bevy/examples/3d/motion_blur.rs

387 lines
13 KiB
Rust
Raw Normal View History

Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
//! Demonstrates how to enable per-object motion blur. This rendering feature can be configured per
//! camera using the [`MotionBlur`] component.z
use bevy::{core_pipeline::motion_blur::MotionBlur, math::ops, prelude::*};
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
fn main() {
let mut app = App::new();
// MSAA and Motion Blur together are not compatible on WebGL
#[cfg(all(feature = "webgl2", target_arch = "wasm32", not(feature = "webgpu")))]
app.insert_resource(Msaa::Off);
app.add_plugins(DefaultPlugins)
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
.add_systems(Startup, (setup_camera, setup_scene, setup_ui))
.add_systems(Update, (keyboard_inputs, move_cars, move_camera).chain())
.run();
}
fn setup_camera(mut commands: Commands) {
commands.spawn((
Camera3d::default(),
// Add the `MotionBlur` component to a camera to enable motion blur.
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
// Motion blur requires the depth and motion vector prepass, which this bundle adds.
// Configure the amount and quality of motion blur per-camera using this component.
MotionBlur {
shutter_angle: 1.0,
samples: 2,
#[cfg(all(feature = "webgl2", target_arch = "wasm32", not(feature = "webgpu")))]
_webgl2_padding: Default::default(),
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
},
));
}
// Everything past this point is used to build the example, but isn't required to use motion blur.
#[derive(Resource)]
enum CameraMode {
Track,
Chase,
}
#[derive(Component)]
struct Moves(f32);
#[derive(Component)]
struct CameraTracked;
#[derive(Component)]
struct Rotates;
fn setup_scene(
asset_server: Res<AssetServer>,
mut images: ResMut<Assets<Image>>,
mut commands: Commands,
mut meshes: ResMut<Assets<Mesh>>,
mut materials: ResMut<Assets<StandardMaterial>>,
) {
commands.insert_resource(AmbientLight {
color: Color::WHITE,
brightness: 300.0,
});
commands.insert_resource(CameraMode::Chase);
commands.spawn((
DirectionalLight {
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
illuminance: 3_000.0,
shadows_enabled: true,
..default()
},
Transform::default().looking_to(Vec3::new(-1.0, -0.7, -1.0), Vec3::X),
));
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
// Sky
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
commands.spawn((
Mesh3d(meshes.add(Sphere::default())),
MeshMaterial3d(materials.add(StandardMaterial {
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
unlit: true,
base_color: Color::linear_rgb(0.1, 0.6, 1.0),
..default()
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
})),
Transform::default().with_scale(Vec3::splat(-4000.0)),
));
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
// Ground
let mut plane: Mesh = Plane3d::default().into();
let uv_size = 4000.0;
let uvs = vec![[uv_size, 0.0], [0.0, 0.0], [0.0, uv_size], [uv_size; 2]];
plane.insert_attribute(Mesh::ATTRIBUTE_UV_0, uvs);
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
commands.spawn((
Mesh3d(meshes.add(plane)),
MeshMaterial3d(materials.add(StandardMaterial {
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
base_color: Color::WHITE,
perceptual_roughness: 1.0,
base_color_texture: Some(images.add(uv_debug_texture())),
..default()
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
})),
Transform::from_xyz(0.0, -0.65, 0.0).with_scale(Vec3::splat(80.)),
));
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
spawn_cars(&asset_server, &mut meshes, &mut materials, &mut commands);
spawn_trees(&mut meshes, &mut materials, &mut commands);
spawn_barriers(&mut meshes, &mut materials, &mut commands);
}
fn spawn_cars(
asset_server: &AssetServer,
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_CARS: usize = 20;
let box_mesh = meshes.add(Cuboid::new(0.3, 0.15, 0.55));
let cylinder = meshes.add(Cylinder::default());
let logo = asset_server.load("branding/icon.png");
let wheel_matl = materials.add(StandardMaterial {
base_color: Color::WHITE,
base_color_texture: Some(logo.clone()),
..default()
});
let mut matl = |color| {
materials.add(StandardMaterial {
base_color: color,
..default()
})
};
let colors = [
matl(Color::linear_rgb(1.0, 0.0, 0.0)),
matl(Color::linear_rgb(1.0, 1.0, 0.0)),
matl(Color::BLACK),
matl(Color::linear_rgb(0.0, 0.0, 1.0)),
matl(Color::linear_rgb(0.0, 1.0, 0.0)),
matl(Color::linear_rgb(1.0, 0.0, 1.0)),
matl(Color::linear_rgb(0.5, 0.5, 0.0)),
matl(Color::linear_rgb(1.0, 0.5, 0.0)),
];
for i in 0..N_CARS {
let color = colors[i % colors.len()].clone();
commands
.spawn((
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
Mesh3d(box_mesh.clone()),
MeshMaterial3d(color.clone()),
Transform::from_scale(Vec3::splat(0.5)),
Moves(i as f32 * 2.0),
))
.insert_if(CameraTracked, || i == 0)
.with_children(|parent| {
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
parent.spawn((
Mesh3d(box_mesh.clone()),
MeshMaterial3d(color),
Transform::from_xyz(0.0, 0.08, 0.03).with_scale(Vec3::new(1.0, 1.0, 0.5)),
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
));
let mut spawn_wheel = |x: f32, z: f32| {
parent.spawn((
Mesh3d(cylinder.clone()),
MeshMaterial3d(wheel_matl.clone()),
Transform::from_xyz(0.14 * x, -0.045, 0.15 * z)
.with_scale(Vec3::new(0.15, 0.04, 0.15))
.with_rotation(Quat::from_rotation_z(std::f32::consts::FRAC_PI_2)),
Rotates,
));
};
spawn_wheel(1.0, 1.0);
spawn_wheel(1.0, -1.0);
spawn_wheel(-1.0, 1.0);
spawn_wheel(-1.0, -1.0);
});
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
}
}
fn spawn_barriers(
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_CONES: usize = 100;
let capsule = meshes.add(Capsule3d::default());
let matl = materials.add(StandardMaterial {
base_color: Color::srgb_u8(255, 87, 51),
reflectance: 1.0,
..default()
});
let mut spawn_with_offset = |offset: f32| {
for i in 0..N_CONES {
let pos = race_track_pos(
offset,
(i as f32) / (N_CONES as f32) * std::f32::consts::PI * 2.0,
);
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
commands.spawn((
Mesh3d(capsule.clone()),
MeshMaterial3d(matl.clone()),
Transform::from_xyz(pos.x, -0.65, pos.y).with_scale(Vec3::splat(0.07)),
));
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
}
};
spawn_with_offset(0.04);
spawn_with_offset(-0.04);
}
fn spawn_trees(
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_TREES: usize = 30;
let capsule = meshes.add(Capsule3d::default());
let sphere = meshes.add(Sphere::default());
let leaves = materials.add(Color::linear_rgb(0.0, 1.0, 0.0));
let trunk = materials.add(Color::linear_rgb(0.4, 0.2, 0.2));
let mut spawn_with_offset = |offset: f32| {
for i in 0..N_TREES {
let pos = race_track_pos(
offset,
(i as f32) / (N_TREES as f32) * std::f32::consts::PI * 2.0,
);
let [x, z] = pos.into();
Migrate meshes and materials to required components (#15524) # Objective A big step in the migration to required components: meshes and materials! ## Solution As per the [selected proposal](https://hackmd.io/@bevy/required_components/%2Fj9-PnF-2QKK0on1KQ29UWQ): - Deprecate `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle`. - Add `Mesh2d` and `Mesh3d` components, which wrap a `Handle<Mesh>`. - Add `MeshMaterial2d<M: Material2d>` and `MeshMaterial3d<M: Material>`, which wrap a `Handle<M>`. - Meshes *without* a mesh material should be rendered with a default material. The existence of a material is determined by `HasMaterial2d`/`HasMaterial3d`, which is required by `MeshMaterial2d`/`MeshMaterial3d`. This gets around problems with the generics. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, previously nothing was rendered. Now, it renders a white default `ColorMaterial` in 2D and a `StandardMaterial` in 3D (this can be overridden). Below, only every other entity has a material: ![Näyttökuva 2024-09-29 181746](https://github.com/user-attachments/assets/5c8be029-d2fe-4b8c-ae89-17a72ff82c9a) ![Näyttökuva 2024-09-29 181918](https://github.com/user-attachments/assets/58adbc55-5a1e-4c7d-a2c7-ed456227b909) Why white? This is still open for discussion, but I think white makes sense for a *default* material, while *invalid* asset handles pointing to nothing should have something like a pink material to indicate that something is broken (I don't handle that in this PR yet). This is kind of a mix of Godot and Unity: Godot just renders a white material for non-existent materials, while Unity renders nothing when no materials exist, but renders pink for invalid materials. I can also change the default material to pink if that is preferable though. ## Testing I ran some 2D and 3D examples to test if anything changed visually. I have not tested all examples or features yet however. If anyone wants to test more extensively, it would be appreciated! ## Implementation Notes - The relationship between `bevy_render` and `bevy_pbr` is weird here. `bevy_render` needs `Mesh3d` for its own systems, but `bevy_pbr` has all of the material logic, and `bevy_render` doesn't depend on it. I feel like the two crates should be refactored in some way, but I think that's out of scope for this PR. - I didn't migrate meshlets to required components yet. That can probably be done in a follow-up, as this is already a huge PR. - It is becoming increasingly clear to me that we really, *really* want to disallow raw asset handles as components. They caused me a *ton* of headache here already, and it took me a long time to find every place that queried for them or inserted them directly on entities, since there were no compiler errors for it. If we don't remove the `Component` derive, I expect raw asset handles to be a *huge* footgun for users as we transition to wrapper components, especially as handles as components have been the norm so far. I personally consider this to be a blocker for 0.15: we need to migrate to wrapper components for asset handles everywhere, and remove the `Component` derive. Also see https://github.com/bevyengine/bevy/issues/14124. --- ## Migration Guide Asset handles for meshes and mesh materials must now be wrapped in the `Mesh2d` and `MeshMaterial2d` or `Mesh3d` and `MeshMaterial3d` components for 2D and 3D respectively. Raw handles as components no longer render meshes. Additionally, `MaterialMesh2dBundle`, `MaterialMeshBundle`, and `PbrBundle` have been deprecated. Instead, use the mesh and material components directly. Previously: ```rust commands.spawn(MaterialMesh2dBundle { mesh: meshes.add(Circle::new(100.0)).into(), material: materials.add(Color::srgb(7.5, 0.0, 7.5)), transform: Transform::from_translation(Vec3::new(-200., 0., 0.)), ..default() }); ``` Now: ```rust commands.spawn(( Mesh2d(meshes.add(Circle::new(100.0))), MeshMaterial2d(materials.add(Color::srgb(7.5, 0.0, 7.5))), Transform::from_translation(Vec3::new(-200., 0., 0.)), )); ``` If the mesh material is missing, a white default material is now used. Previously, nothing was rendered if the material was missing. The `WithMesh2d` and `WithMesh3d` query filter type aliases have also been removed. Simply use `With<Mesh2d>` or `With<Mesh3d>`. --------- Co-authored-by: Tim Blackbird <justthecooldude@gmail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-01 21:33:17 +00:00
commands.spawn((
Mesh3d(sphere.clone()),
MeshMaterial3d(leaves.clone()),
Transform::from_xyz(x, -0.3, z).with_scale(Vec3::splat(0.3)),
));
commands.spawn((
Mesh3d(capsule.clone()),
MeshMaterial3d(trunk.clone()),
Transform::from_xyz(x, -0.5, z).with_scale(Vec3::new(0.05, 0.3, 0.05)),
));
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
}
};
spawn_with_offset(0.07);
spawn_with_offset(-0.07);
}
fn setup_ui(mut commands: Commands) {
Text rework (#15591) **Ready for review. Examples migration progress: 100%.** # Objective - Implement https://github.com/bevyengine/bevy/discussions/15014 ## Solution This implements [cart's proposal](https://github.com/bevyengine/bevy/discussions/15014#discussioncomment-10574459) faithfully except for one change. I separated `TextSpan` from `TextSpan2d` because `TextSpan` needs to require the `GhostNode` component, which is a `bevy_ui` component only usable by UI. Extra changes: - Added `EntityCommands::commands_mut` that returns a mutable reference. This is a blocker for extension methods that return something other than `self`. Note that `sickle_ui`'s `UiBuilder::commands` returns a mutable reference for this reason. ## Testing - [x] Text examples all work. --- ## Showcase TODO: showcase-worthy ## Migration Guide TODO: very breaking ### Accessing text spans by index Text sections are now text sections on different entities in a hierarchy, Use the new `TextReader` and `TextWriter` system parameters to access spans by index. Before: ```rust fn refresh_text(mut query: Query<&mut Text, With<TimeText>>, time: Res<Time>) { let text = query.single_mut(); text.sections[1].value = format_time(time.elapsed()); } ``` After: ```rust fn refresh_text( query: Query<Entity, With<TimeText>>, mut writer: UiTextWriter, time: Res<Time> ) { let entity = query.single(); *writer.text(entity, 1) = format_time(time.elapsed()); } ``` ### Iterating text spans Text spans are now entities in a hierarchy, so the new `UiTextReader` and `UiTextWriter` system parameters provide ways to iterate that hierarchy. The `UiTextReader::iter` method will give you a normal iterator over spans, and `UiTextWriter::for_each` lets you visit each of the spans. --------- Co-authored-by: ickshonpe <david.curthoys@googlemail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-09 18:35:36 +00:00
commands
.spawn((
Text::default(),
Merge Style properties into Node. Use ComputedNode for computed properties. (#15975) # Objective Continue improving the user experience of our UI Node API in the direction specified by [Bevy's Next Generation Scene / UI System](https://github.com/bevyengine/bevy/discussions/14437) ## Solution As specified in the document above, merge `Style` fields into `Node`, and move "computed Node fields" into `ComputedNode` (I chose this name over something like `ComputedNodeLayout` because it currently contains more than just layout info. If we want to break this up / rename these concepts, lets do that in a separate PR). `Style` has been removed. This accomplishes a number of goals: ## Ergonomics wins Specifying both `Node` and `Style` is now no longer required for non-default styles Before: ```rust commands.spawn(( Node::default(), Style { width: Val::Px(100.), ..default() }, )); ``` After: ```rust commands.spawn(Node { width: Val::Px(100.), ..default() }); ``` ## Conceptual clarity `Style` was never a comprehensive "style sheet". It only defined "core" style properties that all `Nodes` shared. Any "styled property" that couldn't fit that mold had to be in a separate component. A "real" style system would style properties _across_ components (`Node`, `Button`, etc). We have plans to build a true style system (see the doc linked above). By moving the `Style` fields to `Node`, we fully embrace `Node` as the driving concept and remove the "style system" confusion. ## Next Steps * Consider identifying and splitting out "style properties that aren't core to Node". This should not happen for Bevy 0.15. --- ## Migration Guide Move any fields set on `Style` into `Node` and replace all `Style` component usage with `Node`. Before: ```rust commands.spawn(( Node::default(), Style { width: Val::Px(100.), ..default() }, )); ``` After: ```rust commands.spawn(Node { width: Val::Px(100.), ..default() }); ``` For any usage of the "computed node properties" that used to live on `Node`, use `ComputedNode` instead: Before: ```rust fn system(nodes: Query<&Node>) { for node in &nodes { let computed_size = node.size(); } } ``` After: ```rust fn system(computed_nodes: Query<&ComputedNode>) { for computed_node in &computed_nodes { let computed_size = computed_node.size(); } } ```
2024-10-18 22:25:33 +00:00
Node {
Text rework (#15591) **Ready for review. Examples migration progress: 100%.** # Objective - Implement https://github.com/bevyengine/bevy/discussions/15014 ## Solution This implements [cart's proposal](https://github.com/bevyengine/bevy/discussions/15014#discussioncomment-10574459) faithfully except for one change. I separated `TextSpan` from `TextSpan2d` because `TextSpan` needs to require the `GhostNode` component, which is a `bevy_ui` component only usable by UI. Extra changes: - Added `EntityCommands::commands_mut` that returns a mutable reference. This is a blocker for extension methods that return something other than `self`. Note that `sickle_ui`'s `UiBuilder::commands` returns a mutable reference for this reason. ## Testing - [x] Text examples all work. --- ## Showcase TODO: showcase-worthy ## Migration Guide TODO: very breaking ### Accessing text spans by index Text sections are now text sections on different entities in a hierarchy, Use the new `TextReader` and `TextWriter` system parameters to access spans by index. Before: ```rust fn refresh_text(mut query: Query<&mut Text, With<TimeText>>, time: Res<Time>) { let text = query.single_mut(); text.sections[1].value = format_time(time.elapsed()); } ``` After: ```rust fn refresh_text( query: Query<Entity, With<TimeText>>, mut writer: UiTextWriter, time: Res<Time> ) { let entity = query.single(); *writer.text(entity, 1) = format_time(time.elapsed()); } ``` ### Iterating text spans Text spans are now entities in a hierarchy, so the new `UiTextReader` and `UiTextWriter` system parameters provide ways to iterate that hierarchy. The `UiTextReader::iter` method will give you a normal iterator over spans, and `UiTextWriter::for_each` lets you visit each of the spans. --------- Co-authored-by: ickshonpe <david.curthoys@googlemail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-09 18:35:36 +00:00
position_type: PositionType::Absolute,
top: Val::Px(12.0),
left: Val::Px(12.0),
..default()
},
))
.with_children(|p| {
p.spawn(TextSpan::default());
p.spawn(TextSpan::default());
p.spawn(TextSpan::new("1/2: -/+ shutter angle (blur amount)\n"));
p.spawn(TextSpan::new("3/4: -/+ sample count (blur quality)\n"));
p.spawn(TextSpan::new("Spacebar: cycle camera\n"));
Text rework (#15591) **Ready for review. Examples migration progress: 100%.** # Objective - Implement https://github.com/bevyengine/bevy/discussions/15014 ## Solution This implements [cart's proposal](https://github.com/bevyengine/bevy/discussions/15014#discussioncomment-10574459) faithfully except for one change. I separated `TextSpan` from `TextSpan2d` because `TextSpan` needs to require the `GhostNode` component, which is a `bevy_ui` component only usable by UI. Extra changes: - Added `EntityCommands::commands_mut` that returns a mutable reference. This is a blocker for extension methods that return something other than `self`. Note that `sickle_ui`'s `UiBuilder::commands` returns a mutable reference for this reason. ## Testing - [x] Text examples all work. --- ## Showcase TODO: showcase-worthy ## Migration Guide TODO: very breaking ### Accessing text spans by index Text sections are now text sections on different entities in a hierarchy, Use the new `TextReader` and `TextWriter` system parameters to access spans by index. Before: ```rust fn refresh_text(mut query: Query<&mut Text, With<TimeText>>, time: Res<Time>) { let text = query.single_mut(); text.sections[1].value = format_time(time.elapsed()); } ``` After: ```rust fn refresh_text( query: Query<Entity, With<TimeText>>, mut writer: UiTextWriter, time: Res<Time> ) { let entity = query.single(); *writer.text(entity, 1) = format_time(time.elapsed()); } ``` ### Iterating text spans Text spans are now entities in a hierarchy, so the new `UiTextReader` and `UiTextWriter` system parameters provide ways to iterate that hierarchy. The `UiTextReader::iter` method will give you a normal iterator over spans, and `UiTextWriter::for_each` lets you visit each of the spans. --------- Co-authored-by: ickshonpe <david.curthoys@googlemail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-09 18:35:36 +00:00
});
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
}
fn keyboard_inputs(
mut motion_blur: Single<&mut MotionBlur>,
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
presses: Res<ButtonInput<KeyCode>>,
text: Single<Entity, With<Text>>,
mut writer: TextUiWriter,
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
mut camera: ResMut<CameraMode>,
) {
if presses.just_pressed(KeyCode::Digit1) {
Rename rendering components for improved consistency and clarity (#15035) # Objective The names of numerous rendering components in Bevy are inconsistent and a bit confusing. Relevant names include: - `AutoExposureSettings` - `AutoExposureSettingsUniform` - `BloomSettings` - `BloomUniform` (no `Settings`) - `BloomPrefilterSettings` - `ChromaticAberration` (no `Settings`) - `ContrastAdaptiveSharpeningSettings` - `DepthOfFieldSettings` - `DepthOfFieldUniform` (no `Settings`) - `FogSettings` - `SmaaSettings`, `Fxaa`, `TemporalAntiAliasSettings` (really inconsistent??) - `ScreenSpaceAmbientOcclusionSettings` - `ScreenSpaceReflectionsSettings` - `VolumetricFogSettings` Firstly, there's a lot of inconsistency between `Foo`/`FooSettings` and `FooUniform`/`FooSettingsUniform` and whether names are abbreviated or not. Secondly, the `Settings` post-fix seems unnecessary and a bit confusing semantically, since it makes it seem like the component is mostly just auxiliary configuration instead of the core *thing* that actually enables the feature. This will be an even bigger problem once bundles like `TemporalAntiAliasBundle` are deprecated in favor of required components, as users will expect a component named `TemporalAntiAlias` (or similar), not `TemporalAntiAliasSettings`. ## Solution Drop the `Settings` post-fix from the component names, and change some names to be more consistent. - `AutoExposure` - `AutoExposureUniform` - `Bloom` - `BloomUniform` - `BloomPrefilter` - `ChromaticAberration` - `ContrastAdaptiveSharpening` - `DepthOfField` - `DepthOfFieldUniform` - `DistanceFog` - `Smaa`, `Fxaa`, `TemporalAntiAliasing` (note: we might want to change to `Taa`, see "Discussion") - `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflections` - `VolumetricFog` I kept the old names as deprecated type aliases to make migration a bit less painful for users. We should remove them after the next release. (And let me know if I should just... not add them at all) I also added some very basic docs for a few types where they were missing, like on `Fxaa` and `DepthOfField`. ## Discussion - `TemporalAntiAliasing` is still inconsistent with `Smaa` and `Fxaa`. Consensus [on Discord](https://discord.com/channels/691052431525675048/743663924229963868/1280601167209955431) seemed to be that renaming to `Taa` would probably be fine, but I think it's a bit more controversial, and it would've required renaming a lot of related types like `TemporalAntiAliasNode`, `TemporalAntiAliasBundle`, and `TemporalAntiAliasPlugin`, so I think it's better to leave to a follow-up. - I think `Fog` should probably have a more specific name like `DistanceFog` considering it seems to be distinct from `VolumetricFog`. ~~This should probably be done in a follow-up though, so I just removed the `Settings` post-fix for now.~~ (done) --- ## Migration Guide Many rendering components have been renamed for improved consistency and clarity. - `AutoExposureSettings` → `AutoExposure` - `BloomSettings` → `Bloom` - `BloomPrefilterSettings` → `BloomPrefilter` - `ContrastAdaptiveSharpeningSettings` → `ContrastAdaptiveSharpening` - `DepthOfFieldSettings` → `DepthOfField` - `FogSettings` → `DistanceFog` - `SmaaSettings` → `Smaa` - `TemporalAntiAliasSettings` → `TemporalAntiAliasing` - `ScreenSpaceAmbientOcclusionSettings` → `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflectionsSettings` → `ScreenSpaceReflections` - `VolumetricFogSettings` → `VolumetricFog` --------- Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-09-10 01:11:46 +00:00
motion_blur.shutter_angle -= 0.25;
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
} else if presses.just_pressed(KeyCode::Digit2) {
Rename rendering components for improved consistency and clarity (#15035) # Objective The names of numerous rendering components in Bevy are inconsistent and a bit confusing. Relevant names include: - `AutoExposureSettings` - `AutoExposureSettingsUniform` - `BloomSettings` - `BloomUniform` (no `Settings`) - `BloomPrefilterSettings` - `ChromaticAberration` (no `Settings`) - `ContrastAdaptiveSharpeningSettings` - `DepthOfFieldSettings` - `DepthOfFieldUniform` (no `Settings`) - `FogSettings` - `SmaaSettings`, `Fxaa`, `TemporalAntiAliasSettings` (really inconsistent??) - `ScreenSpaceAmbientOcclusionSettings` - `ScreenSpaceReflectionsSettings` - `VolumetricFogSettings` Firstly, there's a lot of inconsistency between `Foo`/`FooSettings` and `FooUniform`/`FooSettingsUniform` and whether names are abbreviated or not. Secondly, the `Settings` post-fix seems unnecessary and a bit confusing semantically, since it makes it seem like the component is mostly just auxiliary configuration instead of the core *thing* that actually enables the feature. This will be an even bigger problem once bundles like `TemporalAntiAliasBundle` are deprecated in favor of required components, as users will expect a component named `TemporalAntiAlias` (or similar), not `TemporalAntiAliasSettings`. ## Solution Drop the `Settings` post-fix from the component names, and change some names to be more consistent. - `AutoExposure` - `AutoExposureUniform` - `Bloom` - `BloomUniform` - `BloomPrefilter` - `ChromaticAberration` - `ContrastAdaptiveSharpening` - `DepthOfField` - `DepthOfFieldUniform` - `DistanceFog` - `Smaa`, `Fxaa`, `TemporalAntiAliasing` (note: we might want to change to `Taa`, see "Discussion") - `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflections` - `VolumetricFog` I kept the old names as deprecated type aliases to make migration a bit less painful for users. We should remove them after the next release. (And let me know if I should just... not add them at all) I also added some very basic docs for a few types where they were missing, like on `Fxaa` and `DepthOfField`. ## Discussion - `TemporalAntiAliasing` is still inconsistent with `Smaa` and `Fxaa`. Consensus [on Discord](https://discord.com/channels/691052431525675048/743663924229963868/1280601167209955431) seemed to be that renaming to `Taa` would probably be fine, but I think it's a bit more controversial, and it would've required renaming a lot of related types like `TemporalAntiAliasNode`, `TemporalAntiAliasBundle`, and `TemporalAntiAliasPlugin`, so I think it's better to leave to a follow-up. - I think `Fog` should probably have a more specific name like `DistanceFog` considering it seems to be distinct from `VolumetricFog`. ~~This should probably be done in a follow-up though, so I just removed the `Settings` post-fix for now.~~ (done) --- ## Migration Guide Many rendering components have been renamed for improved consistency and clarity. - `AutoExposureSettings` → `AutoExposure` - `BloomSettings` → `Bloom` - `BloomPrefilterSettings` → `BloomPrefilter` - `ContrastAdaptiveSharpeningSettings` → `ContrastAdaptiveSharpening` - `DepthOfFieldSettings` → `DepthOfField` - `FogSettings` → `DistanceFog` - `SmaaSettings` → `Smaa` - `TemporalAntiAliasSettings` → `TemporalAntiAliasing` - `ScreenSpaceAmbientOcclusionSettings` → `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflectionsSettings` → `ScreenSpaceReflections` - `VolumetricFogSettings` → `VolumetricFog` --------- Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-09-10 01:11:46 +00:00
motion_blur.shutter_angle += 0.25;
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
} else if presses.just_pressed(KeyCode::Digit3) {
Rename rendering components for improved consistency and clarity (#15035) # Objective The names of numerous rendering components in Bevy are inconsistent and a bit confusing. Relevant names include: - `AutoExposureSettings` - `AutoExposureSettingsUniform` - `BloomSettings` - `BloomUniform` (no `Settings`) - `BloomPrefilterSettings` - `ChromaticAberration` (no `Settings`) - `ContrastAdaptiveSharpeningSettings` - `DepthOfFieldSettings` - `DepthOfFieldUniform` (no `Settings`) - `FogSettings` - `SmaaSettings`, `Fxaa`, `TemporalAntiAliasSettings` (really inconsistent??) - `ScreenSpaceAmbientOcclusionSettings` - `ScreenSpaceReflectionsSettings` - `VolumetricFogSettings` Firstly, there's a lot of inconsistency between `Foo`/`FooSettings` and `FooUniform`/`FooSettingsUniform` and whether names are abbreviated or not. Secondly, the `Settings` post-fix seems unnecessary and a bit confusing semantically, since it makes it seem like the component is mostly just auxiliary configuration instead of the core *thing* that actually enables the feature. This will be an even bigger problem once bundles like `TemporalAntiAliasBundle` are deprecated in favor of required components, as users will expect a component named `TemporalAntiAlias` (or similar), not `TemporalAntiAliasSettings`. ## Solution Drop the `Settings` post-fix from the component names, and change some names to be more consistent. - `AutoExposure` - `AutoExposureUniform` - `Bloom` - `BloomUniform` - `BloomPrefilter` - `ChromaticAberration` - `ContrastAdaptiveSharpening` - `DepthOfField` - `DepthOfFieldUniform` - `DistanceFog` - `Smaa`, `Fxaa`, `TemporalAntiAliasing` (note: we might want to change to `Taa`, see "Discussion") - `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflections` - `VolumetricFog` I kept the old names as deprecated type aliases to make migration a bit less painful for users. We should remove them after the next release. (And let me know if I should just... not add them at all) I also added some very basic docs for a few types where they were missing, like on `Fxaa` and `DepthOfField`. ## Discussion - `TemporalAntiAliasing` is still inconsistent with `Smaa` and `Fxaa`. Consensus [on Discord](https://discord.com/channels/691052431525675048/743663924229963868/1280601167209955431) seemed to be that renaming to `Taa` would probably be fine, but I think it's a bit more controversial, and it would've required renaming a lot of related types like `TemporalAntiAliasNode`, `TemporalAntiAliasBundle`, and `TemporalAntiAliasPlugin`, so I think it's better to leave to a follow-up. - I think `Fog` should probably have a more specific name like `DistanceFog` considering it seems to be distinct from `VolumetricFog`. ~~This should probably be done in a follow-up though, so I just removed the `Settings` post-fix for now.~~ (done) --- ## Migration Guide Many rendering components have been renamed for improved consistency and clarity. - `AutoExposureSettings` → `AutoExposure` - `BloomSettings` → `Bloom` - `BloomPrefilterSettings` → `BloomPrefilter` - `ContrastAdaptiveSharpeningSettings` → `ContrastAdaptiveSharpening` - `DepthOfFieldSettings` → `DepthOfField` - `FogSettings` → `DistanceFog` - `SmaaSettings` → `Smaa` - `TemporalAntiAliasSettings` → `TemporalAntiAliasing` - `ScreenSpaceAmbientOcclusionSettings` → `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflectionsSettings` → `ScreenSpaceReflections` - `VolumetricFogSettings` → `VolumetricFog` --------- Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-09-10 01:11:46 +00:00
motion_blur.samples = motion_blur.samples.saturating_sub(1);
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
} else if presses.just_pressed(KeyCode::Digit4) {
Rename rendering components for improved consistency and clarity (#15035) # Objective The names of numerous rendering components in Bevy are inconsistent and a bit confusing. Relevant names include: - `AutoExposureSettings` - `AutoExposureSettingsUniform` - `BloomSettings` - `BloomUniform` (no `Settings`) - `BloomPrefilterSettings` - `ChromaticAberration` (no `Settings`) - `ContrastAdaptiveSharpeningSettings` - `DepthOfFieldSettings` - `DepthOfFieldUniform` (no `Settings`) - `FogSettings` - `SmaaSettings`, `Fxaa`, `TemporalAntiAliasSettings` (really inconsistent??) - `ScreenSpaceAmbientOcclusionSettings` - `ScreenSpaceReflectionsSettings` - `VolumetricFogSettings` Firstly, there's a lot of inconsistency between `Foo`/`FooSettings` and `FooUniform`/`FooSettingsUniform` and whether names are abbreviated or not. Secondly, the `Settings` post-fix seems unnecessary and a bit confusing semantically, since it makes it seem like the component is mostly just auxiliary configuration instead of the core *thing* that actually enables the feature. This will be an even bigger problem once bundles like `TemporalAntiAliasBundle` are deprecated in favor of required components, as users will expect a component named `TemporalAntiAlias` (or similar), not `TemporalAntiAliasSettings`. ## Solution Drop the `Settings` post-fix from the component names, and change some names to be more consistent. - `AutoExposure` - `AutoExposureUniform` - `Bloom` - `BloomUniform` - `BloomPrefilter` - `ChromaticAberration` - `ContrastAdaptiveSharpening` - `DepthOfField` - `DepthOfFieldUniform` - `DistanceFog` - `Smaa`, `Fxaa`, `TemporalAntiAliasing` (note: we might want to change to `Taa`, see "Discussion") - `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflections` - `VolumetricFog` I kept the old names as deprecated type aliases to make migration a bit less painful for users. We should remove them after the next release. (And let me know if I should just... not add them at all) I also added some very basic docs for a few types where they were missing, like on `Fxaa` and `DepthOfField`. ## Discussion - `TemporalAntiAliasing` is still inconsistent with `Smaa` and `Fxaa`. Consensus [on Discord](https://discord.com/channels/691052431525675048/743663924229963868/1280601167209955431) seemed to be that renaming to `Taa` would probably be fine, but I think it's a bit more controversial, and it would've required renaming a lot of related types like `TemporalAntiAliasNode`, `TemporalAntiAliasBundle`, and `TemporalAntiAliasPlugin`, so I think it's better to leave to a follow-up. - I think `Fog` should probably have a more specific name like `DistanceFog` considering it seems to be distinct from `VolumetricFog`. ~~This should probably be done in a follow-up though, so I just removed the `Settings` post-fix for now.~~ (done) --- ## Migration Guide Many rendering components have been renamed for improved consistency and clarity. - `AutoExposureSettings` → `AutoExposure` - `BloomSettings` → `Bloom` - `BloomPrefilterSettings` → `BloomPrefilter` - `ContrastAdaptiveSharpeningSettings` → `ContrastAdaptiveSharpening` - `DepthOfFieldSettings` → `DepthOfField` - `FogSettings` → `DistanceFog` - `SmaaSettings` → `Smaa` - `TemporalAntiAliasSettings` → `TemporalAntiAliasing` - `ScreenSpaceAmbientOcclusionSettings` → `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflectionsSettings` → `ScreenSpaceReflections` - `VolumetricFogSettings` → `VolumetricFog` --------- Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-09-10 01:11:46 +00:00
motion_blur.samples += 1;
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
} else if presses.just_pressed(KeyCode::Space) {
*camera = match *camera {
CameraMode::Track => CameraMode::Chase,
CameraMode::Chase => CameraMode::Track,
};
}
Rename rendering components for improved consistency and clarity (#15035) # Objective The names of numerous rendering components in Bevy are inconsistent and a bit confusing. Relevant names include: - `AutoExposureSettings` - `AutoExposureSettingsUniform` - `BloomSettings` - `BloomUniform` (no `Settings`) - `BloomPrefilterSettings` - `ChromaticAberration` (no `Settings`) - `ContrastAdaptiveSharpeningSettings` - `DepthOfFieldSettings` - `DepthOfFieldUniform` (no `Settings`) - `FogSettings` - `SmaaSettings`, `Fxaa`, `TemporalAntiAliasSettings` (really inconsistent??) - `ScreenSpaceAmbientOcclusionSettings` - `ScreenSpaceReflectionsSettings` - `VolumetricFogSettings` Firstly, there's a lot of inconsistency between `Foo`/`FooSettings` and `FooUniform`/`FooSettingsUniform` and whether names are abbreviated or not. Secondly, the `Settings` post-fix seems unnecessary and a bit confusing semantically, since it makes it seem like the component is mostly just auxiliary configuration instead of the core *thing* that actually enables the feature. This will be an even bigger problem once bundles like `TemporalAntiAliasBundle` are deprecated in favor of required components, as users will expect a component named `TemporalAntiAlias` (or similar), not `TemporalAntiAliasSettings`. ## Solution Drop the `Settings` post-fix from the component names, and change some names to be more consistent. - `AutoExposure` - `AutoExposureUniform` - `Bloom` - `BloomUniform` - `BloomPrefilter` - `ChromaticAberration` - `ContrastAdaptiveSharpening` - `DepthOfField` - `DepthOfFieldUniform` - `DistanceFog` - `Smaa`, `Fxaa`, `TemporalAntiAliasing` (note: we might want to change to `Taa`, see "Discussion") - `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflections` - `VolumetricFog` I kept the old names as deprecated type aliases to make migration a bit less painful for users. We should remove them after the next release. (And let me know if I should just... not add them at all) I also added some very basic docs for a few types where they were missing, like on `Fxaa` and `DepthOfField`. ## Discussion - `TemporalAntiAliasing` is still inconsistent with `Smaa` and `Fxaa`. Consensus [on Discord](https://discord.com/channels/691052431525675048/743663924229963868/1280601167209955431) seemed to be that renaming to `Taa` would probably be fine, but I think it's a bit more controversial, and it would've required renaming a lot of related types like `TemporalAntiAliasNode`, `TemporalAntiAliasBundle`, and `TemporalAntiAliasPlugin`, so I think it's better to leave to a follow-up. - I think `Fog` should probably have a more specific name like `DistanceFog` considering it seems to be distinct from `VolumetricFog`. ~~This should probably be done in a follow-up though, so I just removed the `Settings` post-fix for now.~~ (done) --- ## Migration Guide Many rendering components have been renamed for improved consistency and clarity. - `AutoExposureSettings` → `AutoExposure` - `BloomSettings` → `Bloom` - `BloomPrefilterSettings` → `BloomPrefilter` - `ContrastAdaptiveSharpeningSettings` → `ContrastAdaptiveSharpening` - `DepthOfFieldSettings` → `DepthOfField` - `FogSettings` → `DistanceFog` - `SmaaSettings` → `Smaa` - `TemporalAntiAliasSettings` → `TemporalAntiAliasing` - `ScreenSpaceAmbientOcclusionSettings` → `ScreenSpaceAmbientOcclusion` - `ScreenSpaceReflectionsSettings` → `ScreenSpaceReflections` - `VolumetricFogSettings` → `VolumetricFog` --------- Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-09-10 01:11:46 +00:00
motion_blur.shutter_angle = motion_blur.shutter_angle.clamp(0.0, 1.0);
motion_blur.samples = motion_blur.samples.clamp(0, 64);
let entity = *text;
Text rework (#15591) **Ready for review. Examples migration progress: 100%.** # Objective - Implement https://github.com/bevyengine/bevy/discussions/15014 ## Solution This implements [cart's proposal](https://github.com/bevyengine/bevy/discussions/15014#discussioncomment-10574459) faithfully except for one change. I separated `TextSpan` from `TextSpan2d` because `TextSpan` needs to require the `GhostNode` component, which is a `bevy_ui` component only usable by UI. Extra changes: - Added `EntityCommands::commands_mut` that returns a mutable reference. This is a blocker for extension methods that return something other than `self`. Note that `sickle_ui`'s `UiBuilder::commands` returns a mutable reference for this reason. ## Testing - [x] Text examples all work. --- ## Showcase TODO: showcase-worthy ## Migration Guide TODO: very breaking ### Accessing text spans by index Text sections are now text sections on different entities in a hierarchy, Use the new `TextReader` and `TextWriter` system parameters to access spans by index. Before: ```rust fn refresh_text(mut query: Query<&mut Text, With<TimeText>>, time: Res<Time>) { let text = query.single_mut(); text.sections[1].value = format_time(time.elapsed()); } ``` After: ```rust fn refresh_text( query: Query<Entity, With<TimeText>>, mut writer: UiTextWriter, time: Res<Time> ) { let entity = query.single(); *writer.text(entity, 1) = format_time(time.elapsed()); } ``` ### Iterating text spans Text spans are now entities in a hierarchy, so the new `UiTextReader` and `UiTextWriter` system parameters provide ways to iterate that hierarchy. The `UiTextReader::iter` method will give you a normal iterator over spans, and `UiTextWriter::for_each` lets you visit each of the spans. --------- Co-authored-by: ickshonpe <david.curthoys@googlemail.com> Co-authored-by: Carter Anderson <mcanders1@gmail.com>
2024-10-09 18:35:36 +00:00
*writer.text(entity, 1) = format!("Shutter angle: {:.2}\n", motion_blur.shutter_angle);
*writer.text(entity, 2) = format!("Samples: {:.5}\n", motion_blur.samples);
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
}
/// Parametric function for a looping race track. `offset` will return the point offset
/// perpendicular to the track at the given point.
fn race_track_pos(offset: f32, t: f32) -> Vec2 {
let x_tweak = 2.0;
let y_tweak = 3.0;
let scale = 8.0;
let x0 = ops::sin(x_tweak * t);
let y0 = ops::cos(y_tweak * t);
let dx = x_tweak * ops::cos(x_tweak * t);
let dy = y_tweak * -ops::sin(y_tweak * t);
let dl = ops::hypot(dx, dy);
let x = x0 + offset * dy / dl;
let y = y0 - offset * dx / dl;
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
Vec2::new(x, y) * scale
}
fn move_cars(
time: Res<Time>,
mut movables: Query<(&mut Transform, &Moves, &Children)>,
mut spins: Query<&mut Transform, (Without<Moves>, With<Rotates>)>,
) {
for (mut transform, moves, children) in &mut movables {
let time = time.elapsed_secs() * 0.25;
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
let t = time + 0.5 * moves.0;
let dx = ops::cos(t);
let dz = -ops::sin(3.0 * t);
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
let speed_variation = (dx * dx + dz * dz).sqrt() * 0.15;
let t = t + speed_variation;
let prev = transform.translation;
transform.translation.x = race_track_pos(0.0, t).x;
transform.translation.z = race_track_pos(0.0, t).y;
transform.translation.y = -0.59;
let delta = transform.translation - prev;
transform.look_to(delta, Vec3::Y);
for child in children.iter() {
let Ok(mut wheel) = spins.get_mut(*child) else {
continue;
};
let radius = wheel.scale.x;
let circumference = 2.0 * std::f32::consts::PI * radius;
let angle = delta.length() / circumference * std::f32::consts::PI * 2.0;
wheel.rotate_local_y(angle);
}
}
}
fn move_camera(
camera: Single<(&mut Transform, &mut Projection), Without<CameraTracked>>,
tracked: Single<&Transform, With<CameraTracked>>,
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
mode: Res<CameraMode>,
) {
let (mut transform, mut projection) = camera.into_inner();
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
match *mode {
CameraMode::Track => {
transform.look_at(tracked.translation, Vec3::Y);
transform.translation = Vec3::new(15.0, -0.5, 0.0);
if let Projection::Perspective(perspective) = &mut *projection {
perspective.fov = 0.05;
}
}
CameraMode::Chase => {
transform.translation =
tracked.translation + Vec3::new(0.0, 0.15, 0.0) + tracked.back() * 0.6;
transform.look_to(tracked.forward(), Vec3::Y);
Per-Object Motion Blur (#9924) https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
2024-04-25 01:16:02 +00:00
if let Projection::Perspective(perspective) = &mut *projection {
perspective.fov = 1.0;
}
}
}
}
fn uv_debug_texture() -> Image {
use bevy::render::{render_asset::RenderAssetUsages, render_resource::*, texture::*};
const TEXTURE_SIZE: usize = 7;
let mut palette = [
164, 164, 164, 255, 168, 168, 168, 255, 153, 153, 153, 255, 139, 139, 139, 255, 153, 153,
153, 255, 177, 177, 177, 255, 159, 159, 159, 255,
];
let mut texture_data = [0; TEXTURE_SIZE * TEXTURE_SIZE * 4];
for y in 0..TEXTURE_SIZE {
let offset = TEXTURE_SIZE * y * 4;
texture_data[offset..(offset + TEXTURE_SIZE * 4)].copy_from_slice(&palette);
palette.rotate_right(12);
}
let mut img = Image::new_fill(
Extent3d {
width: TEXTURE_SIZE as u32,
height: TEXTURE_SIZE as u32,
depth_or_array_layers: 1,
},
TextureDimension::D2,
&texture_data,
TextureFormat::Rgba8UnormSrgb,
RenderAssetUsages::RENDER_WORLD,
);
img.sampler = ImageSampler::Descriptor(ImageSamplerDescriptor {
address_mode_u: ImageAddressMode::Repeat,
address_mode_v: ImageAddressMode::MirrorRepeat,
mag_filter: ImageFilterMode::Nearest,
..ImageSamplerDescriptor::linear()
});
img
}