Per-Object Motion Blur (#9924)

https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0

# Objective

- Adds per-object motion blur to the core 3d pipeline. This is a common
effect used in games and other simulations.
- Partially resolves #4710

## Solution

- This is a post-process effect that uses the depth and motion vector
buffers to estimate per-object motion blur. The implementation is
combined from knowledge from multiple papers and articles. The approach
itself, and the shader are quite simple. Most of the effort was in
wiring up the bevy rendering plumbing, and properly specializing for HDR
and MSAA.
- To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is
required. I've extracted this code from #9000. This is because the
prepass buffers are multisampled, and require accessing with
`textureLoad` as opposed to the widely compatible `textureSample`.
- Added an example to demonstrate the effect of motion blur parameters.

## Future Improvements

- While this approach does have limitations, it's one of the most
commonly used, and is much better than camera motion blur, which does
not consider object velocity. For example, this implementation allows a
dolly to track an object, and that object will remain unblurred while
the background is blurred. The biggest issue with this implementation is
that blur is constrained to the boundaries of objects which results in
hard edges. There are solutions to this by either dilating the object or
the motion vector buffer, or by taking a different approach such as
https://casual-effects.com/research/McGuire2012Blur/index.html
- I'm using a noise PRNG function to jitter samples. This could be
replaced with a blue noise texture lookup or similar, however after
playing with the parameters, it gives quite nice results with 4 samples,
and is significantly better than the artifacts generated when not
jittering.

---

## Changelog

- Added: per-object motion blur. This can be enabled and configured by
adding the `MotionBlurBundle` to a camera entity.

---------

Co-authored-by: Torstein Grindvik <52322338+torsteingrindvik@users.noreply.github.com>
This commit is contained in:
Aevyrie 2024-04-24 18:16:02 -07:00 committed by GitHub
parent 9592a40e1e
commit ade70b3925
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
11 changed files with 1046 additions and 8 deletions

View file

@ -766,6 +766,17 @@ description = "Loads and renders a glTF file as a scene"
category = "3D Rendering"
wasm = true
[[example]]
name = "motion_blur"
path = "examples/3d/motion_blur.rs"
doc-scrape-examples = true
[package.metadata.example.motion_blur]
name = "Motion Blur"
description = "Demonstrates per-pixel motion blur"
category = "3D Rendering"
wasm = false
[[example]]
name = "tonemapping"
path = "examples/3d/tonemapping.rs"

View file

@ -26,6 +26,7 @@ pub mod graph {
MainTransparentPass,
EndMainPass,
Taa,
MotionBlur,
Bloom,
Tonemapping,
Fxaa,

View file

@ -15,6 +15,7 @@ pub mod core_3d;
pub mod deferred;
pub mod fullscreen_vertex_shader;
pub mod fxaa;
pub mod motion_blur;
pub mod msaa_writeback;
pub mod prepass;
mod skybox;
@ -53,6 +54,7 @@ use crate::{
deferred::copy_lighting_id::CopyDeferredLightingIdPlugin,
fullscreen_vertex_shader::FULLSCREEN_SHADER_HANDLE,
fxaa::FxaaPlugin,
motion_blur::MotionBlurPlugin,
msaa_writeback::MsaaWritebackPlugin,
prepass::{DeferredPrepass, DepthPrepass, MotionVectorPrepass, NormalPrepass},
tonemapping::TonemappingPlugin,
@ -89,6 +91,7 @@ impl Plugin for CorePipelinePlugin {
BloomPlugin,
FxaaPlugin,
CASPlugin,
MotionBlurPlugin,
));
}
}

View file

@ -0,0 +1,168 @@
//! Per-object, per-pixel motion blur.
//!
//! Add the [`MotionBlurBundle`] to a camera to enable motion blur. See [`MotionBlur`] for more
//! documentation.
use crate::{
core_3d::graph::{Core3d, Node3d},
prepass::{DepthPrepass, MotionVectorPrepass},
};
use bevy_app::{App, Plugin};
use bevy_asset::{load_internal_asset, Handle};
use bevy_ecs::{
bundle::Bundle, component::Component, query::With, reflect::ReflectComponent,
schedule::IntoSystemConfigs,
};
use bevy_reflect::{std_traits::ReflectDefault, Reflect};
use bevy_render::{
camera::Camera,
extract_component::{ExtractComponent, ExtractComponentPlugin, UniformComponentPlugin},
render_graph::{RenderGraphApp, ViewNodeRunner},
render_resource::{Shader, ShaderType, SpecializedRenderPipelines},
Render, RenderApp, RenderSet,
};
pub mod node;
pub mod pipeline;
/// Adds [`MotionBlur`] and the required depth and motion vector prepasses to a camera entity.
#[derive(Bundle, Default)]
pub struct MotionBlurBundle {
pub motion_blur: MotionBlur,
pub depth_prepass: DepthPrepass,
pub motion_vector_prepass: MotionVectorPrepass,
}
/// A component that enables and configures motion blur when added to a camera.
///
/// Motion blur is an effect that simulates how moving objects blur as they change position during
/// the exposure of film, a sensor, or an eyeball.
///
/// Because rendering simulates discrete steps in time, we use per-pixel motion vectors to estimate
/// the path of objects between frames. This kind of implementation has some artifacts:
/// - Fast moving objects in front of a stationary object or when in front of empty space, will not
/// have their edges blurred.
/// - Transparent objects do not write to depth or motion vectors, so they cannot be blurred.
///
/// Other approaches, such as *A Reconstruction Filter for Plausible Motion Blur* produce more
/// correct results, but are more expensive and complex, and have other kinds of artifacts. This
/// implementation is relatively inexpensive and effective.
///
/// # Usage
///
/// Add the [`MotionBlur`] component to a camera to enable and configure motion blur for that
/// camera. Motion blur also requires the depth and motion vector prepass, which can be added more
/// easily to the camera with the [`MotionBlurBundle`].
///
/// ```
/// # use bevy_core_pipeline::{core_3d::Camera3dBundle, motion_blur::MotionBlurBundle};
/// # use bevy_ecs::prelude::*;
/// # fn test(mut commands: Commands) {
/// commands.spawn((
/// Camera3dBundle::default(),
/// MotionBlurBundle::default(),
/// ));
/// # }
/// ````
#[derive(Reflect, Component, Clone, ExtractComponent, ShaderType)]
#[reflect(Component, Default)]
#[extract_component_filter(With<Camera>)]
pub struct MotionBlur {
/// The strength of motion blur from `0.0` to `1.0`.
///
/// The shutter angle describes the fraction of a frame that a camera's shutter is open and
/// exposing the film/sensor. For 24fps cinematic film, a shutter angle of 0.5 (180 degrees) is
/// common. This means that the shutter was open for half of the frame, or 1/48th of a second.
/// The lower the shutter angle, the less exposure time and thus less blur.
///
/// A value greater than one is non-physical and results in an object's blur stretching further
/// than it traveled in that frame. This might be a desirable effect for artistic reasons, but
/// consider allowing users to opt out of this.
///
/// This value is intentionally tied to framerate to avoid the aforementioned non-physical
/// over-blurring. If you want to emulate a cinematic look, your options are:
/// - Framelimit your app to 24fps, and set the shutter angle to 0.5 (180 deg). Note that
/// depending on artistic intent or the action of a scene, it is common to set the shutter
/// angle between 0.125 (45 deg) and 0.5 (180 deg). This is the most faithful way to
/// reproduce the look of film.
/// - Set the shutter angle greater than one. For example, to emulate the blur strength of
/// film while rendering at 60fps, you would set the shutter angle to `60/24 * 0.5 = 1.25`.
/// Note that this will result in artifacts where the motion of objects will stretch further
/// than they moved between frames; users may find this distracting.
pub shutter_angle: f32,
/// The quality of motion blur, corresponding to the number of per-pixel samples taken in each
/// direction during blur.
///
/// Setting this to `1` results in each pixel being sampled once in the leading direction, once
/// in the trailing direction, and once in the middle, for a total of 3 samples (`1 * 2 + 1`).
/// Setting this to `3` will result in `3 * 2 + 1 = 7` samples. Setting this to `0` is
/// equivalent to disabling motion blur.
pub samples: u32,
#[cfg(all(feature = "webgl", target_arch = "wasm32"))]
// WebGL2 structs must be 16 byte aligned.
pub _webgl2_padding: bevy_math::Vec3,
}
impl Default for MotionBlur {
fn default() -> Self {
Self {
shutter_angle: 0.5,
samples: 1,
#[cfg(all(feature = "webgl", target_arch = "wasm32"))]
_webgl2_padding: bevy_math::Vec3::default(),
}
}
}
pub const MOTION_BLUR_SHADER_HANDLE: Handle<Shader> =
Handle::weak_from_u128(987457899187986082347921);
/// Adds support for per-object motion blur to the app. See [`MotionBlur`] for details.
pub struct MotionBlurPlugin;
impl Plugin for MotionBlurPlugin {
fn build(&self, app: &mut App) {
load_internal_asset!(
app,
MOTION_BLUR_SHADER_HANDLE,
"motion_blur.wgsl",
Shader::from_wgsl
);
app.add_plugins((
ExtractComponentPlugin::<MotionBlur>::default(),
UniformComponentPlugin::<MotionBlur>::default(),
));
let Some(render_app) = app.get_sub_app_mut(RenderApp) else {
return;
};
render_app
.init_resource::<SpecializedRenderPipelines<pipeline::MotionBlurPipeline>>()
.add_systems(
Render,
pipeline::prepare_motion_blur_pipelines.in_set(RenderSet::Prepare),
);
render_app
.add_render_graph_node::<ViewNodeRunner<node::MotionBlurNode>>(
Core3d,
Node3d::MotionBlur,
)
.add_render_graph_edges(
Core3d,
(
Node3d::EndMainPass,
Node3d::MotionBlur,
Node3d::Bloom, // we want blurred areas to bloom and tonemap properly.
),
);
}
fn finish(&self, app: &mut App) {
let Some(render_app) = app.get_sub_app_mut(RenderApp) else {
return;
};
render_app.init_resource::<pipeline::MotionBlurPipeline>();
}
}

View file

@ -0,0 +1,149 @@
#import bevy_pbr::prepass_utils
#import bevy_pbr::utils
#import bevy_core_pipeline::fullscreen_vertex_shader::FullscreenVertexOutput
#import bevy_render::globals::Globals
#ifdef MULTISAMPLED
@group(0) @binding(0) var screen_texture: texture_2d<f32>;
@group(0) @binding(1) var motion_vectors: texture_multisampled_2d<f32>;
@group(0) @binding(2) var depth: texture_depth_multisampled_2d;
#else
@group(0) @binding(0) var screen_texture: texture_2d<f32>;
@group(0) @binding(1) var motion_vectors: texture_2d<f32>;
@group(0) @binding(2) var depth: texture_depth_2d;
#endif
@group(0) @binding(3) var texture_sampler: sampler;
struct MotionBlur {
shutter_angle: f32,
samples: u32,
#ifdef SIXTEEN_BYTE_ALIGNMENT
// WebGL2 structs must be 16 byte aligned.
_webgl2_padding: vec3<f32>
#endif
}
@group(0) @binding(4) var<uniform> settings: MotionBlur;
@group(0) @binding(5) var<uniform> globals: Globals;
@fragment
fn fragment(
#ifdef MULTISAMPLED
@builtin(sample_index) sample_index: u32,
#endif
in: FullscreenVertexOutput
) -> @location(0) vec4<f32> {
let texture_size = vec2<f32>(textureDimensions(screen_texture));
let frag_coords = vec2<i32>(in.uv * texture_size);
#ifdef MULTISAMPLED
let base_color = textureLoad(screen_texture, frag_coords, i32(sample_index));
#else
let base_color = textureSample(screen_texture, texture_sampler, in.uv);
#endif
let shutter_angle = settings.shutter_angle;
#ifdef MULTISAMPLED
let this_motion_vector = textureLoad(motion_vectors, frag_coords, i32(sample_index)).rg;
#else
let this_motion_vector = textureSample(motion_vectors, texture_sampler, in.uv).rg;
#endif
#ifdef NO_DEPTH_TEXTURE_SUPPORT
let this_depth = 0.0;
let depth_supported = false;
#else
let depth_supported = true;
#ifdef MULTISAMPLED
let this_depth = textureLoad(depth, frag_coords, i32(sample_index));
#else
let this_depth = textureSample(depth, texture_sampler, in.uv);
#endif
#endif
// The exposure vector is the distance that this fragment moved while the camera shutter was
// open. This is the motion vector (total distance traveled) multiplied by the shutter angle (a
// fraction). In film, the shutter angle is commonly 0.5 or "180 degrees" (out of 360 total).
// This means that for a frame time of 20ms, the shutter is only open for 10ms.
//
// Using a shutter angle larger than 1.0 is non-physical, objects would need to move further
// than they physically travelled during a frame, which is not possible. Note: we allow values
// larger than 1.0 because it may be desired for artistic reasons.
let exposure_vector = shutter_angle * this_motion_vector;
var accumulator: vec4<f32>;
var weight_total = 0.0;
let n_samples = i32(settings.samples);
let noise = utils::interleaved_gradient_noise(vec2<f32>(frag_coords), globals.frame_count); // 0 to 1
for (var i = -n_samples; i < n_samples; i++) {
// The current sample step vector, from in.uv
let step_vector = 0.5 * exposure_vector * (f32(i) + noise) / f32(n_samples);
var sample_uv = in.uv + step_vector;
let sample_coords = vec2<i32>(sample_uv * texture_size);
#ifdef MULTISAMPLED
let sample_color = textureLoad(screen_texture, sample_coords, i32(sample_index));
#else
let sample_color = textureSample(screen_texture, texture_sampler, sample_uv);
#endif
#ifdef MULTISAMPLED
let sample_motion = textureLoad(motion_vectors, sample_coords, i32(sample_index)).rg;
#else
let sample_motion = textureSample(motion_vectors, texture_sampler, sample_uv).rg;
#endif
#ifdef NO_DEPTH_TEXTURE_SUPPORT
let sample_depth = 0.0;
#else
#ifdef MULTISAMPLED
let sample_depth = textureLoad(depth, sample_coords, i32(sample_index));
#else
let sample_depth = textureSample(depth, texture_sampler, sample_uv);
#endif
#endif
var weight = 1.0;
let is_sample_in_fg = !(depth_supported && sample_depth < this_depth && sample_depth > 0.0);
if is_sample_in_fg {
// The following weight calculation is used to eliminate ghosting artifacts that are
// common in motion-vector-based motion blur implementations. While some resources
// recommend using depth, I've found that sampling the velocity results in significantly
// better results. Unlike a depth heuristic, this is not scale dependent.
//
// The most distracting artifacts occur when a stationary foreground object is
// incorrectly sampled while blurring a moving background object, causing the stationary
// object to blur when it should be sharp ("background bleeding"). This is most obvious
// when the camera is tracking a fast moving object. The tracked object should be sharp,
// and should not bleed into the motion blurred background.
//
// To attenuate these incorrect samples, we compare the motion of the fragment being
// blurred to the UV being sampled, to answer the question "is it possible that this
// sample was occluding the fragment?"
//
// Note to future maintainers: proceed with caution when making any changes here, and
// ensure you check all occlusion/disocclusion scenarios and fullscreen camera rotation
// blur for regressions.
let frag_speed = length(step_vector);
let sample_speed = length(sample_motion) / 2.0; // Halved because the sample is centered
let cos_angle = dot(step_vector, sample_motion) / (frag_speed * sample_speed * 2.0);
let motion_similarity = clamp(abs(cos_angle), 0.0, 1.0);
if sample_speed * motion_similarity < frag_speed {
// Project the sample's motion onto the frag's motion vector. If the sample did not
// cover enough distance to reach the original frag, there is no way it could have
// influenced this frag at all, and should be discarded.
weight = 0.0;
}
}
weight_total += weight;
accumulator += weight * sample_color;
}
let has_moved_less_than_a_pixel =
dot(this_motion_vector * texture_size, this_motion_vector * texture_size) < 1.0;
// In case no samples were accepted, fall back to base color.
// We also fall back if motion is small, to not break antialiasing.
if weight_total <= 0.0 || has_moved_less_than_a_pixel {
accumulator = base_color;
weight_total = 1.0;
}
return accumulator / weight_total;
}

View file

@ -0,0 +1,101 @@
use bevy_ecs::{query::QueryItem, world::World};
use bevy_render::{
extract_component::ComponentUniforms,
globals::GlobalsBuffer,
render_graph::{NodeRunError, RenderGraphContext, ViewNode},
render_resource::{
BindGroupEntries, Operations, PipelineCache, RenderPassColorAttachment,
RenderPassDescriptor,
},
renderer::RenderContext,
view::{Msaa, ViewTarget},
};
use crate::prepass::ViewPrepassTextures;
use super::{
pipeline::{MotionBlurPipeline, MotionBlurPipelineId},
MotionBlur,
};
#[derive(Default)]
pub struct MotionBlurNode;
impl ViewNode for MotionBlurNode {
type ViewQuery = (
&'static ViewTarget,
&'static MotionBlurPipelineId,
&'static ViewPrepassTextures,
&'static MotionBlur,
);
fn run(
&self,
_graph: &mut RenderGraphContext,
render_context: &mut RenderContext,
(view_target, pipeline_id, prepass_textures, settings): QueryItem<Self::ViewQuery>,
world: &World,
) -> Result<(), NodeRunError> {
if settings.samples == 0 || settings.shutter_angle <= 0.0 {
return Ok(()); // We can skip running motion blur in these cases.
}
let motion_blur_pipeline = world.resource::<MotionBlurPipeline>();
let pipeline_cache = world.resource::<PipelineCache>();
let settings_uniforms = world.resource::<ComponentUniforms<MotionBlur>>();
let Some(pipeline) = pipeline_cache.get_render_pipeline(pipeline_id.0) else {
return Ok(());
};
let Some(settings_binding) = settings_uniforms.uniforms().binding() else {
return Ok(());
};
let (Some(prepass_motion_vectors_texture), Some(prepass_depth_texture)) =
(&prepass_textures.motion_vectors, &prepass_textures.depth)
else {
return Ok(());
};
let Some(globals_uniforms) = world.resource::<GlobalsBuffer>().buffer.binding() else {
return Ok(());
};
let post_process = view_target.post_process_write();
let msaa = world.resource::<Msaa>();
let layout = if msaa.samples() == 1 {
&motion_blur_pipeline.layout
} else {
&motion_blur_pipeline.layout_msaa
};
let bind_group = render_context.render_device().create_bind_group(
Some("motion_blur_bind_group"),
layout,
&BindGroupEntries::sequential((
post_process.source,
&prepass_motion_vectors_texture.texture.default_view,
&prepass_depth_texture.texture.default_view,
&motion_blur_pipeline.sampler,
settings_binding.clone(),
globals_uniforms.clone(),
)),
);
let mut render_pass = render_context.begin_tracked_render_pass(RenderPassDescriptor {
label: Some("motion_blur_pass"),
color_attachments: &[Some(RenderPassColorAttachment {
view: post_process.destination,
resolve_target: None,
ops: Operations::default(),
})],
depth_stencil_attachment: None,
timestamp_writes: None,
occlusion_query_set: None,
});
render_pass.set_render_pipeline(pipeline);
render_pass.set_bind_group(0, &bind_group, &[]);
render_pass.draw(0..3, 0..1);
Ok(())
}
}

View file

@ -0,0 +1,173 @@
use bevy_ecs::{
component::Component,
entity::Entity,
query::With,
system::{Commands, Query, Res, ResMut, Resource},
world::FromWorld,
};
use bevy_render::{
globals::GlobalsUniform,
render_resource::{
binding_types::{
sampler, texture_2d, texture_2d_multisampled, texture_depth_2d,
texture_depth_2d_multisampled, uniform_buffer_sized,
},
BindGroupLayout, BindGroupLayoutEntries, CachedRenderPipelineId, ColorTargetState,
ColorWrites, FragmentState, MultisampleState, PipelineCache, PrimitiveState,
RenderPipelineDescriptor, Sampler, SamplerBindingType, SamplerDescriptor, ShaderDefVal,
ShaderStages, ShaderType, SpecializedRenderPipeline, SpecializedRenderPipelines,
TextureFormat, TextureSampleType,
},
renderer::RenderDevice,
texture::BevyDefault,
view::{ExtractedView, Msaa, ViewTarget},
};
use crate::fullscreen_vertex_shader::fullscreen_shader_vertex_state;
use super::{MotionBlur, MOTION_BLUR_SHADER_HANDLE};
#[derive(Resource)]
pub struct MotionBlurPipeline {
pub(crate) sampler: Sampler,
pub(crate) layout: BindGroupLayout,
pub(crate) layout_msaa: BindGroupLayout,
}
impl MotionBlurPipeline {
pub(crate) fn new(render_device: &RenderDevice) -> Self {
let mb_layout = &BindGroupLayoutEntries::sequential(
ShaderStages::FRAGMENT,
(
// View target (read)
texture_2d(TextureSampleType::Float { filterable: true }),
// Motion Vectors
texture_2d(TextureSampleType::Float { filterable: true }),
// Depth
texture_depth_2d(),
// Linear Sampler
sampler(SamplerBindingType::Filtering),
// Motion blur settings uniform input
uniform_buffer_sized(false, Some(MotionBlur::min_size())),
// Globals uniform input
uniform_buffer_sized(false, Some(GlobalsUniform::min_size())),
),
);
let mb_layout_msaa = &BindGroupLayoutEntries::sequential(
ShaderStages::FRAGMENT,
(
// View target (read)
texture_2d(TextureSampleType::Float { filterable: true }),
// Motion Vectors
texture_2d_multisampled(TextureSampleType::Float { filterable: false }),
// Depth
texture_depth_2d_multisampled(),
// Linear Sampler
sampler(SamplerBindingType::Filtering),
// Motion blur settings uniform input
uniform_buffer_sized(false, Some(MotionBlur::min_size())),
// Globals uniform input
uniform_buffer_sized(false, Some(GlobalsUniform::min_size())),
),
);
let sampler = render_device.create_sampler(&SamplerDescriptor::default());
let layout = render_device.create_bind_group_layout("motion_blur_layout", mb_layout);
let layout_msaa =
render_device.create_bind_group_layout("motion_blur_layout_msaa", mb_layout_msaa);
Self {
sampler,
layout,
layout_msaa,
}
}
}
impl FromWorld for MotionBlurPipeline {
fn from_world(render_world: &mut bevy_ecs::world::World) -> Self {
let render_device = render_world.resource::<RenderDevice>().clone();
MotionBlurPipeline::new(&render_device)
}
}
#[derive(PartialEq, Eq, Hash, Clone, Copy)]
pub struct MotionBlurPipelineKey {
hdr: bool,
samples: u32,
}
impl SpecializedRenderPipeline for MotionBlurPipeline {
type Key = MotionBlurPipelineKey;
fn specialize(&self, key: Self::Key) -> RenderPipelineDescriptor {
let layout = match key.samples {
1 => vec![self.layout.clone()],
_ => vec![self.layout_msaa.clone()],
};
let mut shader_defs = vec![];
if key.samples > 1 {
shader_defs.push(ShaderDefVal::from("MULTISAMPLED"));
}
#[cfg(all(feature = "webgl", target_arch = "wasm32"))]
{
shader_defs.push("NO_DEPTH_TEXTURE_SUPPORT".into());
shader_defs.push("SIXTEEN_BYTE_ALIGNMENT".into());
}
RenderPipelineDescriptor {
label: Some("motion_blur_pipeline".into()),
layout,
vertex: fullscreen_shader_vertex_state(),
fragment: Some(FragmentState {
shader: MOTION_BLUR_SHADER_HANDLE,
shader_defs,
entry_point: "fragment".into(),
targets: vec![Some(ColorTargetState {
format: if key.hdr {
ViewTarget::TEXTURE_FORMAT_HDR
} else {
TextureFormat::bevy_default()
},
blend: None,
write_mask: ColorWrites::ALL,
})],
}),
primitive: PrimitiveState::default(),
depth_stencil: None,
multisample: MultisampleState::default(),
push_constant_ranges: vec![],
}
}
}
#[derive(Component)]
pub struct MotionBlurPipelineId(pub CachedRenderPipelineId);
pub(crate) fn prepare_motion_blur_pipelines(
mut commands: Commands,
pipeline_cache: Res<PipelineCache>,
mut pipelines: ResMut<SpecializedRenderPipelines<MotionBlurPipeline>>,
pipeline: Res<MotionBlurPipeline>,
msaa: Res<Msaa>,
views: Query<(Entity, &ExtractedView), With<MotionBlur>>,
) {
for (entity, view) in &views {
let pipeline_id = pipelines.specialize(
&pipeline_cache,
&pipeline,
MotionBlurPipelineKey {
hdr: view.hdr,
samples: msaa.samples(),
},
);
commands
.entity(entity)
.insert(MotionBlurPipelineId(pipeline_id));
}
}

View file

@ -69,6 +69,7 @@ impl Plugin for TemporalAntiAliasPlugin {
(
Node3d::EndMainPass,
Node3d::Taa,
Node3d::MotionBlur, // Run MB after TAA, else TAA will add motion artifacts
Node3d::Bloom,
Node3d::Tonemapping,
),

View file

@ -1,5 +1,8 @@
use crate::renderer::RenderAdapter;
use crate::{render_resource::*, renderer::RenderDevice, Extract};
use crate::{
render_resource::*,
renderer::{RenderAdapter, RenderDevice},
Extract,
};
use bevy_asset::{AssetEvent, AssetId, Assets};
use bevy_ecs::system::{Res, ResMut};
use bevy_ecs::{event::EventReader, system::Resource};
@ -186,6 +189,15 @@ impl ShaderCache {
Features::UNIFORM_BUFFER_AND_STORAGE_TEXTURE_ARRAY_NON_UNIFORM_INDEXING,
Capabilities::UNIFORM_BUFFER_AND_STORAGE_TEXTURE_ARRAY_NON_UNIFORM_INDEXING,
),
(
Features::TEXTURE_FORMAT_16BIT_NORM,
Capabilities::STORAGE_TEXTURE_16BIT_NORM_FORMATS,
),
(Features::MULTIVIEW, Capabilities::MULTIVIEW),
(
Features::SHADER_EARLY_DEPTH_TEST,
Capabilities::EARLY_DEPTH_TEST,
),
];
let features = render_device.features();
let mut capabilities = Capabilities::empty();
@ -195,12 +207,28 @@ impl ShaderCache {
}
}
if render_adapter
.get_downlevel_capabilities()
.flags
.contains(DownlevelFlags::CUBE_ARRAY_TEXTURES)
{
capabilities |= Capabilities::CUBE_ARRAY_TEXTURES;
const DOWNLEVEL_FLAGS_CAPABILITIES: &[(DownlevelFlags, Capabilities)] = &[
(
DownlevelFlags::CUBE_ARRAY_TEXTURES,
Capabilities::CUBE_ARRAY_TEXTURES,
),
(
DownlevelFlags::MULTISAMPLED_SHADING,
Capabilities::MULTISAMPLED_SHADING,
),
(
DownlevelFlags::CUBE_ARRAY_TEXTURES,
Capabilities::CUBE_ARRAY_TEXTURES,
),
];
for (downlevel_flag, capability) in DOWNLEVEL_FLAGS_CAPABILITIES {
if render_adapter
.get_downlevel_capabilities()
.flags
.contains(*downlevel_flag)
{
capabilities |= *capability;
}
}
#[cfg(debug_assertions)]

402
examples/3d/motion_blur.rs Normal file
View file

@ -0,0 +1,402 @@
//! Demonstrates how to enable per-object motion blur. This rendering feature can be configured per
//! camera using the [`MotionBlur`] component.z
use bevy::{
core_pipeline::motion_blur::{MotionBlur, MotionBlurBundle},
prelude::*,
};
fn main() {
App::new()
.add_plugins(DefaultPlugins)
.add_systems(Startup, (setup_camera, setup_scene, setup_ui))
.add_systems(Update, (keyboard_inputs, move_cars, move_camera).chain())
.run();
}
fn setup_camera(mut commands: Commands) {
commands.spawn((
Camera3dBundle::default(),
// Add the MotionBlurBundle to a camera to enable motion blur.
// Motion blur requires the depth and motion vector prepass, which this bundle adds.
// Configure the amount and quality of motion blur per-camera using this component.
MotionBlurBundle {
motion_blur: MotionBlur {
shutter_angle: 1.0,
samples: 2,
},
..default()
},
));
}
// Everything past this point is used to build the example, but isn't required to use motion blur.
#[derive(Resource)]
enum CameraMode {
Track,
Chase,
}
#[derive(Component)]
struct Moves(f32);
#[derive(Component)]
struct CameraTracked;
#[derive(Component)]
struct Rotates;
fn setup_scene(
asset_server: Res<AssetServer>,
mut images: ResMut<Assets<Image>>,
mut commands: Commands,
mut meshes: ResMut<Assets<Mesh>>,
mut materials: ResMut<Assets<StandardMaterial>>,
) {
commands.insert_resource(AmbientLight {
color: Color::WHITE,
brightness: 300.0,
});
commands.insert_resource(CameraMode::Chase);
commands.spawn(DirectionalLightBundle {
directional_light: DirectionalLight {
illuminance: 3_000.0,
shadows_enabled: true,
..default()
},
transform: Transform::default().looking_to(Vec3::new(-1.0, -0.7, -1.0), Vec3::X),
..default()
});
// Sky
commands.spawn(PbrBundle {
mesh: meshes.add(Sphere::default()),
material: materials.add(StandardMaterial {
unlit: true,
base_color: Color::linear_rgb(0.1, 0.6, 1.0),
..default()
}),
transform: Transform::default().with_scale(Vec3::splat(-4000.0)),
..default()
});
// Ground
let mut plane: Mesh = Plane3d::default().into();
let uv_size = 4000.0;
let uvs = vec![[uv_size, 0.0], [0.0, 0.0], [0.0, uv_size], [uv_size; 2]];
plane.insert_attribute(Mesh::ATTRIBUTE_UV_0, uvs);
commands.spawn(PbrBundle {
mesh: meshes.add(plane),
material: materials.add(StandardMaterial {
base_color: Color::WHITE,
perceptual_roughness: 1.0,
base_color_texture: Some(images.add(uv_debug_texture())),
..default()
}),
transform: Transform::from_xyz(0.0, -0.65, 0.0).with_scale(Vec3::splat(80.)),
..default()
});
spawn_cars(&asset_server, &mut meshes, &mut materials, &mut commands);
spawn_trees(&mut meshes, &mut materials, &mut commands);
spawn_barriers(&mut meshes, &mut materials, &mut commands);
}
fn spawn_cars(
asset_server: &AssetServer,
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_CARS: usize = 20;
let box_mesh = meshes.add(Cuboid::new(0.3, 0.15, 0.55));
let cylinder = meshes.add(Cylinder::default());
let logo = asset_server.load("branding/icon.png");
let wheel_matl = materials.add(StandardMaterial {
base_color: Color::WHITE,
base_color_texture: Some(logo.clone()),
..default()
});
let mut matl = |color| {
materials.add(StandardMaterial {
base_color: color,
..default()
})
};
let colors = [
matl(Color::linear_rgb(1.0, 0.0, 0.0)),
matl(Color::linear_rgb(1.0, 1.0, 0.0)),
matl(Color::BLACK),
matl(Color::linear_rgb(0.0, 0.0, 1.0)),
matl(Color::linear_rgb(0.0, 1.0, 0.0)),
matl(Color::linear_rgb(1.0, 0.0, 1.0)),
matl(Color::linear_rgb(0.5, 0.5, 0.0)),
matl(Color::linear_rgb(1.0, 0.5, 0.0)),
];
for i in 0..N_CARS {
let color = colors[i % colors.len()].clone();
let mut entity = commands.spawn((
PbrBundle {
mesh: box_mesh.clone(),
material: color.clone(),
transform: Transform::from_scale(Vec3::splat(0.5)),
..default()
},
Moves(i as f32 * 2.0),
));
if i == 0 {
entity.insert(CameraTracked);
}
entity.with_children(|parent| {
parent.spawn(PbrBundle {
mesh: box_mesh.clone(),
material: color,
transform: Transform::from_xyz(0.0, 0.08, 0.03)
.with_scale(Vec3::new(1.0, 1.0, 0.5)),
..default()
});
let mut spawn_wheel = |x: f32, z: f32| {
parent.spawn((
PbrBundle {
mesh: cylinder.clone(),
material: wheel_matl.clone(),
transform: Transform::from_xyz(0.14 * x, -0.045, 0.15 * z)
.with_scale(Vec3::new(0.15, 0.04, 0.15))
.with_rotation(Quat::from_rotation_z(std::f32::consts::FRAC_PI_2)),
..default()
},
Rotates,
));
};
spawn_wheel(1.0, 1.0);
spawn_wheel(1.0, -1.0);
spawn_wheel(-1.0, 1.0);
spawn_wheel(-1.0, -1.0);
});
}
}
fn spawn_barriers(
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_CONES: usize = 100;
let capsule = meshes.add(Capsule3d::default());
let matl = materials.add(StandardMaterial {
base_color: Color::srgb_u8(255, 87, 51),
reflectance: 1.0,
..default()
});
let mut spawn_with_offset = |offset: f32| {
for i in 0..N_CONES {
let pos = race_track_pos(
offset,
(i as f32) / (N_CONES as f32) * std::f32::consts::PI * 2.0,
);
commands.spawn(PbrBundle {
mesh: capsule.clone(),
material: matl.clone(),
transform: Transform::from_xyz(pos.x, -0.65, pos.y).with_scale(Vec3::splat(0.07)),
..default()
});
}
};
spawn_with_offset(0.04);
spawn_with_offset(-0.04);
}
fn spawn_trees(
meshes: &mut Assets<Mesh>,
materials: &mut Assets<StandardMaterial>,
commands: &mut Commands,
) {
const N_TREES: usize = 30;
let capsule = meshes.add(Capsule3d::default());
let sphere = meshes.add(Sphere::default());
let leaves = materials.add(Color::linear_rgb(0.0, 1.0, 0.0));
let trunk = materials.add(Color::linear_rgb(0.4, 0.2, 0.2));
let mut spawn_with_offset = |offset: f32| {
for i in 0..N_TREES {
let pos = race_track_pos(
offset,
(i as f32) / (N_TREES as f32) * std::f32::consts::PI * 2.0,
);
let [x, z] = pos.into();
commands.spawn(PbrBundle {
mesh: sphere.clone(),
material: leaves.clone(),
transform: Transform::from_xyz(x, -0.3, z).with_scale(Vec3::splat(0.3)),
..default()
});
commands.spawn(PbrBundle {
mesh: capsule.clone(),
material: trunk.clone(),
transform: Transform::from_xyz(x, -0.5, z).with_scale(Vec3::new(0.05, 0.3, 0.05)),
..default()
});
}
};
spawn_with_offset(0.07);
spawn_with_offset(-0.07);
}
fn setup_ui(mut commands: Commands) {
let style = TextStyle {
font_size: 24.0,
..default()
};
commands.spawn(
TextBundle::from_sections(vec![
TextSection::new(String::new(), style.clone()),
TextSection::new(String::new(), style.clone()),
TextSection::new("1/2: -/+ shutter angle (blur amount)\n", style.clone()),
TextSection::new("3/4: -/+ sample count (blur quality)\n", style.clone()),
TextSection::new("Spacebar: cycle camera\n", style.clone()),
])
.with_style(Style {
position_type: PositionType::Absolute,
top: Val::Px(12.0),
left: Val::Px(12.0),
..default()
}),
);
}
fn keyboard_inputs(
mut settings: Query<&mut MotionBlur>,
presses: Res<ButtonInput<KeyCode>>,
mut text: Query<&mut Text>,
mut camera: ResMut<CameraMode>,
) {
let mut settings = settings.single_mut();
if presses.just_pressed(KeyCode::Digit1) {
settings.shutter_angle -= 0.25;
} else if presses.just_pressed(KeyCode::Digit2) {
settings.shutter_angle += 0.25;
} else if presses.just_pressed(KeyCode::Digit3) {
settings.samples = settings.samples.saturating_sub(1);
} else if presses.just_pressed(KeyCode::Digit4) {
settings.samples += 1;
} else if presses.just_pressed(KeyCode::Space) {
*camera = match *camera {
CameraMode::Track => CameraMode::Chase,
CameraMode::Chase => CameraMode::Track,
};
}
settings.shutter_angle = settings.shutter_angle.clamp(0.0, 1.0);
settings.samples = settings.samples.clamp(0, 64);
let mut text = text.single_mut();
text.sections[0].value = format!("Shutter angle: {:.2}\n", settings.shutter_angle);
text.sections[1].value = format!("Samples: {:.5}\n", settings.samples);
}
/// Parametric function for a looping race track. `offset` will return the point offset
/// perpendicular to the track at the given point.
fn race_track_pos(offset: f32, t: f32) -> Vec2 {
let x_tweak = 2.0;
let y_tweak = 3.0;
let scale = 8.0;
let x0 = (x_tweak * t).sin();
let y0 = (y_tweak * t).cos();
let dx = x_tweak * (x_tweak * t).cos();
let dy = y_tweak * -(y_tweak * t).sin();
let x = x0 + offset * dy / (dx.powi(2) + dy.powi(2)).sqrt();
let y = y0 - offset * dx / (dx.powi(2) + dy.powi(2)).sqrt();
Vec2::new(x, y) * scale
}
fn move_cars(
time: Res<Time>,
mut movables: Query<(&mut Transform, &Moves, &Children)>,
mut spins: Query<&mut Transform, (Without<Moves>, With<Rotates>)>,
) {
for (mut transform, moves, children) in &mut movables {
let time = time.elapsed_seconds() * 0.25;
let t = time + 0.5 * moves.0;
let dx = t.cos();
let dz = -(3.0 * t).sin();
let speed_variation = (dx * dx + dz * dz).sqrt() * 0.15;
let t = t + speed_variation;
let prev = transform.translation;
transform.translation.x = race_track_pos(0.0, t).x;
transform.translation.z = race_track_pos(0.0, t).y;
transform.translation.y = -0.59;
let delta = transform.translation - prev;
transform.look_to(delta, Vec3::Y);
for child in children.iter() {
let Ok(mut wheel) = spins.get_mut(*child) else {
continue;
};
let radius = wheel.scale.x;
let circumference = 2.0 * std::f32::consts::PI * radius;
let angle = delta.length() / circumference * std::f32::consts::PI * 2.0;
wheel.rotate_local_y(angle);
}
}
}
fn move_camera(
mut camera: Query<(&mut Transform, &mut Projection), Without<CameraTracked>>,
tracked: Query<&Transform, With<CameraTracked>>,
mode: Res<CameraMode>,
) {
let tracked = tracked.single();
let (mut transform, mut projection) = camera.single_mut();
match *mode {
CameraMode::Track => {
transform.look_at(tracked.translation, Vec3::Y);
transform.translation = Vec3::new(15.0, -0.5, 0.0);
if let Projection::Perspective(perspective) = &mut *projection {
perspective.fov = 0.05;
}
}
CameraMode::Chase => {
transform.translation =
tracked.translation + Vec3::new(0.0, 0.15, 0.0) + tracked.back() * 0.6;
transform.look_to(*tracked.forward(), Vec3::Y);
if let Projection::Perspective(perspective) = &mut *projection {
perspective.fov = 1.0;
}
}
}
}
fn uv_debug_texture() -> Image {
use bevy::render::{render_asset::RenderAssetUsages, render_resource::*, texture::*};
const TEXTURE_SIZE: usize = 7;
let mut palette = [
164, 164, 164, 255, 168, 168, 168, 255, 153, 153, 153, 255, 139, 139, 139, 255, 153, 153,
153, 255, 177, 177, 177, 255, 159, 159, 159, 255,
];
let mut texture_data = [0; TEXTURE_SIZE * TEXTURE_SIZE * 4];
for y in 0..TEXTURE_SIZE {
let offset = TEXTURE_SIZE * y * 4;
texture_data[offset..(offset + TEXTURE_SIZE * 4)].copy_from_slice(&palette);
palette.rotate_right(12);
}
let mut img = Image::new_fill(
Extent3d {
width: TEXTURE_SIZE as u32,
height: TEXTURE_SIZE as u32,
depth_or_array_layers: 1,
},
TextureDimension::D2,
&texture_data,
TextureFormat::Rgba8UnormSrgb,
RenderAssetUsages::RENDER_WORLD,
);
img.sampler = ImageSampler::Descriptor(ImageSamplerDescriptor {
address_mode_u: ImageAddressMode::Repeat,
address_mode_v: ImageAddressMode::MirrorRepeat,
mag_filter: ImageFilterMode::Nearest,
..ImageSamplerDescriptor::linear()
});
img
}

View file

@ -138,6 +138,7 @@ Example | Description
[Lines](../examples/3d/lines.rs) | Create a custom material to draw 3d lines
[Load glTF](../examples/3d/load_gltf.rs) | Loads and renders a glTF file as a scene
[Meshlet](../examples/3d/meshlet.rs) | Meshlet rendering for dense high-poly scenes (experimental)
[Motion Blur](../examples/3d/motion_blur.rs) | Demonstrates per-pixel motion blur
[Orthographic View](../examples/3d/orthographic.rs) | Shows how to create a 3D orthographic view (for isometric-look in games or CAD applications)
[Parallax Mapping](../examples/3d/parallax_mapping.rs) | Demonstrates use of a normal map and depth map for parallax mapping
[Parenting](../examples/3d/parenting.rs) | Demonstrates parent->child relationships and relative transformations