Computer Graphics
- Volume Rendering -
Computer Graphics - Volume Rendering - Philipp Slusallek Overview - - PowerPoint PPT Presentation
Computer Graphics - Volume Rendering - Philipp Slusallek Overview Motivation Volume Representation Indirect Volume Rendering Volume Classification Direct Volume Rendering Applications: Bioinformatics Image by [Chimera 08]
Image by [Chimera 08]
Image by [Salama 07]
Image by [RTVG 08]
– Measure or computation the data
– Picking desired features, cleaning, noise-reduction, re-sampling, reconstruction, classification, ...
– Map N-dimensional data to visual primitives
– Generate the image
– Enhancements (gamma correction, tone mapping)
– Computer Tomography (CT, X-Ray), – Magnetic Resonance Imaging (MRI, e-spin) – Positron-Emission Tomography (PET) – Ultrasound, sonar – Electron microscopy – Confocal microscopy – Cryo-EM/Light-Tomography
– Essentially everything > 2D
– Selection of relevant aspects – Cleaning & repairing – Correcting incomplete, out-of-scale values – Noise reduction and removal – Classification
– Re-sampling (often to Cartesian grids)
– Volume reconstructing of 3D data from projection
– Interpretation of measurement values – Mapping to geometric primitives – Mapping to parameters (colors, absorption coefficients, ...)
– Surface extraction vs. direct volume rendering – Single volume vs multiple (possibly overlapping) – Object-based vs. image-based rendering
– Representation of volume
– Colors for given samples (pixels)
– Map “weird values” to optical properties – “Project 1D data values within 3D context to 2D image plane”
– Fluid dynamics – Heat transfer – etc… – Generally “Scientific Visualization”
– CT (Computed Tomography) scanner
bones)
– MRI (Magnetic Resonance Imaging)
heart)
– PET (Positron Emission Tomography) – And many others (also here on campus, e.g. material science)
– 3D field of values: Essentially a 3D scalar or color texture – Sometimes higher dimensional data (e.g. vector/tensor fields)
– 3D lattice of sample points (akin to an image but in 3D)
– Generally point cloud in space – Point neighborhood information (topology) – Data values at the points
– Mathematical description of values in space – Sum of Gaussians (e.g. in quantum mechanics) – Perlin noise (e.g. for non-homogeneous fog) – Always convertible to sampled representation
– Common for scanned data – May have different spacings
– Warped rectilinear grids
– Common for simulated data – E.g. tetrahedral meshes
– No topological/connection information
– Cell-centered sample values
– Node-centered sample values
– wx = (x – x0) / (x1 – x0) – wy = (y – y0) / (y1 – y0) – wz = (z – z0) / (z1 – z0)
– f(x, y, z) = (1 - wz) (1 - wy) (1 - wx) c000 – + (1 - wz) (1 - wy) wx c100 – + (1 - wz) wy (1 - wx) c010 – + (1 - wz) wy wx c110 – + wz (1 - wy) (1 - wx) c001 – + wz (1 - wy) wx c101 – + wz wy (1 - wx) c011 – + wz wy wx c111
x y z
– Along X
– Along Y
– Along Z
x y z
– Map scalar data values to optical properties – E.g.
– Analytical function – Discrete representation
– Physically-based mapping via optical properties of material
– Allows for realistic rendering, often intuitively interpretable by us
– User-defined mapping from data to colors
(color map transfer function)
– Mapping may have no physical interpretation
– Highlight specific features of the data
– First classify data values in sample cells – Then interpolate classified optical properties
– First interpolate data values, then classify interpolated values
– Klaus Engel & Robert Schneider, Siemens Healthineers
– Directly render the volumetric data (only) as translucent material
– Assume constant optical density 𝜆01 – Transmittance: 𝑈 𝑦0, 𝑦1 = 𝑓−𝜆01(𝑦1−𝑦0) – Transmitted radiance: 𝑀𝑝 𝑦0, 𝜕 = 𝑈 𝑦0, 𝑦1 𝑀𝑝 𝑦1, 𝜕
x1 x0
– Assume constant optical density 𝜆01(extinction coefficient) – Transmittance: 𝑈 𝑦0, 𝑦1 = 𝑓−𝜆01(𝑦1−𝑦0) – Transmitted radiance: 𝑈 𝑦0, 𝑦1 𝑀𝑝 𝑦1, 𝜕
– Also assume (constant) volume radiance 𝑀𝑤 𝑦, 𝜕 [Watt/(sr m^3)] – Contributed radiance: 1 − 𝑈 𝑦0, 𝑦1 𝑀𝑤 𝑦01, 𝜕
– Radiance reaching the observer
– 𝑀𝑝 𝑦0, 𝜕 = 1 − 𝑈 𝑦0, 𝑦1 𝑀𝑤 𝑦01, 𝜕 + 𝑈 𝑦0, 𝑦1 𝑀𝑝 𝑦1, 𝜕
– Assume constant volumetric albedo 𝜍𝑤 𝑦 – Assume constant ambient lighting 𝑀𝑏 (everywhere, no shadowing) – Leads to constant volume radiance 𝑀𝑤 𝑦, 𝜕 = 𝑀𝑏 𝜍𝑤
– Entry at camera, exit at intersection, or inf.
– Compute surface illumination 𝑀𝑝 𝑦1, 𝜕
between surface and light source
– Compute volume transmittance 𝑈 𝑦0, 𝑦1 and attenuate surface radiance – Add contributions from volume radiance
x0 x1
– Simple – Efficient
– No true light contributions – No volumetric shadows
– Non-constant optical density / non-constant volume radiance – Sample volume at discrete locations – Assume constant density and volume radiance in each interval
– 𝑀𝑝 𝑦0, 𝜕 = 1 − 𝑓−𝜆01Δ𝑦 𝑀𝑤 𝑦01, 𝜕 + 𝑓−𝜆01Δ𝑦𝑀𝑝 𝑦1, 𝜕 – 𝑀𝑝 𝑦1, 𝜕 = 1 − 𝑓−𝜆12Δ𝑦 𝑀𝑤 𝑦12, 𝜕 + 𝑓−𝜆12Δ𝑦𝑀𝑝 𝑦2, 𝜕 – 𝑀𝑝 𝑦2, 𝜕 = …
𝑀𝑝 𝑦0, 𝜕 = 1 − 𝑓−𝜆01Δ𝑦 𝑀𝑤 𝑦01, 𝜕 + 𝑓−𝜆01Δ𝑦 1 − 𝑓−𝜆12Δ𝑦 𝑀𝑤 𝑦12, 𝜕 + 𝑓−𝜆12Δ𝑦 … = 1 − 𝑓−𝜆01Δ𝑦 𝑀𝑤 𝑦01, 𝜕 + 𝑓−𝜆01Δ𝑦 1 − 𝑓−𝜆12Δ𝑦 𝑀𝑤 𝑦12, 𝜕 + 𝑓−𝜆01Δ𝑦𝑓−𝜆12Δ𝑦 … =
𝑗=0 𝑜−1 𝑘=0 𝑗−1
𝑓−𝜆𝑘,𝑘+1Δ𝑦 1 − 𝑓−𝜆𝑗,𝑗+1Δ𝑦 𝑀𝑤 𝑦𝑗,𝑗+1, 𝜕 +
𝑘=0 𝑜−1
𝑓−𝜆𝑘,𝑘+1Δ𝑦 𝑀𝑝 𝑦𝑜, 𝜕 x0 x1 x2 x3
– dt = min(t_step, t_exit - t); – P = ray.origin + (t + dt/2) * ray.direction; – b = exp(- volume.density(P) * dt); – L += T * (1 - b) * Lv(P); – T *= b; – // Optional early termination – t += t_step;
– Similar to surface reflected radiance (i.e. rendering equation) – Use phase function 𝜍 𝑦, Δ𝜕 , (e.g.
𝜍𝑤 4 𝜌) instead of BRDF*cosine
– Modulate shadow visibility by transmittance
– Modulate visibility at surfaces by transmittance – Modulate visibility at each volume sample by transmittance 𝑀𝑤 𝑦, 𝜕𝑝 = 𝐽(−𝜕) 𝑦 − 𝑧 2 𝑊 𝑦, 𝑧 𝑈(𝑦, 𝑧) 𝜍𝑤 4 𝜌 𝑀𝑠𝑚 𝑦, 𝜕𝑝 = 𝐽(−𝜕) 𝑦 − 𝑧 2 𝑊 𝑦, 𝑧 𝑈(𝑦, 𝑧)𝑔
𝑠 𝜕 𝑦, 𝑧 , 𝑦, 𝜕𝑝 cos 𝜄𝑗
– Non-constant-optical density – Non-constant volume radiance
– Ray-marched shadow rays at surface – Ray-marched shadow rays at each volume sample!!
𝑈 𝑦0, 𝑦𝑜 =
𝑘=0 𝑜−1
𝑓−𝜆𝑘,𝑘+1Δ𝑦
– Abort ray-marching when subsequent contributions are negligible – if (T < epsilon) return L; – Very effective in dense volumes – Also avoids ray-marching to infinity
– 3-D DDA – Ray-marching
– Bulk integration over homogeneous regions (e.g. octree, bricks) – Pre-compute and store maximum step size separately – Increasing step size with decreasing accumulated transmittance – Vertex Connection and Merging & Joint Path Sampling [Siggraph’14]