massive time lapse point
play

MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, - PowerPoint PPT Presentation

April 4-7, 2016 | Silicon Valley MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, OpenGL Chips and Core Markus Schuetz, Professional Visualization Introduction Previous Work Methods AGENDA Progressive Blue-Noise Point Cloud


  1. April 4-7, 2016 | Silicon Valley MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, OpenGL Chips and Core Markus Schuetz, Professional Visualization

  2. Introduction Previous Work Methods AGENDA Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos 2

  3. INTRODUCTION Point Cloud A set of points that represents the external surface of an object Our Dataset: Project Endeavor New NVIDIA building under construction Time-Lapse Point Clouds 3

  4. INTRODUCTION Point Cloud Representation Advantages Disadvantages Simplicity Visually incomplete Scalability Easy to get noise Easiness of capturing Increasing data size Easiness of data handling 4

  5. INTRODUCTION Our Focus Point Cloud Real-Time Rendering Massive Scale (more than 1 TB) Instant Scalability Time-Lapse Efficient Out-of-Core Design VR Rendering Plausible Visualization High-Quality VR Experience 5

  6. INTRODUCTION Our Contributions A Novel Approach for Massive Time-Lapse Point Cloud Rendering Adapting Progressive Blue-Noise Point Cloud (PBNPC) Resampling High-Quality Point Cloud VR Experience Our method provides several important features: performance, quality, & navigation 6

  7. Introduction Previous Work Methods AGENDA Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos 7

  8. PREVIOUS WORK Plant growth analysis using time-lapse point cloud Li et al., Analyzing Growing Plants from 4D Point Cloud Data, 2013, Transaction on Graphics Image source: Li et al., Analyzing growing plants from 4D point cloud data, 2013, ToG 8

  9. PREVIOUS WORK Hyper-lapse from video (sequence of images) First Person Hyper-lapse Video, Kopf et al., 2014, Transaction on Graphics Image source: Kopf et al., First Person Hyper-lapse Video, 2014, ToG 9

  10. Introduction Previous Work Methods AGENDA Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos 10

  11. PROGRESSIVE BLUE-NOISE POINT CLOUD (PBNPC) 11

  12. PROBLEMS COLOR DATA SIZE REGISTRATION MISMATCH Per Day PC: 1.5 GB Drone captured Daily weather changes 2 Years: 1 TB Capturing is not perfect Capturing time changes Current: 120 GB Actual site changes by Sun position changes daily construction GPU Memory: up to 24 GB Shadows ( NVIDIA Quadro M6000 ) Out-of-Core Loading 12

  13. 1. Data Size: 1.5 GB Per Day 13

  14. 2. Color Mismatch July 27, 2015 July 30, 2015 14

  15. 3. Registration Problem 15

  16. PIPELINE OVERVIEW Preprocessing Real-Time Processing Estimating Input PBNPC Time-Lapse Speed Point Cloud Creation (LAS files) Async Adjusting Loading & # of points Removing & Sparse Rendering Buffer Color Registration Correction Filling Sparse Vertex Buffer 16

  17. Input PBNPC Point Cloud Creation (LAS files) INPUT FILES Color Registration Correction Drone captures point cloud every 2-3 day Input file format: LAS (libLAS) Sampling density is varying per day Boundary is varying per day (Noise) 1.5 GB per capture * 86 := 120 GB Reduced to 50 GB 17

  18. Input PBNPC Point Cloud Creation (LAS files) MAJOR PROBLEMS Color Registration Correction Massive Scale Point Cloud Data Handling  We need “Immediate Scalability” &  Preserve Visual Quality 18

  19. BLUE-NOISE POINT CLOUD Types of Noise, depend on frequency distribution White Noise, Pink Noise, and Blue-Noise Blue-Noise Point Cloud Nearly Poisson Distribution Approximation of Human Retina Cells’ Distribution Visually Plausible Image source: Recursive Wang Tiles for Real-Time Blue Noise, Kopf et al., 2006, ACM SIGGRAPH 19

  20. PROGRESSIVE BLUE-NOISE POINT CLOUD Progressive Blue-Noise Point Cloud (PBNPC) Adding or removing any number of points preserve Blue-Noise Characteristics Using “Recursive Wang Tiles for Real -Time Blue Noise”, Kopf et al., 2006, ACM SIGGRAPH Image source: Recursive Wang Tiles for Real-Time Blue Noise, Kopf et al., 2006, ACM SIGGRAPH 20

  21. VIDEO: PBNPC 21

  22. Input PBNPC Point Cloud Creation (LAS files) REGISTRATION Color Registration Correction We tried to use Approximated Nearest Neighbors (ANN) to align Point Clouds - Several problems: too slow and low accuracy Render depth maps in several different camera positions Generate gradient maps from depth maps Octree-based Search + Hill Climbing Algorithm 22

  23. Input PBNPC Point Cloud Creation (LAS files) TIME-LAPSE COLOR CORRECTION Color Registration Correction We are not correcting colors YET Instead Time-Lapse Blending between Days Blending alleviates the color mismatching problem Blending cannot solve shadow(sun position) and color distribution problems 23

  24. TIME-LAPSE COLOR CORRECTION No Blending 24

  25. TIME-LAPSE COLOR CORRECTION Blending 25

  26. PIPELINE OVERVIEW Preprocessing Real-Time Processing Estimating Input PBNPC Time-Lapse Speed Point Cloud Creation (LAS files) Async Adjusting Loading & # of points Removing & Sparse Rendering Buffer Color Registration Correction Filling Sparse Vertex Buffer 26

  27. Estimating Time- Lapse Speed OPENGL 4.5, SPARSE BUFFER Async Adjusting Loading & # of points Removing & ARB_sparse_buffer Sparse Rendering Buffer Filling Sparse Vertex Buffer Newly introduced OpenGL 4.5 Extension (https://www.opengl.org/registry/specs/ARB/sparse_buffer.txt) Decouple GPU’s virtual and physical memory Similar to ARB_sparse_texture extension We are using Sparse Buffer as a Stack (Prepare entire virtual memory per daily PC) 27

  28. OPENGL 4.5, SPARSE BUFFER ARB_sparse_buffer We allocate virtual memory for entire time-lapse point clouds glBufferStorage (target, size, data_ptr, GL_SPARSE_STORAGE_BIT_ARB ) data_ptr will be ignored, when with sparse storage bit Physical memory can be committed by using glBufferPageCommitmentARB(target, offset, size, commit) GL_TRUE or GL_FALSE for commit or decommit memory Greatly ease our effort to manage GPU memory 28

  29. Estimate Time- Lapse Speed ESTIMATING TIME-LAPSE SPEED Async Adjusting Loading & # of points Removing & Based on Loading & Unloading Probability Sparse Rendering Buffer Filling Sparse Vertex Buffer Calculating Time-Lapse Direction & Speed Amount of Async Loading Request is based on Probability Probability adjusted by Time-Lapse Direction & Speed 29

  30. Estimate Time- Lapse Speed ADJUSTING # OF POINTS Async Adjusting Loading & # of points Removing & Sparse Rendering Buffer Filling Sparse Vertex Buffer For Real-Time Rendering (Normal > 60 Hz & VR > 90 Hz) Instant Level of Details (LOD) by using Progressive Blue-Noise Point Cloud Simply adjusting LOD percentage until target FPS accomplished 30

  31. HIGH-QUALITY VR POINT CLOUD 31

  32. VR POINT CLOUDS Performance, Quality, User Interaction 32

  33. VR PERFORMANCE Each point cloud at least 30 million points • • Rendering 2 point clouds during time-slice transitions Render twice (once for each eye) • • At 90 Frames per Second +some more to account for additional VR overhead • 33

  34. VR PERFORMANCE Out-Of-Core data structures necessary • • Multi-Resolution Octree used for Point Cloud VR demo (“Interactions with Gigantic Point Clouds”, Claus Scheiblauer) • Load and render only visible parts up to desired Level of Detail source: “ Potree : Rendering Large Point Clouds in Web Browsers”, Markus Schuetz 34

  35. VR PERFORMANCE High Level-Of-Detail near camera • • Render only ~3 million points out of billions • 1.3 billion points at 500-700FPS on Quadro M6000 24GB source: “ Potree : Rendering Large Point Clouds in Web Browsers”, Markus Schuetz 35

  36. VR POINT CLOUDS Quality Strong aliasing inherent to Point Cloud Rendering • • Surfaces made up of overlapping points that occlude each other. Closest to camera wins. • Aliasing more noticeable in VR due to constant motion and low resolution Perceived as “sparkling” • 36

  37. SOURCES OF ALIASING OCCLUSIONS LEVEL OF DETAIL SILHOUETTES Surface Patches made up Building Multi-Resolution Model Silhouettes of overlapping points Octree, only considering point coordinates Point Sprite Silhouettes Points fighting for visibility Like Nearest-Neighbor source: “ Potree: Rendering Large Point Clouds in Web Browsers”, Markus Schuetz 37

  38. SOURCES OF ALIASING Occlusions Surface Patches made up of overlapping points that constantly fight for visibility • • Blend fragments together instead of rendering only the closest (“High - quality surface splatting on today's GPUs”, Botsch et al., 2005) • Using screen-aligned circles instead of oriented splats source: “ Potree : Rendering Large Point Clouds in Web Browsers”, Markus Schuetz 38

  39. SOURCES OF ALIASING Level-of-Detail Additionally store interpolated colors in lower Levels-Of-Detail • • Like Mip-Mapping for point clouds Interpolated colors partially reduce occlusion-aliasing • 39

  40. SOURCES OF ALIASING Silhouettes Large amount of silhouettes due to holes from incompletely or sparsely captured • 3D data and noise Plus: Each point sprite has its own small silhouette • • Smallest HMD movements and noise constantly change silhouettes Using MSAA to reduce these aliasing artifacts • MSAA partially reduces occlusion-aliasing, too: • 40

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend