optimizing motion to photon latency
play

Optimizing Motion-to-Photon Latency on DAQRI Smart Glasses - PowerPoint PPT Presentation

Optimizing Motion-to-Photon Latency on DAQRI Smart Glasses Heinrich Fink heinrich.fink@daqri.com XDC 2018 1 System Overview Himax Intel SKL Gen9 Display-Port Ctrl + LCOS m7-6Y75 Hardware Platform Software Linux 4.14.x Motion Samples


  1. Optimizing Motion-to-Photon Latency on DAQRI Smart Glasses Heinrich Fink heinrich.fink@daqri.com XDC 2018 1

  2. System Overview Himax Intel SKL Gen9 Display-Port Ctrl + LCOS m7-6Y75 Hardware Platform Software Linux 4.14.x Motion Samples Pose Frame Photons Camera + Tracking Rendering Display Sensors Visual Inertial Odometry ( VIO ) User App Ideal motion-to-photon latency: AR < 5 ms [1,3] VR < 20 ms [2] 2

  3. Demo :: Basic Mode T T + 11.1 ms T + 22.2 ms T + 33.3 ms VIO 30 / 800 Hz Host render-pose APPLICATION 45 - 90 Hz render GPU HIMAX 90 Hz scan-out & deinterleave Display Controller DISPLAY 540 Hz LCOS Motion-to-photon latency: ~ 26 ms different for each color → rainbow effect 3

  4. Demo :: Optimized Mode T T + 11.1 ms T + 22.2 ms T + 33.3 ms VIO 30 / 800 Hz Host render-pose APPLICATION 45 - 90 Hz render warp-poses GPU COMPOSITOR 90 Hz warp GPU shift-pose Himax 2D-shift HIMAX 90 Hz scan-out & deinterleave Display Controller DISPLAY 540 Hz LCOS M2P Latency: ~ 8 ms 4

  5. Challenges of AR Compositor Architecture • Custom interface between App and Compositor • Need to attach pose / time to app render buffer (for downstream) • Let compositor pace app render cycles • Decouple render-rate from display-rate • No prevalent standard exists for that (yet?) • Keep end-to-end pipeline short • No triple buffering, no intermediates • Updating poses as late as possible w/o stalling the full pipeline • Compositor/App compete for GPU resources • Pre-emption likely needed 5

  6. The Path Ahead • Remove the desktop render path • Direct use of KMS to flip / access timing info • Use DRM format modifiers for optimal end-to-end buffer formats • Can import into EGL / Vulkan for applications • Use dma_fence for down/up stream synch and traceability • Use KMS-exposed hardware planes for simple compositing • Although varying timing requirements might be tricky • Observability through standard tools (GPUView, GPUtop, …?) 6

  7. References [1] Bailey, R. E., Arthur, J. J., & Williams, S. P. (2004, August). Latency requirements for head-worn display S/EVS applications. In Enhanced and Synthetic Vision 2004 (Vol. 5424, pp. 98-110). International Society for Optics and Photonics. [2] Yao, R., Heath, T., Davies, A., Forsyth, T., Mitchell, N., & Hoberman, P. (2014). Oculus vr best practices guide. Oculus VR, 4. http://static.oculusvr.com/sdk-downloads/documents/OculusBestPractices.pdf [3] Lincoln, P. C. (2017). Low Latency Displays for Augmented Reality (Doctoral dissertation, The University of North Carolina at Chapel Hill). [4] Presentation about [3] : https://www.microsoft.com/en-us/research/video/low-latency-displays-augmented-reality/ [5] Wagner, D., (2018). MOTION TO PHOTON LATENCY IN MOBILE AR AND VR, https://medium.com/@DAQRI/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926 7

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend