Projection Mapping Application Kevin Wright Kevin Moule 2 Who we - - PowerPoint PPT Presentation

projection mapping application
SMART_READER_LITE
LIVE PREVIEW

Projection Mapping Application Kevin Wright Kevin Moule 2 Who we - - PowerPoint PPT Presentation

Designing a Self-Calibrating Pipeline for Projection Mapping Application Kevin Wright Kevin Moule 2 Who we are Kevin Wright Director of the Application Software group at Christie responsible for developing video control and display


slide-1
SLIDE 1
slide-2
SLIDE 2

2

Designing a Self-Calibrating Pipeline for Projection Mapping Application

Kevin Wright Kevin Moule

slide-3
SLIDE 3

Who we are

  • Kevin Wright
  • Director of the Application Software group at Christie responsible for

developing video control and display calibration software

  • Kevin Moule
  • Focused on developing machine vision based system for automatically

warping and blending multi-projector displays.

3

slide-4
SLIDE 4

Christie at a Glance

4

slide-5
SLIDE 5

Christie at a Glance

5

Cinema Projection Business Displays Visualization Simulation

slide-6
SLIDE 6

Projection Mapping

Projection Mapping also known as video mapping, spatial augmented reality or shader lamps is a projection technology used to turn

  • bjects, often irregularly

shaped, into a display surface for video projection

6

slide-7
SLIDE 7

Projection Mapping Today - Staged Events

7

slide-8
SLIDE 8

The Potential for Projection Mapping Optional Confidential or

8

slide-9
SLIDE 9

Christie’s C7 Automotive Kiosk

9

slide-10
SLIDE 10

Manual Projection Mapping Workflow

10

slide-11
SLIDE 11

How can we bring Projection Mapping to the Masses?

11

  • Integration Barriers
  • Design constraints
  • Integration expertise
  • Maintenance requirements
  • Alignment
  • Changing models
slide-12
SLIDE 12

Automatic Projection Mapping Workflow

12

slide-13
SLIDE 13

Christie’s Digital Sandbox

13

slide-14
SLIDE 14

Automatic Projection Mapping

  • Starting with a known 3D
  • bject
  • Using structured light from the

projectors plus a number of machine vision cameras

  • Derive all the parameters of the

3D physical world

  • Projector/camera/model pose
  • Projector lens parameters

14

slide-15
SLIDE 15

Automatic Projection Mapping

  • Replicate the physical world in

the virtual world

  • Render images from projector

viewpoints

  • Content creation happens in

the virtual world, the system maps content to projectors

15

slide-16
SLIDE 16

Automatic Projection Mapping

  • Without the proper geometry,

the images do not line up

  • The content looks correct only

through the combination of geometry and projection

16

slide-17
SLIDE 17

Automatic Projection Mapping

  • Projector locations need to be

accurately evaluated

  • Being off by a 1cm leads to a

significant misalignment

17

slide-18
SLIDE 18

Automatic Projection Mapping

  • Content creation happens in

the virtual world, the system maps content to projectors

  • The content looks correct only

through the combination of geometry and projection

18

slide-19
SLIDE 19

Issues With 3D Workflow – Viewport Limitations

Not all visual pipelines provide a full 6 degrees of freedom or the lensing parameters to properly correlate viewport with projector location Pre-rendered content also forces a fixed viewport

19

slide-20
SLIDE 20

Issues With 3D Workflow – Perspective Correction

When creating the illusion of depth (as in through the car window) the viewport must be drawn from the perspective of the viewer rather than the projector

20

slide-21
SLIDE 21

Issues With 3D Workflow – Perspective Correction

Something as simple as specular highlights fall into this category A single light source may appear as many

  • n a shiny object when using multiple

projector viewpoints.

21

slide-22
SLIDE 22

Issues With 3D Workflow – Perspective Correction

22

Something as simple as specular highlights fall into this category A single light source may appear as many

  • n a shiny object when using multiple

projector viewpoints.

slide-23
SLIDE 23

Issues With Workflow – Projecting Depth

23

Geometry required for good projection can

  • ften differ from the actual geometry

Recreating the illusion of the actual geometry requires an eye point that matches the viewer’s position

slide-24
SLIDE 24

Improvements to Workflow - Warping

Image warping may be applied to correct for the difference between eye- point and projector viewports

24

slide-25
SLIDE 25

Improvements to Workflow - Warping

25

slide-26
SLIDE 26

Improvements to Workflow - Warping

  • NVIDIA Warp + Intensity API enables the warping and blending
  • The API is a low level warping engine that needs additional software

support

  • Ideal for us because we can layer our existing projector based warping

tools (Twist/AutoCal) on top

  • Ideal for the application because the warping support is baked into the

driver, no need for software changes

26

slide-27
SLIDE 27

Improvements to Workflow – Rendering Alternatives

A two-pass rendering process can capture an eye-point correct image for use from the perspective of the projector Start with an image rendered from the desired view

27

slide-28
SLIDE 28

Improvements to Workflow – Rendering Alternatives

Naively generating images from the projector locations lead to incorrect and inconsistent results

28

slide-29
SLIDE 29

Improvements to Workflow – Rendering Alternatives

Use the desired viewpoint image as a texture when generating image from the projectors

29

slide-30
SLIDE 30

Future work

  • Formalizing workflow, enabling broader use
  • Improve automation of warping and blending
  • Investigate other rendering techniques and platforms
  • Increase creative control

30

slide-31
SLIDE 31