Computer+Vision Cameras Prof.&Flvio&Cardeal& - - PowerPoint PPT Presentation

computer vision
SMART_READER_LITE
LIVE PREVIEW

Computer+Vision Cameras Prof.&Flvio&Cardeal& - - PowerPoint PPT Presentation

Computer+Vision Cameras Prof.&Flvio&Cardeal& DECOM&/&CEFET7MG cardeal@decom.cefetmg.br& Abstract This lecture discusses features of cameras that help you to decide which camera should be used in your research


slide-1
SLIDE 1

Cameras

Computer+Vision

Prof.&Flávio&Cardeal&– DECOM&/&CEFET7MG

cardeal@decom.cefetmg.br&

slide-2
SLIDE 2

Abstract

  • This lecture discusses features of cameras

that help you to decide which camera should be used in your research or application.

2

slide-3
SLIDE 3

Properties+of+Digital+Cameras

  • For modeling the projective mapping of the 3D

world into images, we need to understand the geometry and photometry of the used cameras.

  • So, now we are going to discuss some basic

models for a single camera or a stereo?camera system, to be eventually used in our applications.

3

slide-4
SLIDE 4

Properties+of+Digital+Cameras

  • A digital camera uses one or several matrix

sensors for recording a projected image.

  • A sensor matrix is an

array of sensor elements, named phototransistors, which capture photons and convert them to electrons.

  • Currently, they are produced either in CCD or

CMOS technology.

4

Ncols × Nrows

slide-5
SLIDE 5

CCD+X+CMOS+Sensors

  • CCD is short for Charged Coupling Device and

CMOS is short for Complementary Metal Oxide

  • Semiconductor. See examples below.

5

CCDESensor CMOSESensor

TheEindividualEcellsE areEsoEtinyEthatEtheyE cannotEbeEseenE here,EevenEafterE zoomingEin.

slide-6
SLIDE 6

CCD+X+CMOS+Sensors

  • In the past, CCDs have been considered superior

to CMOS because of their quality.

  • CCDs have traditionally offered

higher dynamic range and higher resolution.

  • However: frame rates are slowerK require more

power dissipationK cost more to manufacture.

6

AEsceneE demandingE highEdynamicE range.

slide-7
SLIDE 7

CCD+X+CMOS+Sensors

  • Recently, CMOS sensors have shown significant

improvements in quality.

  • In fact, CMOS sensor resolutions and data quality

are approaching those of CCDs.

  • In addition, they have higher speed, lower power

requirements and higher integration potential.

7

slide-8
SLIDE 8

Properties+of+Digital+Cameras

  • Computer vision benefits from the use of high?

quality digital cameras.

  • Important properties are, for example:

8

  • ColorEaccuracyK
  • ReducedElensEdistortionK
  • AspectEratioKE
  • HighEspatialEimageEresolutionK
  • LargeEbitEdepthK
  • HighEdynamicErangeK
  • HighEspeedEofEframeEtransfer.
slide-9
SLIDE 9

Example+of+Application

  • Here we have an example of an application

requiring high?quality cameras.

9

Analysis of a car crash test based

  • n

high?resolution images captured at 1,000 fps.

Source:&R.&Klette

slide-10
SLIDE 10

Computer+Vision+Cameras

  • Computer

vision cameras are typically permanently connected to a computer (via a video port or a frame grabber).

  • And they require software for frame capture or

camera control (e.g., for time synchronization, panning, tilting, or zooming).

10

Panning Tilting

slide-11
SLIDE 11

Digital+Video

  • Digital cameras provide normally both options of

recording still images or video data.

  • For a given camera, spatial times temporal

resolution is typically a constant.

  • For example, a camera which captures 7,680 x

4,320 (i.e. 33 Mpx) at 60 fps, records 1.99 Gpx (Gigapixels) per second.

11

slide-12
SLIDE 12

Interlaced+X+Progressive+Videos

  • An interlaced video is created by scanning either

the odd or the even lines of the image sensor.

  • Thus, the interlaced video contains two fields of a

video frame captured at two different times.

  • For example, in the first field, the odd lines would

be displayed, and then with the second field, the even lines of that image would be shown.

12

slide-13
SLIDE 13

Interlaced+X+Progressive+Videos

13 Source:&http://t3rfde.com/hdtv/

Odd Lines – Field 1 1/60th sec Even Lines – Field 2 1/60th sec Fields 1+2 = Frame 1/30th sec

Interlaced Scan

slide-14
SLIDE 14

Interlaced+X+Progressive+Videos

  • Interlaced videos require a display that is natively

capable of showing the individual fields in a sequential order.

  • The display screen shows one field at a time.
  • The screen keeps alternating rows very quickly

such that a human eye cannot perceive that there is always a blank field of rows in the screen.

14

slide-15
SLIDE 15

Interlaced+X+Progressive+Videos

  • In other words, only half of the resolution is
  • available. This explains why interlaced videos

become blurry when they are paused.

  • Video sources named with the letter i are called

interlaced (e.g., 480i or 1080i video sources).

  • 480 or 1080 refer to the number of scan lines the

video source uses to reproduce the video.

15

slide-16
SLIDE 16

Interlaced+X+Progressive+Videos

  • Progressive video, in contrast, is made up of

consecutively displayed video frames that contain all the horizontal lines of the image being shown.

  • As a result, images appear smoother and fast?

motion sequences are sharper.

  • This leads to better visual video quality and

provides an appropriate input for video analysis.

16

slide-17
SLIDE 17

Interlaced+X+Progressive+Videos

17 Source:&http://t3rfde.com/hdtv/

All lines scanned in a single sweep Frame 1/30th sec

Progressive Scan

slide-18
SLIDE 18

Aspect+Ratio

  • The role of aspect ratio has caused quite a bit of

confusion, partly because there are different types of aspect ratio, not just one.

  • The aspect ratio most people know is the Display

Aspect Ratio (DAR) or image aspect ratio.

  • This is the ratio of the width to the height of the

display frame, the aspect ratio of what we see.

18

slide-19
SLIDE 19

Display+Aspect+Ratio+(DAR)

  • It is expressed as two numbers separated by a

colon, as in 16:9 (width always comes first).

  • Typically, the DAR is 16:9 (widescreen) or 4:3 (full

screen).

  • When comparing different display aspect ratios,
  • ne may compare images with equal height,

width, diagonal, or area.

19

slide-20
SLIDE 20

Display+Aspect+Ratio+(DAR)

20

Comparison of crops of a given image at 4:3 and 16:9, with different parameters equal.

Same diagonal size Same height Same area (number of pixels)

Source:&A.&Hornig

slide-21
SLIDE 21

Display+Aspect+Ratio+(DAR)

21

slide-22
SLIDE 22

Storage+Aspect+Ratio+(SAR)

  • Two other kinds of aspect ratio are: the Pixel

Aspect Ratio (PAR), and the aspect ratio of the stored data named Storage Aspect Ratio (SAR).

  • When digital video is stored, it is stored with a

particular frame size and aspect ratio, the SAR.

  • If the DAR = SAR, then displaying a stored video

is a matter of scaling it to the correct size.

22

slide-23
SLIDE 23

Storage+Aspect+Ratio+(SAR)

  • An example of this might be a 16:9 display

showing video stored with a frame size of 1280 x 720 pixels. Both have the same aspect ratio.

  • In other cases, the video may be stored with an

aspect ratio SAR that does not match the display.

  • Here, the process of displaying the video involves

distorting the SAR to make it match the DAR.

23

slide-24
SLIDE 24

Storage+Aspect+Ratio+(SAR)

  • An example of this might be a 16:9 display

showing video stored with a frame size of 720 x 480 pixels.

  • The SAR is 720:480 = 3:2, an aspect ratio which

does not match the 16:9 display.

  • The stored video must be stretched horizontally
  • r squeezed vertically to match the display.

24

slide-25
SLIDE 25

Pixel+Aspect+Ratio+(PAR)

  • The

latter situation is

  • ften

referred to as anamorphic video.

  • To correct for it, we introduce a third type of

aspect ratio, the Pixel Aspect Ratio (PAR).

  • The basic relationship between the three aspect

ratios is DAR = PAR x SAR.

25

slide-26
SLIDE 26

Pixel+Aspect+Ratio+(PAR)

  • Attention: we have previously defined a pixel as

the smallest single component of a digital image.

  • But, the definition of pixel is context?sensitive.
  • It could also refer, for instance, to "printed pixels"

in a page, photosensor elements in a digital camera or pixels on a display device.

  • The last one is considered when defining PAR.

26

slide-27
SLIDE 27

Pixel+Aspect+Ratio+(PAR)

  • In digital video, the pixels used on a display are

considered to be square (i.e. width = height).

  • If pixels are square, then the PAR is 1:1 and DAR

= SAR. If pixels are non?square, then the PAR is not 1:1 and acts as a correction factor.

  • Since DAR = SAR * PAR, if DAR is 16:9 and SAR

is 3:2, then PAR is 32:27.

27

slide-28
SLIDE 28

Phototransistor+Aspect+Ratio

  • Finally, we have the Phototransistor Aspect Ratio.
  • Each phototransistor is an

rectangular cell (e.g. and are about 2 μm each).

  • Ideally,EtheEPhototransistorEAspectERatioEEEEEE

shouldEbeEequalEtoE1E(i.e.EsquareEcells).

28

a×b a b a b

slide-29
SLIDE 29

Megapixels

  • TheEimageEresolutionEEEEEEEEEEEEEEEEEEEEE(=EnumberEofE

sensorEelements)EisEspecifiedEinEMegapixels (Mpx).E

  • For example, a 4?Mpx camera has ≈ 4,000,000

pixels for a 2272 x 1704 (4:3) image resolution.

  • Without further mentioning, the number of pixels

means “color pixels”. Observation: a large number

  • f pixels alone does not yet ensure image quality.

29

Ncols × Nrows

slide-30
SLIDE 30

Sensor+Noise+and+Bit+Depth

  • For example, more pixels means a smaller

sensor area per pixel, thus less light per sensor area and a worse signal?to?noise ratio (SNR).

  • In some cases, it is also important to have more

than just 8 bits per pixel value in one channel.

  • Example: it is of benefit to have 16 bits per pixel

in a grey?level image when doing stereo analysis.

30

slide-31
SLIDE 31

Color+Accuracy+

  • As we have seen, a digital camera uses an array
  • f millions of tiny phototransistors or light cavities

to record an image.

31

SensorEMatrixEorE CavityEArray

Source:&http://www.cambridgeincolour.com

slide-32
SLIDE 32

Color+Accuracy+

  • When you press your camera's shutter button

and the exposure begins, each of these is uncovered to collect and store photons.

32

PhototransistorsEorE LightECavities

Source:&http://www.cambridgeincolour.com

slide-33
SLIDE 33

Color+Accuracy+

  • Once the exposure finishes, the camera closes

each of these phototransistors, and then tries to assess how many photons fell into each.

  • The relative quantity of photons in each cavity is

then sorted into various intensity levels.

  • The precision is determined by the bit depth (0 ?

255 for an 8?bit image).

33

slide-34
SLIDE 34

Color+Accuracy+

  • However, that illustration would only create grey

scale images, since these cavities are unable to distinguish how much they have of each color.

  • Well, so how can a digital camera capture color

images?

  • One way to achieve this goal is by using the

Bayer pattern or Bayer filter mosaic.

34

slide-35
SLIDE 35

Color+Accuracy+

  • A Bayer filter mosaic is a color filter array for

arranging RGB color filters on the square grid of

  • phototransistors. See the figure below.

35 Source:&R.&Klette

ItEisEplacedEoverEeachEcavityEthatE permitsEonlyEparticularEcolorsEofElight.

slide-36
SLIDE 36

Color+Accuracy+

  • Its particular arrangement of color filters is used

in most current single?chip digital image sensors.

  • Note that the filter pattern is 50% green, 25% red

and 25% blue.

  • It uses twice as many green elements as red or

blue to mimic the physiology of the human eye, which is most sensitive to green light.

36

slide-37
SLIDE 37

Color+Accuracy+

  • Since each cavity is filtered to record only one

color, the data from each cavity cannot fully specify each of the red, green, and blue values.

  • So, to obtain a full?color image, demosaicing

algorithms can be used to interpolate a set of red, green, and blue values for each pixel.

37

slide-38
SLIDE 38

Color+Accuracy+

  • For example, a pixel with a green filter provides

an exact measurement of the green component.

  • However, by using its two red neighbors, we can

interpolate its red value. Also two blue neighbors can be interpolated to yield the its blue value.

38 Source:&R.&Klette

slide-39
SLIDE 39

Color+Accuracy+

39

WhatEtheEcameraEseesE (throughEaEBayerEfilter). FullEcolorE image.

slide-40
SLIDE 40

Color+Accuracy+

  • The color accuracy of a digital camera may be

evaluated by using a color checker.

  • A color checker is a chart of squares showing

different grey?levels or color values.

40 Source:&R.&Klette

slide-41
SLIDE 41

Color+Accuracy+

  • To evaluate color accuracy, take an image of

such a chart, under diffuse illumination (to reduce the impact of lighting on color appearance).

  • Position a window within one patch of the

acquired image. For instance, the red one.

41 Source:&R.&Klette

slide-42
SLIDE 42

Color+Accuracy+

  • The histogram of such a window (if a color patch,

then three histograms) should describe a “thin peak” for a camera with high color accuracy.

42 Source:&R.&Klette

slide-43
SLIDE 43

Lens+Distortion+

  • Lens distortion refers to an optical aberration that

deforms and bends physically straight lines and makes them appear curvy in images.

  • So, lens distortion is determined by the optical

design of the lens.

  • Example: lenses with larger fields of view will

generally exhibit greater amounts of distortion.

43

slide-44
SLIDE 44

Lens+Distortion+

  • Lens

distortion simply misplaces information geometrically and can be calculated or mapped

  • ut of an image.
  • Two common types of lens distortion are: barrel

distortion and pincushion distortion.

  • Let’s examine each one in more detail, but

before, let’s see a lens with zero distortion.

44

slide-45
SLIDE 45

Lens+Distortion+

45 Source:&https://photographylife.com

SuchE“perfect”ElensesE areEveryErare.

slide-46
SLIDE 46

Lens+Distortion+

  • Because the barrel and pincushion distortions

are radially symmetric, or approximately so, they are also called radial distortions.

  • When straight lines are curved inwards in a

shape of a barrel, this type of aberration is called barrel distortion.

  • Let’s see an example of strong barrel distortion.

46

slide-47
SLIDE 47

Lens+Distortion+

47 Source:&https://photographylife.com

StraightElinesEareE visiblyEcurvedEinwards.

slide-48
SLIDE 48

Lens+Distortion+

  • Barrel distortion is commonly seen on wide angle

lenses (informally: short?focal?length lenses).

  • It happens because the field of view of the lens is

much wider than the size of the image sensor and hence it needs to be “squeezed” to fit.

  • Fixing

barrel distortion is usually a pretty straightforward process.

48

slide-49
SLIDE 49

Lens+Distortion+

49

BarrelEdistortionE causedEbyEtheEusageE

  • fEaEwideEangleElens.
slide-50
SLIDE 50

Lens+Distortion+

  • The pincushion distortion is the exact opposite of

barrel distortion, that is, straight lines are curved

  • utwards from the center.
  • It

is commonly seen

  • n

telephoto lenses (informally: long?focal?length lenses).

  • Now, the field of view is smaller than the image

sensor’s size and it needs to be “stretched” to fit.

50

slide-51
SLIDE 51

Lens+Distortion+

51 Source:&https://photographylife.com

StraightElinesEareE visiblyEcurvedE

  • utwards.
slide-52
SLIDE 52

Lens+Distortion+

52

PincushionEdistortionE causedEbyEtheEusageE

  • fEaEtelephotoElens.

Source:&S.&Rutherford&/&Tom’s&guide&

slide-53
SLIDE 53

Central+Projection

  • Ignoring radial distortions, a projection through a

small hole can be described by the theoretical model of a pinhole camera.

  • In this model, the diameter of the hole is

assumed to be “very close” to zero.

  • Existing pinhole cameras use indeed very small

pinholes and long exposure times.

53

slide-54
SLIDE 54

Make+a+Pinhole+Camera

54 Source:&https://www.youtube.com/watch?v=0Bx8P4EvYLA

slide-55
SLIDE 55

Central+Projection

  • The figure below illustrates the model of a

pinhole camera.

55

TheEmodelEcontainsEanE imageEplaneEofEwidthEW andEviewingEangleEα.

=Epinhole

ForEavoidingEtop?downE reversedEimages,EtheE projectionEcenterEisEbehindE theEimageEplane.

Source:&R.&Klette

slide-56
SLIDE 56

Central+Projection

  • The subscript s for the

camera coordinate system comes from sensor.

  • The camera is a particular sensor for measuring

data in the 3D world. A laser range?finder or radar are other examples of sensors.

  • The

?axis points into the world, called the optic axis.

56

XsYsZs Zs

slide-57
SLIDE 57

Central+Projection

  • The figure below illustrates the model of a

pinhole camera.

57

=Epinhole

Source:&R.&Klette

slide-58
SLIDE 58

Central+Projection

  • Because we exclude the consideration of radial

distortion, we have undistorted projected points in the image plane with coordinates and .

  • The distance

between the image plane and the projection center is the focal length. An ideal pinhole camera has a viewing angle of:

58

xu f yu xuyu

α = 2arctan W 2f

slide-59
SLIDE 59

Central+Projection

59

W f α

W 2 W 2

α1 α2

ProjectionECenter ImageEPlane

tanα1 = W 2 f tanα2 = W 2 f α1 = arctan W 2 f α2 = arctan W 2 f α =α1 +α2 = 2⋅arctan W 2 f

slide-60
SLIDE 60

Central+Projection

  • For example, for

mm and mm, the horizontal viewing angle equals about .

  • This model of a pinhole camera disregards the

wave nature of light by assuming ideal geometric rays.

60

W = 36 α =104.25! f =14

slide-61
SLIDE 61

Central+Projection

  • It

also assumes that

  • bjects

are in focus, whatever their distance is to the camera.

  • TheE

cameraEcoordinateEsystemEcanEbeE usedEforErepresentingEanyEpointEinEtheE3DEworld.E

  • AEpointEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEinEtheEworldEisE

mappedEbyEtheEcentralEprojectionEintoEaEpixelE locationE inEtheEimageEplane.

61

XsYsZs P =[ Xs Ys Zs ]T p =[ xu yu ]T

slide-62
SLIDE 62

Central+Projection

  • The figure below on the left illustrates this process.

62 Source:&R.&Klette

P =[ Xs Ys Zs ]T

p =[ xu yu ]T

slide-63
SLIDE 63

Central+Projection+Equations

  • The ray theorem of elementary geometry tells us

that to (of point ) is the same as (of pixel location ) to (of point ).

  • This also happens with ratios in the

plane. Thus, we have that:

63

f

Zs P xu

p Xs

P YsZs

xu = f Xs Zs and yu = f Ys Zs

slide-64
SLIDE 64

The+Principal+Point

  • The figure below illustrates that the optic axis

intersects the image plane somewhere close to its center.

64 Source:&R.&Klette

slide-65
SLIDE 65

The+Principal+Point

  • However, in our assumed

image coordinate system we have the coordinate origin in the upper left corner of the image.

  • Therefore, the coordinate origin is not somewhere

close to the image center, as it occurs for the image coordinates.

65

xy xuyu

slide-66
SLIDE 66

The+Principal+Point

  • LetEE

beEtheEintersectionEpointEofEtheE

  • pticEaxisEwithEtheEimageEplaneEinEE

coordinates.

  • This point

is called the principal point in the image plane, and it is determined by camera calibration. It follows that:

66

[ cx cy ]T xy xy [ cx cy ]T

[ x y ]T =[ xu +cx yu +cy ]T =[ f ⋅ Xs Zs +cx f ⋅Ys Zs +cy ]T

slide-67
SLIDE 67

The+Principal+Point

67

ImageEPlane

(0,0)

x y

c =[ cx cy ]T pu =[ xu yu ]T p = c+ pu =[ x y ]T =[ cx + xu cy + yu ]T

TheEpixelElocationEE inEourE 2DE imageEcoordin.EsystemE alsoEhasEtheE3DEcoordinatesEEE inEtheEEE cameraEcoordinateEsystem.E

xy

XsYsZs

[ x y ]T

[ x −cx y −cy f ]T

slide-68
SLIDE 68

A+TwoMCamera+System

  • For understanding the 3D geometry of a scene, it

is convenient to use more than just one camera.

  • The extraction of 3D information from multiple

cameras is commonly called Stereo Vision.

  • If we use two or more cameras in a computer

vision application, then they should be as identical as possible for avoiding unnecessary difficulties.

68

slide-69
SLIDE 69

A+TwoMCamera+System

  • Calibration will then allow us to have virtually two

identical copies of the same camera.

  • The base distance

is the translational distance between the projection centers of two cameras.

69

b

A stereo camera rig on a suction pad with base distance b.

Source:&R.&Klette

slide-70
SLIDE 70

A+TwoMCamera+System

  • The figure below shows a quadcopter where the

forward?looking integrated stereo camera system has a base distance of 110 mm.

70

Forward?lookingE stereoEcameraEsystem.

Source:&R.&Klette

slide-71
SLIDE 71

Canonical+Stereo+Geometry

  • Assume that we have two virtually identical

cameras perfectly aligned as illustrated below.

71 Source:&R.&Klette

P =[ Xs Ys Zs ]T

slide-72
SLIDE 72

Canonical+Stereo+Geometry

  • We describe each camera by using the model of a

pinhole camera.

  • The canonical stereo geometry of two cameras is

characterized by a copy of the camera on the left translated by the distance .

  • This translation occurs along the

?axis of the camera coordinate system of the left camera.

72

b Xs

slide-73
SLIDE 73

Canonical+Stereo+Geometry

  • The projection center of the left camera is at

and the projection center of the cloned right camera is at . In other words, we have:

Two coplanar images of identical size K Parallel optic axesK An identical focal length K Collinear image rows.

73

(0,0,0) (b,0,0) Ncols × Nrows f

slide-74
SLIDE 74

A+TwoMCamera+System

  • Let’s now apply the central projection equations

for both cameras.

  • InEthisEcase,EaE3DEpointE

inE theEEEEEEEEEEEEcoordinateEsystemEofEtheEleftEcameraEisE mappedEintoEundistortedEimageEpoints:

74

XsYsZs P =[ Xs Ys Zs ]T

puL =[ xuL yuL ]T =[ f ⋅ Xs Zs f ⋅Ys Zs ]T puR =[ xuR yuR ]T =[ f ⋅(Xs − b) Zs f ⋅Ys Zs ]T

slide-75
SLIDE 75

Central+Projection

75

xuL Xs = f Zs ⇒ xuL = f ⋅ Xs Zs

f

puL =[ xuL yuL ]T

b

f

puR =[ xuR yuR ]T

Xs Zs

xuR Xs − b = f Zs ⇒ xuR = f ⋅(Xs − b) Zs

yuL = yuR = f ⋅Ys Zs

P =[ Xs Ys Zs ]T

slide-76
SLIDE 76

A+TwoMCamera+System

  • CalibrationEhasEtoEprovideEaccurateEvaluesEforE

andE forEbeingEableEtoEuseEtheEequationsEbelowE whenEdoingEstereoEvision.

76

b f

puL =[ xuL yuL ]T =[ f ⋅ Xs Zs f ⋅Ys Zs ]T puR =[ xuR yuR ]T =[ f ⋅(Xs − b) Zs f ⋅Ys Zs ]T

slide-77
SLIDE 77

Panoramic+Camera+Systems

  • Panoramic imaging sensor technology has been

applied in several different areas.

  • Panoramic camera systems can either record a

wide?angle image in one shot or are designed for recording multiple images.

  • Those multiple images are further stitched or

combined into one wide?angle image.

77

slide-78
SLIDE 78

Omnidirectional+Camera+System

  • Omnidirectional camera systems are examples of

panoramic imaging sensor technology.

  • They have either a 360?degree field of view, in the

horizontal plane, or a visual field that covers a hemisphere or (approximately) the entire sphere.

  • They are important in areas where large visual

field coverage is needed, such as robotics.

78

slide-79
SLIDE 79

Omnidirectional+Camera+System

79

AEfish?eyeE camera. AEdigitalEcameraEwithE aEhyperboloidal? shapedEmirror

Source:&R.&Klette

slide-80
SLIDE 80

Omnidirectional+Camera+System

80

WithEaE singleEmirrorE atEaEmobileE robot.E

  • 1. Camera.
  • 2. UpperEmirror.
  • 3. LowerEmirror.
  • 4. BlindEspot.
  • 5. FieldEofEview.E

Source:&FU7Fighters&Middle7Size&Robot&2005. Source:&Jahobr (Wikipedia).

slide-81
SLIDE 81

Omnidirectional+Camera+System

81 Source:&https://www.youtube.com/watch?v=0ZAuSFymeQY&– Published&on&January&3rd,&2016.

slide-82
SLIDE 82

Catadioptric X+Dioptric+Systems

  • Omnidirectional imaging can be classified into

catadioptric or dioptric systems.

  • A

catadioptric system combines a standard camera with a shaped mirror, such as a parabolic, hyperbolic, or elliptical mirror.

  • Therefore, it provides a 360?degree field of view

in the horizontal plane.

82

slide-83
SLIDE 83

Catadioptric Systems

83

UsedEforEobtaining panoramicEimages UsedEforE surfaceE reconstruction

slide-84
SLIDE 84

Catadioptric Systems

  • A mapping of a captured wide?angle field of view

into a cylindric panorama is a solution to support common subsequent image analysis.

  • Single?center cylindric images have perspective?

like appearance.

  • Stitching in this case is not required, as the entire

view is just one image.

84

slide-85
SLIDE 85

Dioptric+Systems

  • Dioptric systems use a combination of shaped

lenses (e.g., fisheye lenses) and can reach a field

  • f view even bigger than 180 degrees.
  • Fisheye cameras have been used in applications,

such as, surgical operations or on board in microaerial vehicles for pipeline inspection.

85

slide-86
SLIDE 86

Dioptric+Systems+(Fisheye)

86

UsedEbyEvideo? basedEsurveillanceE systems UsedEforEpeopleE counting

Source:&:&http://business.panasonic.com/

slide-87
SLIDE 87

SensorMLine+Camera+Systems

  • A rotating sensor?line camera produces cylindric

panoramas when used in a configuration as follows.

87

ItEcontainsEaEsmallEturntableE(forE selectingEaEviewingEangleEω),EwhichE isEonEanEextensionEslide,EthatEallowsE usEtoEchoseEaEdistanceER fromEtheE rotationEcenterEofEa bigEturntable.

Source:&R.&Klette

slide-88
SLIDE 88

SensorMLine+Camera+Systems

  • ThatEsensor?lineEcameraEcapturesEinEaEsingleEshotE

pixels.E

  • It records subsequently (say,

times) images during one rotation.

  • Thus, at the end, we merge those

line? images into one array?image.

88

1× Nrows Ncols Ncols Ncols × Nrows

slide-89
SLIDE 89

SensorMLine+Camera+Systems

  • AEbenefitEisEthatEtheElengthEE
  • fEtheEsensorElineE

canEbeEseveralEthousandsEofEpixels.E

  • The sensor?line camera records 360o panoramic

images within a time frame needed for taking individual shots during one full rotation.

  • Next figure shows a 56,580 x 10,200 panorama

captured by a rotating sensor?line camera.

89

Nrows Ncols

slide-90
SLIDE 90

SensorMLine+Camera+Systems

90

AEpanoramicEimageE

  • fEAucklandECBDE

recordedEfromEtheEtopE

  • fEAucklandEHarbour

BridgeEusingEaE sensor?lineEcamera.E

Source:&R.&Klette

slide-91
SLIDE 91

Next+Lecture

  • Coordinates

World Coordinates. Homogeneous Coordinates. Camera

  • Calibration. Rectification of Stereo Image Pairs.
  • Suggested reading

Section 6.2 of textbook.

91