VIDEO SIGNALS VIDEO SIGNALS Corners and Shapes PROJECTION OF - - PowerPoint PPT Presentation

video signals video signals
SMART_READER_LITE
LIVE PREVIEW

VIDEO SIGNALS VIDEO SIGNALS Corners and Shapes PROJECTION OF - - PowerPoint PPT Presentation

VIDEO SIGNALS VIDEO SIGNALS Corners and Shapes PROJECTION OF VECTORS PROJECTION OF VECTORS any vector x can be represented as a linear b d li combination of direction vectors of the coordinate system x = a i + a i + a i x = a 1 i 1


slide-1
SLIDE 1

VIDEO SIGNALS VIDEO SIGNALS

 Corners and Shapes

slide-2
SLIDE 2

PROJECTION OF VECTORS PROJECTION OF VECTORS

b d li

 any vector x can be represented as a linear

combination of direction vectors of the coordinate system x = a i + a i + a i x = a1i1 + a2i2 + a3i3

 orthogonal projection of x onto each axis

produces components aj = ij

Tx

 • rotation of coordinate system produces a  • rotation of coordinate system produces a

new system, in which each axis direction vector can be represented as a linear combination of direction vectors of the first system

 columns of matrix M are the direction vectors

  • f the new system

di t f th t ft

 • coordinates of the vector after

transformation

y y = Mx

slide-3
SLIDE 3

THE CORRELATION MATRIX THE CORRELATION MATRIX

 correlation c of zero-mean random variables x and y

 quantifies linear statistical dependency  quantifies linear statistical dependency  -1 ≤ c ≤ 1  c = 0: uncorrelated  c 1: complete positive correlation

1

N T

R

 c = 1: complete positive correlation  c = -1: complete negative correlation

1 T k k k

R x x N

 correlation matrix C of n-dimensional data x

 – size nxn  computed through covariance matrix R

ij ii jj

c r r 

zero-centered data: Σ xi = 0

 quantifies linear statistical dependencies of n random variables

1 ≤ ≤ 1 l ti f t t i d j

 -1 ≤ cij≤ 1: correlation of vector components i and j  cij ii = 1

 cij= cji

slide-4
SLIDE 4

Eivectors Eivectors and and Eigenvalues Eigenvalues

We can interpret this correlation as an ellipse whose major axis is one

Eivectors Eivectors and and Eigenvalues Eigenvalues

eigenvalue and the minor axis length is the other: No correlation yields a circle, and perfect correlation yields a line.

slide-5
SLIDE 5

THE PRINCIPAL COMPONENTS THE PRINCIPAL COMPONENTS

30

 All principal components

(PCs) start at the origin of the ordinate axes

20 25 30

PC 1

the ordinate axes.

 First PC is direction of

maximum variance from

5 10 15

PC 1

maximum variance from

  • rigin

 Subsequent PCs are

5 10 15 20 25 30 5

 Subsequent PCs are

  • rthogonal to 1st PC and

describe maximum

20 25 30

residual variance

10 15 20

PC 2

5 10 15 20 25 30 5

slide-6
SLIDE 6

ALGEBRAIC INTERPRETATION

 Given m points in a n dimensional space for large n  Given m points in a n dimensional space, for large n,

how does one project on to a low dimensional space while preserving broad trends in the data and allowing it while preserving broad trends in the data and allowing it to be visualized?

slide-7
SLIDE 7

ALGEBRAIC INTERPRETATION – 1D

 Given m points in a n dimensional space, for large n, how does

  • ne project on to a 1 dimensional space?

 Choose a line that fits the data so the points are spread out well

along the line

slide-8
SLIDE 8

ALGEBRAIC INTERPRETATION – 1D

 Formally, minimize sum of squares of distances to the line.  Why sum of squares? Because it allows fast minimization  Why sum of squares? Because it allows fast minimization.

slide-9
SLIDE 9

ALGEBRAIC INTERPRETATION – 1D

Mi i i i f f di t t th li i th

 Minimizing sum of squares of distances to the line is the same

as maximizing the sum of squares of the projections on that line thanks to Pythagoras line, thanks to Pythagoras.

slide-10
SLIDE 10

ALGEBRAIC INTERPRETATION – 1D

H i h f f j i l h

 How is the sum of squares of projection lengths

expressed in algebraic terms?

Point 1 Point 2 L i P P P … P t t t … t L i n e Point 3 : P i t n e 1 2 3 … m Point m B x BT xT

slide-11
SLIDE 11

PCA: GENERAL

From k original coordinates: x1,x2,...,xk: Produce k new coordinates: y1,y2,...,yk:

1 2 k

y = a x + a x + + a x y1 = a11x1 + a12x2 + ... + a1kxk y2 = a21x1 + a22x2 + ... + a2kxk ... yk = ak1x1 + ak2x2 + ... + akkxk

k k1 1 k2 2 kk k

slide-12
SLIDE 12

PCA: GENERAL

From k original coordinates: x1,x2,...,xk: Produce k new coordinates: y1,y2,...,yk: y1 = a11x1 + a12x2 + ... + a1kxk

1 11 1 12 2 1k k

y2 = a21x1 + a22x2 + ... + a2kxk ... yk = ak1x1 + ak2x2 + ... + akkxk such that: 's are ncorrelated (orthogonal) yk's are uncorrelated (orthogonal) y1 explains as much as possible of original variance in data set y2 explains as much as possible of remaining variance y2 explains as much as possible of remaining variance etc.

slide-13
SLIDE 13

PCA: 2D REPRESENTATION

5

1st Principal 2nd Principal Component, y2

4

1st Principal Component, y1 p y2

3 4.0 4.5 5.0 5.5 6.0 2

slide-14
SLIDE 14

PCA SCORES

5

xi2 y

4

xi2 yi,1 yi,2

3 2 4.0 4.5 5.0 5.5 6.0 2

xi1

slide-15
SLIDE 15

PCA EIGENVALUES

5

λ λ2

4

λ1

2

3 4.0 4.5 5.0 5.5 6.0 2

slide-16
SLIDE 16

PCA: ANOTHER EXPLANATION

From k original coordinates: x x x : From k original coordinates: x1,x2,...,xk: Produce k new coordinates: y1,y2,...,yk: y a x + a x + + a x y1 = a11x1 + a12x2 + ... + a1kxk y2 = a21x1 + a22x2 + ... + a2kxk yk's are P i i l C t ... yk = ak1x1 + ak2x2 + ... + akkxk Principal Components such that: 's are ncorrelated (orthogonal) yk's are uncorrelated (orthogonal) y1 explains as much as possible of original variance in data set y2 explains as much as possible of remaining variance y2 explains as much as possible of remaining variance etc.

slide-17
SLIDE 17

PCA: GENERAL

{a11,a12,...,a1k} is 1st Eigen Eigenvect ector of correlation/covariance matrix, and coef coefficients cients of first principal component first principal component {a21,a22,...,a2k} is 2nd Eigen Eigenvect ector of {a21,a22,...,a2k} is 2nd Eigen Eigenvect ector of correlation/covariance matrix, and coef coefficients cients of 2nd principal component … { } i kth Eig Eig t f l ti / i {ak1,ak2,...,akk} is kth Eig Eigen envector

  • r of correlation/covariance

matrix, andcoef coefficients cients of kth principal component

slide-18
SLIDE 18

HARRIS CORNER DETECTOR HARRIS CORNER DETECTOR

 Many applications benefit from features localized in (x y)  Many applications benefit from features localized in (x,y)  Edges well localized only in one direction -> detect corners

f

 Desirable properties of corner detector

 Accurate localization  Invariance against shift rotation scale brightness change  Invariance against shift, rotation, scale, brightness change  Robust against noise, high repeatability

slide-19
SLIDE 19

WHAT PATTERNS CAN BE LOCALIZED MOST ACCURATELY?

 Local displacement sensitivity  Local displacement sensitivity

Li i ti f ll ∆ ∆

 Linear approximation for small ∆x,∆y

     

 

2 ( )

, , ,

x y i d

x S x y f x y f x y y                 

 Iso-sensitivity curves are ellipses

( , ) x y window

y

    

y

slide-20
SLIDE 20

HARRIS CRITERIUM HARRIS CRITERIUM

 Often based on eigenvalues λ1, λ2 of “structure

matrix” (or ”normal matrix” or “second-moment matrix”)

2

slide-21
SLIDE 21

HARRIS CORNER VALUES HARRIS CORNER VALUES

2

slide-22
SLIDE 22

KEYPOINT DETECTION: INPUT KEYPOINT DETECTION: INPUT

slide-23
SLIDE 23

HARRIS CORNERNESS HARRIS CORNERNESS

slide-24
SLIDE 24

THRESHOLDED CORNERNESS THRESHOLDED CORNERNESS

slide-25
SLIDE 25

LOCAL MAXIMA OF CORNERNESS LOCAL MAXIMA OF CORNERNESS

slide-26
SLIDE 26

SUPERIMPOSED KEYPOINTS SUPERIMPOSED KEYPOINTS

slide-27
SLIDE 27

ROBUSTNESS OF HARRIS CORNER DETECTOR

I i b i h ff f( ) f( )

 Invariant to brightness offset: f(x,y) → f(x,y) + c  Invariant to shift and rotation  Not invariant to scaling  Not invariant to scaling

slide-28
SLIDE 28

HOUGH TRANSFORM HOUGH TRANSFORM

G l i li i i

Goal: recognize lines in images

Approach:

For every point in the starting image plot the sinusoid on the dual plane (parameter space):

) , (  

ρ=x*cos(ϑ)+y*sin(ϑ) where x and y are fixed (the considered point coordinates) while ρ and ϑ are variables.

The Hough Transform of an image with K lines is the sum of many sinusoids intersecting in K points sinusoids intersecting in K points.

Maxima in the dual plane indicate the parameters of the k lines

slide-29
SLIDE 29

HOUGH IMPLEMENTATION HOUGH: IMPLEMENTATION

C id di ti ti f th d l l f

 Consider a discretization of the dual plane for

the parameters (ρ,ϑ): it becomes a matrix whose raw and column indices correspond to the quantized values of ρ and ϑ. q ρ

 The limits of ρ are chosen accordingly to the

image size image size. Usually: -ρmax ≤ ρ ≤ ρmax, -π/2 ≤ ϑ ≤ π/2

max max

slide-30
SLIDE 30

HOUGH IMPLEMENTAZION HOUGH: IMPLEMENTAZION

Cl h i H( )

 Clear the matrix H(m,n);  Fro every point P(x,y) of the image

1 f ϑ h f /2 /2 i h dϑ

 1. for ϑn that ranges from -π/2 to π/2 with step dϑ  1. Evaluate ρ(n)=x*cos(ϑn)+y*sin(ϑn)  2 find the index m corresponding to ρ(n)  2. find the index m corresponding to ρ(n)  3. Increase H(m,n)  2. end

 end  4. Find local maxima in H(.,.) that will corresponds to

parameters of the founded lines

slide-31
SLIDE 31

HOUGH TRASFORM HOUGH TRASFORM

 5 points

slide-32
SLIDE 32

HOUGH TRASFORM HOUGH TRASFORM

li   

 line 

Periodic

slide-33
SLIDE 33

HOUGH TRASFORM HOUGH TRASFORM

li   

 line 

slide-34
SLIDE 34

HOUGH TRASFORM HOUGH TRASFORM

li   

 line 

slide-35
SLIDE 35

HOUGH TRASFORM HOUGH TRASFORM

D tt d li

 Dotted line

slide-36
SLIDE 36

HOUGH TRASFORM HOUGH TRASFORM

S t t ith diff t i t ti

 Same text with different orientations

slide-37
SLIDE 37

HOUGH TRASFORM

 Noise and noiseless square

q

slide-38
SLIDE 38

HOUGH TRASFORM HOUGH TRASFORM

 Accumulation matrices of the previous images  Accumulation matrices of the previous images

slide-39
SLIDE 39

EXAMPLES EXAMPLES

slide-40
SLIDE 40

EXAMPLE EXAMPLE

slide-41
SLIDE 41

CIRCLE DETECTION BY HOUGH TRANSFORM CIRCLE DETECTION BY HOUGH TRANSFORM

Fi d i l f fi d di

 Find circles of fixed radius r  For circles of undetermined radius, use

3-d Hough transform for parameters (x0,y0,r)

slide-42
SLIDE 42

EXAMPLE: CIRCLE DETECTION BY HOUGH TRANSFORM