Distributed Sensing and Perception via Sparse Representation Allen - - PowerPoint PPT Presentation

distributed sensing and perception via sparse
SMART_READER_LITE
LIVE PREVIEW

Distributed Sensing and Perception via Sparse Representation Allen - - PowerPoint PPT Presentation

Introduction Face Recognition Fast 1 -Minimization Algorithms Distributed Object Recognition Conclusion Distributed Sensing and Perception via Sparse Representation Allen Y. Yang yang@eecs.berkeley.edu CIS Seminar, Johns Hopkins, 2010


slide-1
SLIDE 1

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Distributed Sensing and Perception via Sparse Representation

Allen Y. Yang yang@eecs.berkeley.edu CIS Seminar, Johns Hopkins, 2010

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-2
SLIDE 2

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Distributed Sensing and Perception: A Comparison

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-3
SLIDE 3

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Distributed Sensing and Perception: A Comparison

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality Design an intelligent system over a network that performs better than the sum of its parts?

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-4
SLIDE 4

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Challenges

1

Making real-time decisions on portable mobile devices is difficult.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-5
SLIDE 5

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Challenges

1

Making real-time decisions on portable mobile devices is difficult.

2

Applications demand extremely high accuracy: 99% Precision, 99% Recall?

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-6
SLIDE 6

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Challenges

1

Making real-time decisions on portable mobile devices is difficult.

2

Applications demand extremely high accuracy: 99% Precision, 99% Recall?

3

Scenarios demand the ability to reconstruct 3-D environments.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-7
SLIDE 7

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Smart Camera Platform: CITRIC v1

CITRIC platform Available library functions

1

Full support Intel IPP Library and OpenCV.

2

JPEG compression: 10 fps.

3

Edge detector: 3 fps.

4

Background Subtraction: 5 fps.

5

SIFT detector: 10 sec per frame.

Academic users:

Reference: AY, et al. “CITRIC: A low-bandwidth wireless camera network platform.” (submitted) ACM Trans. Sensor Networks, 2010.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-8
SLIDE 8

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Body Sensor Platform: DexterNet

1

Body Sensor Layer (BSL)

2

Personal Network Layer (PNL)

3

Global Network Layer (GNL)

Reference: AY, et al. “DexterNet: An open platform for heterogeneous body sensor networks and its applications.” Body Sensor Networks, 2009.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-9
SLIDE 9

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Outline

1

Robust face recognition with low-resolution, distorted, and disguised images

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-10
SLIDE 10

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Outline

1

Robust face recognition with low-resolution, distorted, and disguised images

2

Fast ℓ1-Minimization Algorithms x∗ = arg min

x

x1

  • subj. to b = Ax.

Augmented Lagrange Multiplier

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-11
SLIDE 11

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Outline

1

Robust face recognition with low-resolution, distorted, and disguised images

2

Fast ℓ1-Minimization Algorithms x∗ = arg min

x

x1

  • subj. to b = Ax.

Augmented Lagrange Multiplier

3

Distributed object recognition using a camera network

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-12
SLIDE 12

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Robust Face Recognition

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-13
SLIDE 13

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Classification of Mixture Subspace Model

1

Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] Assume b belongs to Class i in K classes. b = αi,1vi,1 + αi,2vi,2 + · · · + αi,n1vi,ni , = Aiαi.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-14
SLIDE 14

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Classification of Mixture Subspace Model

1

Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] Assume b belongs to Class i in K classes. b = αi,1vi,1 + αi,2vi,2 + · · · + αi,n1vi,ni , = Aiαi.

2

Nevertheless, Class i is the unknown label we need to solve: Sparse representation b = [A1, A2, · · · , AK ] 2 4

α1 α2

. . .

αK

3 5 = Ax.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-15
SLIDE 15

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Classification of Mixture Subspace Model

1

Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] Assume b belongs to Class i in K classes. b = αi,1vi,1 + αi,2vi,2 + · · · + αi,n1vi,ni , = Aiαi.

2

Nevertheless, Class i is the unknown label we need to solve: Sparse representation b = [A1, A2, · · · , AK ] 2 4

α1 α2

. . .

αK

3 5 = Ax.

3

x∗ = [ 0 ··· 0 αT

i

0 ··· 0 ]T ∈ Rn.

Sparse representation x∗ encodes membership!

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-16
SLIDE 16

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Image Corruption

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-17
SLIDE 17

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Image Corruption

1

Sparse representation + sparse error b = Ax + e

2

Occlusion compensation: b = `A | I´ „x e « = Bw

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-18
SLIDE 18

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Performance on the AR database

Reference: AY, et al. Robust face recognition via sparse representation. IEEE PAMI, 2009.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-19
SLIDE 19

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Face Alignment

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-20
SLIDE 20

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Face Alignment

Seek a 2-D transformation b ◦ τi = Aix + e. (1) Although x1 is no longer penalized, the problem becomes nonlinear.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-21
SLIDE 21

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Face Alignment

Seek a 2-D transformation b ◦ τi = Aix + e. (1) Although x1 is no longer penalized, the problem becomes nonlinear. Linear approximation: b ◦ τi + ∇τ(b ◦ τi) · ∆τi ≈ Aix + e. (2)

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-22
SLIDE 22

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Face Alignment

Seek a 2-D transformation b ◦ τi = Aix + e. (1) Although x1 is no longer penalized, the problem becomes nonlinear. Linear approximation: b ◦ τi + ∇τ(b ◦ τi) · ∆τi ≈ Aix + e. (2) Convert to a linear equation: b(k)

i

= [Ai, −J(k)

i

]w + e, (3) where w . = [xT , ∆τ T

i ]T .

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-23
SLIDE 23

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Demo I: Misalignment & Corruption Compensation

Alignment Demo

Reference: Wagner, et al. Towards a Practical Face Recognition System: Robust Registration and Illumination via Sparse Representation. CVPR, 2009.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-24
SLIDE 24

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Question: How to effectively estimate HD sparse signals?

“Black gold” age [Claerbout & Muir 1973, Taylor, Banks & McCoy 1979]

Figure: Deconvolution of spike train.

Basis pursuit [Chen-Donoho 1999]: x∗ = arg min x1, subject to b = Ax The Lasso (least absolute shrinkage and selection operator) [Tibshirani 1996] x∗ = arg min b − Ax2, subject to x1 ≤ k

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-25
SLIDE 25

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

ℓ0/ℓ1 Equivalence Relationship

ℓ0-Minimization over an underdetermined system (NP-Hard) x∗ = arg min

x

x0

  • subj. to b = Ax.

· 0 simply counts the number of nonzero terms.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-26
SLIDE 26

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

ℓ0/ℓ1 Equivalence Relationship

ℓ0-Minimization over an underdetermined system (NP-Hard) x∗ = arg min

x

x0

  • subj. to b = Ax.

· 0 simply counts the number of nonzero terms. ℓ1-Minimization (Linear Program) [Candes & Tao 2006, Donoho 2006] x∗ = arg min

x

x1

  • subj. to b = Ax.

x1 = |x1| + |x2| + · · · + |xn|.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-27
SLIDE 27

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

ℓ1-Minimization via Linear Programming

Using interior-point methods [Karmarkar ’84] Log-Barrier: min

x

1T x − µ

n

X

i=1

log xi,

  • subj. to Ax = b, x ≥ 0.

(4) Using the Karush-Kuhn-Tucker (KKT) conditions 1 − µX −11 − AT y = 0. (5) where x ≥ 0 are the primal variables, and y are the dual variables. Update by solving a linear system with O(n3) [Monteiro & Adler ’89] Z (k)∆x + X (k)∆z = ˆ µ1 − X (k)z(k), A∆x = 0, AT ∆y + ∆z = 0, (6)

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-28
SLIDE 28

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Fast ℓ1-minimization is still a difficult problem!!

Interior-point methods are very expensive in HD space.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-29
SLIDE 29

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

References

1

Primal-Dual Interior-Point Methods

Log-Barrier [Frisch 1955, Karmarkar 1984, Megiddo 1989, Monteiro-Adler 1989, Kojima-Megiddo-Mizuno 1993]

2

Homotopy Methods:

Homotopy [Osborne-Presnell-Turlach 2000, Malioutov-Cetin-Willsky 2005, Donoho-Tsaig 2006] Polytope Faces Pursuit (PFP) [Plumbley 2006] Least Angle Regression (LARS) [Efron-Hastie-Johnstone-Tibshirani 2004]

3

Gradient Projection Methods

Gradient Projection Sparse Representation (GPSR) [Figueiredo-Nowak-Wright 2007] Truncated Newton Interior-Point Method (TNIPM) [Kim-Koh-Lustig-Boyd-Gorinevsky 2007]

4

Iterative Thresholding Methods

Soft Thresholding [Donoho 1995] Sparse Reconstruction by Separable Approximation (SpaRSA) [Wright-Nowak-Figueiredo 2008]

5

Proximal Gradient Methods [Nesterov 1983, Nesterov 2007]

FISTA [Beck-Teboulle 2009] Nesterov’s Method (NESTA) [Becker-Bobin-Cand´ es 2009]

6

Augmented Lagrange Multiplier Methods [Yang-Zhang 2009, AY et al 2010]

YALL1 [Yang-Zhang 2009] Primal ALM, Dual ALM [AY et al 2010]

References: AY, et al., A review of fast ℓ1-minimization algorithms for robust face recognition. Submitted to SIAM Imaging Sciences, 2010.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-30
SLIDE 30

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Iterative Soft-Thresholding (IST) Methods

Objective: x∗ = arg min x1

  • subj. to e = b − Ax < ǫ

F(x) . = 1 2 b − Ax2

2 + λx1 = f (x) + λg(x)

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-31
SLIDE 31

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Iterative Soft-Thresholding (IST) Methods

Objective: x∗ = arg min x1

  • subj. to e = b − Ax < ǫ

F(x) . = 1 2 b − Ax2

2 + λx1 = f (x) + λg(x)

IST iteratively approximate the composite objective function x(k+1) ≈ arg minx{f (x(k)) + (x − x(k))T ∇f (x(k)) + ∇2f (x(k))

2

x − x(k)2

2 + λg(x)}

= arg minx{(x − x(k))T ∇f (x(k)) + α(k)I

2

x − x(k)2

2 + λg(x)}

where the hessian ∇2f (x) is approximated by a diagonal matrix αI.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-32
SLIDE 32

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Iterative Soft-Thresholding (IST) Methods

Objective: x∗ = arg min x1

  • subj. to e = b − Ax < ǫ

F(x) . = 1 2 b − Ax2

2 + λx1 = f (x) + λg(x)

IST iteratively approximate the composite objective function x(k+1) ≈ arg minx{f (x(k)) + (x − x(k))T ∇f (x(k)) + ∇2f (x(k))

2

x − x(k)2

2 + λg(x)}

= arg minx{(x − x(k))T ∇f (x(k)) + α(k)I

2

x − x(k)2

2 + λg(x)}

where the hessian ∇2f (x) is approximated by a diagonal matrix αI. A closed-form solution exists element-wise [Donoho ’95, Wright et al. ’08] x(k+1)

i

= arg min

xi { (xi − u(k) i

)2 2 + λ|xi| α(k) } = soft(u(k)

i

, λ α(k) )

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-33
SLIDE 33

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Augmented Lagrange Multiplier

ALM considers an augmented Lagrange function Lµ(x, y) = x1 + y, b − Ax + µ 2 b − Ax2

2,

where y are the Lagrange multipliers for the constraint b = Ax.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-34
SLIDE 34

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Augmented Lagrange Multiplier

ALM considers an augmented Lagrange function Lµ(x, y) = x1 + y, b − Ax + µ 2 b − Ax2

2,

where y are the Lagrange multipliers for the constraint b = Ax. It can be shown if y∗(µ) is optimal [Hestenes ’69, Powell ’69, Bertsekas ’03] x∗(µ) = arg min

x

Lµ(x, y∗); x∗∗ = lim

µ→∞ x∗(µ)

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-35
SLIDE 35

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Augmented Lagrange Multiplier

ALM considers an augmented Lagrange function Lµ(x, y) = x1 + y, b − Ax + µ 2 b − Ax2

2,

where y are the Lagrange multipliers for the constraint b = Ax. It can be shown if y∗(µ) is optimal [Hestenes ’69, Powell ’69, Bertsekas ’03] x∗(µ) = arg min

x

Lµ(x, y∗); x∗∗ = lim

µ→∞ x∗(µ)

Iteratively update x, y, and µ with O(dn): 8 < : xk+1 = arg minx Lµk (x, yk), yk+1 = yk + µk(b − Axk+1), µk+1 → ∞. .

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-36
SLIDE 36

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Demo II: Speed of ALM vs Interior-Point

Table: Source signal in 1000-D: sparsity = 200; random projection = 600-D.

Algorithm Estimate Runtime PDIPA 63 s ALM 0.16 s

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-37
SLIDE 37

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Distributed Object Recognition: Problem Statement

1

L camera sensors observe a single object in 3-D.

2

The relative positions between cameras are unknown, cross-sensor communication is prohibited.

3

On each camera, seek an encoding function for a high-dim, sparse xi (SIFT histogram) f : xi ∈ RD → bi ∈ Rd

4

At the base station, upon receiving b1, b2, · · · , bL, simultaneously recover x1, x2, · · · , xL, and classify the object class in space.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-38
SLIDE 38

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Key Observations: Scale Invariant Feature Transform

(a) Histogram 1 (b) Histogram 2

All SIFT histograms are nonnegative and sparse.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-39
SLIDE 39

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Key Observations: Scale Invariant Feature Transform

(a) Histogram 1 (b) Histogram 2

All SIFT histograms are nonnegative and sparse. Multiple-view histograms share joint sparse patterns.

Reference: AY, et al. Multiple-view object recognition in smart camera networks. Springer, 2010.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-40
SLIDE 40

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

The System

bi = Axi, where x is assumed sparse.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-41
SLIDE 41

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Joint Sparsity Model

Definition: Joint Sparsity Model [Baron et al. 2005] x1 = xc + z1, . . . xL = xc + zL. xc is called the common component, and zi is called an innovation.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-42
SLIDE 42

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Joint Sparsity Model

Definition: Joint Sparsity Model [Baron et al. 2005] x1 = xc + z1, . . . xL = xc + zL. xc is called the common component, and zi is called an innovation. Recovery of the JS model 2 4

b1

. . .

bL

3 5 = 2 4

A1 A1 ···

. . . ... ...

AL ··· AL

3 5 2 6 4

xc z1

. . .

zL

3 7 5 ⇔ b′ = A′x′ ∈ RdL.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-43
SLIDE 43

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Joint Sparsity Model

Definition: Joint Sparsity Model [Baron et al. 2005] x1 = xc + z1, . . . xL = xc + zL. xc is called the common component, and zi is called an innovation. Recovery of the JS model 2 4

b1

. . .

bL

3 5 = 2 4

A1 A1 ···

. . . ... ...

AL ··· AL

3 5 2 6 4

xc z1

. . .

zL

3 7 5 ⇔ b′ = A′x′ ∈ RdL.

1

New histogram vector remains nonnegative and sparse.

2

Joint sparsity xc is automatically determined by ℓ1-min: No prior training, no assumption about fixing cameras and calibration.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-44
SLIDE 44

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Berkeley Multiview Wireless (BMW) Database

20 landmarks at UC Berkeley. 16 different vantage points (large baseline); five images at one location (short baseline). Low-quality images: low resolution, inaccurate focal length, dusty lenses.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-45
SLIDE 45

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Experiment: Accuracy on BMW Database

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-46
SLIDE 46

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Experiment: Accuracy on BMW Database

Better recognition rate is achieved using multiple views but less bandwidth!

Reference: AY, et al. Towards an efficient distributed object recognition system in wireless smart camera networks. in Information Fusion, 2010.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-47
SLIDE 47

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

A distributed network shall be greater than the sum of its parts!

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-48
SLIDE 48

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

A distributed network shall be greater than the sum of its parts!

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality Our approach to the unique challenges:

1

How to design real-time recognition systems in sensor networks? A: Efficient numerical solvers plus new computational models: parallel & cloud computing.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-49
SLIDE 49

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

A distributed network shall be greater than the sum of its parts!

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality Our approach to the unique challenges:

1

How to design real-time recognition systems in sensor networks? A: Efficient numerical solvers plus new computational models: parallel & cloud computing.

2

How to achieve extremely high accuracy? A: Pay attention to special structures in HD data imposed by applications; Simple solutions are often the best (e.g., sparse representation).

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-50
SLIDE 50

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

A distributed network shall be greater than the sum of its parts!

Centralized Perception Up: powerful processors Up: unlimited memory Up: unlimited bandwidth Down: single modality Distributed Perception Down: mobile processors Down: limited onboard memory Down: band-limited communications Up: distributed, multi-modality Our approach to the unique challenges:

1

How to design real-time recognition systems in sensor networks? A: Efficient numerical solvers plus new computational models: parallel & cloud computing.

2

How to achieve extremely high accuracy? A: Pay attention to special structures in HD data imposed by applications; Simple solutions are often the best (e.g., sparse representation).

3

How to provide the ability to reconstruct 3-D using mobile smart cameras? A: Don’t just label the images; Take advantage of available information in 3-D geometry (e.g., joint sparsity).

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-51
SLIDE 51

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Sparse Representation is “the Next Wave”?

Single-Pixel Camera for Deep-Space Imaging [Baraniuk 2008] Background Subtraction [Chellappa 2008] MRI Imaging [Lustig 2007] Robust PCA [Candes 2009]

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-52
SLIDE 52

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

References

Acknowledgments UC Berkeley

Dr S. Sastry, Dr R. Bajcsy, Dr E. Seto, Dr T. Darrell, Dr J. Malik, N. Naikal, V. Shia,

  • P. Yan.
  • Univ. Illinois
  • A. Ganesh, Z. Zhou, A. Wagner

MSR Asia

Dr Y. Ma, Dr J. Wright

Funding Support

ARO MURI: Heterogeneous Sensor Networks in Urban Terrains ARL: Micro Autonomous Systems and Technology

Patents

Yang, et al. “Recognition via High-Dimensional Data Classification.” US & China Patent, 2009. Yang, et al. “System for Detection of Body Motion.” US Patent, 2010.

Publications

Wright, Yang, Ganesh, Sastry, Ma. “Robust face recognition via sparse representation,” IEEE PAMI, 2009. Yang, Gastpar, Bajcsy, Sastry. “Distributed Sensor Perception via Sparse Representation.” Proceedings of IEEE, 2010. Naikal, Yang, Sastry. “Towards an efficient distributed object recognition system in wireless smart camera networks.” Information Fusion, 2010. Yang, Ganesh, Zhou, Sastry, Ma.“A review of fast ℓ1-minimization algorithms in robust face recognition”, arXiv, 2010. Ganesh, Ma, Wagner, Wright, Yang, “Robust face recognition by sparse representation,” (submitted) Cambridge Press, 2010.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-53
SLIDE 53

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Feasibility and Uniqueness: ℓ0-Minimization

Spark Condition Spark(A): smallest number of columns that are linearly dependent

1

Example I: Identity matrix I ∈ Rd×d, Spark(A) = d+1;

2

Example II: » 1 1 1 1 – , Spark(A) = 2;

3

Example III: Random matrix [v1, v2, · · · , vn] ∈ Rd×n, Spark(A) = d+1 (with high probability);

Sparse signal x can be uniquely recovered by ℓ0-min if x0 < Spark(A) 2 Proof.

1

Suppose x1 = x2 both satisfy the spark condition, and b = Ax1, b = Ax2.

2

A(x1 − x2) . = Ay = b − b = 0.

3

But y0 < Spark(A)

2

+ Spark(A)

2

= Spark(A). Contradiction.

Estimating Spark(A) is as expensive as ℓ0-min itself!

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation

slide-54
SLIDE 54

Introduction Face Recognition Fast ℓ1-Minimization Algorithms Distributed Object Recognition Conclusion

Feasibility and Uniqueness: ℓ1-Minimization

k-Neighborliness Condition Define cross polytope C and quotient polytope P such that P = AC. x is k-sparse ⇔ x lie in a unique (k − 1)-face of C. Necessary and Sufficient:

1

If the (k − 1)-face where x lies maps to a face of P, then ℓ1/ℓ0 holds for this specific x.

2

If all (k − 1)-faces of C map to the faces of P on the boundary, ℓ1/ℓ0 holds for all k-sparse signals.

http://www.eecs.berkeley.edu/~yang Distributed Sensing and Perception via Sparse Representation