2D Face Image Analysis Probabilistic Morphable Models Summer - - PowerPoint PPT Presentation

β–Ά
2d face image analysis
SMART_READER_LITE
LIVE PREVIEW

2D Face Image Analysis Probabilistic Morphable Models Summer - - PowerPoint PPT Presentation

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL 2D Face Image Analysis Probabilistic Morphable Models Summer School, June 2017 Sandro Schnborn University of Basel > DEPARTMENT OF


slide-1
SLIDE 1

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

2D Face Image Analysis

Probabilistic Morphable Models Summer School, June 2017 Sandro SchΓΆnborn University of Basel

slide-2
SLIDE 2

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Contents

Landmarks Fitting Image Fitting Observed Landmarks in 2D Observed Image

2

slide-3
SLIDE 3

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

2D Face Image Analysis

𝑄 πœ„ 𝐽 ∝ β„“ πœ„; 𝐽 𝑄(πœ„)

Morphable Model adaptation to explain image

Bayesian Inference Setup

Face & Feature point detection

Fast bottom-up methods 𝐺

Image Likelihood

Image as observation

3

slide-4
SLIDE 4

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

3D Face Reconstruction

4

slide-5
SLIDE 5

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Fitting as Probabilistic Inference

  • Probabilistic Inference Problem:

𝑄 πœ„ 𝐽) = 𝑄 𝐽 πœ„)𝑄(πœ„) 𝑂 𝐽 𝑂 𝐽 = ∫ 𝑄 𝐽 πœ„)𝑄(πœ„)dπœ„

5

β„“ πœ„; 𝐽 = / π’ͺ 𝐽1 | 𝐽1 3 πœ„ , 𝜏6𝐽7

  • 1∈:

/ 𝑐<= 𝐽1

  • >∈?

𝐺 𝐢

  • Likelihood: 𝑄 𝐽 πœ„)

Image is observation

  • Prior: 𝑄 πœ„

Statistical face model

Face shape & color (PPCA/GP models): 𝑑B = 𝜈 + 𝑉𝐸𝛽 𝛽~ 𝑂 0, 𝐽J Scene: illumination, pose, camera 𝐽 K

slide-6
SLIDE 6

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

MH Inference of the 3DMM

  • Target distribution is our β€œposterior”:

𝑄: 𝑄 M πœ„ 𝐽 = β„“ πœ„; 𝐽 𝑄 πœ„

  • Unnormalized
  • Point-wise evaluation only
  • Parameters
  • Shape:

50 – 200, low-rank parameterized GP shape model

  • Color:

50 – 200, low-rank parameterized GP color model

  • Pose/Camera:

9 parameters, pin-hole camera model

  • Illumination:

9*3 Spherical Harmonics illumination/reflectance β‰ˆ 300 dimensions (!!)

6

slide-7
SLIDE 7

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Metropolis Algorithm

7

𝑅(πœ„O|πœ„) 𝑄(πœ„O|𝐽)

πœ„β€²

Proposal

Accept with probability

reject draw proposal πœ„O

πœ„

Update πœ„ ← πœ„β€²

𝛽 = min 𝑄(πœ„O|𝐽) 𝑄(πœ„|𝐽) , 1

1 βˆ’ 𝛽

  • Asymptotically generates samples πœ„1 ∼ 𝑄(πœ„|𝐽): πœ„X, πœ„6, πœ„7, …
  • Markov chain Monte Carlo (MCMC) Method
  • Works with unnormalized, point-wise posterior

MH Algorithm filters samples with stochastic accept/reject steps

slide-8
SLIDE 8

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Proposals

  • Choose simple Gaussian random walk proposals (Metropolis)

"𝑅 πœ„O|πœ„ = 𝑂(πœ„O|πœ„, Ξ£\)"

  • Normal perturbations of current state
  • Block-wise to account for different parameter types
  • Shape

𝑂(πœ·β€²|𝜷, 𝜏^

6𝐹`)

  • Color

𝑂(πœΈβ€²|𝜸, 𝜏b

6𝐹b)

  • Camera

βˆ‘ 𝑂(πœ„d

O|πœ„d, 𝜏d 6) d

  • Illumination

βˆ‘ 𝑂(πœ„e

O|πœ„e, 𝜏e,1 6 𝐹e)

  • 1
  • Large mixture distributions, e.g.

In practice, we often add more complicated proposals, e.g. shape scaling, a direct illumination estimation and decorrelation

8

2 3 𝑅h πœ„O πœ„ + 1 3 i πœ‡1𝑅1

e(πœ„O|πœ„)

  • 1
slide-9
SLIDE 9

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Landmarks Fitting

Projection Variable Parameters

  • Pose
  • Shape

Likelihood β„“ πœ„; π’š l ∝ 𝑄 π’š l π’š πœ„ Target Landmarks Rendered Landmarks Face Model Prior 𝑄 πœ„

9

slide-10
SLIDE 10

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

3DMM Landmarks Likelihood

Simple models: Independen Independent Gaus ussians ns

  • Observation of landmark locations in image
  • Single landmark position model:

π’š1

6m πœ„ = Top ∘ Pr ∘ Tto ∘ β„Žπœ·

π’š1

7m

β„“1 πœ„; π’š l1

6m = 𝑂 π’š

l1

6m|π’š1 6m πœ„ , 𝜏vt 6

  • Independent model

β„“ πœ„; {π’š l1

6m}1 = / β„“ πœ„; π’š

l1

6m

  • 1
  • Independence and

Gaussian are just simple models (questionable)

10

Tto π’š = 𝑆z,{,| π’š + 𝒖 (Top ∘ Pr)(π’š) = π‘₯ 2 βˆ— 𝑦 𝑨 βˆ’ β„Ž 2 βˆ— 𝑧 𝑨 + 𝒖ƒƒ

slide-11
SLIDE 11

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Landmarks: Samples

11

slide-12
SLIDE 12

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Results: Landmarks

  • Landmarks posterior:

Manual labelling: 𝜏vt = 4pix Image: 512x512

  • Certainty of pose fit
  • Influence of ear points?
  • Frontal better than sideview?

Yaw, ΟƒπŒπ = 4pix wi with ears w/ w/o ears Frontal 1.4∘ Β± 𝟏. 𝟘∘ βˆ’1.4∘ Β± πŸ‘. πŸ–βˆ˜ Sideview 24.8∘ Β± πŸ‘. πŸ”βˆ˜ 25.2∘ Β± πŸ“. 𝟏∘

12

Di Distance st stdev wi with ears w/ w/o ears Frontal 22cm 125cm Sideview 35cm 35cm

slide-13
SLIDE 13

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Face Model Fitting

13

Parametric face model Target Image 𝐽 Rendered Image 𝐽 πœ„ Likelihood β„“ πœ„; 𝐽 ∝ 𝑄 𝐽 𝐽 πœ„ Face Model Reconstruction: Analysis-by-Synthesis πœ„ = 𝜘, 𝛽, 𝛾 : : 𝜘 Scene Parameters, 𝛽 Face shape, 𝛾 Face color

slide-14
SLIDE 14

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Independent Pixels Likelihood

βˆ— β‹― π’ͺ( | , 𝜏6𝐽7) π’ͺ( | , 𝜏6𝐽7) βˆ—

β„“ πœ„; 𝐽 K =

β„“ πœ„; 𝐽 K = / π’ͺ 𝐽1

3 | 𝐽1 πœ„ , 𝜏6𝐽7

  • 1∈:

𝐺

Standard choice Corresponds to least squares fitting

14

slide-15
SLIDE 15

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Background Model

Shrinking Misalignment

The face model covers only a small part of the complete target image What to do outside face region?

β„“ πœ„; 𝐽 K = / β„“1 πœ„; 𝐽1 3

  • 1∈:
  • Explicit model
  • Ignore β†’ strong artifacts

15

slide-16
SLIDE 16

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Explicit Background Model

Add explicit likelihood for background Why is ignoring bad?

16

SchΓΆnborn et al. Β«Background modeling for generative image modelsΒ», Computer Vision and Image Understanding, Volume 136, July 2015, Pages 117–127, doi:10.1016/j.cviu.2015.01.008

Implicit background model is al alway ays present but might be inappropriate β†’ better make it explicit! β„“ πœ„; 𝐽 K = / β„“β€Ί πœ„; 𝐽1 3

  • 1∈:

𝑐<= 𝐽1 3 = 1 β„“ πœ„; 𝐽 K = / β„“β€Ί πœ„; 𝐽1 3

  • 1∈:

/ 𝑐<= 𝐽1 3

  • >∈?

Arbitrary background: The explicit background model needs to be based on generic and simple assumptions: Constant model Histogram model

slide-17
SLIDE 17

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Collective Likelihood

  • Independence is not a good assumption

Too many observations (100k+): overconfident Colors are correlated

  • Model distribution of image distance

Fit to empirical histogram or use model Can be any measure extracted on images

  • Most-likely solutions match the image

with the expected noise level

A perfect reconstruction is unlikely β„Ž(𝑒)

𝑒 =

βˆ’

17

slide-18
SLIDE 18

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Posterior Samples: Fitting Result

  • Model instances with comparable reconstruction quality
  • Remaining uncertainty of model representation
  • Integration of uncertain detection directly into model adaptation

18

0.072 0.073 0.074 0.075 0.076 200000 400000 600000 800000 1e+06 RMS Image Distance Sample dI <dI>

Posterior using collective likelihood

slide-19
SLIDE 19

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Results: Image

Yaw angle: 1.9∘ ± 0.2∘

19

slide-20
SLIDE 20

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Image: Samples

20

slide-21
SLIDE 21

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Posterior Shape Variation

Landmarks posterior, sd[mm] Image posterior, sd[mm]

21

slide-22
SLIDE 22

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Fitting Results

22

Images from: Huang, Gary B., et al. Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Vol. 1. No. 2. Technical Report 07-49, University of Massachusetts, Amherst, 2007. Images from: KΓΆstinger, Martin, et al. "Annotated facial landmarks in the wild: A large-scale, real-world database for facial landmark localization." Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on. IEEE, 2011.

LFW AFLW

slide-23
SLIDE 23

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Automatic Fitting

  • Detection of face and feature points
  • Scanning window & classifier
  • Uncertain results
  • Feed-forward: early hard decisions
  • Integration concept
  • Bayesian integration

β†’ Fi Filtering

  • Metropolis sampling

β†’ Pr Propose & ve verif ify

23

Which box contains the face?

SchΓΆnborn, Sandro, et al. "Markov Chain Monte Carlo for Automated Face Image Analysis." International Journal of Computer Vision (2016): 1-24.

slide-24
SLIDE 24

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Random Forest Detection

  • Scanning Window
  • Random Forest Classifier

24

𝑔

6 𝐽ƒ

𝑔

X 𝐽ƒ

𝑔

7 𝐽ƒ

ΓΌ Γ» Γ» ΓΌ

  • Haar Features
  • Information gain splitting
  • Bagging many trees, depth ~16
  • ~200k training patches (AFLW)

> πœ„ ≀ πœ„

  • Classify each patch: face or not
  • Search over image
  • Search over scales
slide-25
SLIDE 25

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Bayesian Integration

  • Different modality
  • Box 𝐺: position & size
  • Landmarks 𝐸: certainty
  • Detection is uncertain
  • Likelihood models
  • Detection is observation
  • Different observation models
  • Conceptual uncertainty

25

Observation likelihood 𝑄 πœ„ 𝐺, 𝐸 = β„“ πœ„; 𝐺, 𝐸 𝑄 πœ„ 𝑂(𝐺, 𝐸) β„“ πœ„; 𝐺, 𝐸 = 𝑄 𝐺|πœ„ 𝑄 𝐸|πœ„ Bayesian inference

Detection data Bayesian integration

slide-26
SLIDE 26

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Detection Likelihood

𝐸 π’š

β„“ πœ„; 𝐸 = max

𝒖

π’ͺ 𝒖|π’š πœ„ , 𝜏6 𝐸 𝒖

Face detection

Box: position & size of detected face

Landmarks detection

Detection map: certainty

  • f detecting at position π’š

Model: Best combination of landmarks uncertainty and detection certainty Model: Uncertainty of position and scale

β„“ πœ„; 𝐺 = π’ͺ 𝒒|π’š πœ„ , πœβ€Ί

6 β„’π’ͺ 𝑑|𝑑 πœ„ , 𝜏£ 6 26

𝒒, 𝑑

slide-27
SLIDE 27

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Integration by Filtering

  • Step-by-step Bayesian inference
  • Condition on observations one after the other
  • Posterior of first observation becomes prior for next step
  • Each step adds an observation through conditioning with its likelihood
  • Equivalent to single-step Bayesian inference

27

𝑄 πœ„ 𝑄(πœ„| ) 𝑄(πœ„| )

β„“ πœ„; 𝐺, 𝐸 β„“ πœ„; 𝐽

slide-28
SLIDE 28

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Filtering: Multiple Metropolis Decisions

28

  • Step-wise Bayesian inference: Needs β„“ πœ„ for each step
  • Saves computation time if properly ordered

𝑅 𝑄 πœ„|𝐺, 𝐸, 𝐽

𝑄 𝐽

𝑄 πœ„ 𝑄 πœ„|𝐺, 𝐸

𝐺, 𝐸

Check if the proposals fit the detection first! 𝑄(πœ„O) β„“Β€(πœ„O; 𝐽)

πœ„β€²

β„“Β₯(πœ„O; 𝐸) 𝑅(πœ„O|πœ„) Proposal

πœ„ πœ„ πœ„

πœ„ ← πœ„β€²

Bayesian inference steps

slide-29
SLIDE 29

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Alternatives

29

slide-30
SLIDE 30

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

  • Metropolis algorithm formalizes: propose-and-verify
  • Decouples finding possible solution from selection
  • No need to always provide good solutions in proposals
  • Verification for consistency with the model
  • Algorithmic advantage beyond probabilistic Bayesian concept

Draw a sample 𝑦O from 𝑅(𝑦O|𝑦) Propose With probability 𝛽 = min

h π’šΒ¦ h π’š , 1

accept π’šO as new sample Verify

Metropolis: Propose-and-Verify

30

β€œAnything that is more informed than random walks should improve fitting”

slide-31
SLIDE 31

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Multiple Alternative Proposals

  • Metropolis formalizes propose-and-verify
  • Decouples proposing possible solution from validation
  • No need to always provide good solutions in proposals
  • Introduce alternatives through proposals

31

Ξ£

πœ„ ← πœ„β€²

𝑅X(πœ„O|πœ„)

𝑅6(πœ„O|πœ„) β„“?(πœ„O; 𝐢) β„“Β€(πœ„O; 𝐽§) β„“eΒ¨(πœ„O; 𝐸) 𝑅7(πœ„O|πœ„) β„“?(πœ„O; 𝐢) β„“Β€(πœ„O; 𝐽§) β„“eΒ¨(πœ„O; 𝐸)

Many candidates Data-Driven Markov Chain Monte Carlo (DDMCMC): Use data to build more informed proposals

500 1000 1500 2000 2500 3000 Samples 1 2 3 4 5 6 7 8 9

β€œAnything that is more informed than random walks should improve fitting”

slide-32
SLIDE 32

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

32

slide-33
SLIDE 33

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL

Summary

  • Fitting as probabilistic inference
  • Probabilistic inference is often intractable
  • Sampling methods approximate by simulation
  • MCMC methods provide a powerful sampling framework
  • Markov Chain with target distribution as equilibrium distribution
  • General algorithms, e.g. Metropolis-Hastings
  • Fitting of the 3DMM as a real inference problem
  • MH algorithm to integrate information: Framework
  • Filtering: Uncertain information as observation, step-by-step
  • Propose-and-verify: Alternatives, multiple hypotheses, heuristics

33