Learning to separate shading from paint
Marshall F. Tappen1 William T. Freeman1 Edward H. Adelson1,2
1MIT Computer Science and Artificial
Intelligence Laboratory (CSAIL)
2MIT Dept. of Brain and Cognitive Sciences
Marshall
Learning to separate shading from paint Marshall F. Tappen 1 William - - PowerPoint PPT Presentation
Learning to separate shading from paint Marshall F. Tappen 1 William T. Freeman 1 Edward H. Adelson 1,2 1 MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) 2 MIT Dept. of Brain and Cognitive Sciences Marshall Forming an Image
Marshall F. Tappen1 William T. Freeman1 Edward H. Adelson1,2
1MIT Computer Science and Artificial
Intelligence Laboratory (CSAIL)
2MIT Dept. of Brain and Cognitive Sciences
Marshall
Surface Illuminate the surface to get: Shading Image
The “shading image” is the interaction of the shape
Scene Image
We can also include a reflectance pattern or a “paint”
create the observed image.
How can we access shape or reflectance information from the observed image? For example:
image estimate of shape
Shading Image Reflectance Image
Tenenbaum).
domain and assume the images add.
independently is necessary for most image understanding tasks.
– Material recognition – Image segmentation
access and modify the intrinsic images separately
– More informative than just the image – Less complex than fully reconstructing the scene
were caused by shape changes and what parts were caused by paint changes.
as being caused by shading or paint.
cause.
caused by either shading or a reflectance change
squares reconstruction from each set of labeled
from Yair Weiss’s web page.)
Original x derivative image Classify each derivative (White is reflectance)
patterns and smooth illumination
derivative
reflectance
– Color (chromaticity changes) – Form (local image patterns)
– Assume a probabilistic model and use belief propagation.
Unknown Derivative Labels (hidden random variables that we want to estimate)
Local Color Evidence Some statistical relationship that we’ll specify Derivative Labels
Derivative Labels Local Color Evidence Local Form Evidence
Propagate the local evidence in Markov Random Field. This strategy can be used to solve other low-level vision problems. Local Evidence Hidden state to be estimated Influence of Neighbor
For a Lambertian surface, and simple illumination conditions, shading only affects the intensity of the color of a surface Notice that the chromaticity of each face is the same
Any change in chromaticity must be a reflectance change
Intensity Changes Chromaticity Changes
Angle between the two vectors, θ, is greater than 0 Angle between two vectors, θ, equals 0
Red Green B l u e Red Green B l u e
θ
and c2
c1 c2
Input Reflectance Shading
reflectance
– So we label it as “ambiguous” – Need more information
the ripples of the fabric have very different appearances
which take advantage
Examples from Reflectance Change Training Set Examples from Shading Training Set
final classification.
⎟ ⎠ ⎞ ⎜ ⎝ ⎛ =
) ( sign ) ( x h x H
i i i
α
small image patch to classify each derivative
Ip
abs
Initial uniform weight
weak classifier 1 (Freund & Shapire ’95)
⎟ ⎠ ⎞ ⎜ ⎝ ⎛ =
∑
t t t
x h x f ) ( ) ( α θ
weak classifier 2 Incorrect classifications re-weighted more heavily weak classifier 3 Final classifier is weighted combination of weak classifiers
⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ − =
t t t
error error 1 log 5 . α
∑
− − − −
=
i x h y i t x h y i t i t
i t t i i t t i
e w e w w
) ( 1 ) ( 1 α α
Viola and Jones, Robust object detection using a boosted cascade of simple features, CVPR 2001
−
=
i x f y
i i
e J
) (
Classification cost
Treat hm as a perturbation, and expand loss J to second order in hm classifier with perturbation squared error reweighting cost function
using the AdaBoost algorithm (see www.boosting.org for introduction).
these 4 categories:
– Multiple orientations of 1st derivative of Gaussian filters – Multiple orientations of 2nd derivative of Gaussian filters – Several widths of Gaussian filters – impulse
vertical derivatives when the illumination comes from the top of the image.
response is above a threshold, vote for reflectance.
empirical justification for Retinex algorithm: treat small derivative values as shading.
perpendicular to lighting direction as evidence for reflectance change.
Shading Image Reflectance Image Input Image
Results only using chromaticity.
Shading Reflectance Input image
Input Shading Reflectance Is the change here better explained as ?
information from reliable areas of the image into ambiguous areas of the image
images.
local relationships, get global effects out.
scene Scene-scene compatibility function neighboring scene nodes image local
Image-scene compatibility function
i i i j i j i
,
infer the hidden states?)
– Gibbs sampling, simulated annealing – Iterated condtional modes (ICM) – Variational methods – Belief propagation – Graph cuts
See www.ai.mit.edu/people/wtf/learningvision for a tutorial on learning and vision.
y1
) , (
1 1 yx Φ ) , (
2 1 xx Ψ ) , (
2 2 yx Φ ) , (
3 2 xx Ψ ) , (
3 3 yx Φ
x1 y2 x2 y3 x3
) , , , , , ( sum sum mean
3 2 1 3 2 1 1
3 2 1
y y y x x x P x
x x x MMSE =
) , ( ) , ( sum ) , ( ) , ( sum ) , ( mean ) , ( ) , ( ) , ( ) , ( ) , ( sum sum mean ) , , , , , ( sum sum mean
3 2 3 3 2 1 2 2 1 1 1 3 2 3 3 2 1 2 2 1 1 1 3 2 1 3 2 1 1
3 2 1 3 2 1 3 2 1
x x y x x x y x y x x x x y x x x y x y x x y y y x x x P x
x x x MMSE x x x MMSE x x x MMSE
Ψ Φ Ψ Φ Φ = Ψ Φ Ψ Φ Φ = =
y1
) , (
1 1 yx Φ ) , (
2 1 xx Ψ ) , (
2 2 yx Φ ) , (
3 2 xx Ψ ) , (
3 3 yx Φ
x1 y2 x2 y3 x3
y1
) , (
1 1 yx Φ ) , (
2 1 xx Ψ ) , (
2 2 yx Φ ) , (
3 2 xx Ψ ) , (
3 3 yx Φ
x1 y2 x2 y3 x3
) , ( ) , ( sum ) , ( ) , ( sum ) , ( mean ) , ( ) , ( ) , ( ) , ( ) , ( sum sum mean ) , , , , , ( sum sum mean
3 2 3 3 2 1 2 2 1 1 1 3 2 3 3 2 1 2 2 1 1 1 3 2 1 3 2 1 1
3 2 1 3 2 1 3 2 1
x x y x x x y x y x x x x y x x x y x y x x y y y x x x P x
x x x MMSE x x x MMSE x x x MMSE
Ψ Φ Ψ Φ Φ = Ψ Φ Ψ Φ Φ = =
y1
) , (
1 1 yx Φ ) , (
2 1 xx Ψ ) , (
2 2 yx Φ ) , (
3 2 xx Ψ ) , (
3 3 yx Φ
x1 y2 x2 y3 x3
) , ( ) , ( sum ) , ( ) , ( sum ) , ( mean
3 2 3 3 2 1 2 2 1 1 1
3 2 1
x x y x x x y x y x x
x x x MMSE
Ψ Φ Ψ Φ Φ = ) ( ) , ( ) , ( sum ) (
2 3 2 2 2 2 1 1 2 1
2
x M y x x x x M
x
Φ Ψ =
∈
=
) (
) ( ) (
j N k j k j j j
x M x b
j
∈
=
i j N k j k j j i x i j i
x M x x x M
j
\ ) ( ij
) ( ) , ( ) ( ψ
j i i
Kalman filter.
forward/backward algorithm (and MAP variant is Viterbi).
y1 x1 y2 x2 y3 x3
) , ( ) , ( sum ) , ( ) , ( sum ) , ( mean
3 2 3 3 2 1 2 2 1 1 1
3 2 1
x x y x x x y x y x x
x x x MMSE
Ψ Φ Ψ Φ Φ =
3 1
) , ( x x Ψ
Justification for running belief propagation in networks with loops
– Error-correcting codes – Vision applications
– For Gaussian processes, means are correct. – Large neighborhood local maximum for MAP. – Equivalent to Bethe approx. in statistical physics. – Tree-weighted reparameterization Weiss and Freeman, 2000 Yedidia, Freeman, and Weiss, 2000 Freeman and Pasztor, 1999; Frey, 2000 Kschischang and Frey, 1998; McEliece et al., 1998 Weiss and Freeman, 1999 Wainwright, Willsky, Jaakkola, 2001
) ( ) ( ) , ( ) , ( ) ( ) ( ) (
\ ) ( \ ) ( ) (
∈ ∈ ∈
Ψ = Φ =
i j N k j k j j i N k i k i j i j i ij i N k i k i i i i
x M x M x x k x x b x M x k x b
i j i
Belief propagation equations come from the marginalization constraints.
j i i j
i i
∈
=
i j N k j k j j i x i j i
x M x x x M
j
\ ) ( ij
) ( ) , ( ) ( ψ
approximation stationary point.
minimize approximations to Free Energy,
– variational: usually use primal variables. – belief propagation: fixed pt. equs. for dual variables.
propagation algorithms.
Yuille, Welling, etc.
Groups of nodes send messages to other groups of nodes.
Typical choice for Kikuchi cluster.
i j i j
i j i
l k
Update for messages Update for messages
Marginal probabilities for nodes in one row
– classic
– Inspires application of BP to vision
– Applications in super-resolution, motion, shading/paint discrimination
– Application to stereo
– Reparameterization version
– The clearest place to read about BP and GBP.
between neighboring derivatives
the same label
(Yedidia et al. 2000)
⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − − = β β β β ψ 1 1 ) , (
j
x xi
between neighboring derivatives
the same label
(Yedidia et al. 2000)
⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − − = β β β β ψ 1 1 ) , (
j
x xi
Classification
contours should have the same label
derivatives are along a contour
present
function of the image gradient’s magnitude and orientation
0.5 1.0
β=
⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − − = β β β β ψ 1 1 ) , (
j
x xi
Reflectance Image With Propagation Input Image Reflectance Image Without Propagation
shading reflectance
Shading Reflectance Original
(from LL Bean catalog)
shading reflectance
Note: color cue omitted for this processing
Finally, returning to our explanatory example…
input Ideal shading image Ideal paint image
Algorithm output.
Note: occluding edges labeled as reflectance.
reflectance image components.
– Learn classifiers for derivatives based on local evidence, both color and form.
classifications. For manuscripts, see www.ai.mit.edu/people/wtf/