scattered data interpolation for computer graphics
J.P . Lewis Weta Digital Ken Anjyo OLM Digital Fred Pighin (*) Google Inc
SIGGRAPH Asia 2010 Course: Scattered Data Interpolation for Computer Graphics
Monday, January 3, 2011
scattered data interpolation for computer graphics J.P . Lewis Ken - - PowerPoint PPT Presentation
scattered data interpolation for computer graphics J.P . Lewis Ken Anjyo Fred Pighin (*) Weta Digital OLM Digital Google Inc SIGGRAPH Asia 2010 Course: Scattered Data Interpolation for Computer Graphics
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Bickel, Lang, Botsch, Otaduy, Gross Pose-Space Animation and Transfer of Facial Details ACM SCA 2008
Monday, January 3, 2011
Noh and Neumann, Expression Cloning, SIGGRAPH 2001
Monday, January 3, 2011
Locally controllable stylized shading. SIGGRAPH 07
Monday, January 3, 2011
Monday, January 3, 2011
Savchenko, Kojekine, Unno A Practical Image Retouching Method
Monday, January 3, 2011
Carr, Beatson et al. Reconstruction and Representation of 3D Objects with Radial Basis Functions SIGGRAPH 2001
Monday, January 3, 2011
Levin, Lischinski, Weiss, Colorization using Optimization SIGGRAPH 2004
Monday, January 3, 2011
Levin, Lischinski, Weiss, Colorization using Optimization SIGGRAPH 2004
Monday, January 3, 2011
Kurihara & Miyata, Modeling Deformable Human Hands from Medical Images, ACM SCA 2004
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Kurihara & Miyata, Modeling Deformable Human Hands from Medical Images, ACM SCA 2004
Monday, January 3, 2011
Kurihara & Miyata, Modeling Deformable Human Hands from Medical Images, ACM SCA 2004
Monday, January 3, 2011
Taehyun Rhee et al. Scan-Based Volume Animation Driven by Locally Adaptive Articulated Registrations, IEEE Trans. Visualization and Computer Graphics March 2011
Monday, January 3, 2011
Taehyun Rhee et al. Scan-Based Volume Animation Driven by Locally Adaptive Articulated Registrations, IEEE Trans. Visualization and Computer Graphics March 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
Monday, January 3, 2011
1 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
2 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
3 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
4 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
5 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
6 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
7 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
8 / 84
I Use Shepards to interpolate onto a regular grid I Interpolate the grid with a regular spline I Interpolate the residual with a second Shepards I iterate...
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
9 / 84
I Fit a polynomial (or other basis) independently at each
I Use weighted least squares, de-weight data that are far
I For interpolation, weights must go to infinity at the
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
10 / 84
0 a(x) i xi k
a n
k ( m
i
k − dk)2
k
1 x−xkp
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
11 / 84
a n
k (dk − m
i
k)2
k ≡ bk ∈ Rm+1, the polynomial basis evaluated at the
a n
k (dk − bTa)2
a
k .
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
12 / 84
a n
k (a · 1 − dk)2
k (a2 − 2adk + d2 k)
k
n
k wkdk
k wk
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
13 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
14 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
15 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
16 / 84
wikipedia
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
17 / 84
Image: N. Sukmar, Natural Neighbor Interpolation and the Natural Element Method (NEM)
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
18 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
19 / 84
from Generalized Stochastic Subdivision, ACM TOG July 1987
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
20 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
21 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
22 / 84
f
f
(subject to some constraints, to avoid a trivial solution)
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
23 / 84
I function: f(x) → y I operator: Mf → g, e.g. Matrix-vector multiplication
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
24 / 84
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
25 / 84
I direct matrix inverse I Jacobi (because matrix is quite sparse) I Jacobi variants (SOR) I Multigrid
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
26 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
27 / 84
f
f
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
28 / 84
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
29 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
30 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
31 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
32 / 84
L = 1 −2 1 1 −2 . . . one column is zeroed
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
33 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
34 / 84
From: Lifting Detail from Darkness, SIGGRAPH 2001
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
35 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
36 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
37 / 84
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
38 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
39 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
40 / 84
N
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
41 / 84
I Broomhead & Lowe, 1988 I Werntges, ICNN 1993 I in Graphics: 1999-2001
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
42 / 84
I Micchelli - for a large class of functions, the RBF
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
43 / 84
I any monotonic function can be used?! I common choices:
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
44 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
45 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
46 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
47 / 84
N
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
48 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
49 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
50 / 84
I removes the “dips” that result from too-narrow σ I i.e. somewhat less sensitive to choice of σ I (for decaying kernel), far from the data, closest point
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
51 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
52 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
53 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
54 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
55 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
56 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
57 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
58 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
59 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
60 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
61 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
62 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
63 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
64 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
65 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
66 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
67 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
68 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
69 / 84
I if few known points: use RBF I if many points use multigrid instead I but Carr/Beatson et. al. (SIGGRAPH 01) use FMM for
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
70 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
71 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
72 / 84
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
73 / 84
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
74 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
75 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
76 / 84
f
f
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
77 / 84
f
I simplifying assumptions: uniform sampling, 1 dimension I S is a diagonal “selection matrix” with 1s and 0s I P is a diagonal-constant matrix that encodes the
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
78 / 84
Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process regression Gaussian Process regression
79 / 84
I eigenvectors of a circulant matrix are sinusoids, I and P is diagonal-constant (toeplitz), or nearly
I So VD−2VT is approximately the same as taking the
Application Examples
1
Example results
2
the examples as well.
See the paper in EUROGRAPHICS06 (by Baxter and Anjyo) for details.
3
To build a latent doodle space (LDS):
Do best possible, but still must have a good UI.
4
To generate the space of similar drawings (LDS):
This talk focuses on the first strategy: PCA+ RBF
5
lk
sp av
p
(x, y)
a10 a1
1
a20
line drawing line drawing
stroke
X T X
s
1
s2 l2 l
1 6
eigenvalues.
spline:
f (s, t) := wi
i
r := (s, t) = s2 + t 2
7
8
Application Examples
9
0.0 1.0
N•L intensity distribution
10
Artists want neat shading With conventional shader
11
conventional desired
– Edit undesirable shaded area – Add artistic light and shade to original 3D lighting
12
13
Modify intensity according to painted strokes
Paint brush
Intensity modification
14
Key-frame Animation Offset Data
Linear interpolation
15
Fall off Modified intensity Boundary constraints Threshold Fall-off constraints
No modification
Painted shade
Interpolation by RBF
16
specified for the points and .
Boundary constraints Threshold Fall off constraints
No modification
17
f (x) = wi(x ci)
i=1 l
where x = (x1, x2, x3), and ; P(x) is a linear polynomial of x1, x2, x3; ci means the constraint points ( and ) l is the number of all the constraint points. (x) := x x1
2 + x2 2 + x3 2
wi(c j ci)
i=1 l
for1 jl.
For the given values hj (1 j l) , we have: Totally l + 4 unknown values.
18
We further add the following condition: for any linear polynomial Taking Q = 1, x1, x2, and x3 , we know that the above condition means:
wi Q(ci)
i=1 l
wi
i=1 l
wi
i=1 l
( j =1, 2, 3)
19
we have the following linear equation:
11 12
1 c11 c12 c13 21
ll 1 cl1 cl2 cl 3 1 1 1 c11
wl p0 p1 p2 p3
h1
20
21 22
RBF and RKHS
1
Background Differential equation for TPS RBF as TPS Regularization problem
2
Latent doodle space uses TPS:
The shade painter employs and p is linear: PSD applications use Gaussian RBF with no polynomial term ( ).
p x
( ) 0
G x
( ) = x
2 log x
G x
( ) = x =
x1
2 + x2 2 + x3 2
p x
( ) = p x1,x2 ( ) c0 + c1x1 + c2x2
p x
( ) = p x1,x2,x3 ( ) p0 + p1x1 + p2x2 + p3x3 .
3
where .
Get the values of and the four coefficients of p by solving the linear equation system! G x
( ) = x =
x1
2 + x2 2 + x3 2
fk = iG xk xi
( )
i=1 N
( )
i
i=1 N
i
i=1 N
j =1,2,3. i
{ }
xi = xi1, xi 2, xi 3
( )
T 4
Functional analysis might be helpful to answer them.
5
Background Differential equation for TPS RBF as TPS Regularization problem
6
following energy:
where
in R2
(x)
(x)
2dx1dx2 <
}.
7
is a smooth function with compact support:
is then a quadratic function of t : and it must satisfy .
f = argF
t R, g
G t
( ) = F f + t g ( )
g x
( ) = 0, if x .
G t
( )
( ) = 0
G t
( ) = t 2
gx1x1
2 + 2gx1x2 2 + gx2x2 2
2 fx1x1gx1x1 + 4 fx1x2gx1x2 + 2 fx2x2gx2x2
( )d x
8
( ) = 0
fx1x1gx1x1 + 2 fx1x2gx1x2 + fx2x2gx2x2
( )d x
fx1x1x1gx1 + 2 fx1x2x1gx2 + fx2x2x2gx2
( )d x
fx1x1x1x1 + 2 fx1x2x1x2 + fx2x2x2x2
( )gd x
2 f
( ) gd x
2 f 2 x1
2 + 2
x2
2
f = 0.
9
2
x
( ) = x
2 log x
2 x
( ) = x ( )
x
( )
x
( )
10
= R2
2
( )
xi, fi
( ) R2 R i =1, 2, ..., N ( )
min
f
fi f xi
( )
( )
2 i=1 N
( )
Problem statement
12
dimensions and, instead of F, with:
= Rn
xi, fi
( ) Rn R i =1, 2, ..., N ( ) F = J2
2
13
14
J2
2 f
( ) = F f ( )
xi, fi
( ) Rn R i =1, 2, ..., N ( )
15
and smoothness of the solution enforced by .
( )
( )
2 i=1 N
n
Jm
n
x = x1, x2, ... , xn
( )
T Rn 16
i
{ }
kQ xk
( )
k=1 N
for all Q
17
Latent doodle space uses TPS, where m = n = 2 and p is a linear polynomial. The shade painter deals with the case where m = 2, n = 3, and p is linear.
18
What is functional analysis? RBF and RKHS
19
Rn l2
20
Recall the technique of “integration by parts” in the TPS case.
We therefore have: .
n
Hm
n
21
for 1 i N (The representer theorem). .
Hm
n
G x ck
( )
ci = xi
Hm
n
22
following properties:
This gives an alternative definition of RKHS. K is called the kernel function of RKHS.
K x, y
( ) := G x y ( )
mG x
( ) = x ( ).
23
Bm
n
f , g Bm
n = 1
( )
m m f , g L2
f = iG x xi
( )
i=1 N
( )
0 = iG x xi
( )
i=1 N
( )
Bm
n
= i mG x xi
( ), Q x ( ) L2
i=1 N
i x xi
( ), Q x ( ) L2
i=1 N
iQ xi
( )
i=1 N
24
a polynomial term this time). f = iG x xi
( )
i=1 N
n
am
m0
n
am = 2m m!2m > 0
( )
1
( )
m m0
25
Functional analysis should be helpful to answer them.
26