1/34
Learning with Low Rank Approximations
- r how to use near separability to extract content from structured data
Learning with Low Rank Approximations or how to use near - - PowerPoint PPT Presentation
Learning with Low Rank Approximations or how to use near separability to extract content from structured data Jeremy E. Cohen IRISA, INRIA, CNRS, University of Rennes, France 30 April 2019 1/34 1 Introduction: separability and matrix/tensor
1/34
2/34 1 Introduction: separability and matrix/tensor rank 2 Semi-supervised learning: dictionary-based matrix and tensor factorization 3 Complete dictionary learning for blind source separation 4 Joint factorization models: some facts, and the linearly coupled case
3/34
y g(y)dy
4/34
n
n
i yi, but it is a separable function of yi:
5/34
6/34
7/34
8/34
q=1 aq ⊗ bq ⊗ cq
r1,r2,r3
q=1 aq ⊗ bq ⊗ cq}
→ I don’t know, this is the difficult part but at least you may think about separability in the future.
9/34
10/34
11/34
r
q aq ⊗ bq ⊗ cq + N with N some small Gaussian noise,
r
F
12/34
1 Pixels can contain several materials → unmixing! 2 Spectra and Abundances are nonnegative! 3 Few materials, many wavelengths
13/34
r
W ≥0,H≥0
r
q2 F
14/34
1 How to deal with the semi-supervised settings?
2 Blind is hard! E.g., NMF is often not identifiable.
3 What about dealing with several data set (Hyper-Multispectral, time
discussions)
15/34
16/34
17/34
NMF
100 200 400 600 100 200 400 600
Spectral band index
100 200 400 100 200 400 100 200 400 600 100 100 200 300
r
d ≫ r
18/34
r
r
19/34
r
20/34
q=1 dsq ⊗ bq ⊗ cq, then the following holds:
r
F
21/34
A,B,C,K
r
F + λA − D(:, K)2 F
22/34
Figure: Spectral signatures and abundance maps identified using MPALS for the Urban data set with r = 6.
23/34
Sources 𝑏𝑗 Libraries 𝐸𝑙
24/34
25/34
r
26/34
r(r−2) r−k
r3 (r−k)2 ) data points are sufficient for ensuring uniqueness.
27/34
A≥0,B≥0,bi0≤k
r
q2 F
1 If k and r are small, trying all
k
2 We can try a variant of k-means.
k
28/34
4 5 10 2025 50 125 d 20 40 60 80 100 Quasi perfect reconstruction of D (%) 4 5 10 2025 50 125 d 10 20 30 40 50 60 relative MSE on D ESNA(proposed) KSub NMF-HALS Lasso-HALS a.set NNOMP SuNNOMP NOLRAK(proposed) 4 5 10 2025 50 125 d 20 40 60 80 100 Quasi perfect reconstruction of D (%) 4 5 10 2025 50 125 d 10 20 30 40 50 relative MSE on D
29/34
30/34
31/34
r
r
32/34
k
1 , . . . , BT K]
k
33/34
k Pk = I and Pk ∈ RJ×r (if r < J).
k Bk is constant.
34/34
r