A message-passing approach to low-rank matrix reconstruction and - - PowerPoint PPT Presentation

a message passing approach to low rank matrix
SMART_READER_LITE
LIVE PREVIEW

A message-passing approach to low-rank matrix reconstruction and - - PowerPoint PPT Presentation

bg=white Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C A message-passing approach to low-rank matrix reconstruction and application to clustering Toshiyuki


slide-1
SLIDE 1

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

A message-passing approach to low-rank matrix reconstruction and application to clustering

Toshiyuki TANAKA tt@i.kyoto-u.ac.jp

Graduate School of Informatics, Kyoto University

1 September, 2014

1 / 57

slide-2
SLIDE 2

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Table of contents

1

Low-rank matrix reconstruction via message passing

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

2 / 57

slide-3
SLIDE 3

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Collaborators

Ryosuke Matsushita (NTT DATA Mathematical Systems Inc., Japan) Kei Sano (Graduate School of Informatics, Kyoto University, Japan)

3 / 57

slide-4
SLIDE 4

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Brief biography

1993: Graduated from Department of Electronic Engineering, Graduate School of Engineering, the University of Tokyo. 1993–2005: Department of Electronics and Information Engineering, Tokyo Metropolitan University. 2005–: Graduate School of Informatics, Kyoto University.

4 / 57

slide-5
SLIDE 5

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing Problem formulation Approach

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

5 / 57

slide-6
SLIDE 6

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Low-rank matrix reconstruction via message passing

Reference: R. Matsushita and T. Tanaka, “Low-rank matrix reconstruction and clustering via approximate message passing,” in

  • C. J. C. Burges et al. (eds.), Advances in Neural Information

Processing Systems, volume 26, pages 917–925, 2013.

6 / 57

slide-7
SLIDE 7

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing Problem formulation Approach

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

7 / 57

slide-8
SLIDE 8

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Low-rank matrix reconstruction

Problem formulation A0 ∈ RM×N, rankA0 = r ≪ min{M, N} (Low-rank). Observation noise: W = (Wij) ∈ RM×N, Wij ∼ N(0, Mσ2). Observe (part of) A = A0 + W → Estimate the low-rank mtx. A0 Consider SVD A = UΣV T of A, and let ˆ A = UΣrV T using Σr constructed by leaving the largest r singular values of Σ. “Nuclear-norm” minimization: ˆ A = arg min

X∈RM×N

  • X1 + λX − A2

F

  • . . .

9 / 57

slide-9
SLIDE 9

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Shatten norms

Shatten’s p-norm Ap :=

  • i

σp

i

1/p , p ≥ 1. Rank = Number of non-zero singular values = “0-norm”. Nuclear norm = Shatten’s 1-norm. Regarded as a convex relaxation of Rank.

10 / 57

slide-10
SLIDE 10

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Low-rank matrix reconstruction

Formulation via probability model A = UV T + W , U ∈ RM×r, V T ∈ Rr×N ⇒ p(A|U, V ) ∝

  • i,j

e−(Aij−uT

i vj)2/(2Mσ2)

p(U) ∝

M

  • i=1

e−c(ui), p(V ) ∝

N

  • j=1

e−c(vj) ⇒ p(U, V |A) ∝ p(A|U, V )p(U)p(V ) =  

i,j

e−(Aij−uT

i vj)2/(2Mσ2)

  M

  • i=1

e−c(ui)  

N

  • j=1

e−c(vj)  

11 / 57

slide-11
SLIDE 11

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Low-rank matrix reconstruction

Formulation via probability model p(U, V |A) ∝ p(A|U, V )p(U)p(V ) =  

i,j

e−(Aij−uT

i vj)2/(2Mσ2)

  M

  • i=1

e−c(ui)  

N

  • j=1

e−c(vj)   Allows straightforward incorporation of prior knowledge on U and V .

Non-negativeness (Paatero-Tapper, 1994) cu(ui) = 0 (ui ≥ 0), ∞ (else) Sparseness (Olshausen-Field, 1996) cu(ui) = ui0

  • r

ui1

Generally a non-convex optimization. Hard to solve.

12 / 57

slide-12
SLIDE 12

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Low-rank matrix reconstruction

Two alternative objectives: Posterior-mean (PM) estimation: Optimal in terms of UV T − ˆ U ˆ V TF. ( ˆ UPM, ˆ V PM) = Ep(U,V |A)(U, V ) Maximum-A-Posteriori (MAP) estimation: ( ˆ UMAP, ˆ V MAP) = arg max

U,V p(U, V | A)

13 / 57

slide-13
SLIDE 13

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Problem formulation

Low-rank matrix reconstruction

PM estimation: ( ˆ UPM, ˆ V PM) = Ep(U,V |A)(U, V ) MAP estimation: ( ˆ UMAP, ˆ V MAP) = arg maxU,V p(U, V | A) One-parameter extension p(U, V | A; β) ∝ [p(U, V | A)]β Extended PM estimation: ( ˆ Uβ, ˆ V β) = Ep(U,V |A;β)(U, V ) PM estimation: ( ˆ UPM, ˆ V PM) = ( ˆ Uβ, ˆ V β)|β=1 MAP estimation: ( ˆ UMAP, ˆ V MAP) = ( ˆ Uβ, ˆ V β)|β→∞

14 / 57

slide-14
SLIDE 14

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing Problem formulation Approach

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

15 / 57

slide-15
SLIDE 15

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Our approach

p(U, V |A; β) ∝  

i,j

e−β(Aij−uT

i vj)2/(2Mσ2)

  M

  • i=1

e−βc(ui)  

N

  • j=1

e−βc(vj)  

Belief-propagation (BP) / Approximate message passing (AMP): Factor-graph representation. Apply BP ⇒ Message-passing algorithm. Msgs are densities. Take large-system limit ⇒ AMP algorithm. Msgs are parameters.

16 / 57

slide-16
SLIDE 16

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4 p(U, V |A; β) ∝  

i,j

e−β(Aij−uT

i vj)2/(2Mσ2)

  M

  • i=1

e−βc(ui)  

N

  • j=1

e−βc(vj)  

17 / 57

slide-17
SLIDE 17

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • l=j

λ(i,l)→i(ui) λ(i,j)→i(ui) ∝

  • e−(Aij−uT

i vj)2/(2Mσ2)µj→(i,j)(vj) dvj

Msgs have functional degree of freedom. Hard to implement.

19 / 57

slide-18
SLIDE 18

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • l=j

λ(i,l)→i(ui) λ(i,j)→i(ui) ∝

  • e−(Aij−uT

i vj)2/(2Mσ2)µj→(i,j)(vj) dvj

Msgs have functional degree of freedom. Hard to implement.

19 / 57

slide-19
SLIDE 19

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • l=j

λ(i,l)→i(ui) ∝ p(ui)

  • e−

l=j(Ail−uT i vl)2/(2Mσ2)

l=j

  • µl→(i,l)(vl) dvl
  • λ(i,j)→i(ui) ∝
  • e−(Aij−uT

i vj)2/2σ2µj→(i,j)(vj) dvj

AMP: Apply CLT and represent msgs in terms of means and covariances.

20 / 57

slide-20
SLIDE 20

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • l=j

λ(i,l)→i(ui) ∝ p(ui)

  • e−

l=j(Ail−uT i vl)2/(2Mσ2)

l=j

  • µl→(i,l)(vl) dvl
  • λ(i,j)→i(ui) ∝
  • e−(Aij−uT

i vj)2/2σ2µj→(i,j)(vj) dvj

AMP: Apply CLT and represent msgs in terms of means and covariances.

20 / 57

slide-21
SLIDE 21

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • l=j

λ(i,l)→i(ui) ∝ p(ui)

  • e−

l=j(Ail−uT i vl)2/(2Mσ2)

l=j

  • µl→(i,l)(vl) dvl
  • λ(i,j)→i(ui) ∝
  • e−(Aij−uT

i vj)2/2σ2µj→(i,j)(vj) dvj

AMP: Apply CLT and represent msgs in terms of means and covariances.

20 / 57

slide-22
SLIDE 22

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approximate message passing

Approximate message passing BP on dense random graph. Formulated for compressed sensing (Donoho-Maleki-Montanari, 2009). Macroscopic description as density evolution (state evolution; SE). Same idea is traced back to:

Perceptron learning (Wong, 1995; Opper-Winther, 1996) CDMA multiuser detection (Kabashima, 2003) . . .

21 / 57

slide-23
SLIDE 23

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approach based on BP / AMP

BP / AMP approach to low-rank matrix reconstruction: r = 1 (Rangan-Fletcher, 2012) r ≥ 1 (Matsushita-Tanaka, 2012; Javanmard-Montanari, 2012) ∗ r ≫ 1 (“not so low-rank”) ⇒ Dictionary learning (Kabashima et al., 2014)

22 / 57

slide-24
SLIDE 24

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approach based on BP / AMP

Iterative algorithm: Alternatively update (Ut, St) and (V t, T t). U0 V 1 T 1 U1 S1 V 2 T 2 U2 S2 V 3 T 3 U3 S3 Ut =    ut

1

. . . ut

M

   , V t =    v t

1

. . . v t

N

   St = {St

1, . . . , St M},

T t = {T t

1, . . . , T t M},

St

j , T t j ∈ Rr×r

23 / 57

slide-25
SLIDE 25

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approach based on BP / AMP

Iterative algorithm: Alternatively update (Ut, St) and (V t, T t). U0 V 1 T 1 U1 S1 V 2 T 2 U2 S2 V 3 T 3 U3 S3 Bt

u =

1 Mσ2  AV t − Ut−1

N

  • j=1

T t

j

  ut

i = fβ(bt ui, Λt u; pui),

= (bt

u1, . . . , bt uM)T,

Λt

u =

1 Mσ2  (V t)TV t + (β−1 − 1)

N

  • j=1

T t

j

  , St

i = ∂fβ(bt ui, Λt u; pui)

∂bt

ui

23 / 57

slide-26
SLIDE 26

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approach based on BP / AMP

Iterative algorithm: Alternatively update (Ut, St) and (V t, T t). U0 V 1 T 1 U1 S1 V 2 T 2 U2 S2 V 3 T 3 U3 S3 Bt

v =

1 Mσ2

  • ATUt − V t

M

  • i=1

St

i

  • v t+1

j

= fβ(bt

vj, Λt v; pvj),

= (bt

v1, . . . , bt vN)T,

Λt

v =

1 Mσ2

  • (Ut)TUt + (β−1 − 1)

M

  • i=1

St

i

  • ,

T t+1

j

= ∂fβ(bt

vj, Λt v; pvj)

∂bt

vj

23 / 57

slide-27
SLIDE 27

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C Approach

Approach based on BP / AMP

fβ(b, Λ; p) := u, · · · :=

  • (· · · ) e−β(uT Λu/2−bT u) p(u)β du
  • e−β(uT Λu/2−bT u) p(u)β du

, → Posterior mean. ∂fβ(b, Λ; p) ∂b = β

  • uuT − uuT

→ Posterior covariance scaled by β.

24 / 57

slide-28
SLIDE 28

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

25 / 57

slide-29
SLIDE 29

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering

26 / 57

slide-30
SLIDE 30

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering

26 / 57

slide-31
SLIDE 31

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering

k-means (Lloyd, 1982) Iterate the following until convergence: Estimate class labels on the basis of distances from reference vectors of the classes. cj := arg min

k∈{1,...,r} aj − uk2 2

Update reference vectors of the classes. uk :=

  • j:cj=k aj
  • j:cj=k 1

∗ Commonly used for clustering of data. ∗ Can be interpreted as the high-SNR limit of EM algorithm for Gaussian mixture.

27 / 57

slide-32
SLIDE 32

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering as low-rank matrix reconstruction

= A U V T + (noise) Column corresponds to datum. Column corresponds to ref. vec. of a class Impose constraint “each column

  • f V T has only one 1” (non-separable).

28 / 57

slide-33
SLIDE 33

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering as low-rank matrix reconstruction

= A U V T + (noise) Column corresponds to datum. Column corresponds to ref. vec. of a class Impose constraint “each column

  • f V T has only one 1” (non-separable).

28 / 57

slide-34
SLIDE 34

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

AMP-MAP algorithm

AMP-MAP: Algorithm greatly simplified in MAP problem setting (β → ∞). ut

ik =

  • j:ct

j =k aij

  • j:ct

j =k 1 ,

ct+1

j

= arg min

k∈{1,...,r}

  • 1

Mσ2 aj − ut

k2 − (−1) δct

j ,k

M

  • j:ct

j =k 1

  • .

Simple expectation-maximization (EM) algorithm applied to Gaussian mixture model with vanishing variance. Pros: Guarantee of monotonic convergence of marginal log-likelihood to local optimum.

29 / 57

slide-35
SLIDE 35

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

AMP-MAP algorithm

AMP-MAP: Algorithm greatly simplified in MAP problem setting (β → ∞). ut

ik =

  • j:ct

j =k aij

  • j:ct

j =k 1 ,

ct+1

j

= arg min

k∈{1,...,r}

  • 1

Mσ2 aj − ut

k2 − (−1) δct

j ,k

M

  • j:ct

j =k 1

  • .

Simple expectation-maximization (EM) algorithm applied to Gaussian mixture model with vanishing variance. Pros: Guarantee of monotonic convergence of marginal log-likelihood to local optimum.

29 / 57

slide-36
SLIDE 36

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering: numerical experiments

Dimension of data: M = 800 #data: N = 1600 #clusters: r ∈ {2, . . . , 18} Each data belongs to one of r clusters with equal prob. Variance of observation noise: Mσ2 = 80 Averaged over 500 trials to evaluate performance.

30 / 57

slide-37
SLIDE 37

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering: numerical experiments

Exactness of assignments 0.2 0.4 0.6 0.8 1 2 4 6 8 10 12 14 16 18 #clusters r × proposed + k-means

31 / 57

slide-38
SLIDE 38

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering: numerical experiments

Exactness of assignments 0.2 0.4 0.6 0.8 1 2 4 6 8 10 12 14 16 18 #clusters r × proposed + k-means

31 / 57

slide-39
SLIDE 39

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering: numerical experiments

Value of objective fn. A − ˆ U ˆ V T2

F

0.97 0.975 0.98 0.985 0.99 0.995 1 2 4 6 8 10 12 14 16 18 #clusters r × proposed + k-means

32 / 57

slide-40
SLIDE 40

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Clustering: numerical experiments

Error in mtx. reconstruction UV T − ˆ U ˆ V T2

F

0.5 1 1.5 2 2.5 3 2 4 6 8 10 12 14 16 18 #clusters r × proposed + k-means

33 / 57

slide-41
SLIDE 41

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

34 / 57

slide-42
SLIDE 42

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Problem formulation

Additive Gaussian noise A0 ∈ RM×N, rankA0 = r ≪ min{M, N} (Low-rank). Observe (part of) A = (Aij), Aij ∼ N((A0)ij, Mσ2) → Estimate the low-rank mtx. A0 Poisson noise A0 ∈ R+M×N, rankA0 = r ≪ min{M, N} (Low-rank). Observe (part of) A = (Aij), Aij ∼ Poisson(λ0 + (A0)ij) → Estimate the low-rank mtx. A0 Aij ∈ N+. λ0 ≥ 0: background intensity.

35 / 57

slide-43
SLIDE 43

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

11 12 13 14 15 21 22 23 24 25 31 32 33 34 35 41 42 43 44 45

v1 v2 v3 v4 v5 u1 u2 u3 u4

µi→(i,j)(ui) ∝ p(ui)

  • e
  • l=j[−(λ0+uT

i vj)+Aij log(λ0+uT i vj)]

×

  • l=j
  • µl→(i,l)(vl) dvl
  • λ(i,j)→i(ui) ∝
  • e−(λ0+uT

i vj) (λ0 + uT

i vj)Aij

Aij! µj→(i,j)(vj) dvj

36 / 57

slide-44
SLIDE 44

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

µi→(i,j)(ui) ∝ p(ui)

  • e
  • l=j[−(λ0+uT

i vj)+Aij log(λ0+uT i vj)]

×

  • l=j
  • µl→(i,l)(vl) dvl
  • λ(i,j)→i(ui) ∝
  • e−(λ0+uT

i vj) (λ0 + uT

i vj)Aij

Aij! µj→(i,j)(vj) dvj AMP: Application of CLT Requires assumption that uT

i vj is small, to allow Taylor

expansion of logarithm. Feasible in MAP setting (β → ∞).

37 / 57

slide-45
SLIDE 45

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

AMP for Poisson data

AMP-MAP: Algorithm greatly simplified in MAP problem setting (β → ∞). (We let p(ui) ∝ e−µui1.) ut

ik = j:ct

j =k aij

  • j:ct

j =k 1 + µ − λ0

  • +

, σt

ik =

  • j:ct

j =k aij

  • j:ct

j =k 1 + µ

2 (ut

ik > 0),

0 (ut

ik = 0),

ct+1

j

= arg max

k∈{1,...,r}

  • i

[aij log(λ0 + ut

ik) − uik]

+ (−1)

δct

j ,k 1

2λ2

  • i:ut

ik>0

(aij − λ0)2σt

ik

  • .

Simple EM algorithm applied to multivariate Poisson mixture.

38 / 57

slide-46
SLIDE 46

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

AMP for Poisson data

AMP-MAP: Algorithm greatly simplified in MAP problem setting (β → ∞). (We let p(ui) ∝ e−µui1.) ut

ik = j:ct

j =k aij

  • j:ct

j =k 1 + µ − λ0

  • +

, σt

ik =

  • j:ct

j =k aij

  • j:ct

j =k 1 + µ

2 (ut

ik > 0),

0 (ut

ik = 0),

ct+1

j

= arg max

k∈{1,...,r}

  • i

[aij log(λ0 + ut

ik) − uik]

+ (−1)

δct

j ,k 1

2λ2

  • i:ut

ik>0

(aij − λ0)2σt

ik

  • .

Simple EM algorithm applied to multivariate Poisson mixture.

38 / 57

slide-47
SLIDE 47

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Analysis and extensions of compressed sensing and low-rank matrix reconstruction

1

Low-rank matrix reconstruction via message passing

2

Application to clustering

3

Application to multivariate Poisson clustering

4

Conclusions

56 / 57

slide-48
SLIDE 48

bg=white

Low-rank matrix reconstruction via message passing Application to clustering Application to multivariate Poisson clustering C

Conclusions

Low-rank matrix reconstruction via message passing Problem formulation with probability model Derivation of BP/AMP-based iterative algorithm Application to clustering Extension to Poisson noise

57 / 57