a quantum statistical statistical mechanical mechanical a
play

A Quantum- -Statistical Statistical- -Mechanical Mechanical A - PowerPoint PPT Presentation

A Quantum- -Statistical Statistical- -Mechanical Mechanical A Quantum Extension of Gaussian Mixture Model Extension of Gaussian Mixture Model Kazuyuki Tanaka Kazuyuki Tanaka Graduate School of Information Sciences, Graduate School of


  1. A Quantum- -Statistical Statistical- -Mechanical Mechanical A Quantum Extension of Gaussian Mixture Model Extension of Gaussian Mixture Model Kazuyuki Tanaka Kazuyuki Tanaka Graduate School of Information Sciences, Graduate School of Information Sciences, Tohoku University, Sendai, Japan Tohoku University, Sendai, Japan http://www.smapip.is.tohoku.ac.jp/~kazu http:// www.smapip.is.tohoku.ac.jp/~kazu/ / In collaborat ollaboration with ion with In c Koji Tsuda Koji Tsuda Max Planck Institute for Biological Cybernetics, Germany Germany Max Planck Institute for Biological Cybernetics, September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 1 1

  2. Contents Contents 1. Introduction 1. Introduction 2. Conventional Gaussian Mixture Model 2. Conventional Gaussian Mixture Model 3. Quantum Mechanical Extension of 3. Quantum Mechanical Extension of Gaussian Mixture Model Gaussian Mixture Model 4. Quantum Belief Propagation 4. Quantum Belief Propagation 5. Concluding Remarks 5. Concluding Remarks September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 2 2

  3. Information Processing Information Processing by using Quantum Statistical- -Mechanics Mechanics by using Quantum Statistical Quantum Annealing in Optimizations Quantum Annealing in Optimizations Quantum Error Correcting Codes Quantum Error Correcting Codes etc... etc... Massive Information Processing Massive Information Processing by means of Density Matrix by means of Density Matrix September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 3 3

  4. Motivations Motivations How can we construct the quantum Gaussian mixture model? How can we construct a data- classification algorithm by using the quantum Gaussian mixture model? September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 4 4

  5. Contents Contents 1. Introduction 1. Introduction 2. Conventional Gaussian Mixture Model 2. Conventional Gaussian Mixture Model 3. Quantum Mechanical Extension of 3. Quantum Mechanical Extension of Gaussian Mixture Model Gaussian Mixture Model 4. Quantum Belief Propagation 4. Quantum Belief Propagation 5. Concluding Remarks 5. Concluding Remarks September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 5 5

  6. Prior of Gauss Mixture Model One of three labels 1,2 and 3 is assigned to each node. = = α P X N 3 labels ( 1 ) r i r 1 ∏ = = = P X x P X x ( ) ( ) = = α i i P X ( 2 ) i 2 = i 1 = = α P X ( 3 ) i 3 Label x i is generated randomly and independently of each node. Histogram α 1 α 3 α 2 2 3 1 x i =2 x i =3 x i =1 September 2007 IW-SMI2007, Kyoto 6

  7. Date Generating Process Data y i are generated randomly and independently of each node. μ = σ = 60 , 10 x i =1 1 1 = = P Y y X k ( | ) i i i ⎛ ⎞ 1 1 ⎜ ⎟ 2 = − − μ y exp ( ) i k ⎜ ⎟ π σ 2 σ 2 ⎝ 2 ⎠ k k x i =2 μ = σ = 150 , 30 1 1 x i =3 μ = σ = 200 , 20 3 3 September 2007 IW-SMI2007, Kyoto 7

  8. Gauss Mixture Models = = α P X x ( ) Prior Probability i i x i Data Generating Process ⎛ ⎞ 1 1 ⎜ ⎟ 2 = = = = − − μ P Y y X k g y y ( | ) ( ) exp ( ) i i i k i i k ⎜ ⎟ π σ 2 σ 2 ⎝ 2 ⎠ k k N 3 r r r r r ∏ ∑ = μ σ α = = = = P Y y P Y y X k P X k ( | , , ) ( | ) ( ) i i i i = = i k 1 1 N 3 Marginal Likelihood ∏ ∑ = α g y ( ) k k i for Hyperparameters μ , σ and α = = i k 1 1 r r r r r r r ˆ ˆ ˆ μ σ α = μ σ α P y ( , , ) arg max ( | , , ) r r r μ σ α ( , , ) September 2007 IW-SMI2007, Kyoto 8

  9. Conventional Gauss Mixture Models r r r r r r r ⎛ ⎞ y ˆ ˆ ˆ μ σ α = μ σ α P y ( , , ) arg max ( | , , ) 1 ⎜ ⎟ r r r y ⎜ ⎟ μ σ α r ( , , ) 2 = y : Data ⎜ ⎟ M ⎜ ⎟ − ⎜ ⎟ N 1 y ∑ ⎝ ⎠ ρ N y y ( ) i k i = i 0 μ ← k − N 1 ∑ ρ α , μ , σ r r r y ( ) α σ α k i g k μ ( | , , ) k k ← ρ y ( ) = r r r i 0 i ∑ α σ α g k μ ( | , , ) k k k − N 1 ∑ 2 μ ρ y y ( - ) ( ) i k i k = i 2 σ ← 0 k ρ ( y i ) − N 1 ∑ ρ y ( ) k i = i 0 Labels ⎛ ⎞ x N 1 ⎜ ⎟ 1 ∑ α ← ρ y ( ) x ⎜ ⎟ r k k i 2 N = x : Parameters ⎜ ⎟ = i 1 M ⎜ ⎟ ⎜ ⎟ x ⎝ ⎠ N September 2007 IW-SMI2007, Kyoto 9

  10. Contents Contents 1. Introduction 1. Introduction 2. Conventional Gaussian Mixture Model 2. Conventional Gaussian Mixture Model 3. Quantum Mechanical Extension of 3. Quantum Mechanical Extension of Gaussian Mixture Model Gaussian Mixture Model 4. Quantum Belief Propagation 4. Quantum Belief Propagation 5. Concluding Remarks 5. Concluding Remarks September 2007 September 2007 IW- IW -SMI2007, Kyoto SMI2007, Kyoto 10 10

  11. Quantum Gauss Mixture Models N 3 r ⎛ ⎞ r r r r ∏ ∑ 1 1 ⎜ ⎟ = μ σ α = α 2 P Y y g y = − − μ g y y ( | , , ) ( ) ( ) exp ( ) k i i k k k i ⎜ ⎟ π σ 2 σ 2 2 ⎝ ⎠ k k = = k i 1 1 3 ∑ α g y ( ) k k i = k 1 α ⎛ ⎞ g y ( ) 0 0 i 1 1 ⎜ ⎟ = α g y ⎜ ⎟ Tr 0 ln ( ) 0 i 2 2 ⎜ ⎟ α g y ⎝ 0 0 ln ( ) ⎠ i 3 3 α ⎛ ⎞ g y ln ( ) 0 0 i 1 1 ⎜ ⎟ = α g y ⎜ ⎟ Tr exp 0 ln ( ) 0 i 2 2 ⎜ ⎟ α g y ⎝ 0 0 ln ( ) ⎠ i 3 3 ⎛ α ⎞ ⎛ ⎞ ⎛ ⎞ g y ln 0 0 ln ( ) 0 0 i ⎜ 1 1 ⎟ ⎜ ⎟ ⎜ ⎟ = α + ⎜ g y ⎟ ⎜ ⎟ ⎜ ⎟ Tr exp 0 ln 0 0 ln ( ) 0 i 2 2 ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ α g y ⎝ 0 0 ln ⎠ ⎝ 0 0 ln ( ) ⎠ ⎝ ⎠ i 3 3 September 2007 IW-SMI2007, Kyoto 11

  12. Quantum Gauss Mixture Models N 3 r r r r r ∏ ∑ = μ σ α = α P Y y g y ( | , , ) ( ) Quantum k k i = = Representation i k 1 1 N − − F G y r r r r Tr exp( ( )) ∏ i μ σ α = P y ( | , , ) − F Tr exp( ) = i 1 α ⎛ ⎞ α γ γ ⎛ ⎞ ln 0 0 ln 1 ⎜ ⎟ 1 ⎜ ⎟ = − α = − γ α γ F ⎜ ⎟ F ⎜ ⎟ 0 ln 0 ln 2 2 ⎜ ⎟ ⎜ ⎟ α γ γ α ⎝ 0 0 ln ⎠ ln ⎝ ⎠ 3 3 ⎛ ⎞ g y ln ( ) 0 0 i 1 ⎜ ⎟ = − G y g y ⎜ ⎟ ( ) 0 ln ( ) 0 i i 2 ⎜ ⎟ g y ⎝ 0 0 ln ( ) ⎠ i 3 September 2007 IW-SMI2007, Kyoto 12

  13. Quantum Gauss Mixture Models N − H y r r r r Tr exp( ( )) ∏ i μ σ α = P y ( | , , ) − F Tr exp( ) = i 1 α γ γ ⎛ ⎞ g y ln ( ) i 1 1 ⎜ ⎟ = − γ α γ H y g y ⎜ ⎟ ( ) ln ( ) i i 2 2 ⎜ ⎟ γ γ α g y ⎝ ln ( ) ⎠ i 3 3 3 3 α = ⎧ g y k l ∑∑ i ln ( ) ( ) ( ) = − B X i k k i ( ) = kl B ⎨ kl kl γ ≠ = = k l k l ⎩ ( ) 1 1 ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ 1 0 0 0 1 0 0 0 0 ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ = = = X X X ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ L 0 0 0 0 0 0 0 0 0 11 12 33 ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎝ 0 0 0 ⎠ ⎝ 0 0 0 ⎠ ⎝ 0 0 1 ⎠ September 2007 IW-SMI2007, Kyoto 13

  14. Maxmum Likelihood Estimation in Quantum Gauss Mixture Model N − r r r r r r r H y r r r r Tr exp( ( )) ∏ ˆ ˆ ˆ μ σ α = μ σ α i P y μ σ α = P y ( , , ) arg max ( | , , ) ( | , , ) r r r − F μ σ α Tr exp( ) ( , , ) = i 1 Linear Response Formulas 1 − − λ − λ H y H y ∫ ( 1 ) ( ) ( ) λ e X e d i i Tr ∂ kl − H y ( ) 0 = i ln( Tr e ) − ∂ H y B ( ) i Tr e kl − H y ( ) X e i Tr kl = − H y ( ) e i Tr r r r μ σ α Extremun Condition for , and September 2007 IW-SMI2007, Kyoto 14

  15. Quantum Gauss Mixture Models r r r r r r r ˆ ˆ ˆ μ σ α = μ σ α P y ( , , ) arg max ( | , , ) r r r ⎛ ⎞ y μ σ α ( , , ) 1 ⎜ ⎟ y ⎜ ⎟ r 2 = y N : Data ⎜ ⎟ ∑ M y X ρ y Tr ( ) ⎜ ⎟ i kk i ⎜ ⎟ = y i ⎝ ⎠ μ ← 1 N k N ∑ X ρ y Tr ( ) kk i = i 1 α , μ , σ − H y ( ) N e i ∑ 2 ← μ y X ρ y ρ y ( - ) Tr ( ) ( ) i kk i k i − H y ( ) = e i i 2 Tr 1 σ ← k N ∑ X ρ y Tr ( ) kk i ρ ( y i ) = i 1 ⎛ ⎞ ⎛ ⎞ N 1 ⎜ ⎟ ⎜ ∑ ⎟ α ← X ρ y exp Tr ln ( ) ⎛ ⎞ k kk i x ⎜ ⎟ ⎜ ⎟ N 1 ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ = i 1 x ⎜ ⎟ r 2 = x : Parameters ⎜ ⎟ M ⎜ ⎟ ⎜ ⎟ x ⎝ ⎠ N September 2007 IW-SMI2007, Kyoto 15

  16. Image Segmentation Image Segmentation γ = 0.4 γ = 0.2 Conventional Original Image Gauss Mixture Model Quantum Gauss Mixture Model Histogram r r r μ σ α P y ( | , , ) i − − F G y Tr exp( ( )) i = 0 255 0 255 0 − 255 F Tr exp( ) September 2007 IW- -SMI2007, Kyoto SMI2007, Kyoto 16 September 2007 IW 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend