noisy in memory recursive computation with memristor
play

Noisy In-Memory Recursive Computation with Memristor Crossbars Elsa - PowerPoint PPT Presentation

Noisy In-Memory Recursive Computation with Memristor Crossbars Elsa Dupraz , Lav Varshney elsa.dupraz@imt-atlantique.fr IMT Atlantique, Lab-STICC, UBL varshney@illinois.edu University of Illinois at Urbana-Champaign Funded by


  1. Noisy In-Memory Recursive Computation with Memristor Crossbars Elsa Dupraz † , Lav Varshney ‡ † elsa.dupraz@imt-atlantique.fr IMT Atlantique, Lab-STICC, UBL ‡ varshney@illinois.edu University of Illinois at Urbana-Champaign Funded by Thomas Jefferson fund and by ANR project EF-FECtive

  2. Section 1: Introduction 2 Computation in memory Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion ◮ Data transfer bottleneck in conventional setups : Memory Processing units Memory banks + bottleneck ◮ In-memory computing : Memory Memory banks Processing units + Computationnal memory ◮ Memristors [SSSW08] : N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  3. Section 1: Introduction 3 Dot-product computation from memristor crossbars [LWFV18] Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion Memristor crossbar : Notation : ... ◮ u i : Input voltages ◮ x j : Output voltages ... ◮ g ij : Conductance values N g ij ... ... ... � x j = u i ... � N k = 0 g kj i = 1 Issue : ◮ Uncertainty on conductance values In this work : noisy computation from memristor crossbars N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  4. Section 1: Introduction 4 Existing works Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion Existing works ◮ Logic-in-Memory [GTDS17, AHC + 19] ◮ Dot-product computation from memristor crossbars [NKSB14] ◮ Memristor crossbars for Machine Learning [LWFV18, JAC + 19] ◮ Hamming distance computation [CC15] ◮ Noisy Hamming distance computation [CSD18] In this work ◮ Noisy dot-product computation in memory : Probability distribution of final computation error ◮ Noisy iterative dot-product computation in memory : Recursive expressions of means and variances of successive outputs N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  5. Section 1: Introduction 5 Table of contents Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion 1. Introduction 2. Dot-product computation 3. Iterative dot-product computation 4. Simulation results 5. Conclusion N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  6. Section 2: Dot-product computation 6 Table of contents Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion 1. Introduction 2. Dot-product computation 3. Iterative dot-product computation 4. Simulation results 5. Conclusion N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  7. Section 2: Dot-product computation 7 Computation model Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion ... Dot-product computation : N G ij � X j = U i ... � N k = 0 G kj i = 1 Noisy computation : ... ... ... ... ◮ G ij , U i are independent random variables ◮ 1st and 2nd order moments ◮ E [ G ij ] = g ij (target value) ◮ E [ U i ] = u i (target value) Objective : determine the probability distributions of the X j N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  8. Section 2: Dot-product computation 8 Main result Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion Objective : determine the probability distribution of X j = � N G ij k = 0 G kj U i i = 1 � N Theorem � 2 �� N i = 0 g ij If α j = lim � = 0, then N 2 N →∞ � � N 2 0 , 1 � d � X j − x j ⇒ N √ v j α 2 j ◮ x j = � N g ij k = 0 g kj u i is the true output value i = 1 � N ◮ d is the convergence in distribution N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  9. Section 2: Dot-product computation 9 Conclusions of the Theorem Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion ◮ The Theorem is valid for a large range of distributions ◮ The distribution of X j can be approximated by a Gaussian : � � v j X j ∼ AN x j , α 2 j N 4 ◮ The mean-squared error can be approximated as : v j E [( X j − x j ) 2 ] ≈ j N 4 . α 2 ◮ Conclusion : if v j and α j tend to constants, E [( X j − x j ) 2 ] → 0 N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  10. Section 3: Iterative dot-product computation 10 Table of contents Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion 1. Introduction 2. Dot-product computation 3. Iterative dot-product computation 4. Simulation results 5. Conclusion N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  11. Section 3: Iterative dot-product computation 11 Computation model Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion ... Iterative dot-product computation : X ( T ) = G ( T ) G ( T − 1 ) · · · G ( 1 ) X ( 0 ) ... Recursion : ... ... ... ◮ X ( t ) = G ( t ) X ( t − 1 ) . ... Objective : Recursive expressions of 1st and 2nd order statistics of the X ( t ) N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  12. Section 3: Iterative dot-product computation 12 Main results Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion G ij Objective : First-order moments of X ( t ) = � N k = 0 G kj X ( t − 1 ) j i = 1 � N i Proposition 1 : Mean The second-order Taylor expansion of the mean µ ( t ) of X ( t ) is given by j j g ( t ) N � � Θ j j ) 2 + Γ j Λ j 1 ij µ ( t ) � µ ( t − 1 ) = − j ) 3 + O j i δ ( t ) ( δ ( t ) ( δ ( t ) ( δ ( t ) j ) 3 i = 1 j where δ ( t ) = � N k = 0 g kj . j N →∞ µ ( t ) = x ( t ) Remark : we have that lim j j N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  13. Section 3: Iterative dot-product computation 13 Main results Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion G ij Objective : Second-order moments of X ( t ) = � N k = 0 G kj X ( t − 1 ) j i = 1 � N i Proposition 2 : Variance The second-order Taylor expansion of the variance γ ( t ) of X ( t ) is given by j j � 2 � � � j ) 3 + 3 Θ 2 j Γ j Θ j Ψ j j ) 2 − 2 Λ j Θ j 1 j ) 2 + O γ ( t ) j ) 4 − ( µ ( t ) = + j δ ( t ) ( δ ( t ) ( δ ( t ) ( δ ( t ) ( δ ( t ) j ) 3 j Proposition 3 : Covariance The second-order Taylor expansion of the covariance γ ( t ) jj ′ of X ( t ) , X ( t ) j ′ , with j � = j ′ , is j given by N N � � 1 γ ( t ) � � λ ij λ i ′ j ′ γ ( t − 1 ) jj ′ = + O i , i ′ ( δ ( t ) j ) 3 i = 1 i ′ = 1 N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  14. Section 4: Simulation results 14 Table of contents Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion 1. Introduction 2. Dot-product computation 3. Iterative dot-product computation 4. Simulation results 5. Conclusion N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  15. Section 4: Simulation results 15 Synthetic data Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion Parameters : ◮ Uniform random variables G ij , U i , ◮ Dot-product computation : N = 1000, K = 10000 samples X j ◮ Iterative computation : T = 8, N ∈ { 256 , 512 , 1024 } Histogram for dot-product computation : Variance approx for iterative computation : Histogram of Xj 14 Gaussian Approximation 10 - 5 Density Approximation of [18] 12 10 - 10 10 Variance 8 10 - 15 f 6 Empirical, N = 256 Gaussian approximation, N = 256 10 - 20 Taylor expansion, N = 256 Empirical, N = 512 4 Gaussian approximation, N = 512 Taylor expansion, N = 512 10 - 25 Empirical, N = 1024 2 Gaussian approximation, N = 1024 Taylor expansion, N = 1024 0 2 4 6 8 4.8 4.9 5 5.1 5.2 Iteration Number xj N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

  16. Section 4: Simulation results 16 PCA Introduction Dot-product computation Iterative dot-product computation Simulation results Conclusion Parameters : ◮ Memristor-based PCA [LWFV18] ◮ 10 images of size 16 × 16 ◮ Variance σ 2 ∈ { 0 . 01 , 1 , 4 } Results : Original image Noisy image Standard PCA 15 15 15 12 12 12 9 9 9 6 6 6 3 3 3 3 6 9 12 15 3 6 9 12 15 3 6 9 12 15 Mem. PCA ( σ =0.1) Mem. PCA ( σ =1) Mem. PCA ( σ =2) 15 15 15 12 12 12 9 9 9 6 6 6 3 3 3 3 6 9 12 15 3 6 9 12 15 3 6 9 12 15 t N OISY I N -M EMORY R ECURSIVE C OMPUTATION WITH M EMRISTOR C ROSSBARS Elsa Dupraz, Lav Varshney

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend