neuromorphic computing in cmos digital analog or mixed
play

Neuromorphic Computing in CMOS: Digital, Analog or Mixed-Signal ? - PowerPoint PPT Presentation

Neuromorphic Computing in CMOS: Digital, Analog or Mixed-Signal ? Shreyas Sen, Ayan Biswas, Priyadarshini Panda, Kaushik Roy ECE, Purdue University Sep 27, 2016 SPARC Lab 1 Neuromorphic Fundamental Question Error-Resiliency


  1. Neuromorphic Computing in CMOS: Digital, Analog or Mixed-Signal ? Shreyas Sen, Ayan Biswas, Priyadarshini Panda, Kaushik Roy ECE, Purdue University Sep 27, 2016 SPARC Lab 1

  2. Neuromorphic – Fundamental Question Error-Resiliency  Energy-efficiency Neuron Architecture • Approximate Neurons ? • Digital, Analog or Mixed-Signal? • Noisy Neurons ? SPARC Lab 2

  3. Digital vs. Analog Von-Neumann Computing Digital vs. Analog [1] [1] Sarpeshkar , Rahul. "Analog versus digital: extrapolating from electronics to neurobiology.“ Neural computation 10.7 (1998): 1601-1638. SPARC Lab 3

  4. Neuron Architecture - Digital Digital MAC (n=2) Transistor Count Multiplier Adder Thresholding 8b: High Static Leakage SPARC Lab 4

  5. Neuron Architecture – Mixed-Signal • Resistive Load 𝒐 • No PMOS load to ensure 𝑾 𝒑𝒗𝒖 = 𝝉( 𝒙 𝒍 𝒉 𝒏 𝑾 𝒍 ) • 𝑺 𝒎𝒑𝒃𝒆 ≠ 𝒈(𝑱 𝒖𝒑𝒖𝒃𝒎 ) ) R load • 𝒍=𝟐 Large Signal Multiplication V out 2 N-1 X 2 N-1 X 2 N-1 X 2X 2X 2X 𝒉 𝒏 1X 𝒉 𝒏 1X 𝒉 𝒏 1X w 1 w 2 w n V 1 V 2 V n N N N 𝒋 𝒄𝒋𝒃𝒕 𝒋 𝒄𝒋𝒃𝒕 𝒋 𝒄𝒋𝒃𝒕 SPARC Lab 5

  6. Comparison: Dig-N vs. MS-N SPARC Lab 6

  7. Error Resiliency vs. MS-N Noise Quantization Noise Thermal Noise SPARC Lab 7

  8. Conclusion 8b Digital Precision Neuron Mixed-Signal 3b Neuron (LS) 1MHz Frequency SPARC Lab 8

  9. THANK YOU SPARC Lab 9

  10. Dig-N: Noise and BW vs. Power SPARC Lab 10

  11. Dig-N: Noise and BW vs. Power SPARC Lab 11

  12. 6@24x24 10 FCN 6@12x12 12@4x4 12@8x8 28x28 6@24x24 6@12x12 12@8x8 12@4x4 28x28 SPARC Lab 12

  13. FCN (I784, 1H 50, 2H 100, O10) Prec vs error Trained with high precision (16) MNIST FCN Scaled down (16  1) CE as QN increases 5b it can hold accuracy 500nA 100nA 10nA 1nA MNIST CNN CIFAR CNN 500nA 100nA 10nA 1nA 44.5 35.4 28.6 25.6 500nA 100nA 10nA 1nA SPARC Lab 13

  14. FCN (I784, 1H 50, 2H 100, O10) Prec vs error Trained with high precision (16) MNIST FCN Scaled down (16  1) CE as QN increases 5b it can hold accuracy 500nA 100nA 10nA 1nA CIFAR CNN MNIST CNN 44.5 500nA 100nA 10nA 1nA 35.4 28.6 25.6 500nA 100nA 10nA 1nA SPARC Lab 14

  15. MNIST CNN FCN (I784, 1H 50, 2H 100, O10) Includes retraining for 3b 12% CNN is better trained – Shared Baseline error is low (good training) weight Resilient to introduction of thermal noise More neuron, bigger network – 50mV(low SNR) diverges more more error averaging SPARC Lab 15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend