efficient deep feature learning and extraction via
play

Efficient Deep Feature Learning and Extraction via StochasticNets - PowerPoint PPT Presentation

Efficient Deep Feature Learning and Extraction via StochasticNets Mohammad Javad Shafiee, Parthipan Siva Paul Fieguth and Alexander Wong. Presentation by Ivan Slobozhan Embedded devices and DNN Low-power CPU Custom embedded CPU not


  1. Efficient Deep Feature Learning and Extraction via StochasticNets Mohammad Javad Shafiee, Parthipan Siva Paul Fieguth and Alexander Wong. Presentation by Ivan Slobozhan

  2. Embedded devices and DNN • Low-power CPU • Custom embedded CPU not flexible • Addition costs

  3. Way how to improve • Synaptic formation is stochastic in nature

  4. Random Graphs • Common random graph representation: 𝐻 𝑜, 𝑞 , where 0 < 𝑞 < 1 . • Generalized random graph model: 𝐻 𝑊, 𝑞 )* , 𝑞 ),* 𝜗[0, 1]

  5. Random Graphs Start with 𝑜 vertices 𝑊 = 𝑤 1 1 ≤ 𝑟 ≤ 𝑜 and added 𝐹 = • {𝑓 )* |1 ≤ 𝑗 ≤ 𝑜, 1 ≤ 𝑘 ≤ 𝑜, 𝑗 ≠ 𝑘) . Random graphs with 7 nodes and 𝑞 ),* = 0,5

  6. StochasticNets: Deep Neural Networks As Random Graph Realizations • 𝐻(𝑊, 𝑞 𝑗 → 𝑘 ) , where 𝑊 is a set of neurons 𝑊 = 𝑤 ) 1 ≥ 𝑗 ≥ 𝑜 , with 𝑤 ) denoting the 𝑗 @A neuron and n denoting the total number of neurons and 𝑞 𝑗 → 𝑘 is the probability that neural connection occurs between neuron 𝑤 ) and 𝑤 * .

  7. Constructing DNN as Random Graph • Ensure the properties of the network architecture

  8. FNN as a StochasticNet • No neural connections between non-adjacent layers. • No neural connections between neurons on the same layer. 𝑞 𝑗 → 𝑘 𝑙 → ℎ = 0, 𝑥ℎ𝑓𝑜 𝑗 = 𝑘|| 𝑗 − 𝑘 >

  9. Realization of FFN as a StochasticNet FFN FFN Graph Realization

  10. Relationship to Other Methods

  11. Feature learning via Deep Convolutional StochasticNets • Neural connectivity in the convolutional layers are arranged such that small spatially localized neural collections are connected to the same output neuron in the next layer

  12. Deep convolutional StochasticNet

  13. Spatial neural connectivity models • 𝑞 𝑗 → 𝑘 = G𝑉 0, 1 , 𝑘𝜗𝑆 ) 0, 𝑝𝑢ℎ𝑓𝑠𝑥𝑗𝑡𝑓 − Uniform • 𝑞 𝑗 → 𝑘 = N 𝑂 𝑗, 𝜏 , 𝑘𝜗𝑆 ) 0, 𝑝𝑢ℎ𝑓𝑠𝑥𝑗𝑡𝑓 − Gaussian

  14. CIFAR-10 dataset Gaussian Uniform

  15. STL-10 Gaussian Uniform

  16. Extraction time versus the number of neural connections

  17. Classification using Uniform model

  18. Classification using Gaussian model

  19. Conclusion • DNN could be effectively constructed by stochastic connectivity between neurons. • StochasticNets reduce computation complexity • Provide better or close to ordinary CNN results in feature extractions

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend