Efficient Deep Feature Learning and Extraction via StochasticNets - - PowerPoint PPT Presentation

β–Ά
efficient deep feature learning and extraction via
SMART_READER_LITE
LIVE PREVIEW

Efficient Deep Feature Learning and Extraction via StochasticNets - - PowerPoint PPT Presentation

Efficient Deep Feature Learning and Extraction via StochasticNets Mohammad Javad Shafiee, Parthipan Siva Paul Fieguth and Alexander Wong. Presentation by Ivan Slobozhan Embedded devices and DNN Low-power CPU Custom embedded CPU not


slide-1
SLIDE 1

Efficient Deep Feature Learning and Extraction via StochasticNets

Mohammad Javad Shafiee, Parthipan Siva Paul Fieguth and Alexander Wong.

Presentation by Ivan Slobozhan

slide-2
SLIDE 2

Embedded devices and DNN

  • Low-power CPU
  • Custom embedded CPU not flexible
  • Addition costs
slide-3
SLIDE 3

Way how to improve

  • Synaptic formation is stochastic in nature
slide-4
SLIDE 4

Random Graphs

  • Common random graph representation: 𝐻 π‘œ, π‘ž , where 0 < π‘ž < 1.
  • Generalized random graph model: 𝐻 π‘Š, π‘ž)* , π‘ž),*πœ—[0, 1]
slide-5
SLIDE 5

Random Graphs

Random graphs with 7 nodes and π‘ž),* = 0,5

  • Start with π‘œ vertices π‘Š = 𝑀1 1 ≀ π‘Ÿ ≀ π‘œ and added 𝐹 =

{𝑓)*|1 ≀ 𝑗 ≀ π‘œ, 1 ≀ π‘˜ ≀ π‘œ, 𝑗 β‰  π‘˜).

slide-6
SLIDE 6

StochasticNets: Deep Neural Networks As Random Graph Realizations

  • 𝐻(π‘Š, π‘ž 𝑗 β†’ π‘˜ ), where π‘Š is a set of neurons π‘Š = 𝑀) 1 β‰₯ 𝑗 β‰₯ π‘œ , with 𝑀)

denoting the 𝑗@A neuron and n denoting the total number of neurons and π‘ž 𝑗 β†’ π‘˜ is the probability that neural connection occurs between neuron 𝑀) and 𝑀*.

slide-7
SLIDE 7

Constructing DNN as Random Graph

  • Ensure the properties of the network architecture
slide-8
SLIDE 8

FNN as a StochasticNet

  • No neural connections between non-adjacent layers.
  • No neural connections between neurons on the same layer.

π‘ž 𝑗 β†’ π‘˜ 𝑙 β†’ β„Ž = 0, π‘₯β„Žπ‘“π‘œ 𝑗 = π‘˜|| 𝑗 βˆ’ π‘˜ >

slide-9
SLIDE 9

Realization of FFN as a StochasticNet

FFN FFN Graph Realization

slide-10
SLIDE 10

Relationship to Other Methods

slide-11
SLIDE 11

Feature learning via Deep Convolutional StochasticNets

  • Neural connectivity in the convolutional layers are arranged such that small

spatially localized neural collections are connected to the same output neuron in the next layer

slide-12
SLIDE 12

Deep convolutional StochasticNet

slide-13
SLIDE 13

Spatial neural connectivity models

  • π‘ž 𝑗 β†’ π‘˜ = G𝑉 0, 1 , π‘˜πœ—π‘†)

0, π‘π‘’β„Žπ‘“π‘ π‘₯𝑗𝑑𝑓 βˆ’Uniform

  • π‘ž 𝑗 β†’ π‘˜ = N 𝑂 𝑗, 𝜏 , π‘˜πœ—π‘†)

0, π‘π‘’β„Žπ‘“π‘ π‘₯𝑗𝑑𝑓 βˆ’ Gaussian

slide-14
SLIDE 14

CIFAR-10 dataset

Gaussian Uniform

slide-15
SLIDE 15

STL-10

Gaussian Uniform

slide-16
SLIDE 16

Extraction time versus the number of neural connections

slide-17
SLIDE 17

Classification using Uniform model

slide-18
SLIDE 18

Classification using Gaussian model

slide-19
SLIDE 19

Conclusion

  • DNN could be effectively constructed by stochastic connectivity between

neurons.

  • StochasticNets reduce computation complexity
  • Provide better or close to ordinary CNN results in feature extractions