Laterally Connected Lobe Component Analysis: Precision and - - PowerPoint PPT Presentation

laterally connected lobe component
SMART_READER_LITE
LIVE PREVIEW

Laterally Connected Lobe Component Analysis: Precision and - - PowerPoint PPT Presentation

Laterally Connected Lobe Component Analysis: Precision and Topography Matt Luciw Juyang Weng Embodied Intelligence Laboratory Department of Computer Science and Engineering Michigan State University East Lansing, MI Enabling Emergent Internal


slide-1
SLIDE 1

Laterally Connected Lobe Component Analysis: Precision and Topography

Matt Luciw Juyang Weng

Embodied Intelligence Laboratory Department of Computer Science and Engineering Michigan State University East Lansing, MI

slide-2
SLIDE 2

Enabling Emergent Internal Representation

  • Internal representation for non-symbolic agents
  • Emergent: develops from experience
  • Efficient: utilize limited resource
  • Effective: leads to good performance
  • How must learning mechanisms adapt

throughout the time course of development?

slide-3
SLIDE 3

Cortex-Inspired Multilayer Two-Way Networks

  • Level of modeling
  • Firing rate model
  • Explicitly weighted connected neurons
  • Hebbian learning (LCA)
  • Pathway development
  • Hierarchical, layered networks from

sensors to motors

  • Sensors: pixels from camera
  • Motors control actions and behavior

Adapted from Kandel, Schwartz and Jessell 2000 Visuomotor pathways in cortex

  • Three types of connections

1.

Bottom-up

2.

Top-down

3.

Lateral

slide-4
SLIDE 4

Michigan State University 4

Context of This Work

 Optimal synaptic weight learning: LCA (Weng and

Zhang, WCCI 2006, Weng and Luciw, TAMD)

 Top-down connections

 Class-based grouping (Luciw and Weng, WCCI 2008)

 Top-down connections and Time

 Almost-perfect recognition in centered objects (Luciw

and Weng, ICDL 2008)

 This work: Extend LCA and MILN with adaptive

lateral excitatory connections

slide-5
SLIDE 5

Michigan State University 5

Related Work: Lateral Connections

 SOM (Kohonen)

 Isotropic updating  Scope and learning rates manually tuned

 LISSOM (Miikulainen et al.)

 Explicit lateral excitatory and inhibitory connections  Learning rates manually tuned

 MILN (Weng et al.)

 ``Growing cortex’’ (scheduled growth times)  Optimal LCA: Automatic tuning of learning rates

 LC-LCA within MILN (this work)

 Explicit lateral excitatory connections  Using optimal LCA framework  Including top-down connections

slide-6
SLIDE 6

Michigan State University 6

Motor and Somatosensory Organization

  • Primary cortical areas are organized topographically.
slide-7
SLIDE 7

Tootell, et al. fMRI mapping of a morphed continuum of 3D shapes within inferior temporal cortex, 2007

FFA and PPA Areas

slide-8
SLIDE 8

Buzas, et al. Model-based analysis of excitatory lateral connections in visual cortex, 2006

V1 Connectivity

slide-9
SLIDE 9

Lateral Excitation: Conflicting Criteria

 Early stages: the brain must organize more globally

 Critical for generalization with limited connections  Mechanism: Isotropic updating

 Later stages: the brain must fine-tune its representation

 Critical for superior performance

 Lateral excitation mechanisms for both?

 Isotropic updating: organized but not precise  Neurons do not excite (interfere with) one another: precise but not

  • rganized

 Solution: adaptive lateral connections

Figure from Weng, Luwang, Lu and Xue, 2007

slide-10
SLIDE 10

LC-LCA Algorithm

slide-11
SLIDE 11

Network Computation

  • 1. Neurons Compute:
slide-12
SLIDE 12

Lateral Inhibition

  • 2. Neurons Compete:

1.

Rank neuron pre-responses

2.

Top-k are scaled and will update

3.

Others do not fire

  • Efficient approximation: replaces iterations
  • Simplifies lateral dynamics, but not biologically plausible
  • Sparse coding emerges
slide-13
SLIDE 13
  • 3. Optimal Hebbian Learning
  • Optimal adaptation of winners

Hebbian adaptation, given stimulus (pre-synaptic activity), and firing (post-synaptic activity) Learning rates are automatically tuned

 Each converges to the principal component of its observations  Minimize representational error in a mean-square sense

LCA Updating

slide-14
SLIDE 14

Michigan State University 14

LCA Weight Development

 Neurons weight are roughly the expectation

  • f their response-weighted input

 For lateral weights:

j i

Neuron i fires

slide-15
SLIDE 15

Michigan State University 15

Previous Method: 3 x 3 Updating

 Some neurons converge

to specialize in low- probability parts of the input space

slide-16
SLIDE 16

Michigan State University 16

Effect of Adaptive Lateral Connections: Intuitive (?) Example

No lateral excitation: unbalanced Isotropic updating: balanced but wasteful Adaptive

slide-17
SLIDE 17

Michigan State University 17

Relative Levels of Modeling

 Bottom-up Connections (+): level l  Top-down Connections (+): level l  Lateral Connections (-): higher than l  Before: Lateral Connections (+): higher than l  Now: Lateral Connections (+): level l

slide-18
SLIDE 18

Experiments

 MSU 25-Objects Dataset: 25 Classes  200 images per class: 3D rotation  4/5 training data, 1/5 testing data  Grayscale

slide-19
SLIDE 19

Imposed

Network Setup

Communication

  • f top firing

(Left): Training (Learning) (Below): Testing

slide-20
SLIDE 20

Experiment Setup

  • 5 trials for each test
  • Each trial trained for 25,000 image/label pairs
  • Error: disjoint testing samples

 Developmental Scheduling

 No adaptation of lateral weights for 500 t  Schedule the number of winners (K)

 Compared:

 3x3 with top-down  3x3 without top-down  LC-LCA with top-down  LC-LCA without top-down

slide-21
SLIDE 21

Results: Recognition Rate

slide-22
SLIDE 22

Results: Neuronal Entropy

  • If entropy is zero, neuron i fires for samples from a single class.
slide-23
SLIDE 23

Topographic Organization Comparison

slide-24
SLIDE 24

Michigan State University 24

Comparison of Updating Methods

LCA: SOM: LISSOM:

slide-25
SLIDE 25

Michigan State University 25

Results: Comparison of Updating Methods

slide-26
SLIDE 26

Michigan State University 26

Conclusion

 Laterally Connected Lobe Component Analysis  Smoothness and Precision are conflicting criteria

 Mitigate through adaptive lateral connections  Developmental scheduling

 Integrated networks with bottom-up, lateral, and

top-down connections

 LCA’s optimal update leads to more stable

performance

 Future directions

 Locally connected laterally connected networks  Adaptive lateral connections in what/where networks

slide-27
SLIDE 27

Michigan State University 27