Feature Grouping as a Stochastic Regularizer for High Dimensional Structured Data
Bertrand Thirion
(INRIA, France)
Gaël Varoquaux
(INRIA, France)
Sergül Aydöre
(Stevens Ins3tute of Technology, USA)
POSTER: Pacific Ballroom #121, 06/11, Tuesday
Feature Grouping as a Stochastic Regularizer for High Dimensional - - PowerPoint PPT Presentation
Feature Grouping as a Stochastic Regularizer for High Dimensional Structured Data Sergl Aydre Bertrand Thirion Gal Varoquaux (Stevens Ins3tute of Technology, USA) (INRIA, France) (INRIA, France) POSTER : Pacific Ballroom #121, 06/11,
(INRIA, France)
(INRIA, France)
(Stevens Ins3tute of Technology, USA)
POSTER: Pacific Ballroom #121, 06/11, Tuesday
PET acquisition process wikipedia MRI Scanner and rs-fMRI time series acquisition [NVIDIA] Genomics
Integrative Genomics Viewer, 2012
Seismology
hKps://www.mapnagroup.com
Astronomy
Astronomy Magazine, 2015
POSTER: Pacific Ballroom #121, 06/11, Tuesday
A typical MEG equipment [BML2001]
POSTER: Pacific Ballroom #121, 06/11, Tuesday
POSTER: Pacific Ballroom #121, 06/11, Tuesday
POSTER: Pacific Ballroom #121, 06/11, Tuesday
POSTER: Pacific Ballroom #121, 06/11, Tuesday
Training Data Recursive Nearest AgglomeraJon (ReNA) [Hoyos et al 2016] Iteration 2 Iteration 1 Iteration N
Number of clusters = 5
grouping algorithm
Clusters are then recursively merged until the desired number of clusters remain.
(ii) leads to good signal approximations. Algorithm
Φ = α1 · · · α1 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 α2 · · · α2 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 α3 · · · α3 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 α4 · · · α4 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 0 · · · 0 α5 · · · α5
<latexit sha1_base64="fFMxY1HAqzjxFw4rJL83tBDhvS8=">ADqXicnZJda9swFIYVex+d95Wul7s5LKwMBsF2UrqbQqE3u+hFCk0aFoUgy0oiKktGksuCyW/rf9hd/82U1P2Im8HoAcF73nP0SJZPkgtubBjeNDz/xctXr3feBG/fvf/wsbn7aWBUoSnrUyWUHibEMEl61tuBRvmpEsEewiuTxZ1S+umDZcyXO7yNk4IzPJp5wS6zJbuMaJ0qkGbHzEvfmfAlHADhMy7LxLma/14CJiKfk0kEmKbKmod8H8I7L4TN7N8JxsHjUgWLa/D4f3F13kZfBevU4J3n8rbBuzV497m8bfCDGvwAIMBMpvc/Z9Jshe1wHfBURJVoSp6k+YfnCpaZExaKogxoyjM7bgk2nIq2DLAhWE5oZdkxkZOSpIxMy7Xk7aEr85JYaq0W9LC2n28oySZMYscZ2rkTL12srcVhsVdvpjXHKZF5ZJenvQtBgFazGFlKuGbVi4QShmru7Ap0Tah1wx24R4jqn/xUDOJ21GnHZ93WcVw9xw76jL6gbyhCh+gY/UQ91EfU2/dOvb438L/7Z/7Q/3Xb6jWqPXtoI3z6F28VEs8=</latexit>Feature Grouping Matrix
Each row captures a different structure
Reduction and Low-rank Approximation
POSTER: Pacific Ballroom #121, 06/11, Tuesday Consider fully connected neural network with 𝑰 layers
POSTER: Pacific Ballroom #121, 06/11, Tuesday Pre-compute a bank of feature grouping matrices
POSTER: Pacific Ballroom #121, 06/11, Tuesday Sample from the training set
POSTER: Pacific Ballroom #121, 06/11, Tuesday Sample from the bank of feature grouping matrices
POSTER: Pacific Ballroom #121, 06/11, Tuesday Re-define parameter space and project input onto lower dimensional space Forward Propagation
POSTER: Pacific Ballroom #121, 06/11, Tuesday Apply back propagation Back Propagation
POSTER: Pacific Ballroom #121, 06/11, Tuesday Update parameters
POSTER: Pacific Ballroom #121, 06/11, Tuesday
Feature Grouping performs best as the sample size decreases
Performance in terms of sample size for fMRI data Performance in terms of computa3on 3me for OliveZ Faces
Feature Grouping is computaJonally efficient and robust to noise