CS 103: Representation Learning, Information Theory and Control
Lecture 5, Feb 8, 2019
CS 103: Representation Learning, Information Theory and Control - - PowerPoint PPT Presentation
CS 103: Representation Learning, Information Theory and Control Lecture 5, Feb 8, 2019 Representation Learning and Information Bottleneck Desiderata for representations An optimal representation z of the data x for the task y is a stochastic
Lecture 5, Feb 8, 2019
3
i p(zi))
Minimal sufficient representations for deep learning
4
regularizer cross-entropy
We only need to enforce minimality (easy) to gain invariance (difficult)
5
constant minimality invariance
The standard architecture alone already promotes invariant representations
6
1
2
3
Creating a soft bottleneck with controlled noise
7
bottleneck
Achille and Soatto, "Information Dropout: Learning Optimal Representations Through Noisy Computation”, PAMI 2018 (arXiv 2016)
8
Only informative part of the image Other information is discarded
(Achille and Soatto, 2017)
Achille and Soatto, "Information Dropout: Learning Optimal Representations Through Noisy Computation”, PAMI 2018 (arXiv 2016)