Neu Neural C al Collab ollabor orativ ive e Subspace ce - - PowerPoint PPT Presentation

neu neural c al collab ollabor orativ ive e subspace ce
SMART_READER_LITE
LIVE PREVIEW

Neu Neural C al Collab ollabor orativ ive e Subspace ce - - PowerPoint PPT Presentation

Neu Neural C al Collab ollabor orativ ive e Subspace ce Clustering Tong Zhang, Pan Ji , Mehrtash Harandi, Wenbing Huang, Hongdong Li International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019 Subspace Clustering


slide-1
SLIDE 1

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Neu Neural C al Collab

  • llabor
  • rativ

ive e Subspace ce Clustering

Tong Zhang, Pan Ji, Mehrtash Harandi, Wenbing Huang, Hongdong Li

slide-2
SLIDE 2

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Subspace Clustering

  • Cluster data points drawn from a union of low-dimensional subspaces
  • Applications: image clustering, motion segmentation, etc.
slide-3
SLIDE 3

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Subspace Clustering Methods

  • STOA methods consist of two steps:
  • 1. Construct an affinity matrix for the whole dataset,
  • 2. Apply normalized cuts or spectral clustering.
slide-4
SLIDE 4

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Scalability Issue!!

  • Affinity matrix construction is expensive:
  • Large memory footprint;
  • High complexity in optimization.
  • Spectral clustering is expensive:
  • Computing SVD on large matrices is demanding.
slide-5
SLIDE 5

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Can we avoid the construction of huge affinity matrices and bypass the spectral clustering?

slide-6
SLIDE 6

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Our Idea

  • Construct affinity matrix in a batch;
  • Train a classifier using affinity matrices.
slide-7
SLIDE 7

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Our Idea

  • Construct affinity matrix in a batch;
  • Train a classifier using affinity matrices.
  • How?
slide-8
SLIDE 8

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Affinity from Classification

  • Build connection between clustering and classification via affinity

matrices

Classification Affinity Matrix from Classification

!"#$(&, () = +

,+

  • .
slide-9
SLIDE 9

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Affinity from Subspace

  • Subspace affinity

from self-expressiveness !"#$

input

Encoder

%&

Z ZC Decoder

  • utput

Self-Expressive Layer

'∗ !"#$

slide-10
SLIDE 10

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Collaborative Learning

  • Subspace affinity is more confident of identifying samples from the

same class. !"#$ !%&'

Positive Confidence

slide-11
SLIDE 11

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Collaborative Learning (cont’d)

  • Classification affinity is more confident of identifying samples from

different classes. !"#$ !%&'

Negative Confidence

slide-12
SLIDE 12

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Our Framework: Collaborative Learning

Data Batch

slide-13
SLIDE 13

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Clustering via Classifier

  • Output the clustering simply through the classification part (bypass

the spectral clustering): !" = $%&'$() *

") , ℎ = 1, ⋯ , /,

where / is number of clusters.

slide-14
SLIDE 14

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Experiments:

  • MNIST
  • Fashion-MNIST
slide-15
SLIDE 15

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Conclusion

  • Subspace is a powerful tool to represent data in high-dimensional

space.

  • Introduced a collaborative learning paradigm for clustering.
  • Made subspace clustering scalable through batch-wise training.
slide-16
SLIDE 16

International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

We’re hiring! For more details, visit http://www.nec- labs.com/research-departments/media-analytics.