ee 6882 visual search engine
play

EE 6882 Visual Search Engine March 5 th , 2012 Lecture #7 Relevance - PDF document

3/5/2012 EE 6882 Visual Search Engine March 5 th , 2012 Lecture #7 Relevance Feedback Graph Based Semi Supervised Learning Application of Image Matching: Manipulation Detection What We have Learned Image Representation and


  1. 3/5/2012 EE 6882 Visual Search Engine March 5 th , 2012 Lecture #7  Relevance Feedback  Graph ‐ Based Semi ‐ Supervised Learning  Application of Image Matching: Manipulation Detection What We have Learned  Image Representation and Retrieval Using Global Features  Local Features and Image Matching  Image Classification 2 1

  2. 3/5/2012 What more can be done?  Image Representation and Retrieval  User in the Loop: Relevance Feedback  Local Features and Image Matching  New Features  Different Ways of Quantization, Codebook Learning  Application: Duplicate Detection  Image Classification  Machine Learning Techniques  Semi ‐ Supervised Learning  Multi ‐ Modal Fusion  Others:  User interfaces 3 When User in the Loop: Interactive Query Refinement Results 1 3 2 Query Formulation Query Processing  Feature Selection  Query Examples  Distance Metric  Classifiers  Ranking Model  Key Words Updated Online Update/ Rerank results  Relevance Feedback (shot, track, track interval) Handle  Feature/Attributes New Classifiers novel data  Interaction Log 2

  3. 3/5/2012 Example: Columbia TAG Interactive Image Search System  Demo: Rapid Image Annotation with User Interaction S.-F. Chang, Columbia U. 5 A Very Simple Case: Query Update • Automatically update Query Point based on user feedback f 2 Retrieval results Query f 1 3

  4. 3/5/2012 User Provides Feedback Positive Feedback f 2 Negative Feedback f 1 Query Update New Query Vector = mean(Positive Feedbacks) f 2 f 1 Positive Feedback Negative Feedback 4

  5. 3/5/2012 Query Expansion Positive Feedback f 2 Negative Feedback f 1 Build Multiple Classifiers Positive Feedback f 2 Negative Feedback f 1 5

  6. 3/5/2012 Graph-based Semi-Supervised Learning • Given a small set of labeled data and a large number of unlabeled data in a high-dimensional feature space – Build sparse graphs with local connectivity – Propagate information over graphs of large data sets – Hopefully robust to noise and scalable to gigantic sets Input samples with sparse labels Input samples with sparse labels Label propagation on graph Label propagation on graph Label inference results Label inference results Unlabeled Positive Negative Negative Positive 11 Intuition  Capture local structures via sparse graph Nonlinear Classifier Gr aph Semi- L inear Classifier Super vised L ear ning T hr ough Spar e Gr aph Constr uc tion (e.g., kNN) 6

  7. 3/5/2012 Possible Applications: Propagating Labels in Interactive Search & Auto Re ‐ ranking Processing Feature Compute No Image\Video data (denoising, Extraction Similarity predefined cropping …) Category Graph Construction Interactive Top-rank browse / label results Existing User Ranking/filtering Interface System Re-ranking Applications over large set Label Propagation Search, Browsing Automatic Mode Interactive Mode 13 S.-F. Chang, Columbia U. Background Review  Given a dataset of labeled samples , and unlabeled samples  undirected graph of samples as vertices and edges weighted by sample similarity  Define weight matrix ; vertex degree � � � � � �� � 7

  8. 3/5/2012 Example Weight matrix Node degree 1 1 0 0 0 0 1 0 3 0 0 0 1 D � 0 0 3 0 0 0 0 0 4 0 2 2 0 0 0 0 3 c lasses 1 0 0 1 0 0 0 1 0 samples 0 1 0 F � 0 0 1 �, �, � � � 0 0 1 0 0 1 0 0 1 0.1 0.2 0.9 ? ? ? Gr aph-based SSL L abel pr edic tion L abel matr ix Some Options of Constructing Sparse Graph  Distance Threshold  K-Nearest Neighbor Graph � �� � max � � � �� � 1 �� � � and � � ������� �� � � � ��  B-Matched Graph (Huang and Jebara, AISTATS 2007) (Jebara, Wang, and Chang, ICML 2009) max � � �� � �� � �� 8

  9. 3/5/2012 Several Ways of Constructing Sparse Graphs k,b=4 k,b=6 Distance threshold Rank threshold (kNN) B-Match Examples of Graph Construction (KNN) (B-Matching) k = 4 b = 4 9

  10. 3/5/2012 Graph Construction – Edge Weighting  Binary Weighting  Gaussian Kernel Weighting  Locally Linear Reconstruction Weighting Measure Smoothness: Graph Laplacian  Graph Laplacian , and normalized Laplacian  smoothness of function f over graph � �, �� � � � � �� � � � ��� � � � ��� � � � � � � �� � � �� �� ��� ��� � �, �� � � ���� � ��� Multi-class 10

  11. 3/5/2012 Classical Methods: (Zhu et al ICML03, Zhou et al NIPS04, Joachim ICML03) • Predict a graph function (F) via cost optimization empirical loss prediction function function smoothness  Local and Global Consistency - LGC (Zhou et al, NIPS 04) � ∗ � �� � → � � �� �� � � ��  Gaussian Random Fields – GRF (Zhu et al, ICML03) � � �� �0 � � Empirical Observations (Jebara, Wang, and Chang, ICML 2009)  Co mpare method-gr aphs-weights  B-matc hing te nds to o utpe rfo rm kNN  B-Matc hing partic ularly go o d fo r GT AM + lo c al GTAM line ar (L L R) we ight GTAM GTAM GTAM GTAM 22 GTAM 11

  12. 3/5/2012 Noisy Label and other Challenges LGC Propagation GRF Propagation Ill Label Unbalanced Noisy Data Labels Locations and Labels 23 Label Unbalance ‐ A Quick Fix  Normalize labels within each class based on node degrees c lasses Example: samples L abel matr ix Node degree matrix 12

  13. 3/5/2012 Dealing with Noisy Labels ‐‐ Graph Transduction via Alternate Minimization ( GTAM , Wang, Jebara, & Chang, ICML, 2008) ( LDST , Wang, Jiang, & Chang, CVPR, 2009)  Change uni ‐ variate optimization to bi ‐ variate formulation: Alternate Optimization  First, given Y solve continuous valued  Then, search optimal integer Y given F* Gradient decent search 13

  14. 3/5/2012 Alternate Minimization for Label Tuning 1 0 0.8 0.1 Example: 0 1 �0.23 �0.25 � � Q = � � = �0.31 0.07 0 0 �0.17 �0.04 0 0 Add label: Delete label: 0 0 ��� ����� (3,1) 0 1 � � = 1 0 ������ ����� (1,1) 0 0  Iteratively repeat the above procedure Initial Labels Label Diagnosis and Self Tuning ( LDST, Wang, Jian, & Chang, CVPR, 2009) Add label: Delete label: Iteration # 2 Iteration # 6 Decline of the cost function Q over iterations (with vs. without label tuning) 14

  15. 3/5/2012 Application: Web Search Reranking Google Search “Tiger” Keyword Search Web Images Top images as + Bottom imgs as ‐ Label Diagnosis Diffusion Application: Web Search Reranking Rerank Keyword Search Web Images Top images as + Bottom imgs as ‐ Label Diagnosis Diffusion 15

  16. 3/5/2012 Possible Applications: Propagating Labels in Interactive Search & Auto Re ‐ ranking Processing Feature Compute No Image\Video data (denoising, Extraction Similarity predefined cropping …) Category Graph Construction Interactive browse / label User Interface Applications Label Propagation Search, Browsing Interactive Mode 31 S.-F. Chang, Columbia U. Application: Brain Machine Interface for Image Retrieval -- denoise unreliable labels from brain signal decoding (joint work with Sajda et al, ACMMM 2009, J. of Neural Engineering, May 2011) Use image graph to Use EEG brain signals tune & propagate to detect target of information interest 16

  17. 3/5/2012 The Paradigm Database (any target that may interest users) 33 The Paradigm Neural (EEG) decoder EEG-scores Database 34 17

  18. 3/5/2012 The Paradigm Neural (EEG) decoder Exemplar labels (noisy) Graph-based Semi-Supervised image features Learning Database prediction score 35 The Paradigm Pre-triage Post-triage 36 18

  19. 3/5/2012 The Paradigm Machine filters out Human inspects noise and retrieves only a small targets from very sample set via BCI large DB • General: no predefined target models, no keyword • High Throughput: neuro ‐ vision as bootstrap of fast computer vision Pre-triage Post-triage 37 The Neural Signatures of “Recognition” D. Linden, Neuroscientist, 2005, the Oddball Effect Novel (P3a) Standard Target Target (P3b) Novel Novel Target Standard time 38 19

  20. 3/5/2012 Single-trial EEG Analysis • Typically EEG is averaged over trials to increase the amplitude of the signal correlated with cortical processes relative to artifacts (very low SNR) • High-density EEG systems were designed without a principled approach to handling the volume of information provided by simultaneously sampling from large electrode arrays. • Our solution: identifying neural correlates with individual stimuli via single trial EEG analysis. • We apply principled methods to find optimal ways for combining information over electrodes and moments in time contained in individual trials Single-trial EEG Event Related Potentials 39 NSF HNCV10 Identifying Discriminative Components in the EEG Using Single-Trial Analysis LDA or Logistic Regression is used to learn the contributions of (Parra, Sajda et al. 2002, 2003) EEG signal components at different spatial ‐ temporal locations Optimal spatial filtering across electrodes within each short window (e.g., 100ms) Optimal temporal filtering over time windows after onset 40 NSF HNCV10 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend