discrete hashing
play

Discrete Hashing Fast, scalable retrieval and classification Fumin - PowerPoint PPT Presentation

Discrete Hashing Fast, scalable retrieval and classification Fumin Shen Center for Future Media, University of Electronic Science and Technology of China Outline Introduction to Hashing Discrete optimization for Hashing Applications


  1. Discrete Hashing Fast, scalable retrieval and classification Fumin Shen Center for Future Media, University of Electronic Science and Technology of China

  2. Outline • Introduction to Hashing • Discrete optimization for Hashing • Applications of Discrete Hashing • Classification by Hamming Retrieval

  3. Background: Hashing Hamming distance Extremely Extremely fast! ast!

  4. Background: Hashing • Locality-Sensitive Hashing (LSH): [Gionis, Indyk, and Motwani 1999], [Datar et al. 2004], etc. Query hash function for a bit 101 101 0 Data Vector random 1 Recent study: learn to hash 0 1 1 0 Learned from data

  5. Background: Hashing The main application: Approximate Nearest Neighbor Search (ANNS) Image Database query visually relevant

  6. LSH • LSH has a lot of variants suitable for � � ( � � � � � ), Cosine similarity, Gaussian, � � kernels (KLSH), etc. • Good: Sublinear search time ��� � � for � < 1. • Bad: Long hash bits ( ~�� ) and hundreds of hash tables (big memory).

  7. Learning based Hashing

  8. Unsupervised Hashing Learning binary codes preserving data similarities • PCAH: generate W by principal component analysis (PCA) • SH: (Weiss et al., 2008) introduce unsupervised graph hashing • ITQ: (Gong and Lazebnik, 2011) orthogonal rotation matrix to refine the initial projection matrix by PCA • AGH: (Liu et al., 2011) solve SH by anchor graphs • IMH: (Shen et al., 2013) generate binary codes form general data manifolds • DGH: (Liu et al., 2014) Solve SH by discrete optimization • AIBC: (Shen et al., 2015) asymmetric hashing • …

  9. Supervised Hashing Learning binary codes supervised by piecewise or pairwise/ranking labels • SSH: (Wang et al., 2010) exploits both labeled and unlabeled data for hashing • MLH: (Norouzi and Fleet, 2011) based on structural SVM • KSH: (Liu et al., 2012) kernel based supervised hashing • FastH: (Lin et al., 2014) solve hashing by Graph cuts • SDH: (Shen et al., 2015) generate binary codes by discrete optimization • COSDISH: (Kang, et al., 2016) column sampling based discrete supervised hashing • DSeRH (Liu et al., 2017) deep ranking hashing • …

  10. Deep learning based Hashing • Lots of supervised methods • DAPH (Shen et al., MM’17) • DSeRH (Liu et al., CVPR’17) • DPSH (Li et al., IJCAI’16) • VDSH (Zhang et al., CVPR’16) • DSH (Liu et al., CVPR’16) • CNNH (Xia et al., AAAI’15) • Very few unsupervised ones • DH (Liong et al., CVPR’15) • Deepbit (Lin et al., CVPR’16) • UH-BDNN (Do et al., ECCV’16 )

  11. Deep vs. Shallow Deep learning boost supervised hashing Long way for unsupervised deep hashing Method ITQ IMH CNN+ITQ DH UH-BDNN MAP 17.76 18.38 0.255 16.62 18.35

  12. Manifold learning vs. Hashing Spectral Hashing Optimal hash codes • Very similar formulation • Key difference: discrete constraint

  13. The hashing problem • Mixed Integer Program; Normally NP hard • Difficult to optimize due to the discrete variables

  14. Solution in literature • Step 1: Relaxation -- discard the discrete constraints • Mimic sign function by continuous Sigmoid • Hard to achieve good (local) optima • Step 2: Rounding – thresholding after learning • Quantization techniques: ITQ (Gong and Lazebnik, 2011) • Increasing quantization distortion with long hash codes

  15. Our solution

  16. (I) Supervised Discrete Hashing Formulation: Joint learning of binary codes � , feature representation ���� and the linear classifier � � is the ground truth label matrix Algorithm: Alternating minimization until convergence • solve the W-subproblem (multi-class classification); • solve the F -subproblem (feature learning); • solve the B-subproblem (hash learning) – the key problem F. Shen, C. Shen, W. Liu, H. T. Shen, “ Supervised Discrete Hashing ”, CVPR’15.

  17. (I) Supervised Discrete Hashing The key binary code optimization problem Algorithm: Discrete Cyclic Coordinate descent (DCC) learn bit-by-bit Optimal, closed-form solution in each iteration!

  18. (I) Supervised Discrete Hashing Results Discrete optimization vs . Relaxed optimization CIFAR-10 dataset Discrete Optimization is Important for Hashing!

  19. (I) Supervised Discrete Hashing • SDH supports other losses such as hinge loss. Then the B- subproblem still has a closed-form update, while the W- subproblem is the multi-class SVM. • SDH scales linearly with the number � of labeled examples, so it can incorporate massive labeled data into training.

  20. Binary optimization How to solve the general binary code learning problem? • Design a new algorithm for every different loss? • The loss can be too complex to design feasible discrete optimization algorithm.

  21. (II) Discrete Proximal Linearized Minimization Motivation: Minimize an equivalent smooth + non- smooth loss Algorithm: Discrete Proximal Linearized Minimization Each iteration: closed-form, optimal solution! F. Shen, X. Zhou, Y. Yang, J. Song, H. T. Shen and D. Tao, ʺ A Fast Optimization Method for General Binary Code Learning ʺ , IEEE Transactions on Image Processing (TIP), 2016.

  22. (II) Discrete Proximal Linearized Minimization • Theoretical Guaranteed to converge! • Practical: • Very Fast, even faster than DCC in SDH • Successfully applied to supervised and unsupervised Hashing

  23. (III) Asymmetric Inner-product Binary Coding Hashing for Maximum Inner Product Search (MIPS): Retrieve the datum having the largest inner product with query q from database A Algorithm: Inner product fitting by asymmetric hash functions � is the inner products of � and � Decomposed this hard problem into two sub-problems with each solved by DCC , as in SDH. F. Shen, W. Liu, S. Zhang, Y. Yang, and H. T. Shen, “ Learning Binary Codes for Maximum Inner Product Search ”, ICCV 2015

  24. Results: unsupervised hashing Asymmetric Inner-product Binary Coding (AIBC)

  25. (IV) Discrete Collaborative Filtering Collaborative Filtering Our proposal: Discrete Collaborative Filtering H. Zhang, F. Shen , L. Liu, W. Liu, X. He, H. Luan and T. ‐ S. Chua, “ Discrete Collaborative Filtering ”, SIGIR 2016. Best Paper Award Honorable Mention

  26. (IV) Discrete Collaborative Filtering

  27. (V) Classification by Hamming Retrieval Motivation • Very few learn-to-hash work for classification! • Existing classification methods treat hash codes as real-valued features Boost even linear classification by hashing

  28. (V) Classification by Hamming Retrieval Idea : Classify binary data with binary weights Floating-point XNOR operations multiplications F. Shen , Y. Mu, Y. Yang, W. Liu, L. Liu, J. Song, H. T. Shen, “ Classification by Retrieval: Binarizing Data and Classifier ”, SIGIR 2017. Best Paper Award Honorable Mention

  29. (V) Classification by Hamming Retrieval Framework Classifying an image reduces to retrieving its nearest class codes in the Hamming space.

  30. (V) Classification by Hamming Retrieval Fomulation : Joint learning of binary codes and binary weights Inter-class margin • The loss can be any proper empirical loss. We particularly study the Exponential loss and Linear loss.

  31. (V) Classification by Hamming Retrieval Solution: Exponential loss W-subproblem: Binary Quadratic Program (BQP) • bit-by-bit Sequential bit flipping algorithm – local optimal • B-subproblem bit-by-bit • P-subproblem

  32. (V) Classification by Hamming Retrieval Results: LibLinear vs. our method on SUN 397

  33. (V) Classification by Hamming Retrieval Results : Comparison in accuracy (%), training and testing time (seconds).

  34. (V) Classification by Hamming Retrieval Results: Accuracy (%) with increasing binary code length

  35. (V) Classification by Hamming Retrieval Conclusions: • Convert linear classification to Hamming retrieval • Binarize both data and classifier in a joint problem • Support many empirical loss functions • Significant reduction on storage, training and testing computation

  36. (VI) Deep Sketch Hashing Sketch based image retrieval Existing methods: • Hand-crafted feature engineering. (e.g., SIFT, HOG, HELO [1] , LKS [2] ) • Deep learning based feature extraction Li Liu, Fumin Shen , Yuming Shen, Xianglong Liu, Ling Shao, “ Deep Sketch Hashing: Fast Free ‐ hand Sketch ‐ Based Image Retrieval, ”, CVPR 2017

  37. (VI) Deep Sketch Hashing Framework of DSH We integrate a convolutional neural network and discrete binary code learning into a unified framework. Li Liu, Fumin Shen, Yuming Shen, Xianglong Liu, Ling Shao, “ Deep Sketch Hashing: Fast Free ‐ hand Sketch ‐ Based Image Retrieval, ”, CVPR 2017

  38. (VI) Deep Sketch Hashing Objective Formulation of DSH non-convex and non-smooth

  39. (VI) Deep Sketch Hashing Alternating Optimization

  40. (VI) Deep Sketch Hashing Comparison with previous SBIR methods

  41. (VI) Deep Sketch Hashing Experimental results of DSH Comparison with cross-modality methods

  42. (VI) Deep Sketch Hashing Successful Cases of DSH:

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend