introduction to speaker diarization
play

Introduction to Speaker Diarization Dr. Gerald Friedland - PowerPoint PPT Presentation

Introduction to Speaker Diarization Dr. Gerald Friedland International Computer Science Institute Berkeley, CA friedland@icsi.berkeley.edu Monday, May 21, 12 Speaker Diarization... tries to answer the question: who spoke when?


  1. Segmentation & Clustering • Originally: Segment first, cluster later Chen, S. S. and Gopalakrishnan, P., “Clustering via the bayesian information criterion with applications in speech recognition,” Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, 2001, Vol. 2, Seattle, USA, pp. 645-648. • More e ffj cient: Top-Down and Bottom-Up Approaches 16 Monday, May 21, 12

  2. Segmentation: Secret Sauce 17 Monday, May 21, 12

  3. Segmentation: Secret Sauce • How do you distinguish speakers? 17 Monday, May 21, 12

  4. Segmentation: Secret Sauce • How do you distinguish speakers? • Combination of MFCC+GMM+BIC seems unbeatable! 17 Monday, May 21, 12

  5. Segmentation: Secret Sauce • How do you distinguish speakers? • Combination of MFCC+GMM+BIC seems unbeatable! • Can be generalized to Audio Percepts 17 Monday, May 21, 12

  6. MFCC: Idea Audio Signal Mel-Scale Pre-emphasis Filterbank Windowing Log-Scale FFT DCT MFCC power cepstrum of signal 18 Monday, May 21, 12

  7. MFCC: Mel Scale 19 Monday, May 21, 12

  8. MFCC: Result 20 Monday, May 21, 12

  9. Gaussian Mixtures 21 Monday, May 21, 12

  10. Training of Mixture Models Goal: Find a i for Expectation: Maximization: 22 Monday, May 21, 12

  11. Bayesian Information Criterion BIC = where X is the sequence of features for a segment, Θ are the parameters of the statistical model for the segment, K is the number of parameters for the model, N is the number of frames in the segment, λ is an optimization parameter. 23 Monday, May 21, 12

  12. Bayesian Information Criterion: Explanation 24 Monday, May 21, 12

  13. Bayesian Information Criterion: Explanation • BIC penalizes the complexity of the model (as of number of parameters in model). 24 Monday, May 21, 12

  14. Bayesian Information Criterion: Explanation • BIC penalizes the complexity of the model (as of number of parameters in model). • BIC measures the e ffj ciency of the parameterized model in terms of predicting the data. 24 Monday, May 21, 12

  15. Bayesian Information Criterion: Explanation • BIC penalizes the complexity of the model (as of number of parameters in model). • BIC measures the e ffj ciency of the parameterized model in terms of predicting the data. • BIC is therfore used to choose the number of clusters according to the intrinsic complexity present in a particular dataset. 24 Monday, May 21, 12

  16. Bayesian Information Criterion: Properties 25 Monday, May 21, 12

  17. Bayesian Information Criterion: Properties • BIC is a minimum description length criterion. 25 Monday, May 21, 12

  18. Bayesian Information Criterion: Properties • BIC is a minimum description length criterion. • BIC is independent of the prior. 25 Monday, May 21, 12

  19. Bayesian Information Criterion: Properties • BIC is a minimum description length criterion. • BIC is independent of the prior. • It is closely related to other penalized likelihood criteria such as RIC and the Akaike information criterion. 25 Monday, May 21, 12

  20. Bottom-Up Algorithm Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster2 Cluster1 Cluster2 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  21. Bottom-Up Algorithm Initialization  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  22. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  23. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 (Re-)Training  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  24. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 (Re-)Training  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  25. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 (Re-)Training (Re-)Alignment  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  26. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 (Re-)Training (Re-)Alignment Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster3 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  27. Bottom-Up Algorithm Initialization Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Cluster1 Cluster2 Cluster3 Yes (Re-)Training Merge two Clusters? (Re-)Alignment Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster3 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  28. Bottom-Up Algorithm Initialization Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster2 Cluster2 Yes (Re-)Training Merge two Clusters? (Re-)Alignment Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster3 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  29. Bottom-Up Algorithm Initialization Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster2 Cluster2 Yes (Re-)Training Merge two Clusters? (Re-)Alignment Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster3 Cluster3  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  30. Bottom-Up Algorithm Initialization Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster2 Cluster2 Yes (Re-)Training Merge two Clusters? (Re-)Alignment Cluster1 Cluster1 Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  31. Bottom-Up Algorithm Initialization Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2 Cluster2 Cluster2 No Yes (Re-)Training Merge two Clusters? End (Re-)Alignment Cluster1 Cluster1 Cluster2 Cluster2 Cluster1 Cluster1 Cluster2 Cluster2  Start with too many clusters (initialized randomly)  Purify clusters by comparing and merging similar clusters 26  Resegment and repeat until no more merging needed Monday, May 21, 12

  32. ICSI’s Speaker Diarization 27 Monday, May 21, 12

  33. ICSI’s Speaker Diarization • Speaker Diarization research @ ICSI since 2001 27 Monday, May 21, 12

  34. ICSI’s Speaker Diarization • Speaker Diarization research @ ICSI since 2001 • Various versions of Diarization Engines developed over the years 27 Monday, May 21, 12

  35. ICSI’s Speaker Diarization • Speaker Diarization research @ ICSI since 2001 • Various versions of Diarization Engines developed over the years • Status: Research code but stable for some applications that are error tolerant 27 Monday, May 21, 12

  36. ICSI’s Speaker Diarization Engine Variants 28 Monday, May 21, 12

  37. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation) 28 Monday, May 21, 12

  38. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation)  Fast (single mic, multiple CPU cores) 28 Monday, May 21, 12

  39. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation)  Fast (single mic, multiple CPU cores)  Super fast (single mic, multiple GPUs) 28 Monday, May 21, 12

  40. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation)  Fast (single mic, multiple CPU cores)  Super fast (single mic, multiple GPUs)  Accurate but slow (multi mic, additional preprocessing) 28 Monday, May 21, 12

  41. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation)  Fast (single mic, multiple CPU cores)  Super fast (single mic, multiple GPUs)  Accurate but slow (multi mic, additional preprocessing)  Audio/Visual (single and multi mic, for localization) 28 Monday, May 21, 12

  42. ICSI’s Speaker Diarization Engine Variants  Basic (single mic, easy installation)  Fast (single mic, multiple CPU cores)  Super fast (single mic, multiple GPUs)  Accurate but slow (multi mic, additional preprocessing)  Audio/Visual (single and multi mic, for localization)  Online (single mic, “who is speaking now”) 28 Monday, May 21, 12

  43. Basic Speaker Diarization: Facts 29 Monday, May 21, 12

  44. Basic Speaker Diarization: Facts • Input: 16kHz mono audio 29 Monday, May 21, 12

  45. Basic Speaker Diarization: Facts • Input: 16kHz mono audio • Features: MFCC19, no delta or deltadelta 29 Monday, May 21, 12

  46. Basic Speaker Diarization: Facts • Input: 16kHz mono audio • Features: MFCC19, no delta or deltadelta • Speech/Non-Speech Detector external 29 Monday, May 21, 12

  47. Basic Speaker Diarization: Facts • Input: 16kHz mono audio • Features: MFCC19, no delta or deltadelta • Speech/Non-Speech Detector external • Runtime: ~ realtime (1h audio needs 1h processing on a single CPU, excluding speech/non-speech) 29 Monday, May 21, 12

  48. Multi-CPU Speaker Diarization: Facts 30 Monday, May 21, 12

  49. Multi-CPU Speaker Diarization: Facts • Same as Basic Speaker Diarization 30 Monday, May 21, 12

  50. Multi-CPU Speaker Diarization: Facts • Same as Basic Speaker Diarization • Runtime: Dependent on number of CPUs used. Example: 8 cores runtime = 14.3 x realtime, i.e. 14minutes of audio need 1 minute of processing. 30 Monday, May 21, 12

  51. Multi-CPU Speaker Diarization: Facts • Same as Basic Speaker Diarization • Runtime: Dependent on number of CPUs used. Example: 8 cores runtime = 14.3 x realtime, i.e. 14minutes of audio need 1 minute of processing. • Runtime bottleneck usually: Speech/ Non-Speech Detector 30 Monday, May 21, 12

  52. GPU Speaker Diarization: Facts 31 Monday, May 21, 12

  53. GPU Speaker Diarization: Facts • Same as Basic Speaker Diarization 31 Monday, May 21, 12

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend