matrix factorization
play

Matrix Factorization March 17, 2020 Data Science CSCI 1951A Brown - PowerPoint PPT Presentation

Matrix Factorization March 17, 2020 Data Science CSCI 1951A Brown University Instructor: Ellie Pavlick HTAs: Josh Levin, Diane Mutako, Sol Zitter 1 Announcements 2 Today Matrix Factorization with SVD Applications to: Topic


  1. cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 doc1 in new feature space 1 0 1 0 1 doc4 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V

  2. cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 weight of component 1 for 1 0 1 0 1 doc4 doc 1 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V

  3. cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 weight of component 1 over 1 0 1 0 1 doc4 all the data cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V

  4. cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 component 1 1 0 1 0 1 doc4 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V

  5. cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 contribution of “the” to 1 0 1 0 1 doc4 component 1 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V

  6. Singular Value Decomposition (SVD) x = x M U D V m x n m x m m x n n x n

  7. 67

  8. d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n

  9. d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n keep only first l components

  10. d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n keep only first l components “best l-rank approximation of M”

  11. d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V ||M - UDV|| 2 as small as possible m x n m x l l x l l x n keep only first l components “best l-rank approximation of M”

  12. 72

  13. Dimensionality Reduction • “Low Rank Assumption”: we typically assume that our features contain a large amount of redundant information • We can throw away a lot of principle components without losing too much of the signal needed for our task

  14. Clicker Question!

  15. Clicker Question! In practice, is this assumption of low rank valid? a) Yes b) No c) Yeah, sure, why not?

  16. Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*

  17. Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*

  18. Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*

  19. Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*

  20. Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”* *Matrix and Tensor Factorization Methods for Natural Language Processing. (ACL 2015)

  21. Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…

  22. Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…

  23. Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…

  24. Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…

  25. Matrix Completion to all the ballad of mud- boys i roma buster okja bound loved scruggs… before user1 1 0 1 user2 0 0 1 user3 1 0 1 0 1 0 user4 user5 1

  26. Matrix Completion to all the ballad of mud- boys i roma buster okja bound loved scruggs… before 1 0 user1 1 0 1 user2 0 0 1 1 user3 1 0 1 0 1 0 user4 user5 1 “people also liked…”

  27. Matrix Completion

  28. Matrix Completion M ≈ UDV = M’

  29. Matrix Completion M ≈ UDV = M’ original completed

  30. Matrix Completion M ≈ UDV = M’ original completed problems?

  31. Matrix Completion x x = M U D V Exact SVD assumes M is complete…

  32. Matrix Completion x x = M U D V …just gradient descent that MF!

  33. MF with Gradient Descent x x = M U D V

  34. MF with Gradient Descent = x M U V

  35. MF with Gradient Descent = x M U V Not properly SVD (fewer guarantees, e.g. components not orthonormal) but good enough

  36. <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> MF with Gradient Descent = x M U V X ( M ij − u i · v j ) 2 min U,V ij

  37. <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> MF with Gradient Descent = x M U V X ( M ij − u i · v j ) 2 min U,V ij But! Only consider cases when M ij is observed!

  38. Clicker Question!

  39. <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> Clicker Question! X ( M ij − u i · v j ) 2 min U,V ij Compute the loss given this setting of U and V… 2 1 1 2 3 2 2 0 0 M U V a) 14 b) 10 c) 6

  40. <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> Clicker Question! X ( M ij − u i · v j ) 2 min U,V ij Compute the loss given this setting of U and V… 1 2 2 1 x 1 2 = 2 4 3 2 0 0 2 0 0 M U V M’ a) 14 b) 10 c) 6

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend