session4 normsand inner products
play

Session4: Normsand inner-products Optimization and Computational - PowerPoint PPT Presentation

Session4: Normsand inner-products Optimization and Computational Linear Algebra for Data Science Lo Miolane Contents 1. Norms & inner-products 2. Orthogonality 1 3. Orthogonal projection 4. Proof of the Cauchy-Schwarz inequality


  1. Session4: Normsand inner-products Optimization and Computational Linear Algebra for Data Science Léo Miolane

  2. Contents 1. Norms & inner-products 2. Orthogonality 1 3. Orthogonal projection 4. Proof of the Cauchy-Schwarz inequality

  3. Normsandinner-products Norms and inner-products 1/20

  4. Questions Norms machine learning : in - H2 Euclidean H measure norm → • distances . ^ dot product Euclidean a. y . = ftp.Y-yuz-CG-it-3 ask ) " regularization " for Use norms • Loss ( data , a) + thalli with a ER " minimize { + Make Norms and inner-products 2/20

  5. Questions Norms and inner-products 2/20

  6. Questions Norms and inner-products 2/20

  7. Orthogonality Orthogonality 3/20

  8. Definition Definition We say that vectors x and y are orthogonal if È x, y Í = 0 . We write then x ‹ y . We say that a vector x is orthogonal to a set of vectors A if x is orthogonal to all the vectors in A . We write then x ‹ A . Exercise: If x is orthogonal to v 1 , . . . , v k then x is orthogonal to any linear combination of these vectors i.e. x ‹ Span( v 1 , . . . , v k ) . ifeng.se ::÷ . q 5) Guy > = details > t.itanefa.tk E Io to =o Orthogonality 4/20

  9. ⇒ Pythagorean Theorem Theorem (Pythagorean theorem) Let Î · Î be the norm induced by È · , · Í . For all x, y œ V we have ① € ∆ Î x + y Î 2 = Î x Î 2 + Î y Î 2 . x ‹ y ≈ Proof. Katy 112 ( nty , at y > = = ling THE t ⇤ Orthogonality 5/20

  10. ⇒ ⇒ Application to random variables ✓ = f secmoondmeuf ) variables with finite random define kx,y>=LfI X , Y For we , 11 × 11 =T ¥ ' the norm induced is X , Yhauezeiome-au.EE Let's that assume • Cov CX , Y ) =o ECXYT XIY - o - equivalent By Pyth them this is to . . 11 × +4112 11 × 112 t 114112 = = Efcxt 4723 - Voix ) + Vaecyj Va ¥ = Orthogonality 6/20

  11. Orthogonal & orthonormal families Definition We say that a family of vectors ( v 1 , . . . , v k ) is: orthogonal if the vectors v 1 , . . . , v n are pairwise orthogonal, i.e. È v i , v j Í = 0 for all i ” = j . orthonormal if it is orthogonal and if all the v i have unit norm: - = Î v 1 Î = · · · = Î v k Î = 1 . - - basis of Rn Example : is orthonormal canonical • the • ( (f) , ft ) ) orthogonal but not is orthonormal Orthogonality 7/20

  12. Coordinates in an orthonormal basis Proposition A vector space of finite dimension admits an orthonormal basis. Proposition Assume that dim( V ) = n and let ( v 1 , . . . , v n ) be an orthonormal basis of V . Then the coordinates of a vector x œ V in the basis ( v 1 , . . . , v n ) are ( È v 1 , x Í , . . . , È v n , x Í ) : - ① x = È v 1 , x Í v 1 + · · · + È v n , x Í v n . = have ⇒ rvn Proof tan Un : we t - . - for some - an ER ar - ' HEIL Gave t - tan un , Qi > = = - - Orthogonality 8/20

  13. Coordinates in an orthonormal basis Let n , y EVERY Cue , a) Vz tan , a > Vn t n = - - - T ae → n Cy , y > Vst - t Lun , ay > Un y - - - - - - Ps Bn - tpnvn ) ( my > = ( airs t . Reva t - - tannin , - - = E. Et ai fjLvi,v =/ 1 if i - j otherwise 0 + an pre dept t = - - - = FaEt---aI Hall Orthogonality 9/20

  14. Proof Orthogonality 10/20

  15. Orthogonalprojection Orthogonal projection 11/20

  16. Picture From now, È · , · Í denotes the Euclidean dot product, and Î · Î the Euclidean norm. which the vector y of S the closest What is is a ? to I den , s ) t ¥ s ( sobs pace ) Orthogonal projection 12/20

  17. Orthogonal projection and distance to a subspace Definition Let S be a subspace of R n . The orthogonal projection of a vector x onto S is defined as the vector P S ( x ) in S that minimizes the ← the rectory ES distance to x : ② P S ( x ) def = arg min Î x ≠ y Î . that nphiamjy.es y ∈ S The distance of x to the subspace S is then defined as d ( x, S ) def = min y ∈ S Î x ≠ y Î = Î x ≠ P S ( x ) Î . - - a ¢ • if a # Bca ) S then a = Ps Ca ) • if then a E S Orthogonal projection 13/20

  18. Computing orthogonal projections Proposition Let S be a subspace of R n and let ( v 1 , . . . , v k ) be an orthonormal basis of S . Then for all x œ R n , P S ( x ) = È v 1 , x Í v 1 + · · · + È v k , x Í v k . - Let YES Proof : - taa Va aah y = t - - , for - aa ER some h . - - EYE - y 112 Hn = = ¥ , ai Ily If • k = ¥ - • ( NYS di la , Vi ) = La , as vet . - tamed , - Orthogonal projection 14/20

  19. Proof t o ¥ dg2-2aiCa = Half Ha - YR :*÷÷:÷÷: - given by minimizer is y i - . - t Lana > re Ps (a) Ca , # ve t = . - 22 , Cami > ¥ 1 > flail = di - of flail = Zai ' Gi ) f- - Ka , vis * - the aint = Ca , vis Orthogonal projection o 15/20 → =

  20. Consequence Ps Ca ) re t ok t :IEi÷ ¥ ;If = - - - at define ' il ' :÷÷ , Vita = fu = ( Va , a) Os t - t Loa , a) Vee - - . , Orthogonal projection 16/20

  21. Consequence Corollary For all x œ R n , x ≠ P S ( x ) is orthogonal to S . Î P S ( x ) Î Æ Î x Î . = Orthogonal projection 17/20

  22. ProofofCauchy-Schwarz inequality Proof of Cauchy-Schwarz inequality 18/20

  23. Cauchy-Schwarz inequality Theorem Let Î · Î be the norm induced by the inner product È · , · Í on the vector space V . Then for all x, y œ V : I | È x, y Í | Æ Î x Î Î y Î . (1) Moreover, there is equality in (1) if and only if x and y are linearly = dependent, i.e. x = α y or y = α x for some α œ R . Proof : my EV Let result • If is obvious the a - O 0 de y = . assume that day ¥ 8 • From now we Proof of Cauchy-Schwarz inequality 19/20

  24. Proof IR → R f We define : - talk E t Hy + ¥ Half Ily 112 - It La , y > for TER , fct ) - 2 poltyomialcint ) = a Tyree - fettes • Ranch # I : for all t Remark # 2 fct ) 30 : • . d of fct ) is discriminant so Hence the → ÷ ! ! a=a T-4" to Proof of Cauchy-Schwarz inequality 20/20

  25. Proof if and only equality kn.gs/--HaUllyH There is if A- O t that There exists such → some fat - O - " that - ta -0 - talk -0 this y means Hy I ; - ta s . - Proof of Cauchy-Schwarz inequality 20/20

  26. Questions? 21/20

  27. Questions? KI cos O = ly Ik Is 21/20

  28. ⇐ Orthogonal matrices Definition A matrix A œ R n × n is called an orthogonal matrix if its columns are = - f an orthonormal family. = ten - go.a.g.ae - ( ÷ : matias Isis :) Ro . 22/20

  29. ⇒ A proposition Proposition Let A œ R n × n . The following points are equivalent: 1. A is orthogonal. ' G O AT = A ' 2. A T A = Id n . A invertible and G' 3. AA T = Id n A- → µ ⇐ s - i÷:t÷÷÷ : - y - - 23/20

  30. Orthogonal matrices & norm Proposition Let A œ R n × n be an orthogonal matrix. Then A preserves the dot product in the sense that for all x, y œ R n , È Ax, Ay Í = È x, y Í . In particular if we take x = y we see that A preserves the Euclidean norm: Î Ax Î = Î x Î . 24/20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend