matrix calculations inner products orthogonality
play

Matrix Calculations: Inner Products & Orthogonality A. - PowerPoint PPT Presentation

Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Matrix Calculations: Inner Products & Orthogonality A. Kissinger Institute for Computing and Information


  1. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Matrix Calculations: Inner Products & Orthogonality A. Kissinger Institute for Computing and Information Sciences Radboud University Nijmegen Version: spring 2017 A. Kissinger Version: spring 2017 Matrix Calculations 1 / 48

  2. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Outline Inner products and orthogonality Orthogonalisation Application: computational linguistics Wrapping up A. Kissinger Version: spring 2017 Matrix Calculations 2 / 48

  3. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Length of a vector • Each vector v = ( x 1 , . . . , x n ) ∈ R n has a length (aka. norm), written as � v � • This � v � is a non-negative real number: � v � ∈ R , � v � ≥ 0 • Some special cases: • n = 1: so v ∈ R , with � v � = | v | • n = 2: so v = ( x 1 , x 2 ) ∈ R 2 and with Pythagoras: � � v � 2 = x 2 1 + x 2 x 2 1 + x 2 and thus � v � = 2 2 • n = 3: so v = ( x 1 , x 2 , x 3 ) ∈ R 3 and also with Pythagoras: � � v � 2 = x 2 1 + x 2 2 + x 2 x 2 1 + x 2 2 + x 2 and thus � v � = 3 3 • In general, for v = ( x 1 , . . . , x n ) ∈ R n , � x 2 1 + x 2 2 + · · · + x 2 � v � = n A. Kissinger Version: spring 2017 Matrix Calculations 4 / 48

  4. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Distance between points • Assume now we have two vectors v , w ∈ R n , written as: v = ( x 1 , . . . , x n ) w = ( y 1 , . . . , y n ) • What is the distance between the endpoints? • commonly written as d ( v , w ) • again, d ( v , w ) is a non-negative real • For n = 2, � ( x 1 − y 1 ) 2 + ( x 2 − y 2 ) 2 = � v − w � = � w − v � d ( v , w ) = • This will be used also for other n , so: d ( v , w ) = � v − w � A. Kissinger Version: spring 2017 Matrix Calculations 5 / 48

  5. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Length is fundamental • Distance can be obtained from length of vectors • Interestingly, also angles can be obtained from length! • Both length of vectors and angles between vectors can de derived from the notion of inner product A. Kissinger Version: spring 2017 Matrix Calculations 6 / 48

  6. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Inner product definition Definition For vectors v = ( x 1 , . . . , x n ) , w = ( y 1 , . . . , y n ) ∈ R n define their inner product as the real number: � v , w � = x 1 y 1 + · · · + x n y n � = x i y i 1 ≤ i ≤ n Note : Length � v � can be expressed via inner product: � � v � 2 = x 2 1 + · · · + x 2 n = � v , v � , so � v � = � v , v � . A. Kissinger Version: spring 2017 Matrix Calculations 7 / 48

  7. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Inner products via matrix transpose Matrix transposition For an m × n matrix A , the transpose A T is the n × m matrix A obtained by mirroring in the diagonal:   T   a 11 · · · a 11 · · · a m 1 a 1 n . .  .   .  = . .     a m 1 · · · a mn a 1 n · · · a mn In other words, the rows of A become the columns of A T . The inner product of v = ( x 1 , . . . , x n ) , w = ( y 1 , . . . , y n ) ∈ R n is then a matrix product:   y 1 .  = v T · w .   . � v , w � = x 1 y 1 + · · · + x n y n = ( x 1 · · · x n ) · .  y n A. Kissinger Version: spring 2017 Matrix Calculations 8 / 48

  8. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Properties of the inner product 1 The inner product is symmetric in v and w : � v , w � = � w , v � 2 It is linear in v : � v + v ′ , w � = � v , w � + � v ′ , w � � a v , w � = a � v , w � ...and hence also in w (by symmetry): � v , w + w ′ � = � v , w � + � v , w ′ � � v , a w � = a � v , w � 3 And it is positive definite: v � = 0 = ⇒ � v , v � > 0 A. Kissinger Version: spring 2017 Matrix Calculations 9 / 48

  9. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Inner products and angles, part I For v = w = (1 , 0), � v , w � = 1. As we start to rotate w , � v , w � goes down until 0: � v , w � = 4 � v , w � = 3 � v , w � = 1 � v , w � = 0 5 5 ...and then goes to − 1: � v , w � = − 3 � v , w � = − 4 � v , w � = 0 � v , w � = − 1 5 5 ...then down to 0 again, then to 1, then repeats... A. Kissinger Version: spring 2017 Matrix Calculations 10 / 48

  10. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Cosine Plotting these numbers vs. the angle between the vectors, we get: It looks like � v , w � depends on the cosine of the angle between v and w . Let’s prove it! A. Kissinger Version: spring 2017 Matrix Calculations 11 / 48

  11. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Recall: definition of cosine a y γ x cos( γ ) = x = ⇒ x = a cos( γ ) a A. Kissinger Version: spring 2017 Matrix Calculations 12 / 48

  12. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up The cosine rule a b y γ x c cos( γ ) = a 2 + b 2 − c 2 Claim: 2 ab Proof: We have three equations to play with: x 2 + y 2 = a 2 ( c − x ) 2 + y 2 = b 2 x = a cos( γ ) ...lets do the math. � A. Kissinger Version: spring 2017 Matrix Calculations 13 / 48

  13. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Inner products and angles, part II Translating this to something about vectors: � v � d ( v , w ) := � v − w � γ � w � gives: cos( γ ) = � v � 2 + � w � 2 − � v − w � 2 2 � v � � w � Let’s clean this up... A. Kissinger Version: spring 2017 Matrix Calculations 14 / 48

  14. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Inner products and angles, part II Starting from the cosine rule: cos( γ ) = � v � 2 + � w � 2 − � v − w � 2 2 � v � � w � n − ( x 1 − y 1 ) 2 − · · · − ( x n − y n ) 2 = x 2 1 + · · · + x 2 n + y 2 1 + · · · + y 2 2 � v � � w � = 2 x 1 y 1 + · · · + 2 x n y n 2 � v � � w � = x 1 y 1 + · · · + x n y n � v � � w � � v , w � � v , w � = remember this: cos( γ ) = � v � � w � � v � � w � Thus, angles between vectors are expressible via the inner product � (since � v � = � v , v � ). A. Kissinger Version: spring 2017 Matrix Calculations 15 / 48

  15. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Linear algebra in gaming, part I • Linear algebra plays an important role in game visualisation • Here: simple illustration, borrowed from blog.wolfire.com ( More precisely: http://blog.wolfire.com/2009/07/ linear-algebra-for-game-developers-part-2 ) • Recall: cosine cos function is positive on angles between -90 and +90 degrees. A. Kissinger Version: spring 2017 Matrix Calculations 16 / 48

  16. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Linear algebra in gaming, part II • Consider a guard G and hiding ninja H in: � 1 � • The guard is at position (1 , 1), facing in direction D = , 1 with a 180 degree field of view • The ninja is at (3 , 0). Is he in sight? A. Kissinger Version: spring 2017 Matrix Calculations 17 / 48

  17. Inner products and orthogonality Orthogonalisation Radboud University Nijmegen Application: computational linguistics Wrapping up Linear algebra in gaming, part III � 2 � 3 � � 1 � � • The vector from G to H is: V = − = 0 1 − 1 • The angle γ between D and V must be between -90 and +90 � D , V � • Hence we must have: cos( γ ) = � D �·� V � ≥ 0 • Since � D � ≥ 0 and � V � ≥ 0, it suffices to have: � D , V � ≥ 0 • Well, � D , V � = 1 · 2 + 1 · − 1 = 1. Hence H is within sight! A. Kissinger Version: spring 2017 Matrix Calculations 18 / 48

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend