aspects of quadratic forms in the work of hirzebruch and
play

ASPECTS OF QUADRATIC FORMS IN THE WORK OF HIRZEBRUCH AND ATIYAH - - PowerPoint PPT Presentation

1 ASPECTS OF QUADRATIC FORMS IN THE WORK OF HIRZEBRUCH AND ATIYAH - the directors cut Andrew Ranicki University of Edinburgh and MPIM, Bonn http://www.maths.ed.ac.uk/ aar Topology seminar, Bonn, 12th October 2010 An extended version of


  1. 1 ASPECTS OF QUADRATIC FORMS IN THE WORK OF HIRZEBRUCH AND ATIYAH - the director’s cut Andrew Ranicki University of Edinburgh and MPIM, Bonn http://www.maths.ed.ac.uk/ � aar Topology seminar, Bonn, 12th October 2010 An extended version of the lecture given at the Royal Society of Edinburgh on 17th September, 2010 on the occasion of the award to F.Hirzebruch of an Honorary RSE Fellowship.

  2. 2 James Joseph Sylvester (1814–1897) Honorary Fellow of the RSE, 1874

  3. 3 Sylvester’s 1852 paper ◮ Fundamental insight: the invariance of the numbers of positive and negative eigenvalues of a quadratic polynomial under linear substitutions. ◮ Impact statement: the Sylvester crater on the Moon

  4. 4 Symmetric matrices ◮ An m × n matrix S = ( s ij ∈ R ) corresponds to a bilinear pairing m n � � S : R m × R n → R ; (( x 1 , x 2 , . . . , x m ) , ( y 1 , y 2 , . . . , y n )) �→ s ij x i y j . i =1 j =1 ◮ The transpose of an m × n matrix S is the n × m matrix S ∗ = ( s ∗ ji ) s ∗ ji = s ij , S ∗ ( x , y ) = S ( y , x ) . ◮ An n × n matrix S is symmetric if S ∗ = S , i.e. S ( x , y ) = S ( y , x ). ◮ Quadratic polynomials Q ( x ) correspond to symmetric matrices S Q ( x ) = S ( x , x ) , with S ( x , y ) = ( Q ( x + y ) − Q ( x ) − Q ( y )) / 2 . ◮ Spectral theorem (Cauchy, 1829) The eigenvalues of a symmetric n × n matrix S are real: the characteristic polynomial of S ch z ( S ) = det( zI n − S ) ∈ R [ z ] has real roots λ 1 , λ 2 , . . . , λ n ∈ R ⊂ C .

  5. 5 Linear and orthogonal congruence ◮ Two n × n matrices S , T are linearly congruent if T = A ∗ SA for an invertible n × n matrix A , i.e. T ( x , y ) = S ( Ax , Ay ) ∈ R ( x , y ∈ R n ) . ◮ An n × n matrix A is orthogonal if it is invertible and A − 1 = A ∗ . ◮ Two n × n matrices S , T are orthogonally congruent if T = A ∗ SA for an orthogonal n × n matrix A . Then T = A − 1 SA is conjugate to S . ◮ Diagonalization A symmetric n × n matrix S is orthogonally congruent   to a diagonal matrix χ 1 0 . . . 0   0 χ 2 . . . 0   A ∗ SA = D ( χ 1 , χ 2 , . . . , χ n ) =   . . . ... . . .   . . . 0 0 . . . χ n with A = ( b 1 b 2 . . . b n ) the orthogonal n × n matrix with columns an orthonormal basis of R n of eigenvectors b k ∈ R n , Ab k = χ k b k ∈ R n . ◮ Proposition Symmetric n × n matrices S , T are orthogonally congruent if and only if they have the same eigenvalues.

  6. 6 The indices of inertia and the signature ◮ The positive and negative index of inertia of a symmetric n × n matrix S are τ + ( S ) = (no. of eigenvalues λ k > 0) , τ − ( S ) = (no. of eigenvalues λ k < 0) ∈ { 0 , 1 , 2 , . . . , n } . ◮ τ + ( S ) = dim( V + ) is the dimension of any maximal subspace V + ⊆ R n with S ( x , y ) > 0 for all x , y ∈ V + \{ 0 } . Similarly for τ − ( S ). ◮ The signature (= index of inertia ) of S is the difference n � τ ( S ) = τ + ( S ) − τ − ( S ) = sign( λ k ) ∈ {− n , . . . , − 1 , 0 , 1 , . . . , n } . k =1 ◮ The rank of S is the sum n � dim R ( S ( R n )) = τ + ( S ) + τ − ( S ) = | sign( λ k ) | ∈ { 0 , 1 , 2 , . . . , n } . k =1 S is invertible if and only if τ + ( S ) + τ − ( S ) = n .

  7. 7 Sylvester’s Law of Inertia (1852) ◮ Law of Inertia Symmetric n × n matrices S , T are linearly congruent if and only if they have eigenvalues of the same signs, i.e. same indices τ + ( S ) = τ + ( T ) and τ − ( S ) = τ − ( T ) ∈ { 0 , 1 , . . . , n } . ◮ Proof (i) If x ∈ R n is such that S ( x , x ) � = 0 then S is linearly congruent � S ( x , x ) � 0 with S ′ the ( n − 1) × ( n − 1)-matrix of S restricted to S ′ 0 to the ( n − 1)-dimensional subspace x ⊥ = { y ∈ R n | S ( x , y ) = 0 } ⊂ R n . ◮ (ii) A symmetric n × n matrix S with eigenvalues λ 1 � λ 2 � · · · � λ n is linearly congruent to the diagonal matrix   I p 0 0   D (sign( λ 1 ) , sign( λ 2 ) , . . . , sign( λ n )) = 0 0 0 0 0 − I q with τ + ( S ) = p , τ − ( S ) = q , τ ( S ) = p − q , rank( S ) = p + q . ◮ Invertible symmetric n × n matrices S , T are linearly congruent if and only if they have the same signature τ ( S ) = τ ( T ).

  8. 8 Regular symmetric matrices ◮ The principal k × k minor of an n × n matrix S = ( s ij ) 1 � i , j � n is µ k ( S ) = det( S k ) ∈ R with S k = ( s ij ) 1 � i , j � k the principal k × k submatrix. � � . . . S k S = . . ... . . ◮ An n × n matrix S is regular if µ k ( S ) � = 0 ∈ R (1 � k � n ) , that is if each S k is invertible. ◮ In particular, S n = S is invertible, and the eigenvalues are λ k � = 0.

  9. 9 The Sylvester-Gundelfinger-Frobenius theorem ◮ Theorem (Sylvester 1852, Gundelfinger 1881, Frobenius 1895) The eigenvalues λ k ( S ) of a regular symmetric n × n matrix S have the signs of the successive minor quotients sign( λ k ( S )) = sign( µ k ( S ) /µ k − 1 ( S )) ∈ {− 1 , 1 } for k = 1 , 2 , . . . , n , with µ 0 ( S ) = 1. The signature is n � sign( µ k ( S ) /µ k − 1 ( S )) ∈ {− n , − n + 1 , . . . , n } . τ ( S ) = k =1 ◮ Proved by the ”algebraic plumbing” of matrices – the algebraic analogue of the geometric plumbing of manifolds. ◮ Corollary If S is an invertible symmetric n × n matrix which is not regular then for sufficiently small ǫ � = 0 the symmetric n × n matrix S ǫ = S + ǫ I n is regular, with eigenvalues λ k ( S ǫ ) = λ k ( S ) + ǫ � = 0, and sign( λ k ( S ǫ )) = sign( λ k ( S )) ∈ {− 1 , 1 } , � n sign( µ k ( S ǫ ) /µ k − 1 ( S ǫ )) ∈ Z . τ ( S ) = τ ( S ǫ ) = k =1

  10. 10 Algebraic plumbing ◮ Definition The plumbing of a regular symmetric n × n matrix S with respect to v ∈ R n , w � = vS − 1 v ∗ ∈ R is the regular symmetric � S � v ∗ ( n + 1) × ( n + 1) matrix S ′ = . v w ◮ Proof of the Sylvester-Gundelfinger-Frobenius Theorem It suffices to calculate the jump in signature under plumbing. The matrix identity � � � S � � 1 � S − 1 v ∗ 1 0 0 S ′ = vS − 1 w − vS − 1 v ∗ 1 0 0 1 � S � 0 shows that S ′ is linearly congruent to . w − vS − 1 v ∗ 0 By the Law of Inertia τ ( S ′ ) = τ ( S ) + sign( w − vS − 1 v ∗ ) ∈ Z , so that τ ( S ′ ) − τ ( S ) = sign( µ n ( S ′ ) /µ n − 1 ( S ′ )).

  11. 11 Tridiagonal matrices ◮ The tridiagonal symmetric n × n matrix of χ = ( χ 1 , χ 2 , . . . , χ n ) ∈ R n   χ 1 1 0 . . . 0 0   1 χ 2 1 . . . 0 0     0 1 χ 3 . . . 0 0   Tri( χ ) =   . . . . . ... . . . . .   . . . . .     0 0 0 . . . χ n − 1 1 0 0 0 . . . 1 χ n ◮ Jacobi Ein leichtes Verfahren, die in der Theorie der S¨ acularst¨ orungen osen (1846). vorkommenden Gleichungen numerisch aufzul¨ Tridiagonal matrices first used in the numerical solution of simultaneous linear equations. ◮ Tridiagonal matrices and continued fractions feature in recurrences, Sturm theory, numerical analysis, orthogonal polynomials, integrable systems . . . and in the Hirzebruch-Jung resolution of singularities.

  12. 12 Sylvester’s 1853 paper ◮ Sturm’s theorem gave a formula for the number of roots in an interval of a generic real polynomial f ( x ) ∈ R [ x ]. ◮ The formula was in terms of the numbers of changes of signs at the ends of the interval in the polynomials which occur as the successive remainders in the Euclidean algorithm in the polynomial ring R [ x ] applied to f ( x ) / f ′ ( x ). ◮ Sylvester recast the formula as a difference of signatures, using an expression for the signature of a tridiagonal matrix in terms of continued fractions. ◮ Barge and Lannes, Suites de Sturm, indice de Maslov et p´ eriodicit´ e de Bott (2008) gives a modern take on the algebraic connections between Sturm sequences, the signatures of tridiagonal matrices and Bott periodicity.

  13. 13 Tridiagonal matrices and continued fractions ◮ A vector χ = ( χ 1 , χ 2 , . . . , χ n ) ∈ R n is regular if χ k � = 0 , µ k (Tri( χ )) � = 0 ( k = 1 , 2 , . . . , n ) so that the tridiagonal symmetric matrix Tri( χ ) is regular. ◮ Theorem (Sylvester, 1853) A tridiagonal matrix Tri( χ ) for a regular χ ∈ R n is linearly congruent to the diagonal matrix with entries the continued fractions λ k (Tri( χ )) = µ k (Tri( χ )) /µ k − 1 (Tri( χ )) = [ χ k , χ k − 1 , . . . , χ 1 ] 1 = χ k − 1 χ k − 1 − χ k − 2 − ... − 1 χ 1 ◮ The signature of Tri( χ ) is n � sign( λ k (Tri( χ ))) ∈ {− n , − n + 1 , . . . , n } . τ (Tri( χ )) = k =1

  14. 14 ”Aspiring to these wide generalizations, the analysis of quadratic functions soars to a pitch from whence it may look proudly down on the feeble and vain attempts of geometry proper to rise to its level or to emulate it in its flights.” (1850) Savilian Professor of Geometry, Oxford, 1883-1894

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend