interlacing families ii mixed
play

Interlacing Families II: Mixed Characteristic Polynomials and the - PowerPoint PPT Presentation

Interlacing Families II: Mixed Characteristic Polynomials and the Kadison-Singer Problem Nikhil Srivastava (Microsoft Research) Adam Marcus (Yale), Daniel Spielman (Yale) Discrepancy Theory How well can you approximate a discrete object by


  1. In One Dimension Main Theorem . Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 1 are numbers w ith |𝑀 𝑗 | 2 ≀ πœ— 2 and energy one 2 = 1. 𝑀 𝑗 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that each part has energy close to half 2 = 1 𝑀 𝑗 2 Β± 5πœ— π‘—βˆˆπ‘ˆ π‘˜

  2. In Higher Dimensions Main Theorem . Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || ≀ πœ— and energy one in each direction: 𝑣, 𝑀 𝑗 2 = 1 βˆ€||𝑣|| = 1 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that each part has energy close to half in each direction: 𝑣, 𝑀 𝑗 2 = 1 βˆ€||𝑣|| = 1 2 Β± 5πœ— π‘—βˆˆπ‘ˆ π‘˜

  3. In Higher Dimensions Main Theorem . Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || ≀ πœ— and energy one in each direction: 𝑣, 𝑀 𝑗 2 = 1 βˆ€||𝑣|| = 1 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that each part has energy close to half in each direction: 𝑣, 𝑀 𝑗 2 = 1 βˆ€||𝑣|| = 1 2 Β± 5πœ— π‘—βˆˆπ‘ˆ π‘˜ Optimal in high dim

  4. Matrix Notation Given vectors 𝑀 1 , … , 𝑀 𝑛 write quadratic form as 𝑀 𝑗 , 𝑣 2 = 𝑣 π‘ˆ π‘ˆ 𝑅 𝑣 = 𝑀 𝑗 𝑀 𝑗 𝑣 𝑗 𝑗

  5. Matrix Notation Given vectors 𝑀 1 , … , 𝑀 𝑛 write quadratic form as 𝑀 𝑗 , 𝑣 2 = 𝑣 π‘ˆ π‘ˆ 𝑅 𝑣 = 𝑀 𝑗 𝑀 𝑗 𝑣 𝑗 𝑗 Isotropy: π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗

  6. Matrix Notation Given vectors 𝑀 1 , … , 𝑀 𝑛 write quadratic form as 𝑀 𝑗 , 𝑣 2 = 𝑣 π‘ˆ π‘ˆ 𝑅 𝑣 = 𝑀 𝑗 𝑀 𝑗 𝑣 𝑗 𝑗 Isotropy: π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗 Comparision: 𝐡 β‰Ό 𝐢 ⟺ 𝑣 π‘ˆ 𝐡𝑣 ≀ 𝑣 π‘ˆ 𝐢 𝑣 βˆ€π‘£

  7. Matrix Notation Main Theorem . Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || ≀ πœ— and π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that 1 1 π‘ˆ β‰Ό 2 βˆ’ 5πœ— 𝐽 β‰Ό 𝑀 𝑗 𝑀 𝑗 2 + 5πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

  8. Matrix Notation Main Theorem . Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || 2 ≀ πœ— and π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that 1 1 π‘ˆ β‰Ό 2 βˆ’ 5 πœ— 𝐽 β‰Ό 𝑀 𝑗 𝑀 𝑗 2 + 5 πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

  9. Unnormalized Version Suppose I get some vectors π‘₯ 1 , … , π‘₯ 𝑛 which are not isotropic: π‘ˆ = 𝑋 ≽ 0 π‘₯ 𝑗 π‘₯ 𝑗 𝑗 Consider 𝑀 𝑗 ≔ 𝑋 βˆ’ 1 2 π‘₯ 𝑗 and apply theorem to 𝑀 𝑗 . Normalized vectors have ||𝑀 𝑗 || 2 = ||𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 = πœ— Thm. gives 1 1 π‘ˆ β‰Ό 2 βˆ’ 5 πœ— 𝐽 β‰Ό 𝑀 𝑗 𝑀 𝑗 2 + 5 πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

  10. Unnormalized Version Suppose I get some vectors π‘₯ 1 , … , π‘₯ 𝑛 which are not isotropic: π‘ˆ = 𝑋 ≽ 0 π‘₯ 𝑗 π‘₯ 𝑗 𝑗 Consider 𝑀 𝑗 ≔ 𝑋 βˆ’ 1 2 π‘₯ 𝑗 and apply theorem to 𝑀 𝑗 . Normalized vectors have ||𝑀 𝑗 || 2 = ||𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 = πœ— 1 1 𝑋 βˆ’1 π‘ˆ 𝑋 βˆ’1 2 β‰Ό 2 βˆ’ 5 πœ— 𝐽 β‰Ό 2 π‘₯ 𝑗 π‘₯ 𝑗 2 + 5 πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

  11. Unnormalized Version Suppose I get some vectors π‘₯ 1 , … , π‘₯ 𝑛 which are not isotropic: π‘ˆ = 𝑋 ≽ 0 π‘₯ 𝑗 π‘₯ 𝑗 𝑗 Consider 𝑀 𝑗 ≔ 𝑋 βˆ’ 1 2 π‘₯ 𝑗 and apply theorem to 𝑀 𝑗 . Normalized vectors have ||𝑀 𝑗 || 2 = ||𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 = πœ— 1 1 2 βˆ’ 5 πœ— 𝐽 β‰Ό 𝑋 βˆ’1 𝑋 βˆ’1 π‘ˆ 2 β‰Ό π‘₯ 𝑗 π‘₯ 𝑗 2 + 5 πœ— 𝐽 2 π‘—βˆˆπ‘ˆ π‘˜

  12. Unnormalized Version Suppose I get some vectors π‘₯ 1 , … , π‘₯ 𝑛 which are Fact : 𝐡 β‰Ό 𝐢 ⟺ 𝐷𝐡𝐷 β‰Ό 𝐷𝐢𝐷 not isotropic: for invertible 𝐷 π‘ˆ = 𝑁 ≽ 0 π‘₯ 𝑗 π‘₯ 𝑗 𝑗 Consider 𝑀 𝑗 ≔ 𝑋 βˆ’ 1 2 π‘₯ 𝑗 and apply theorem to 𝑀 𝑗 . Normalized vectors have ||𝑀 𝑗 || 2 = ||𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 = πœ— 1 1 2 βˆ’ 5 πœ— 𝐽 β‰Ό 𝑋 βˆ’1 𝑋 βˆ’1 π‘ˆ 2 β‰Ό π‘₯ 𝑗 π‘₯ 𝑗 2 + 5 πœ— 𝐽 2 π‘—βˆˆπ‘ˆ π‘˜

  13. Unnormalized Version Suppose I get some vectors π‘₯ 1 , … , π‘₯ 𝑛 which are not isotropic: π‘ˆ = 𝑋 ≽ 0 π‘₯ 𝑗 π‘₯ 𝑗 𝑗 Consider 𝑀 𝑗 ≔ 𝑋 βˆ’ 1 2 π‘₯ 𝑗 and apply theorem to 𝑀 𝑗 . Normalized vectors have ||𝑀 𝑗 || 2 = ||𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 = πœ— 1 1 π‘ˆ 2 βˆ’ 5 πœ— 𝑋 β‰Ό π‘₯ 𝑗 π‘₯ 𝑗 β‰Ό 2 + 5 πœ— 𝑋 π‘—βˆˆπ‘ˆ π‘˜

  14. Unnormalized Theorem Given arbitrary vectors π‘₯ 1 , … , π‘₯ 𝑛 ∈ ℝ π‘œ there is a partition 𝑛 = π‘ˆ 1 βˆͺ π‘ˆ 2 with 1 π‘ˆ β‰Ό 1 π‘ˆ π‘ˆ 2 βˆ’ πœ— π‘₯ 𝑗 π‘₯ 𝑗 β‰Ό π‘₯ 𝑗 π‘₯ 𝑗 2 βˆ’ πœ— π‘₯ 𝑗 π‘₯ 𝑗 𝑗 π‘—βˆˆπ‘ˆ π‘˜ 𝑗 || 𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 Where πœ— ≔ max 𝑗

  15. Unnormalized Theorem Given arbitrary vectors π‘₯ 1 , … , π‘₯ 𝑛 ∈ ℝ π‘œ there is a partition 𝑛 = π‘ˆ 1 βˆͺ π‘ˆ 2 with 1 π‘ˆ β‰Ό 1 π‘ˆ π‘ˆ 2 βˆ’ πœ— π‘₯ 𝑗 π‘₯ 𝑗 β‰Ό π‘₯ 𝑗 π‘₯ 𝑗 2 βˆ’ πœ— π‘₯ 𝑗 π‘₯ 𝑗 𝑗 π‘—βˆˆπ‘ˆ π‘˜ 𝑗 || 𝑋 βˆ’ 1 2 π‘₯ 𝑗 || 2 Where πœ— ≔ max 𝑗 Any quadratic form in which no vector has too much influence can be split into two representative pieces.

  16. Applications

  17. 1. Graph Theory Given an undirected graph 𝐻 = (π‘Š, 𝐹) , consider its Laplacian matrix: π‘ˆ 𝑀 𝐻 = πœ€ 𝑗 βˆ’ πœ€ π‘˜ )(πœ€ 𝑗 βˆ’ πœ€ π‘˜ π‘—π‘˜βˆˆπΉ

  18. 1. Graph Theory Given an undirected graph 𝐻 = (π‘Š, 𝐹) , consider its Laplacian matrix: π‘ˆ 𝑀 𝐻 = πœ€ 𝑗 βˆ’ πœ€ π‘˜ )(πœ€ 𝑗 βˆ’ πœ€ π‘˜ π‘—π‘˜βˆˆπΉ i j i j

  19. 1. Graph Theory Given an undirected graph 𝐻 = (π‘Š, 𝐹) , consider its Laplacian matrix: π‘ˆ = 𝐸 βˆ’ 𝐡 𝑀 𝐻 = πœ€ 𝑗 βˆ’ πœ€ π‘˜ )(πœ€ 𝑗 βˆ’ πœ€ π‘˜ π‘—π‘˜βˆˆπΉ

  20. 1. Graph Theory Given an undirected graph 𝐻 = (π‘Š, 𝐹) , consider its Laplacian matrix: π‘ˆ = 𝐸 βˆ’ 𝐡 𝑀 𝐻 = πœ€ 𝑗 βˆ’ πœ€ π‘˜ )(πœ€ 𝑗 βˆ’ πœ€ π‘˜ π‘—π‘˜βˆˆπΉ Quadratic form: 2 𝑔𝑝𝑠 𝑦 ∈ 𝑺 π‘œ 𝑦 π‘ˆ 𝑀𝑦 = 𝑦 𝑗 βˆ’ 𝑦 π‘˜ π‘—π‘˜βˆˆπΉ

  21. The Laplacian Quadratic Form An example: -1 +1 +2 0 +1 +3

  22. The Laplacian Quadratic Form An example: -1 +1 1 2 3 2 2 2 +2 0 2 2 1 2 3 2 +1 +3 x T Lx = οƒ₯ i,j 2 E ( x (i)- x (j)) 2

  23. The Laplacian Quadratic Form An example: -1 +1 1 2 3 2 2 2 +2 0 2 2 1 2 3 2 +1 +3 x T Lx = οƒ₯ i,j 2 E ( x (i)- x (j)) 2 = 28

  24. The Laplacian Quadratic Form Another example: 0 0 0 +1 0 +1

  25. The Laplacian Quadratic Form Another example: 0 0 1 2 0 2 1 2 0 +1 1 2 0 2 0 2 0 +1 x T L G x = 3

  26. Cuts and the Quadratic Form For characteristic vector

  27. Cuts and the Quadratic Form For characteristic vector The Laplacian Quadratic form encodes the entire cut structure of the graph.

  28. Application to Graphs G

  29. Application to Graphs G π‘ˆ 𝑀 𝐻 = πœ€ 𝑗 βˆ’ πœ€ π‘˜ )(πœ€ 𝑗 βˆ’ πœ€ π‘˜ π‘—π‘˜βˆˆπΉ

  30. Application to Graphs G Theorem

  31. Application to Graphs G Theorem H 1 𝑀 𝐼 1 𝑀 𝐼 2 H 2

  32. Application to Graphs G Theorem H 1 H 2 𝑦 π‘ˆ 𝑀 𝐼 1 𝑦 β‰ˆ 1 2 𝑦 π‘ˆ 𝑀 𝐻 𝑦

  33. Recursive Application Gives: 1. Graph Sparsification Theorem [Batson- Spielman- S’09]: Every graph G has a weighted O(1)-cut approximation H with O(n) edges. G H 𝑃 π‘œ 2 edges 𝑃 π‘œ edges Unweighted Weighted

  34. Approximating One Graph by Another Cut Approximation [Benczur- Karger’96] G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  35. Approximating One Graph by Another Cut Approximation [Benczur- Karger’96] G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  36. Approximating One Graph by Another Cut Approximation [Benczur- Karger’96] G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  37. Approximating One Graph by Another Cut Approximation [Benczur- Karger’96] G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  38. Approximating One Graph by Another Cut Approximation [Benczur- Karger’96] G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  39. Approximating One Graph by Another G and H have same cuts. Equivalent for min Cut Approximation [Benczur- Karger’96] cut, max cut, sparsest cut… G H For every cut, weight of edges in G β‰ˆ weight of edges in H

  40. Recursive Application Gives: 2. Unweighted Graph Sparsification Every transitive graph G has an unweighted O(1)-cut approximation H with O(n) edges. 𝐿 π‘œ 𝐼 Expander graph

  41. Recursive Application Gives: 2. Unweighted Graph Sparsification Every transitive graph G can be partitioned into O(1)-cut approximations with O(n) edges. 𝐿 π‘œ 𝐼 1 … 𝐼 π‘œ Expander graphs

  42. Recursive Application Gives: 2. Unweighted Graph Sparsification Every transitive graph G can be partitioned into O(1)-cut approximations with O(n) edges. 𝐿 π‘œ 𝐼 1 … 𝐼 π‘œ Expander graphs Generalizes [Frieze-Molloy]

  43. Recursive Application Gives: 2. Unweighted Graph Sparsification Every transitive graph G can be partitioned into O(1)-cut approximations with O(n) edges. H 1 G H 2 Same cut structure

  44. 2. Uncertainty Principles Signal 𝑦 ∈ β„‚ π‘œ . Discrete Fourier Transform 𝑏2πœŒπ‘—π‘™ 𝑦 a = βŒ©π‘¦, exp(βˆ’ ) π‘™β‰€π‘œ βŒͺ π‘œ

  45. 2. Uncertainty Principles Signal 𝑦 ∈ β„‚ π‘œ . Discrete Fourier Transform 𝑏2πœŒπ‘—π‘™ 𝑦 a = βŒ©π‘¦, exp(βˆ’ ) π‘™β‰€π‘œ βŒͺ π‘œ Uncertainty Principle : 𝑦 and 𝑦 cannot be simultaneously localized. π‘‘π‘£π‘žπ‘ž 𝑦 Γ— π‘‘π‘£π‘žπ‘ž 𝑦 β‰₯ π‘œ

  46. 2. Uncertainty Principles Signal 𝑦 ∈ β„‚ π‘œ . Discrete Fourier Transform 𝑏2πœŒπ‘—π‘™ 𝑦 a = βŒ©π‘¦, exp(βˆ’ ) π‘™β‰€π‘œ βŒͺ π‘œ Uncertainty Principle : 𝑦 and 𝑦 cannot be simultaneously localized. π‘‘π‘£π‘žπ‘ž 𝑦 Γ— π‘‘π‘£π‘žπ‘ž 𝑦 β‰₯ π‘œ If 𝑦 is supported on 𝑇 = βˆšπ‘œ coordinates, π‘‘π‘£π‘žπ‘ž 𝑦 β‰₯ π‘œ

  47. 2. Uncertainty Principles Signal 𝑦 ∈ β„‚ π‘œ . Discrete Fourier Transform 𝑏2πœŒπ‘—π‘™ 𝑦 a = βŒ©π‘¦, exp(βˆ’ ) π‘™β‰€π‘œ βŒͺ π‘œ Stronger Uncertainty Principle: For every subset 𝑇 = π‘œ , there is a partition π‘œ = π‘ˆ 1 βˆͺ β‹― π‘ˆ π‘œ 1 ||𝑦| 𝑇 || 2 β‰ˆ 𝑦| π‘ˆ 𝑗 || 2 π‘œ || for all 𝑦 and π‘ˆ 𝑗

  48. 2. Uncertainty Principles Signal 𝑦 ∈ β„‚ π‘œ . Discrete Fourier Transform 𝑏2πœŒπ‘—π‘™ 𝑦 a = βŒ©π‘¦, exp(βˆ’ ) π‘™β‰€π‘œ βŒͺ π‘œ Stronger Uncertainty Principle: For every subset 𝑇 = π‘œ , there is a partition π‘œ = π‘ˆ 1 βˆͺ β‹― π‘ˆ π‘œ 1 ||𝑦| 𝑇 || 2 β‰ˆ 𝑦| π‘ˆ 𝑗 || 2 π‘œ || for all 𝑦 and π‘ˆ 𝑗

  49. 2. Uncertainty Principles Proof. 𝑏2πœŒπ‘—π‘™ Let 𝑔 𝑙 = exp(βˆ’ ) π‘™β‰€π‘œ be the Fourier basis. π‘œ Fix a subset 𝑇 βŠ‚ π‘œ of βˆšπ‘œ coords. The restricted norm is: ||𝑦| 𝑇 || 2 = 𝑙 𝑦| 𝑇 , 𝑔 𝑙 2 a quadratic form in π‘œ dimensions. Apply the theorem.

  50. 2. Uncertainty Principles Applications in analytic number theory, harmonic analysis. Proof. 𝑏2πœŒπ‘—π‘™ Let 𝑔 𝑙 = exp(βˆ’ ) π‘™β‰€π‘œ be the Fourier basis. π‘œ Fix a subset 𝑇 βŠ‚ π‘œ of βˆšπ‘œ coords. The restricted norm is: ||𝑦| 𝑇 || 2 = 𝑙 𝑦| 𝑇 , 𝑔 𝑙 2 a quadratic form in π‘œ dimensions. Apply the theorem.

  51. 3. The Kadison-Singer Problem Dirac 1930’s: Bra-Ket formalism for quantum states. πœ” , π‘ž …

  52. 3. The Kadison-Singer Problem Dirac 1930’s: Bra-Ket formalism for quantum states. πœ” , π‘ž … What are Bras and Kets? NOT vectors.

  53. 3. The Kadison-Singer Problem Dirac 1930’s: Bra-Ket formalism for quantum states. πœ” , π‘ž … What are Bras and Kets? NOT vectors. Von Neumann 1936: Theory of 𝐷 βˆ— algebras.

  54. 3. The Kadison-Singer Problem Dirac 1930’s: Bra-Ket formalism for quantum states. πœ” , π‘ž … What are Bras and Kets? NOT vectors. Von Neumann 1936: Theory of 𝐷 βˆ— algebras. Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices

  55. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices

  56. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices Anderson 1979 : Reduced to a question about finite matrices. β€œPaving Conjecture”

  57. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices Anderson 1979 : Reduced to a question about finite matrices. Akemann-Anderson 1991 : Reduced to a question about finite projection matrices.

  58. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices Anderson 1979 : Reduced to a question about finite matrices. Akemann-Anderson 1991 : Reduced to a question about finite projection matrices. Weaver 2002: Discrepancy theoretic formulation of the same question.

  59. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices Anderson 1979 : Reduced to a question about finite matrices. Akemann-Anderson 1991 : Reduced to a question about finite projection matrices. This work: Proof of Weaver’s conjecture.

  60. 3. The Kadison-Singer Problem Kadison-Singer 1959: Does this lead to a satisfactory notion of measurement? Conjecture : about ∞ matrices Anderson 1979 : Reduced to a question about finite matrices. Akemann-Anderson 1991 : Reduced to a question about finite projection matrices. This work: Proof of Weaver’s conjecture.

  61. In General Anything that can be encoded as a quadratic form can be split into pieces while preserving certain properties. Many different things can be gainfully encoded this way.

  62. Proof

  63. Main Theorem Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || 2 ≀ πœ— and π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that 1 1 π‘ˆ β‰Ό 2 βˆ’ 5 πœ— 𝐽 β‰Ό 𝑀 𝑗 𝑀 𝑗 2 + 5 πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

  64. Equivalent Theorem Suppose 𝑀 1 , … , 𝑀 𝑛 ∈ 𝑺 π‘œ are vectors ||𝑀 𝑗 || 2 ≀ πœ— and π‘ˆ = 𝐽 π‘œ 𝑀 𝑗 𝑀 𝑗 𝑗 Then there is a partition π‘ˆ 1 βˆͺ π‘ˆ 2 such that 1 π‘ˆ β‰Ό 𝑀 𝑗 𝑀 𝑗 2 + 5 πœ— 𝐽 π‘—βˆˆπ‘ˆ π‘˜

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend