Model Order Reduction of Higher Order Systems Joint work with Peter - - PowerPoint PPT Presentation

model order reduction of higher order systems
SMART_READER_LITE
LIVE PREVIEW

Model Order Reduction of Higher Order Systems Joint work with Peter - - PowerPoint PPT Presentation

Model Order Reduction of Higher Order Systems Joint work with Peter Benner and Philip Saltenberger Heike Fabender, ICERM, February 2020 Institute for Numerical Analysis, TU Braunschweig Introduction Approach 1 Linearizations Example Robot


slide-1
SLIDE 1

Model Order Reduction of Higher Order Systems

Joint work with Peter Benner and Philip Saltenberger Heike Faßbender, ICERM, February 2020

Institute for Numerical Analysis, TU Braunschweig

slide-2
SLIDE 2

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Higher Order Linear Time-Invariant Systems

Given matrices Pj ∈ Rn×n, 0 j ℓ, Cj ∈ Rp×n, 0 j < ℓ, B ∈ Rn×m, D ∈ Rp×m and an input function u : [0, ∞) → Rm, we seek the state function x : [0, ∞) → Rm and the

  • utput function y : [0, ∞) → Rp such that

Pℓ dℓ dtℓ x(t) + Pℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + Cℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + C1 d dt x(t) + C0x(t) = y(t) with initial conditions dj dtj x(t)

  • t=0

= x(j)

0 ,

0 j ℓ, where x(j) ∈ Rn, 0 j ℓ are given vectors.

Transfer Function

G(s) = D + ℓ−1

j=0 Cj(P0 + sP1 + s2P2 + · · · + sℓPℓ)−1B = D + ℓ−1 j=0 Cj(P(s))−1B.

  • H. Faßbender MOR of Higher Order Systems
slide-3
SLIDE 3

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Higher Order Linear Time-Invariant Systems

Given matrices Pj ∈ Rn×n, 0 j ℓ, Cj ∈ Rp×n, 0 j < ℓ, B ∈ Rn×m, D ∈ Rp×m and an input function u : [0, ∞) → Rm, we seek the state function x : [0, ∞) → Rm and the

  • utput function y : [0, ∞) → Rp such that

Pℓ dℓ dtℓ x(t) + Pℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + Cℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + C1 d dt x(t) + C0x(t) = y(t) with initial conditions dj dtj x(t)

  • t=0

= x(j)

0 ,

0 j ℓ, where x(j) ∈ Rn, 0 j ℓ are given vectors.

Transfer Function

G(s) = D + ℓ−1

j=0 Cj(P0 + sP1 + s2P2 + · · · + sℓPℓ)−1B = D + ℓ−1 j=0 Cj(P(s))−1B.

  • H. Faßbender MOR of Higher Order Systems
slide-4
SLIDE 4

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Model Order Reduction for Higher Order Linear Time-Invariant Systems

Given matrices Pj ∈ Rn×n, 0 j ℓ, Cj ∈ Rp×n, 0 j < ℓ, B ∈ Rn×m, D ∈ Rp×m and an input function u : [0, ∞) → Rm, we seek reduced order matrices ˆ Pj ∈ Rr×r, 0 j ℓ, ˆ Cj ∈ Rp×r, 0 j < ℓ, ˆ B ∈ Rr×m, ˆ D ∈ Rp×m with r ≪ n such that ˆ Pℓ dℓ dtℓ ˆ x(t) + ˆ Pℓ−1 dℓ−1 dtℓ−1 ˆ x(t) + · · · + ˆ P1 d dt ˆ x(t) + ˆ P0ˆ x(t) = ˆ Bu(t) ˆ Du(t) + ˆ Cℓ−1 dℓ−1 dtℓ−1 ˆ x(t) + · · · + ˆ C1 d dt ˆ x(t) + ˆ C0ˆ x(t) = ˆ y(t) with suitable initial conditions yields a transfer function ˆ G(s) such that ˆ G(s) = G(s) + O((s − s0)r) for some s0 ∈ C.

  • H. Faßbender MOR of Higher Order Systems
slide-5
SLIDE 5

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Galerkin Projection of Higher Order Linear Time-Invariant Systems

Given matrices Pj ∈ Rn×n, Cj ∈ Rp×n, B ∈ Rn×m, D ∈ Rp×m, find a matrix V ∈ Rn×r with orthonormal columns with r ≪ n and construct ˆ Pj = V TPjV ∈ Rr×r, ˆ B = V TB ∈ Rr×m, ˆ Cj = CjV ∈ Rp×r, ˆ D = D ∈ Rp×m, such that ˆ G(s) = G(s) + O((s − s0)r) for some s0 ∈ C.

  • H. Faßbender MOR of Higher Order Systems
slide-6
SLIDE 6

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-7
SLIDE 7

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-8
SLIDE 8

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-9
SLIDE 9

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-10
SLIDE 10

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-11
SLIDE 11

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-12
SLIDE 12

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-13
SLIDE 13

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-14
SLIDE 14

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Standard approach: Linearization

Consider associated matrix polynomial P(λ) = λℓPℓ + λℓ−1Pℓ−1 + · · · + λP1 + P0 ∈ Πn

and convert it into λE + A ∈ Πℓn

1 with the same eigenvalues.

Outline Illustrative examples Approach 1: MOR for higher order system by Freund (2005) (Approach 2: MOR for higher order system by Li, Bao, Lin, Wei (2011)) New developments in linearization of matrix polynomials

Generalization of companion form linearization L1 Block Kronecker linearizations Gr+1

Higher order LTI systems and block Kronecker linearizations

  • H. Faßbender MOR of Higher Order Systems
slide-15
SLIDE 15

Introduction Approach 1 Linearizations Example Robot Conclusions

Illustrative examples

Gyroscopic system P(λ) ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n. Such problems arise, for example, in finite element discretization in structural analysis and in the elastic deformation of anisotropic materials. They are used to model vibrations of spinning structures such as the simulation of tire noise, helicopter rotor blades, or spin-stabilized satellites with appended solar panels or antennas.

Robot P(λ) ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4. Such problems arise, e.g, from the model of a robot with electric motors in the joints.

T-even matrix polynomials

For both examples: P(λ) = P(−λ)T.

  • H. Faßbender MOR of Higher Order Systems
slide-16
SLIDE 16

Introduction Approach 1 Linearizations Example Robot Conclusions

Illustrative examples

Gyroscopic system P(λ) ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n. Such problems arise, for example, in finite element discretization in structural analysis and in the elastic deformation of anisotropic materials. They are used to model vibrations of spinning structures such as the simulation of tire noise, helicopter rotor blades, or spin-stabilized satellites with appended solar panels or antennas.

Robot P(λ) ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4. Such problems arise, e.g, from the model of a robot with electric motors in the joints.

T-even matrix polynomials

For both examples: P(λ) = P(−λ)T.

  • H. Faßbender MOR of Higher Order Systems
slide-17
SLIDE 17

Introduction Approach 1 Linearizations Example Robot Conclusions

Illustrative examples

Gyroscopic system P(λ) ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n. Such problems arise, for example, in finite element discretization in structural analysis and in the elastic deformation of anisotropic materials. They are used to model vibrations of spinning structures such as the simulation of tire noise, helicopter rotor blades, or spin-stabilized satellites with appended solar panels or antennas.

Robot P(λ) ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4. Such problems arise, e.g, from the model of a robot with electric motors in the joints.

T-even matrix polynomials

For both examples: P(λ) = P(−λ)T.

  • H. Faßbender MOR of Higher Order Systems
slide-18
SLIDE 18

Introduction Approach 1 Linearizations Example Robot Conclusions

Illustrative examples

Gyroscopic system P(λ) ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n. Such problems arise, for example, in finite element discretization in structural analysis and in the elastic deformation of anisotropic materials. They are used to model vibrations of spinning structures such as the simulation of tire noise, helicopter rotor blades, or spin-stabilized satellites with appended solar panels or antennas.

Robot P(λ) ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4. Such problems arise, e.g, from the model of a robot with electric motors in the joints.

T-even matrix polynomials

For both examples: P(λ) = P(−λ)T.

  • H. Faßbender MOR of Higher Order Systems
slide-19
SLIDE 19

Introduction Approach 1 Linearizations Example Robot Conclusions

Illustrative examples

Gyroscopic system P(λ) ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n. Such problems arise, for example, in finite element discretization in structural analysis and in the elastic deformation of anisotropic materials. They are used to model vibrations of spinning structures such as the simulation of tire noise, helicopter rotor blades, or spin-stabilized satellites with appended solar panels or antennas.

Robot P(λ) ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4. Such problems arise, e.g, from the model of a robot with electric motors in the joints.

T-even matrix polynomials

For both examples: P(λ) = P(−λ)T.

  • H. Faßbender MOR of Higher Order Systems
slide-20
SLIDE 20

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Back to Higher Order Linear Time-Invariant Systems

Pℓ dℓ dtℓ x(t) + Pℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + Cℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + C1 d dt x(t) + C0x(t) = y(t) Let z(t) =      x(t)

d dt x(t)

. . .

dℓ−1 dtℓ−1 x(t)

     , BF =      . . . B      , AF =         −In · · · −In ... . . . . . . ... ... ... · · · −In P0 P1 P2 · · · Pℓ−1         , EF =

  • I(ℓ−1)n

Pℓ

  • ,

CF = [C0 C1 · · · Cℓ−1], DF = D.

  • H. Faßbender MOR of Higher Order Systems
slide-21
SLIDE 21

Introduction Approach 1 Linearizations Example Robot Conclusions

Higher Order Linear Time-Invariant Systems

Back to Higher Order Linear Time-Invariant Systems

Pℓ dℓ dtℓ x(t) + Pℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + Cℓ−1 dℓ−1 dtℓ−1 x(t) + · · · + C1 d dt x(t) + C0x(t) = y(t) Let z(t) =      x(t)

d dt x(t)

. . .

dℓ−1 dtℓ−1 x(t)

     , BF =      . . . B      , AF =         −In · · · −In ... . . . . . . ... ... ... · · · −In P0 P1 P2 · · · Pℓ−1         , EF =

  • I(ℓ−1)n

Pℓ

  • ,

CF = [C0 C1 · · · Cℓ−1], DF = D.

  • H. Faßbender MOR of Higher Order Systems
slide-22
SLIDE 22

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Approach 1: Linearization via the first companion form

The higher order system is equivalent to the first order system EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t) z(0) = z0 where

z(t) =      x(t)

d dt x(t)

. . .

dℓ−1 dtℓ−1 x(t)

     , z0 =       x(0) x(1) . . . x(ℓ−1)       , BF =     . . . B     , AF =        −In · · · −In ... . . . . . . ... ... ... · · · −In P0 P1 P2 · · · Pℓ−1        ,

EF =

  • I(ℓ−1)n

Pℓ

  • ,

CF = [C0 C1 · · · Cℓ−1], DF = D.

  • H. Faßbender MOR of Higher Order Systems
slide-23
SLIDE 23

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Approach 1: Linearization via the first companion form

The higher order system is equivalent to the first order system EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t) z(0) = z0 Transfer function G(s) = DF + CF(sEF + AF)−1BF = D + ℓ−1

j=0 Cj(P(s))−1B ∈ C[s]p×m.

EF, AF ∈ Rℓn×ℓn, BF ∈ Rℓn×m are large and (block-) sparse. λEF + AF does not inherit any structure from P(λ), that is, e.g., P(λ) = P(λ)T does not imply that (λEF + AF)T = λEF + AF.

  • H. Faßbender MOR of Higher Order Systems
slide-24
SLIDE 24

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Approach 1: Linearization via the first companion form

The higher order system is equivalent to the first order system EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t) z(0) = z0 Transfer function G(s) = DF + CF(sEF + AF)−1BF = D + ℓ−1

j=0 Cj(P(s))−1B ∈ C[s]p×m.

EF, AF ∈ Rℓn×ℓn, BF ∈ Rℓn×m are large and (block-) sparse. λEF + AF does not inherit any structure from P(λ), that is, e.g., P(λ) = P(λ)T does not imply that (λEF + AF)T = λEF + AF.

  • H. Faßbender MOR of Higher Order Systems
slide-25
SLIDE 25

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Approach 1: Linearization via the first companion form

The higher order system is equivalent to the first order system EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t) z(0) = z0 Transfer function G(s) = DF + CF(sEF + AF)−1BF = D + ℓ−1

j=0 Cj(P(s))−1B ∈ C[s]p×m.

EF, AF ∈ Rℓn×ℓn, BF ∈ Rℓn×m are large and (block-) sparse. λEF + AF does not inherit any structure from P(λ), that is, e.g., P(λ) = P(λ)T does not imply that (λEF + AF)T = λEF + AF.

  • H. Faßbender MOR of Higher Order Systems
slide-26
SLIDE 26

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Approach 1: Linearization via the first companion form

The higher order system is equivalent to the first order system EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t) z(0) = z0 Transfer function G(s) = DF + CF(sEF + AF)−1BF = D + ℓ−1

j=0 Cj(P(s))−1B ∈ C[s]p×m.

EF, AF ∈ Rℓn×ℓn, BF ∈ Rℓn×m are large and (block-) sparse. λEF + AF does not inherit any structure from P(λ), that is, e.g., P(λ) = P(λ)T does not imply that (λEF + AF)T = λEF + AF.

  • H. Faßbender MOR of Higher Order Systems
slide-27
SLIDE 27

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Rewrite G(s) = DF + CF(sEF + AF)−1BF for s0 ∈ C such that s0EF + AF is nonsingular as G(s) = DF + CF(I + (s − s0)MF)−1RF with MF = (s0EF + AF)−1EF ∈ Cℓn×ℓn, RF = (s0EF + AF)−1BF ∈ Cℓn×m. Compute orthonormal basis of Ks(MF, RF) = span{RF, MFRF, . . . , Ms−1

F

RF}. Let W be the matrix representing the basis. Generate reduced order system ˆ E d dt ˆ z(t) + ˆ Aˆ z(t) = ˆ Bu(t) ˆ y(t) = Du(t) + ˆ Cˆ z(t) with ˆ E = WTEW, ˆ A = WTAW ∈ Cr×r, ˆ B = WTB ∈ Cr×m, ˆ C = CW ∈ Cp×r. It seems as if no ℓth order ODE can be extracted.

  • H. Faßbender MOR of Higher Order Systems
slide-28
SLIDE 28

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Rewrite G(s) = DF + CF(sEF + AF)−1BF for s0 ∈ C such that s0EF + AF is nonsingular as G(s) = DF + CF(I + (s − s0)MF)−1RF with MF = (s0EF + AF)−1EF ∈ Cℓn×ℓn, RF = (s0EF + AF)−1BF ∈ Cℓn×m. Compute orthonormal basis of Ks(MF, RF) = span{RF, MFRF, . . . , Ms−1

F

RF}. Let W be the matrix representing the basis. Generate reduced order system ˆ E d dt ˆ z(t) + ˆ Aˆ z(t) = ˆ Bu(t) ˆ y(t) = Du(t) + ˆ Cˆ z(t) with ˆ E = WTEW, ˆ A = WTAW ∈ Cr×r, ˆ B = WTB ∈ Cr×m, ˆ C = CW ∈ Cp×r. It seems as if no ℓth order ODE can be extracted.

  • H. Faßbender MOR of Higher Order Systems
slide-29
SLIDE 29

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Rewrite G(s) = DF + CF(sEF + AF)−1BF for s0 ∈ C such that s0EF + AF is nonsingular as G(s) = DF + CF(I + (s − s0)MF)−1RF with MF = (s0EF + AF)−1EF ∈ Cℓn×ℓn, RF = (s0EF + AF)−1BF ∈ Cℓn×m. Compute orthonormal basis of Ks(MF, RF) = span{RF, MFRF, . . . , Ms−1

F

RF}. Let W be the matrix representing the basis. Generate reduced order system ˆ E d dt ˆ z(t) + ˆ Aˆ z(t) = ˆ Bu(t) ˆ y(t) = Du(t) + ˆ Cˆ z(t) with ˆ E = WTEW, ˆ A = WTAW ∈ Cr×r, ˆ B = WTB ∈ Cr×m, ˆ C = CW ∈ Cp×r. It seems as if no ℓth order ODE can be extracted.

  • H. Faßbender MOR of Higher Order Systems
slide-30
SLIDE 30

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Rewrite G(s) = DF + CF(sEF + AF)−1BF for s0 ∈ C such that s0EF + AF is nonsingular as G(s) = DF + CF(I + (s − s0)MF)−1RF with MF = (s0EF + AF)−1EF ∈ Cℓn×ℓn, RF = (s0EF + AF)−1BF ∈ Cℓn×m. Compute orthonormal basis of Ks(MF, RF) = span{RF, MFRF, . . . , Ms−1

F

RF}. Let W be the matrix representing the basis. Generate reduced order system ˆ E d dt ˆ z(t) + ˆ Aˆ z(t) = ˆ Bu(t) ˆ y(t) = Du(t) + ˆ Cˆ z(t) with ˆ E = WTEW, ˆ A = WTAW ∈ Cr×r, ˆ B = WTB ∈ Cr×m, ˆ C = CW ∈ Cp×r. It seems as if no ℓth order ODE can be extracted.

  • H. Faßbender MOR of Higher Order Systems
slide-31
SLIDE 31

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Rewrite G(s) = DF + CF(sEF + AF)−1BF for s0 ∈ C such that s0EF + AF is nonsingular as G(s) = DF + CF(I + (s − s0)MF)−1RF with MF = (s0EF + AF)−1EF ∈ Cℓn×ℓn, RF = (s0EF + AF)−1BF ∈ Cℓn×m. Compute orthonormal basis of Ks(MF, RF) = span{RF, MFRF, . . . , Ms−1

F

RF}. Let W be the matrix representing the basis. Generate reduced order system ˆ E d dt ˆ z(t) + ˆ Aˆ z(t) = ˆ Bu(t) ˆ y(t) = Du(t) + ˆ Cˆ z(t) with ˆ E = WTEW, ˆ A = WTAW ∈ Cr×r, ˆ B = WTB ∈ Cr×m, ˆ C = CW ∈ Cp×r. It seems as if no ℓth order ODE can be extracted.

  • H. Faßbender MOR of Higher Order Systems
slide-32
SLIDE 32

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

The matrices MF and RF have a particular structure MF = (s0EF + AF)−1EF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, RF = (s0EF + AF)−1BF = c ⊗ R, where M(i) = (P(s0))−1

ℓ−i

  • j=0

sj

0Pi+j ∈ Cn×n, i = 1, . . . , ℓ

R = (P(s0))−1B ∈ Cn×m, c =        1 s0 s2 . . . sℓ−1        , Σ =          · · · · · · 1 ... . . . s0 1 ... . . . . . . ... ... ... . . . sℓ−2 · · · s0 1          ∈ Cℓ×ℓ.

  • H. Faßbender MOR of Higher Order Systems
slide-33
SLIDE 33

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Theorem (Freund 2005)

Let MF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, and RF = c ⊗ R with c ∈ Cℓ, cj = 0, j = 1, . . . , ℓ, R ∈ Cn×m, M(i) ∈ Cn×n, i = 1, . . . , ℓ, Σ ∈ Cℓ×ℓ. Let W ∈ Cℓn×r be any basis of the block-Krylov subspace Ks(MF, RF), r sm. Then W can be represented in the form

     WU(1) WU(2) . . . WU(ℓ)     

where W ∈ Cn×r and, for each i = 1, 2, . . . , ℓ, U(i) ∈ Cr×r is nonsingular and upper triangular. Ks(MF, RF) ⊂ Cℓn consists of ℓ ’copies’ of the subspace Sr = span{W} ⊂ Cn. Let V be the matrix representing an orthonormal basis of span{W}. Choose V = diag(V, V, . . . , V) ∈ Cℓn×ℓr, V HV = Ir. Then Ks(MF, RF) ⊆ range V.

  • H. Faßbender MOR of Higher Order Systems
slide-34
SLIDE 34

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Theorem (Freund 2005)

Let MF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, and RF = c ⊗ R with c ∈ Cℓ, cj = 0, j = 1, . . . , ℓ, R ∈ Cn×m, M(i) ∈ Cn×n, i = 1, . . . , ℓ, Σ ∈ Cℓ×ℓ. Let W ∈ Cℓn×r be any basis of the block-Krylov subspace Ks(MF, RF), r sm. Then W can be represented in the form

     WU(1) WU(2) . . . WU(ℓ)     

where W ∈ Cn×r and, for each i = 1, 2, . . . , ℓ, U(i) ∈ Cr×r is nonsingular and upper triangular. Ks(MF, RF) ⊂ Cℓn consists of ℓ ’copies’ of the subspace Sr = span{W} ⊂ Cn. Let V be the matrix representing an orthonormal basis of span{W}. Choose V = diag(V, V, . . . , V) ∈ Cℓn×ℓr, V HV = Ir. Then Ks(MF, RF) ⊆ range V.

  • H. Faßbender MOR of Higher Order Systems
slide-35
SLIDE 35

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Theorem (Freund 2005)

Let MF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, and RF = c ⊗ R with c ∈ Cℓ, cj = 0, j = 1, . . . , ℓ, R ∈ Cn×m, M(i) ∈ Cn×n, i = 1, . . . , ℓ, Σ ∈ Cℓ×ℓ. Let W ∈ Cℓn×r be any basis of the block-Krylov subspace Ks(MF, RF), r sm. Then W can be represented in the form

     WU(1) WU(2) . . . WU(ℓ)     

where W ∈ Cn×r and, for each i = 1, 2, . . . , ℓ, U(i) ∈ Cr×r is nonsingular and upper triangular. Ks(MF, RF) ⊂ Cℓn consists of ℓ ’copies’ of the subspace Sr = span{W} ⊂ Cn. Let V be the matrix representing an orthonormal basis of span{W}. Choose V = diag(V, V, . . . , V) ∈ Cℓn×ℓr, V HV = Ir. Then Ks(MF, RF) ⊆ range V.

  • H. Faßbender MOR of Higher Order Systems
slide-36
SLIDE 36

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Theorem (Freund 2005)

Let MF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, and RF = c ⊗ R with c ∈ Cℓ, cj = 0, j = 1, . . . , ℓ, R ∈ Cn×m, M(i) ∈ Cn×n, i = 1, . . . , ℓ, Σ ∈ Cℓ×ℓ. Let W ∈ Cℓn×r be any basis of the block-Krylov subspace Ks(MF, RF), r sm. Then W can be represented in the form

     WU(1) WU(2) . . . WU(ℓ)     

where W ∈ Cn×r and, for each i = 1, 2, . . . , ℓ, U(i) ∈ Cr×r is nonsingular and upper triangular. Ks(MF, RF) ⊂ Cℓn consists of ℓ ’copies’ of the subspace Sr = span{W} ⊂ Cn. Let V be the matrix representing an orthonormal basis of span{W}. Choose V = diag(V, V, . . . , V) ∈ Cℓn×ℓr, V HV = Ir. Then Ks(MF, RF) ⊆ range V.

  • H. Faßbender MOR of Higher Order Systems
slide-37
SLIDE 37

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Theorem (Freund 2005)

Let MF = (c ⊗ In)

  • M(1)

M(2) M(3) · · · M(ℓ) + Σ ⊗ In, and RF = c ⊗ R with c ∈ Cℓ, cj = 0, j = 1, . . . , ℓ, R ∈ Cn×m, M(i) ∈ Cn×n, i = 1, . . . , ℓ, Σ ∈ Cℓ×ℓ. Let W ∈ Cℓn×r be any basis of the block-Krylov subspace Ks(MF, RF), r sm. Then W can be represented in the form

     WU(1) WU(2) . . . WU(ℓ)     

where W ∈ Cn×r and, for each i = 1, 2, . . . , ℓ, U(i) ∈ Cr×r is nonsingular and upper triangular. Ks(MF, RF) ⊂ Cℓn consists of ℓ ’copies’ of the subspace Sr = span{W} ⊂ Cn. Let V be the matrix representing an orthonormal basis of span{W}. Choose V = diag(V, V, . . . , V) ∈ Cℓn×ℓr, V HV = Ir. Then Ks(MF, RF) ⊆ range V.

  • H. Faßbender MOR of Higher Order Systems
slide-38
SLIDE 38

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Project the first order system using V

  • VHEFV
  • VH d

dt z(t) +

  • VHAFV
  • VHz(t) =
  • VHBF
  • u(t)

y(t) = DFu(t) + (CFV) VHz(t) with VHBF =      . . . V HB      , VHAFV =         −In · · · −In ... . . . . . . ... ... ... · · · −In VP0V H VP1V H VP2V H · · · VPℓ−1V H         , VHEFV =

  • I(ℓ−1)n

V HPℓV

  • ,

CFV = [C0V C1V · · · Cℓ−1V], DF = D. An ℓth order reduced order system can be read off immediately. The first moments of the reduced order system match those of the original system.

  • H. Faßbender MOR of Higher Order Systems
slide-39
SLIDE 39

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Project the first order system using V

  • VHEFV
  • VH d

dt z(t) +

  • VHAFV
  • VHz(t) =
  • VHBF
  • u(t)

y(t) = DFu(t) + (CFV) VHz(t) with VHBF =      . . . V HB      , VHAFV =         −In · · · −In ... . . . . . . ... ... ... · · · −In VP0V H VP1V H VP2V H · · · VPℓ−1V H         , VHEFV =

  • I(ℓ−1)n

V HPℓV

  • ,

CFV = [C0V C1V · · · Cℓ−1V], DF = D. An ℓth order reduced order system can be read off immediately. The first moments of the reduced order system match those of the original system.

  • H. Faßbender MOR of Higher Order Systems
slide-40
SLIDE 40

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1

[Freund 2005]

Project the first order system using V

  • VHEFV
  • VH d

dt z(t) +

  • VHAFV
  • VHz(t) =
  • VHBF
  • u(t)

y(t) = DFu(t) + (CFV) VHz(t) with VHBF =      . . . V HB      , VHAFV =         −In · · · −In ... . . . . . . ... ... ... · · · −In VP0V H VP1V H VP2V H · · · VPℓ−1V H         , VHEFV =

  • I(ℓ−1)n

V HPℓV

  • ,

CFV = [C0V C1V · · · Cℓ−1V], DF = D. An ℓth order reduced order system can be read off immediately. The first moments of the reduced order system match those of the original system.

  • H. Faßbender MOR of Higher Order Systems
slide-41
SLIDE 41

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1 and 2

Approach 1 and 2 use companion form linearization. Approach 1 uses block-Krylov subspace Ks(MF, RF) with MF = (s0EF + AF)−1EF and RF = (s0EF + AF)−1BF. Approach 2 uses block-Krylov subspace Ks(MB, RB) with MB = A−1

B EB and

RB = A−1

B BB.

Neither λEF + AF nor λEB + AB is structure-preserving, e.g., (−λEF + AF)T = λEF + AF and (−λEB + AB)T = λEB + AB. There are numerous other linearizations.

  • H. Faßbender MOR of Higher Order Systems
slide-42
SLIDE 42

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1 and 2

Approach 1 and 2 use companion form linearization. Approach 1 uses block-Krylov subspace Ks(MF, RF) with MF = (s0EF + AF)−1EF and RF = (s0EF + AF)−1BF. Approach 2 uses block-Krylov subspace Ks(MB, RB) with MB = A−1

B EB and

RB = A−1

B BB.

Neither λEF + AF nor λEB + AB is structure-preserving, e.g., (−λEF + AF)T = λEF + AF and (−λEB + AB)T = λEB + AB. There are numerous other linearizations.

  • H. Faßbender MOR of Higher Order Systems
slide-43
SLIDE 43

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1 and 2

Approach 1 and 2 use companion form linearization. Approach 1 uses block-Krylov subspace Ks(MF, RF) with MF = (s0EF + AF)−1EF and RF = (s0EF + AF)−1BF. Approach 2 uses block-Krylov subspace Ks(MB, RB) with MB = A−1

B EB and

RB = A−1

B BB.

Neither λEF + AF nor λEB + AB is structure-preserving, e.g., (−λEF + AF)T = λEF + AF and (−λEB + AB)T = λEB + AB. There are numerous other linearizations.

  • H. Faßbender MOR of Higher Order Systems
slide-44
SLIDE 44

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1 and 2

Approach 1 and 2 use companion form linearization. Approach 1 uses block-Krylov subspace Ks(MF, RF) with MF = (s0EF + AF)−1EF and RF = (s0EF + AF)−1BF. Approach 2 uses block-Krylov subspace Ks(MB, RB) with MB = A−1

B EB and

RB = A−1

B BB.

Neither λEF + AF nor λEB + AB is structure-preserving, e.g., (−λEF + AF)T = λEF + AF and (−λEB + AB)T = λEB + AB. There are numerous other linearizations.

  • H. Faßbender MOR of Higher Order Systems
slide-45
SLIDE 45

Introduction Approach 1 Linearizations Example Robot Conclusions

Approach 1 and 2

Approach 1 and 2 use companion form linearization. Approach 1 uses block-Krylov subspace Ks(MF, RF) with MF = (s0EF + AF)−1EF and RF = (s0EF + AF)−1BF. Approach 2 uses block-Krylov subspace Ks(MB, RB) with MB = A−1

B EB and

RB = A−1

B BB.

Neither λEF + AF nor λEB + AB is structure-preserving, e.g., (−λEF + AF)T = λEF + AF and (−λEB + AB)T = λEB + AB. There are numerous other linearizations.

  • H. Faßbender MOR of Higher Order Systems
slide-46
SLIDE 46

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Systematic way to construct linearizations that allow for the preservation of structure and/or are better conditioned than the companion forms.

[Mackey, Mackey, Mehl, Mehrmann, SIMAX 2006] = [4M]

P(λ)x =

  • i=0

λiPix = ⇒ linearization of size ℓn × ℓn     λ     

Pℓ · · · In · · · In · · · . . . . . . ... ... . . . · · · In

     +     

Pℓ−1 Pℓ−2 · · · P1 P0 −In · · · −In · · · . . . . . . ... . . . . . . · · · −In

         

  • L1(λ)

       λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      .

  • H. Faßbender MOR of Higher Order Systems
slide-47
SLIDE 47

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Systematic way to construct linearizations that allow for the preservation of structure and/or are better conditioned than the companion forms.

[Mackey, Mackey, Mehl, Mehrmann, SIMAX 2006] = [4M]

P(λ)x =

  • i=0

λiPix = ⇒ linearization of size ℓn × ℓn     λ     

Pℓ · · · In · · · In · · · . . . . . . ... ... . . . · · · In

     +     

Pℓ−1 Pℓ−2 · · · P1 P0 −In · · · −In · · · . . . . . . ... . . . . . . · · · −In

         

  • L1(λ)

       λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      .

  • H. Faßbender MOR of Higher Order Systems
slide-48
SLIDE 48

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Systematic way to construct linearizations that allow for the preservation of structure and/or are better conditioned than the companion forms.

[Mackey, Mackey, Mehl, Mehrmann, SIMAX 2006] = [4M]

P(λ)x =

  • i=0

λiPix = ⇒ linearization of size ℓn × ℓn     λ     

Pℓ · · · In · · · In · · · . . . . . . ... ... . . . · · · In

     +     

Pℓ−1 Pℓ−2 · · · P1 P0 −In · · · −In · · · . . . . . . ... . . . . . . · · · −In

         

  • L1(λ)

       λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      .

  • H. Faßbender MOR of Higher Order Systems
slide-49
SLIDE 49

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Thus L1(λ)        λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      ⇐ ⇒ L1(λ) · (Λℓ ⊗ In)x = e1 ⊗ P(λ)x as        λℓ−1x λℓ−2x . . . λx x        =               λℓ−1 λℓ−2 . . . λ 1        ⊗ In        x = (Λℓ⊗In)x and      P(λ)x . . .      = e1⊗P(λ)x.

  • H. Faßbender MOR of Higher Order Systems
slide-50
SLIDE 50

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Thus L1(λ)        λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      ⇐ ⇒ L1(λ) · (Λℓ ⊗ In)x = e1 ⊗ P(λ)x as        λℓ−1x λℓ−2x . . . λx x        =               λℓ−1 λℓ−2 . . . λ 1        ⊗ In        x = (Λℓ⊗In)x and      P(λ)x . . .      = e1⊗P(λ)x.

  • H. Faßbender MOR of Higher Order Systems
slide-51
SLIDE 51

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations – Motivation

Thus L1(λ)        λℓ−1x λℓ−2x . . . λx x        =      P(λ)x . . .      ⇐ ⇒ L1(λ) · (Λℓ ⊗ In)x = e1 ⊗ P(λ)x as        λℓ−1x λℓ−2x . . . λx x        =               λℓ−1 λℓ−2 . . . λ 1        ⊗ In        x = (Λℓ⊗In)x and      P(λ)x . . .      = e1⊗P(λ)x.

  • H. Faßbender MOR of Higher Order Systems
slide-52
SLIDE 52

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-53
SLIDE 53

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-54
SLIDE 54

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-55
SLIDE 55

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-56
SLIDE 56

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-57
SLIDE 57

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-58
SLIDE 58

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations

Generalize L1(λ) · (Λℓ ⊗ In) = e1 ⊗ P(λ) to L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for L(λ) = λE + A.

Definition [Ansatz space]

[4M]

L1(P) = {L(λ) = λE + A | E, A ∈ Rℓn×ℓn, L(λ) · (Λℓ ⊗ In) = v ⊗ P(λ) for some ansatz vector v ∈ Rℓ}. Theorem

[4M],[FS-1]

L1(P) is a vector space over R with dim L1(P) = ℓ(ℓ − 1)n2 + ℓ. Almost all pencils in L1(P) are strong linearizations of P(λ). L(λ) = [v ⊗ In W]L1(λ) for v = 0 and an arbitrary W ∈ Rℓn×(ℓ−1)n is a strong linearization of P(λ), if [v ⊗ In W] is nonsingular. Similar derivation for second companion form L2(λ) gives L2(P). There do exist linearizations that are not in L1(P) or L2(P).

  • H. Faßbender MOR of Higher Order Systems
slide-59
SLIDE 59

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Matrix Polynomials – (Strong) Linearization

Definition (Linearization)

A pencil L(λ) = λE + A with E, A ∈ Rkn×kn is called a linearization of P(λ) ∈ Πn

ℓ if

there exist unimodular matrix polynomials E(λ), F(λ) such that E(λ)L(λ)F(λ) =

  • P(λ)

I(k−1)n

  • for some k ∈ N.

A matrix polynomial E(λ) is unimodular if det E(λ) is a nonzero constant.

Theorem

[Lancaster, Psarrakos Report 2005]

For regular polynomials P(λ) : any linearization: the Jordan structure of all finite eigenvalues is preserved. strong linearization: the Jordan structure of the eigenvalue ∞ is preserved.

Example λP1 + P0 = λ

  • 4

5

  • 1

2 3

  • =

⇒ λ1 = 1 4, λ2 = 3 0 = ∞.

  • H. Faßbender MOR of Higher Order Systems
slide-60
SLIDE 60

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Matrix Polynomials – (Strong) Linearization

Definition (Linearization)

A pencil L(λ) = λE + A with E, A ∈ Rkn×kn is called a linearization of P(λ) ∈ Πn

ℓ if

there exist unimodular matrix polynomials E(λ), F(λ) such that E(λ)L(λ)F(λ) =

  • P(λ)

I(k−1)n

  • for some k ∈ N.

A matrix polynomial E(λ) is unimodular if det E(λ) is a nonzero constant.

Theorem

[Lancaster, Psarrakos Report 2005]

For regular polynomials P(λ) : any linearization: the Jordan structure of all finite eigenvalues is preserved. strong linearization: the Jordan structure of the eigenvalue ∞ is preserved.

Example λP1 + P0 = λ

  • 4

5

  • 1

2 3

  • =

⇒ λ1 = 1 4, λ2 = 3 0 = ∞.

  • H. Faßbender MOR of Higher Order Systems
slide-61
SLIDE 61

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Matrix Polynomials – (Strong) Linearization

Definition (Linearization)

A pencil L(λ) = λE + A with E, A ∈ Rkn×kn is called a linearization of P(λ) ∈ Πn

ℓ if

there exist unimodular matrix polynomials E(λ), F(λ) such that E(λ)L(λ)F(λ) =

  • P(λ)

I(k−1)n

  • for some k ∈ N.

A matrix polynomial E(λ) is unimodular if det E(λ) is a nonzero constant.

Theorem

[Lancaster, Psarrakos Report 2005]

For regular polynomials P(λ) : any linearization: the Jordan structure of all finite eigenvalues is preserved. strong linearization: the Jordan structure of the eigenvalue ∞ is preserved.

Example λP1 + P0 = λ

  • 4

5

  • 1

2 3

  • =

⇒ λ1 = 1 4, λ2 = 3 0 = ∞.

  • H. Faßbender MOR of Higher Order Systems
slide-62
SLIDE 62

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Freund considers EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t). Interpret Freund’s approach in terms of the first companion form L1(λ) = λE1 + A1 E1 d dt z(t) + A1 z(t) = B1u(t) y(t) = DFu(t) + C1 z(t). with

  • z(t) = PT

z(t) B1 = PTB C1 = CFP as L1(λ) = λE1 + A1 = λPTEFP + PTAFP with P =

  • In

... In

  • .
  • H. Faßbender MOR of Higher Order Systems
slide-63
SLIDE 63

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Freund considers EF d dt z(t) + AFz(t) = BFu(t) y(t) = DFu(t) + CFz(t). Interpret Freund’s approach in terms of the first companion form L1(λ) = λE1 + A1 E1 d dt z(t) + A1 z(t) = B1u(t) y(t) = DFu(t) + C1 z(t). with

  • z(t) = PT

z(t) B1 = PTB C1 = CFP as L1(λ) = λE1 + A1 = λPTEFP + PTAFP with P =

  • In

... In

  • .
  • H. Faßbender MOR of Higher Order Systems
slide-64
SLIDE 64

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Approach is based on the Krylov subspace induced by M = (L1(s0))−1E1 and R = (L1(s0))−1B1. All linearizations in L1 can be written as L(λ) = [v ⊗ In W]L1(λ) = TL1(λ) = λTE1 + TA1 with v ∈ Rℓ, W ∈ Rℓn×(ℓ−1)n such that T = [v ⊗ In W] is nonsingular. As (TE1) d dt z(t) + (TA1) z(t) = (TB1) u(t) and (L(s0))−1 (TE1) = (L1(s0))−1E1 = M, (L(s0))−1 (TB1) = (L1(s0))−1B1 = R, all linearization in L1 will yield (theoretically) the same reduced order system. A similar observation holds for Approach 2.

  • H. Faßbender MOR of Higher Order Systems
slide-65
SLIDE 65

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Approach is based on the Krylov subspace induced by M = (L1(s0))−1E1 and R = (L1(s0))−1B1. All linearizations in L1 can be written as L(λ) = [v ⊗ In W]L1(λ) = TL1(λ) = λTE1 + TA1 with v ∈ Rℓ, W ∈ Rℓn×(ℓ−1)n such that T = [v ⊗ In W] is nonsingular. As (TE1) d dt z(t) + (TA1) z(t) = (TB1) u(t) and (L(s0))−1 (TE1) = (L1(s0))−1E1 = M, (L(s0))−1 (TB1) = (L1(s0))−1B1 = R, all linearization in L1 will yield (theoretically) the same reduced order system. A similar observation holds for Approach 2.

  • H. Faßbender MOR of Higher Order Systems
slide-66
SLIDE 66

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Approach is based on the Krylov subspace induced by M = (L1(s0))−1E1 and R = (L1(s0))−1B1. All linearizations in L1 can be written as L(λ) = [v ⊗ In W]L1(λ) = TL1(λ) = λTE1 + TA1 with v ∈ Rℓ, W ∈ Rℓn×(ℓ−1)n such that T = [v ⊗ In W] is nonsingular. As (TE1) d dt z(t) + (TA1) z(t) = (TB1) u(t) and (L(s0))−1 (TE1) = (L1(s0))−1E1 = M, (L(s0))−1 (TB1) = (L1(s0))−1B1 = R, all linearization in L1 will yield (theoretically) the same reduced order system. A similar observation holds for Approach 2.

  • H. Faßbender MOR of Higher Order Systems
slide-67
SLIDE 67

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) of linearizations and Approach 1

Approach is based on the Krylov subspace induced by M = (L1(s0))−1E1 and R = (L1(s0))−1B1. All linearizations in L1 can be written as L(λ) = [v ⊗ In W]L1(λ) = TL1(λ) = λTE1 + TA1 with v ∈ Rℓ, W ∈ Rℓn×(ℓ−1)n such that T = [v ⊗ In W] is nonsingular. As (TE1) d dt z(t) + (TA1) z(t) = (TB1) u(t) and (L(s0))−1 (TE1) = (L1(s0))−1E1 = M, (L(s0))−1 (TB1) = (L1(s0))−1B1 = R, all linearization in L1 will yield (theoretically) the same reduced order system. A similar observation holds for Approach 2.

  • H. Faßbender MOR of Higher Order Systems
slide-68
SLIDE 68

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Gyroscopic system P(λ) = P(−λ)T ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n.

Companion form in L1(P) L1(λ) =

  • M

I

  • +
  • G

K −I

  • is not structure preserving as L1(λ) = L1(−λ)T.

Structured linearization in L1(P) L(λ) = λ

  • −M

M G

  • +
  • M

K

  • ∈ L1(P)

is a structure-preserving linearization (L(λ) = L(−λ)T).

  • H. Faßbender MOR of Higher Order Systems
slide-69
SLIDE 69

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Gyroscopic system P(λ) = P(−λ)T ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n.

Companion form in L1(P) L1(λ) =

  • M

I

  • +
  • G

K −I

  • is not structure preserving as L1(λ) = L1(−λ)T.

Structured linearization in L1(P) L(λ) = λ

  • −M

M G

  • +
  • M

K

  • ∈ L1(P)

is a structure-preserving linearization (L(λ) = L(−λ)T).

  • H. Faßbender MOR of Higher Order Systems
slide-70
SLIDE 70

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Gyroscopic system P(λ) = P(−λ)T ∈ Πn

2

P(λ) = λ2M + λG + K, M = MT, G = −GT, K = K T, M, G, K ∈ Rn×n.

Companion form in L1(P) L1(λ) =

  • M

I

  • +
  • G

K −I

  • is not structure preserving as L1(λ) = L1(−λ)T.

Structured linearization in L1(P) L(λ) = λ

  • −M

M G

  • +
  • M

K

  • ∈ L1(P)

is a structure-preserving linearization (L(λ) = L(−λ)T).

  • H. Faßbender MOR of Higher Order Systems
slide-71
SLIDE 71

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

Companion form in L1(P)

L1(λ) = λ    P4 In In In    +    P3 P2 P1 P0 −In −In −In   

Structured linearizations in L1(P)

different [4M] L(λ) = λ    −P4 −P4 P4 P3 P4 P3 −P4 P1 − P3 P0 − P2 P4 P3 P2 − P0 P1    +    P4 P4 P2 − P4 P1 − P3 P0 P4 P3 − P1 P2 − P0 P0 P0   

  • H. Faßbender MOR of Higher Order Systems
slide-72
SLIDE 72

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

Companion form in L1(P)

L1(λ) = λ    P4 In In In    +    P3 P2 P1 P0 −In −In −In   

Structured linearizations in L1(P)

different [4M] L(λ) = λ    −P4 −P4 P4 P3 P4 P3 −P4 P1 − P3 P0 − P2 P4 P3 P2 − P0 P1    +    P4 P4 P2 − P4 P1 − P3 P0 P4 P3 − P1 P2 − P0 P0 P0   

  • H. Faßbender MOR of Higher Order Systems
slide-73
SLIDE 73

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

Companion form in L1(P)

L1(λ) = λ    P4 In In In    +    P3 P2 P1 P0 −In −In −In   

Structured linearizations in L1(P)

different [4M] L(λ) = λ    −P4 −P4 P4 P3 P4 P3 −P4 P1 − P3 P0 − P2 P4 P3 P2 − P0 P1    +    P4 P4 P2 − P4 P1 − P3 P0 P4 P3 − P1 P2 − P0 P0 P0   

  • H. Faßbender MOR of Higher Order Systems
slide-74
SLIDE 74

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✸✵❀ P✹❂❡②❡✭♥✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

L1(λ) and L(λ) may be very differently conditioned.

  • H. Faßbender MOR of Higher Order Systems
slide-75
SLIDE 75

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✸✵❀ P✹❂❡②❡✭♥✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

L1(λ) and L(λ) may be very differently conditioned.

  • H. Faßbender MOR of Higher Order Systems
slide-76
SLIDE 76

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✺❀ P✹❂✳✺✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

L1(λ) and L(λ) may be very differently conditioned. L(λ) is not (block) sparse, while L1(λ) is.

  • H. Faßbender MOR of Higher Order Systems
slide-77
SLIDE 77

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Vector space L1(P) – Structured Linearizations

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✺❀ P✹❂✳✺✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

L1(λ) and L(λ) may be very differently conditioned. L(λ) is not (block) sparse, while L1(λ) is.

  • H. Faßbender MOR of Higher Order Systems
slide-78
SLIDE 78

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Structured Linearization not in L1(P)

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

(Structured) Linearization not in L1(P) L(λ) =      

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       = λE+A Note+E, A ∈ R5n×5n! as V(λ)L(λ)U(λ) = diag(I4n, P(λ)) for

V(λ) =       In −P4 −λP4 −λIn In λP4 λ2P4 + λP3 + P2 In In λ2In −λIn In −λ2P4 −λ3P4 − λ2P3 − λP2       , U(λ) =       In λIn λ2In In λIn In In −λ2P4 In λ3P4 + λ2P3 + λP2       , det U(λ) = det V(λ) = 1.

  • H. Faßbender MOR of Higher Order Systems
slide-79
SLIDE 79

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Structured Linearization not in L1(P)

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

(Structured) Linearization not in L1(P) L(λ) =      

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       = λE+A Note+E, A ∈ R5n×5n! as V(λ)L(λ)U(λ) = diag(I4n, P(λ)) for

V(λ) =       In −P4 −λP4 −λIn In λP4 λ2P4 + λP3 + P2 In In λ2In −λIn In −λ2P4 −λ3P4 − λ2P3 − λP2       , U(λ) =       In λIn λ2In In λIn In In −λ2P4 In λ3P4 + λ2P3 + λP2       , det U(λ) = det V(λ) = 1.

  • H. Faßbender MOR of Higher Order Systems
slide-80
SLIDE 80

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Structured Linearization not in L1(P)

Robot P(λ) = P(−λ)T ∈ Πn

4

P(λ) = λ4P4 + λ3P3 + λ2P2 + λP1 + P0, Pi = (−1)iPT

i ,

Pi ∈ Rn×n, i = 0, . . . , 4.

(Structured) Linearization not in L1(P) L(λ) =      

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       = λE+A Note+E, A ∈ R5n×5n! as V(λ)L(λ)U(λ) = diag(I4n, P(λ)) for

V(λ) =       In −P4 −λP4 −λIn In λP4 λ2P4 + λP3 + P2 In In λ2In −λIn In −λ2P4 −λ3P4 − λ2P3 − λP2       , U(λ) =       In λIn λ2In In λIn In In −λ2P4 In λ3P4 + λ2P3 + λP2       , det U(λ) = det V(λ) = 1.

  • H. Faßbender MOR of Higher Order Systems
slide-81
SLIDE 81

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Definition [Block Kronecker Ansatz space]

[FS-2]

Let P(λ) ∈ Πn

ℓ with ℓ = r + s + 1. The block Kronecker ansatz space Gr+1(P) is the

set of all ℓn × ℓn matrix pencils L(λ) that satisfy the block Kronecker ansatz equation [ λrIn · · · In ] Is n

  • L(λ)
  • L11(λ)

L12(λ) L21(λ) L22(λ)

       λsIn . . . In    Ir n      = αP(λ)

  • .

Gr+1(P) is a vector space over R of dimension (ℓ − 1)ℓn2 + 1. [FS-2] Thus, L1(P) = Gr+1(P). Almost all pencils in Gr+1(P) are strong linearizations of P(λ). [FS-2]

  • H. Faßbender MOR of Higher Order Systems
slide-82
SLIDE 82

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Definition [Block Kronecker Ansatz space]

[FS-2]

Let P(λ) ∈ Πn

ℓ with ℓ = r + s + 1. The block Kronecker ansatz space Gr+1(P) is the

set of all ℓn × ℓn matrix pencils L(λ) that satisfy the block Kronecker ansatz equation [ λrIn · · · In ] Is n

  • L(λ)
  • L11(λ)

L12(λ) L21(λ) L22(λ)

       λsIn . . . In    Ir n      = αP(λ)

  • .

Gr+1(P) is a vector space over R of dimension (ℓ − 1)ℓn2 + 1. [FS-2] Thus, L1(P) = Gr+1(P). Almost all pencils in Gr+1(P) are strong linearizations of P(λ). [FS-2]

  • H. Faßbender MOR of Higher Order Systems
slide-83
SLIDE 83

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Definition [Block Kronecker Ansatz space]

[FS-2]

Let P(λ) ∈ Πn

ℓ with ℓ = r + s + 1. The block Kronecker ansatz space Gr+1(P) is the

set of all ℓn × ℓn matrix pencils L(λ) that satisfy the block Kronecker ansatz equation [ λrIn · · · In ] Is n

  • L(λ)
  • L11(λ)

L12(λ) L21(λ) L22(λ)

       λsIn . . . In    Ir n      = αP(λ)

  • .

Gr+1(P) is a vector space over R of dimension (ℓ − 1)ℓn2 + 1. [FS-2] Thus, L1(P) = Gr+1(P). Almost all pencils in Gr+1(P) are strong linearizations of P(λ). [FS-2]

  • H. Faßbender MOR of Higher Order Systems
slide-84
SLIDE 84

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Definition [Block Kronecker Ansatz space]

[FS-2]

Let P(λ) ∈ Πn

ℓ with ℓ = r + s + 1. The block Kronecker ansatz space Gr+1(P) is the

set of all ℓn × ℓn matrix pencils L(λ) that satisfy the block Kronecker ansatz equation [ λrIn · · · In ] Is n

  • L(λ)
  • L11(λ)

L12(λ) L21(λ) L22(λ)

       λsIn . . . In    Ir n      = αP(λ)

  • .

Gr+1(P) is a vector space over R of dimension (ℓ − 1)ℓn2 + 1. [FS-2] Thus, L1(P) = Gr+1(P). Almost all pencils in Gr+1(P) are strong linearizations of P(λ). [FS-2]

  • H. Faßbender MOR of Higher Order Systems
slide-85
SLIDE 85

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Higher order system and block Kronecker linearizations

Robot P(λ) ∈ Πn

4

P4 d4 dt4 x(t) + P3 d3 dt3 x(t) + P2 d2 dt2 x(t) + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + C3 d3 dt3 x(t) + C2 d2 dt2 x(t) + C1 d dt x(t) + C0x(t) = y(t) The linearization L(λ) = λE + A =       P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       does not give an equivalent first order ODE of the form E d

dt z(t) + Az(t) = Bu(t)

as [λ2In

−λIn In 0]

 

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI

   

λ2In λIn In

  = P(λ).

  • H. Faßbender MOR of Higher Order Systems
slide-86
SLIDE 86

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Higher order system and block Kronecker linearizations

Robot P(λ) ∈ Πn

4

P4 d4 dt4 x(t) + P3 d3 dt3 x(t) + P2 d2 dt2 x(t) + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + C3 d3 dt3 x(t) + C2 d2 dt2 x(t) + C1 d dt x(t) + C0x(t) = y(t) The linearization L(λ) = λE + A =       P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       does not give an equivalent first order ODE of the form E d

dt z(t) + Az(t) = Bu(t)

as [λ2In

−λIn In 0]

 

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI

   

λ2In λIn In

  = P(λ).

  • H. Faßbender MOR of Higher Order Systems
slide-87
SLIDE 87

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Higher order system and block Kronecker linearizations

Robot P(λ) ∈ Πn

4

P4 d4 dt4 x(t) + P3 d3 dt3 x(t) + P2 d2 dt2 x(t) + P1 d dt x(t) + P0x(t) = Bu(t) Du(t) + C3 d3 dt3 x(t) + C2 d2 dt2 x(t) + C1 d dt x(t) + C0x(t) = y(t) The linearization L(λ) = λE + A =       P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI       does not give an equivalent first order ODE of the form E d

dt z(t) + Az(t) = Bu(t)

as [λ2In

−λIn In 0]

 

P4 I −P2 − λP3 λI I P0 + λP1 λI I −λI I −λI

   

λ2In λIn In

  = P(λ).

  • H. Faßbender MOR of Higher Order Systems
slide-88
SLIDE 88

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

In L1 all linearizations are based on L1(λ), the linearizations in Gr+1 are based on LK(λ) = λEK + AK =             λαPℓ + αPℓ−1 αPℓ−2 · · · αPr αPr−1 . . . αP0 −In λIn ... ... −In λIn −In λIn ... ... −In λIn             = Σr(λ) LT

r (λ)

Ls(λ)

  • with ℓ = r + s + 1, Σr(λ) ∈ C(r+1)n×sn, and Lj(λ) ∈ Cjn×(j+1)n.
  • H. Faßbender MOR of Higher Order Systems
slide-89
SLIDE 89

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

We can find BK, CK such that G(s) = D +

ℓ−1

  • j=0

Cj((P(s))−1B = DK + CK (LK(s))−1 BK. Introduce shift s0 ∈ C such that LK(s0) = s0EK + AK is nonsingular. Then G(s) = DK + CK(LK(s))−1BK = DK + CK(I + (s − s0)MK)−1RK with MK = (LK(s0))−1EK, RK = (LK(s0))−1BK. Compute basis of Ks(MK, RK). Represent the basis in block form W1

W2 . . . Wℓ

  • ,

Wj ∈ Cn×r. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-90
SLIDE 90

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

We can find BK, CK such that G(s) = D +

ℓ−1

  • j=0

Cj((P(s))−1B = DK + CK (LK(s))−1 BK. Introduce shift s0 ∈ C such that LK(s0) = s0EK + AK is nonsingular. Then G(s) = DK + CK(LK(s))−1BK = DK + CK(I + (s − s0)MK)−1RK with MK = (LK(s0))−1EK, RK = (LK(s0))−1BK. Compute basis of Ks(MK, RK). Represent the basis in block form W1

W2 . . . Wℓ

  • ,

Wj ∈ Cn×r. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-91
SLIDE 91

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

We can find BK, CK such that G(s) = D +

ℓ−1

  • j=0

Cj((P(s))−1B = DK + CK (LK(s))−1 BK. Introduce shift s0 ∈ C such that LK(s0) = s0EK + AK is nonsingular. Then G(s) = DK + CK(LK(s))−1BK = DK + CK(I + (s − s0)MK)−1RK with MK = (LK(s0))−1EK, RK = (LK(s0))−1BK. Compute basis of Ks(MK, RK). Represent the basis in block form W1

W2 . . . Wℓ

  • ,

Wj ∈ Cn×r. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-92
SLIDE 92

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

We can find BK, CK such that G(s) = D +

ℓ−1

  • j=0

Cj((P(s))−1B = DK + CK (LK(s))−1 BK. Introduce shift s0 ∈ C such that LK(s0) = s0EK + AK is nonsingular. Then G(s) = DK + CK(LK(s))−1BK = DK + CK(I + (s − s0)MK)−1RK with MK = (LK(s0))−1EK, RK = (LK(s0))−1BK. Compute basis of Ks(MK, RK). Represent the basis in block form W1

W2 . . . Wℓ

  • ,

Wj ∈ Cn×r. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-93
SLIDE 93

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Any linearization in Gr+1 can be expressed as

  • LK(λ) = T1LK(λ)T2

with T1 =

  • I(r+1)n

B1 C1

  • ,

T2 =

  • I(s+1)n

B2 C2

  • and B1 ∈ R(r+1)n×s n, B2 ∈ Rr n×(s+1)n, C1 ∈ Rsn×sn, C2 ∈ Rr n×r n.

G(s) = DK + CK( LK(s))−1 BK with CK = CKT2, BK = T1BK. G(s) = DK + CK(I + (s − s0) MK)−1 RK with

  • MK = (

LK(s0))−1T1EKT2,

  • RK = (

LK(s0))−1 BK, = T−1

2 MKT2,

= T−1

2 RK.

Thus, K( MK, Rk) = T−1

2 K(MK, Rk).

As before: Compute basis of Ks( MK, RK). Represent it in block form with blocks Wj ∈ Cn×r, j = 1, . . . , ℓ. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-94
SLIDE 94

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Any linearization in Gr+1 can be expressed as

  • LK(λ) = T1LK(λ)T2

with T1 =

  • I(r+1)n

B1 C1

  • ,

T2 =

  • I(s+1)n

B2 C2

  • and B1 ∈ R(r+1)n×s n, B2 ∈ Rr n×(s+1)n, C1 ∈ Rsn×sn, C2 ∈ Rr n×r n.

G(s) = DK + CK( LK(s))−1 BK with CK = CKT2, BK = T1BK. G(s) = DK + CK(I + (s − s0) MK)−1 RK with

  • MK = (

LK(s0))−1T1EKT2,

  • RK = (

LK(s0))−1 BK, = T−1

2 MKT2,

= T−1

2 RK.

Thus, K( MK, Rk) = T−1

2 K(MK, Rk).

As before: Compute basis of Ks( MK, RK). Represent it in block form with blocks Wj ∈ Cn×r, j = 1, . . . , ℓ. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-95
SLIDE 95

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Any linearization in Gr+1 can be expressed as

  • LK(λ) = T1LK(λ)T2

with T1 =

  • I(r+1)n

B1 C1

  • ,

T2 =

  • I(s+1)n

B2 C2

  • and B1 ∈ R(r+1)n×s n, B2 ∈ Rr n×(s+1)n, C1 ∈ Rsn×sn, C2 ∈ Rr n×r n.

G(s) = DK + CK( LK(s))−1 BK with CK = CKT2, BK = T1BK. G(s) = DK + CK(I + (s − s0) MK)−1 RK with

  • MK = (

LK(s0))−1T1EKT2,

  • RK = (

LK(s0))−1 BK, = T−1

2 MKT2,

= T−1

2 RK.

Thus, K( MK, Rk) = T−1

2 K(MK, Rk).

As before: Compute basis of Ks( MK, RK). Represent it in block form with blocks Wj ∈ Cn×r, j = 1, . . . , ℓ. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-96
SLIDE 96

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Any linearization in Gr+1 can be expressed as

  • LK(λ) = T1LK(λ)T2

with T1 =

  • I(r+1)n

B1 C1

  • ,

T2 =

  • I(s+1)n

B2 C2

  • and B1 ∈ R(r+1)n×s n, B2 ∈ Rr n×(s+1)n, C1 ∈ Rsn×sn, C2 ∈ Rr n×r n.

G(s) = DK + CK( LK(s))−1 BK with CK = CKT2, BK = T1BK. G(s) = DK + CK(I + (s − s0) MK)−1 RK with

  • MK = (

LK(s0))−1T1EKT2,

  • RK = (

LK(s0))−1 BK, = T−1

2 MKT2,

= T−1

2 RK.

Thus, K( MK, Rk) = T−1

2 K(MK, Rk).

As before: Compute basis of Ks( MK, RK). Represent it in block form with blocks Wj ∈ Cn×r, j = 1, . . . , ℓ. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-97
SLIDE 97

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Block Kronecker Ansatz space Gr+1

Any linearization in Gr+1 can be expressed as

  • LK(λ) = T1LK(λ)T2

with T1 =

  • I(r+1)n

B1 C1

  • ,

T2 =

  • I(s+1)n

B2 C2

  • and B1 ∈ R(r+1)n×s n, B2 ∈ Rr n×(s+1)n, C1 ∈ Rsn×sn, C2 ∈ Rr n×r n.

G(s) = DK + CK( LK(s))−1 BK with CK = CKT2, BK = T1BK. G(s) = DK + CK(I + (s − s0) MK)−1 RK with

  • MK = (

LK(s0))−1T1EKT2,

  • RK = (

LK(s0))−1 BK, = T−1

2 MKT2,

= T−1

2 RK.

Thus, K( MK, Rk) = T−1

2 K(MK, Rk).

As before: Compute basis of Ks( MK, RK). Represent it in block form with blocks Wj ∈ Cn×r, j = 1, . . . , ℓ. Generate reduced order higher order system via projection with V, the matrix representing an orthonormal basis of span{Wr+1}.

  • H. Faßbender MOR of Higher Order Systems
slide-98
SLIDE 98

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Four different Linearizations for Robot Example

Robot P(λ) ∈ Πn

4

P4 d4 dt4 x(t) + P3 d3 dt3 x(t) + P2 d2 dt2 x(t) + P1 d dt x(t) + P0x(t) = Bu(t), Pi = (−1)iPT

i

Du(t) + C3 d3 dt3 x(t) + C2 d2 dt2 x(t) + C1 d dt x(t) + C0x(t) = y(t)

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✺❀ P✹❂✳✺✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

  • H. Faßbender MOR of Higher Order Systems
slide-99
SLIDE 99

Introduction Approach 1 Linearizations Example Robot Conclusions Vector space L1(P) Vector space Gr+1

Four different Linearizations for Robot Example

  • H. Faßbender MOR of Higher Order Systems
slide-100
SLIDE 100

Introduction Approach 1 Linearizations Example Robot Conclusions

Four different Linearizations for Robot Example

Robot P(λ) ∈ Πn

4

P4 d4 dt4 x(t) + P3 d3 dt3 x(t) + P2 d2 dt2 x(t) + P1 d dt x(t) + P0x(t) = Bu(t), Pi = (−1)iPT

i

Du(t) + C3 d3 dt3 x(t) + C2 d2 dt2 x(t) + C1 d dt x(t) + C0x(t) = y(t)

P✵❂✶✴✶✵✵✯❣❛❧❧❡r②✭✬♣♦✐ss♦♥✬✱✶✵✮❀ P✷❂r❛♥❞♥✭✶✵✵✮❀P✷❂✭P✷✰P✷✬✮✴✸✵❀ P✹❂❡②❡✭♥✮❀ P✶❂r❛♥❞✭✶✵✵✮❀P✶❂P✶✲P✶✬❀ P✸❂r❛♥❞♥✭✶✵✵✮❀P✸❂P✸✲P✸✬❀

  • H. Faßbender MOR of Higher Order Systems
slide-101
SLIDE 101

Introduction Approach 1 Linearizations Example Robot Conclusions

Four different Linearizations for Robot Example

  • H. Faßbender MOR of Higher Order Systems
slide-102
SLIDE 102

Introduction Approach 1 Linearizations Example Robot Conclusions

Eigenvalues of Robot Example

  • H. Faßbender MOR of Higher Order Systems
slide-103
SLIDE 103

Introduction Approach 1 Linearizations Example Robot Conclusions

MOR for Robot Example, expansion points ±0.5ı

  • H. Faßbender MOR of Higher Order Systems
slide-104
SLIDE 104

Introduction Approach 1 Linearizations Example Robot Conclusions

MOR for Robot Example, expansion points ±0.5ı

  • H. Faßbender MOR of Higher Order Systems
slide-105
SLIDE 105

Introduction Approach 1 Linearizations Example Robot Conclusions

MOR for Robot Example, expansion points ±0.5ı

  • H. Faßbender MOR of Higher Order Systems
slide-106
SLIDE 106

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-107
SLIDE 107

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-108
SLIDE 108

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-109
SLIDE 109

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-110
SLIDE 110

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-111
SLIDE 111

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-112
SLIDE 112

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-113
SLIDE 113

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-114
SLIDE 114

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-115
SLIDE 115

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-116
SLIDE 116

Introduction Approach 1 Linearizations Example Robot Conclusions

Conclusions

Galerkin projection based MOR for higher order LTI systems. Compute projection from linearization of higher order LTI system such that higher

  • rder system can be recovered.

Vector spaces L1(P) and Gr+1(P) allow to generate an abundance of linearizations. Linearizations have different condition.

It is not (yet) clear how to choose an optimally conditioned linearization. For the structured robot example, the structured linearizations seem to be better conditioned.

LU decomposition of linearization needs to be computed efficiently.

For block-dense linearizations, the LU decomposition can be computed in about O(ℓ3n3) flops. For the structured robot example, the LU decomposition of the structured block Kronecker linearization can be computed in just O(n3 + ℓ2n2) flops.

Open question: What are the dominant poles of a higher order system?

Thank you for your attention!

  • H. Faßbender MOR of Higher Order Systems
slide-117
SLIDE 117

Introduction Approach 1 Linearizations Example Robot Conclusions

Main References

[FS-1]: H. Faßbender and P. Saltenberger, On vector spaces of linearizations for matrix polynomials in orthogonal bases. Linear Algebra and its Applications 525 (2017), pp. 59–83. [FS-2]: H. Faßbender and P. Saltenberger, Block Kronecker Ansatz Spaces for Matrix

  • Polynomials. Linear Algebra and its Applications 542 (2018), pp. 118–148.

[Freund BIT 2005]: R. Freund, Krylov subspaces associated with higher-order linear dynamical

  • systems. BIT 45 (2005), pp. 495–526.

[4M]: D. S. Mackey, N. Mackey, C. Mehl and V. Mehrmann, Vector spaces of linearizations for matrix polynomials. SIAM J. Matrix Anal. Appl., 28 (2006), pp. 971–1004. [LBLW 2011]: B. Li, L. Bao, Y. Lin, Y. Wei, Model-order reduction of kth order MIMO dynamical systems using block kth order Krylov subspaces. International Journal of Computer Mathematics 88(1) (2011), pp. 150–162. Some results from [FS-2] have been discovered independently in

  • M. Bueno, F. Dopico, J. Pérez, R. Saavedra, B. Zykoski, A unified approach to Fiedler-like

pencils via strong block minimal bases pencils. arXiv preprint, arXiv:1611.07170v1.

  • H. Faßbender MOR of Higher Order Systems
slide-118
SLIDE 118

Introduction Approach 1 Linearizations Example Robot Conclusions

More References

Dopico, Lawrence, Pérez, Van Dooren, Block Kronecker linearizations of matrix polynomials and their backward errors. MIMS-eprint 2016.34. Freund, Pade-type model reduction of second-order and higher-order linear dynamical systems. In Benner, Mehrmann, Sorensen, Dimensions reduction of large-scale systems, Springer 2005. Lancaster, Psarrakos, A Note on Weak and Strong Linearizations of Regular Matrix Polynomials, Numerical Analysis Report No. 470, 2005. Lin, Bao, Wei, Model-order reduction of large-scale kth order linear dynamical systems via a kth

  • rder Arnoldi method. International Journal of Computer Mathematics 87(2) (2010), pp.

435–453. Mackey, Mackey, Mehl, Mehrmann, Structured polynomial eigenvalue problems: good vibrations from good linearizations. SIMAX 28 (2006). Mehrmann, Schröder, Simoncini, An implicitly-restarted Krylov subspace method for real symmetric/skew-symmetric eigenproblems, LAA, 2009. Mehrmann, Watkins, Structure-preserving methods for computing eigenpairs of large sparse skew-Hamiltonian/Hamiltonian pencils, SISC 22 (2001).

  • H. Faßbender MOR of Higher Order Systems