CAD&CG Bundle Adjustment - - PowerPoint PPT Presentation

cad cg bundle adjustment jointly optimize all
SMART_READER_LITE
LIVE PREVIEW

CAD&CG Bundle Adjustment - - PowerPoint PPT Presentation

CAD&CG Bundle Adjustment Jointly optimize all cameras and points 2 arg min ( , ) X C x i j ij C ,... C , X ,..., X 1 N 1 N c p Triggs, B.,


slide-1
SLIDE 1

集束调整

章国锋 浙江大学CAD&CG国家重点实验室

slide-2
SLIDE 2

Bundle Adjustment

 Jointly optimize all

cameras and points

2 ,..., , ,...

) , ( min arg

1 1

ij j i X X C C

x C X

p N c N

Triggs, B., Mclauchlan, P., Hartley, R., and Fitzgibbon, A. 1999. Bundle adjustment—a modern synthesis. In Proceedings of the International Workshop

  • n Vision Algorithms: Theory and Practice. 298–372.
slide-3
SLIDE 3

Nonlinear Least Squares

 Gaussian Newton  Levenberg-Marquardt

) ˆ ( ) ˆ ( ) ˆ ( ) ( ) ( min arg

ˆ * 2 *

x J J J x J J x x x x x

T x T x x x x x

                  

) ˆ ( ) ( x J x I J J

T T

     

first order approximation to Hessian Jacobian matrix

slide-4
SLIDE 4

Sparse Bundle Adjustment

1 Camera 1 Point Sparsity patten of Hessian

Manolis I. A. Lourakis, Antonis A. Argyros: SBA: A software package for generic sparse bundle adjustment. ACM Trans. Math. Softw. 36(1) (2009)

2 ,..., , ,...

) , ( min arg

1 1

ij j i X X C C

x C X

p N c N

slide-5
SLIDE 5

Sparse Bundle Adjustment

 An simple example

 4 points  3 cameras  all points are visible in all cameras

slide-6
SLIDE 6

Sparse Bundle Adjustment

                                                                             

43 42 41 33 32 31 23 22 21 13 12 11 43 43 42 42 41 41 33 33 32 32 31 31 23 23 22 22 21 21 13 13 12 12 11 11

,              B A B A B A B A B A B A B A B A B A B A B A B A J

slide-7
SLIDE 7

Sparse Bundle Adjustment

 

T x T

J J J  

ij T ij ij j ij T ij i i ij T ij j T T T T T T T T T T T T T T

B A W B B V A A U V W W W V W W W V W W W V W W W W W W W U W W W W U W W W W U V W W U J J                                   

 

 

, ,

3 1 4 1 4 43 42 41 3 33 32 31 2 23 22 21 1 13 12 11 43 33 23 13 3 42 32 22 12 2 41 31 21 11 1

slide-8
SLIDE 8

Sparse Bundle Adjustment

 

T x T

J J J  

 

T T X T X T X T X T C T C T C X C x

4 3 2 1 3 2 1

                   

slide-9
SLIDE 9

Sparse Bundle Adjustment

 

T x T

J J J  

 

 

 

           

3 1 4 1

4 3 2 1 3 2 1

j ij T ij X i ij T ij C T T X T X T X T X T C T C T C X C T

B A J

i j

             

slide-10
SLIDE 10

Sparse Bundle Adjustment

 

T x T

J J J  

C T X X X C C T X X C X C T T X C X C T

W V WV S W WV U S WV V W W WV U V W W U                                                                             

   

) (

1 1 1 1

Schur Complement Compute cameras first (# cameras << # points) back substitution for points

slide-11
SLIDE 11

Sparse Bundle Adjustment

 In general, NOT all points are visible in all

cameras

 Aij = Bij = 0 if i-th points is invisible (or not matched) in j-th camera  More sparse structure, more speed-up

ij T ij ij j ij T ij i i ij T ij j

B A W B B V A A U   

 

 

, ,

3 1 4 1

slide-12
SLIDE 12

Related Works

 Hierarchical BA

 Steedly et al. 2003, Snavely et al. 2008, Frahm et al.

2010

 Segment-based BA

 Zhu et al. 2014, Zhang et al. 2016 (ENFT)

 Incremental BA

 Kaess et al. 2008 (iSAM), Kaess et al. 2011 (iSAM2),

Indelman et al. 2012 (iLBA), Ila et al. 2017 (SLAM++), Liu et al. 2017 (EIBA)

 Parallel BA

 Ni et al. 2007, Wu et al. 2011 (PBA)

slide-13
SLIDE 13

Segment-based Bundle Adjustment

Zhang G, Liu H, Dong Z, et al. Efficient non-consecutive feature tracking for robust structure-from-motion[J]. IEEE Transactions on Image Processing, 2016, 25(12): 5957-5970.

slide-14
SLIDE 14

The Difficulties for Large-Scale SfM

 Global Bundle Adjustment

 Huge variables  Memory limit  Time-consuming

 Iterative Local Bundle Adjustment

 Large error is difficult to be propagated to the whole

sequence.

 Easily stuck in a local optimum.

 Pose Graph Optimization

 May not sufficiently minimize the error.

slide-15
SLIDE 15

Segment-based Progressive SfM

 Split a long sequence to multiple short sequences.  Perform SfM for each sequence and align them together.  Detect the ``split point’’ and further split the sequence if

the reprojection error is large.

 The above procedure is repeated until the error is less

than a threshold.

slide-16
SLIDE 16

Segment-based Progressive SfM

 Split Point Detection

 Best minimize the reprojection error w.r.t. a, i.e. steepest descent

direction

 The inconsistency between two consecutive frames

slide-17
SLIDE 17

Split Point Detection

slide-18
SLIDE 18

SFM on Garden Dataset

6段长视频序列,将近10万帧,特征匹配74分钟,SfM求解16分钟(单线程), 平均17.7fps VisualSFM:SfM求解 57 分钟 (GPU加速)

slide-19
SLIDE 19

Comparison on Garden Dataset

ENFT-SFM VisualSFM ORB-SLAM

slide-20
SLIDE 20

Comparison with ORB-SLAM in Garden 01 Sequence

ENFT-SLAM ORB-SLAM

Non-consecutive Track Matching Segment-based BA Bag-of-words Place Recognition Pose Graph Optimization + Traditional BA

slide-21
SLIDE 21

Incremental BA in iSAM2 Based on Bayes Tree

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

slide-22
SLIDE 22

Incremental Bundle Adjustment

In order to benefit from increased accuracy offered by relinearization in batch optimization:

 Fixed-lag / Sliding-window Approaches  Keyframe-based Approaches  Incremental Approaches (iSAM, iSAM2, our

EIBA)

slide-23
SLIDE 23

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

Gaussian Factor Graph

loop constraint a-priori constraint kinematics measurement projection measurement

: state : landmark

slide-24
SLIDE 24

 Reduce fill-in: Use heuristics algorithms CCOLAMD to

provide a suboptimal ordering for factorization (finding the optimal is NP-hard).

 Encode with the Bayes tree: Introduce Bayes tree

(a.k.a. directed clique tree) to encode the square root information matrix.

 Fluid relinearization: Perform fluid relinearization when

adding new factors or updating the linearization points to avoid batch optimization.

 Partial state updates: Perform partial state updates

when solving the Bayes in order to update a state variable only when neccesary.

Main Ideas of iSAM2

slide-25
SLIDE 25

One step: linearization

𝑦2, 𝑦3 𝑚1, 𝑦1|𝑦2 𝑚2|𝑦3 𝑚1 𝑦1 𝑦2 𝑦3 𝑚2 𝑚1 𝑦1 𝑦2 𝑦3 𝑚2 𝑦1, 𝑦2, 𝑦3 𝑚1|𝑦1, 𝑦2 𝑚2|𝑦3

factor graph chordal Bayes net Bayes tree

eliminating the factor graph using the CCOLAMD ordering (e.g.𝑚1, 𝑚2, 𝑦1, 𝑦2, 𝑦3) creating Bayes tree in reverse elimination order (e.g.𝑦3, 𝑦2, 𝑦1, 𝑚2, 𝑚1) adding new factors/states and applying the fluid relinearization (e.g. 𝑔 𝑦1, 𝑦3 )

slide-26
SLIDE 26

One step: partial update

𝑦2, 𝑦3 𝑚1, 𝑦1|𝑦2 𝑚2|𝑦3

starting from the root clique updating all variables that change by more than a threshold

slide-27
SLIDE 27

Reordering with CCOLAMD / CHOLMOD

Kaess, M., Ranganathan, A., & Dellaert, F. (2008). iSAM: Incremental smoothing and mapping. IEEE Transactions on Robotics, 24(6), 1365-1378.

Reduce Fill-in

slide-28
SLIDE 28

In Gaussian factor graphs, elimination is equivalent to sparse QR factorization of the measurement Jacobian.

𝐾 = × × × × × × × × × × × 𝑚1 𝑚2 𝑦1 𝑦2 𝑦3

sparse pattern of the measurement Jacobian

𝑚1 𝑦1 𝑦2 𝑦3 𝑚2

slide-29
SLIDE 29

𝐼 = × × × × × × × × × × × × × × ×

𝑚1 𝑦1 𝑦2 𝑦3 𝑚2

In Gaussian factor graphs, elimination is equivalent to sparse QR factorization of the measurement Jacobian.

𝑚1 𝑚2 𝑦1 𝑦2 𝑦3

sparse pattern of the information matrix

slide-30
SLIDE 30

𝑆 = × × × × × × × × × ×

In Gaussian factor graphs, elimination is equivalent to sparse QR factorization of the measurement Jacobian.

𝑚1 𝑦1 𝑦2 𝑦3 𝑚2

𝑚1 𝑚2 𝑦1 𝑦2 𝑦3

sparse pattern of the square root information matrix No fill-in if we eliminate the factor graph using the elimination ordering 𝑚1, 𝑚2, 𝑦1, 𝑦2, 𝑦3. The resulting directed graph is called the chordal Bayes net.

slide-31
SLIDE 31

𝑚1 𝑦1 𝑦2 𝑦3 𝑚2

𝑚1 𝑚2 𝑦1 𝑦2 𝑦3

Encode with the Bayes Tree

For each conditional density 𝑄(𝜄𝑗|𝑇𝑗)

  • f the Bayes net, in reverse

elimination order (i.e. 𝑦3, 𝑦2, 𝑦1, 𝑚2, 𝑚1), we create a Bayes tree.

slide-32
SLIDE 32

𝑚1 𝑚2 𝑦1 𝑦2 𝑦3 𝑦2, 𝑦3 𝑚1, 𝑦1|𝑦2 𝑚2|𝑦3

Encode with the Bayes Tree

𝑚1, 𝑦1|𝑦2

A clique of the Bayes tree encoding the conditional density 𝑄(𝑚1, 𝑦1|𝑦2) 𝑚1, 𝑦1 are called the frontal variables 𝑦2 is called the separator

slide-33
SLIDE 33

𝑦2, 𝑦3 𝑚1, 𝑦1|𝑦2 𝑚2|𝑦3

Adding New Factors

𝑦1, 𝑦2, 𝑦3 𝑚1|𝑦1, 𝑦2 𝑚2|𝑦3

Fluid relinearization when adding new factors.

 For each variable affected by new factors, remove the

corresponding clique and all parents up to the root

 Re-interpret the removed part as a factor graph  Add the new factors into the resulting factor graph.  Re-order variables and eliminate the factor graph to recreate

a top Bayes tree.

 Insert the orphaned sub-trees back into the new Bayes tree.

ALGORITHM

slide-34
SLIDE 34

𝑦2, 𝑦3 𝑚1, 𝑦1|𝑦2 𝑚2|𝑦3

𝑚1 𝑦1 𝑦2 𝑦3 𝑚1 𝑦1 𝑦2 𝑦3 𝑚1 𝑦1 𝑦2 𝑦3 𝑚2

𝑦1, 𝑦2, 𝑦3 𝑚1|𝑦1, 𝑦2 𝑚2|𝑦3

add a new factor 𝑔 𝑦1, 𝑦3 then update the Bayes tree insert the

  • rphaned

sub-tree back c remove top of Bayes tree re-interpret it as a factor graph

add the new factor 𝑔 𝑦1, 𝑦3 reorder and re-eliminate to create a new Bayes tree

Example: adding a factor

slide-35
SLIDE 35

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

Example of adding new states and factors Information only propagates upwards.

slide-36
SLIDE 36

Example of adding new states and factors Information only propagates upwards.

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

slide-37
SLIDE 37

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

Example of adding new states and factors Information only propagates upwards.

slide-38
SLIDE 38

While adding new states (always along with adding new factors), information only propagates upwards.

1.

Force the most recently accessed variables to the end and still provide a good overall ordering.

2.

Subsequent updates will then only affect a small part

  • f the tree (the top of the Bayes tree).

3.

Efficient in most cases, except for large loop closures.

Constrained COLAMD

slide-39
SLIDE 39

Fluid Relinearization

Fluid relinearization when linearization points change (together with adding new factors).

1.

For each affected variable remove the corresponding clique and all parents up to the root.

2.

Relinearize all factors required to recreate top.

3.

Add cached linear factors from orphans.

4.

Re-order variables and eliminate the factor graph to create a new top Bayes tree.

5.

Insert the orphaned sub-trees back into the new Bayes tree.

ALGORITHM

slide-40
SLIDE 40

Starting from the root clique:

1.

For current clique: compute update of frontal variables from the local conditional density.

2.

For all variables that change by more than a threshold: recursively process each descendant containing such a variable.

Partial State Updates

ALGORITHM

slide-41
SLIDE 41

Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

slide-42
SLIDE 42

Efficient Incremental BA

Liu H, Li C, Chen G, et al. Robust Keyframe-based Dense SLAM with an RGB- D Camera[J]. arXiv preprint arXiv:1711.05166, 2017.

slide-43
SLIDE 43

Revisit Standard BA

 A regular BA function  Convert Huber norm by re-weighting scheme  Linearization  Solving normal equation

Reprojection error Inverse depth error is 3𝑜𝑦 × (6𝑜𝑑 + 3𝑜𝑞) Jacobian matrix

slide-44
SLIDE 44

Revisit Standard BA

 Step 1: Construct normal equation

Compute and store the small non-zero block

matrices 𝐕𝑗𝑗, 𝐖

𝑘𝑘, 𝐗𝑗𝑘

Do not need to reconstruct from scratch. Only need to add new block matrices.

𝐕 : 𝑜𝑑 × 𝑜𝑑 𝐖 : 𝑜𝑞 × 𝑜𝑞 𝐗 : 𝑜𝑑 × 𝑜𝑞 𝐗𝒋𝒌 no zero only if point 𝑘 is visible in camera 𝑗

slide-45
SLIDE 45

Revisit Standard BA

 Step 2: Marginalize points to construct

Schur Complement

𝐓 is also sparse, with non-zero block matrix

if and only if camera 𝑗1 and 𝑗2 share common points.

slide-46
SLIDE 46

Revisit Standard BA

 Step 3: Update cameras

Use preconditioned conjugate gradient (PCG)

to solve for

 PCG naturally leverages the sparseness of 𝐓

 Step 4: Update points

Back substitution

slide-47
SLIDE 47

Revisit Standard BA

 Num. of observations in each keyframe much

larger than Num. of cameras

 Computation :

Step 1, 2 ≫ Step 3

 Construction of normal equation and Schur

complement takes much more time than PCG iterations

 most variables nearly unchanged (incremental

reconstruction)

 Most computation in steps 1, 2, 4 are unnecessary  Contribution of most to normal equation nearly

remains the same

slide-48
SLIDE 48

Efficient Incremental BA (EIBA)

 Local BA vs. Global B

 local BA : suboptimal, especially when the local map

contains large error.

 global BA : accurate but slow, high latency, lots of

unnecessary computation.  Incremental BA

 Makes maximum use of intermediate computation for

efficiency

 Adaptively updating affected keyframes for map

refinement

slide-49
SLIDE 49

One iteration in EIBA

 Step 1 : Update normal equations and

Schur complement from the last iteration

Store the effect of in ,

initialize to 0 at first, only re-computed when linearization point of is changed.

Remove contribution from the last iteration,

refresh them, update for current iteration.

Update from

slide-50
SLIDE 50

One iteration in EIBA

 Step 1 : Update normal equations and

Schur complement from the last iteration

slide-51
SLIDE 51

One iteration in EIBA

 Step 2 : Update point marginalization and

Schur complement from last iteration

slide-52
SLIDE 52

One iteration in EIBA

 Step 3 : Update cameras

Solve by PCG Change only if exceeds a threshold

 Step 4 : Update points

Back substitution only for visible points in the

changed cameras

Change only if exceeds a threshold

slide-53
SLIDE 53

EIBA in RKD-SLAM

 Energy function

Consist of 3D points observation term and

loop constraint term

Reprojection error Inverse depth error Loop constraint

slide-54
SLIDE 54

EIBA in RKD-SLAM

 3D point observation term

Use inverse depth parameterize

  Each re-projection equation relates two camera

poses ,one 3D point

 Linearization  Also need to update

slide-55
SLIDE 55

EIBA in RKD-SLAM

 Loop constraint term

Represented as relative pose Linearization  Update

slide-56
SLIDE 56

Performance of EIBA

 Computation time

slide-57
SLIDE 57

Performance of EIBA

 Computation time

Our EIBA is faster by an order of one

magnitude than iSAM2.

slide-58
SLIDE 58

Performance of EIBA

 Optimized reprojection error

slide-59
SLIDE 59

Open-source Solver & BA

 g2o: https://github.com/RainerKuemmerle/g2o  GTSAM& iSAM: https://bitbucket.org/gtborg/gtsam/  Ceres Solver: http://ceres-solver.org/  Bundler: http://www.cs.cornell.edu/~snavely/bundler/  PBA: https://grail.cs.washington.edu/projects/mcba/  EIBA: the source code will be released soon.

http://www.zjucvg.net