Probabilistic dipole inversion for adaptive quantitative - - PowerPoint PPT Presentation

probabilistic dipole inversion for adaptive quantitative
SMART_READER_LITE
LIVE PREVIEW

Probabilistic dipole inversion for adaptive quantitative - - PowerPoint PPT Presentation

Probabilistic dipole inversion for adaptive quantitative susceptibility mapping Jinwei Zhang 1,2 , Hang Zhang 1,3 , Mert Sabuncu 1,2,3 , Pascal Spincemaille 1 , Thanh Nguyen 1 , Yi Wang 1,2 1 Department of Radiology, Weill Medical College of


slide-1
SLIDE 1

Probabilistic dipole inversion for adaptive quantitative susceptibility mapping

Jinwei Zhang1,2, Hang Zhang1,3, Mert Sabuncu1,2,3, Pascal Spincemaille1, Thanh Nguyen1, Yi Wang1,2

1 Department of Radiology, Weill Medical College of Cornell University, New York, NY, USA 2 Department of Biomedical Engineering, Cornell University, Ithaca, NY, USA 3 Department of Electrical and Computer Engineering, Cornell University, Ithaca, NY, USA

slide-2
SLIDE 2

Quantitative susceptibility mapping (QSM)

! " # ?

Wang, Yi, and Tian Liu. Magnetic resonance in medicine 73.1 (2015): 82-101.

The zero cone of # in k-space

# = ℱ[(]

" = ! ∗ ( + ,

(Image space)

" = -.#-! + ,

(K-space)

!: tissue susceptibility

": magnetic field (: dipole kernel ,: measurement noise

07/07/20 2

slide-3
SLIDE 3

COSMOS and MEDI

!"#$ = arg min

, log /(1|!) + log / !

COSMOS MEDI

Liu, Tian, et al. Magnetic Resonance in Medicine 61.1 (2009): 196-204. Liu, Jing, et al. Neuroimage 59.3 (2012): 2560-2568.

Zeroes located on a pair of cone surfaces Rotation 2 Rotation1 Binary-valued weighting matrix 5 (three spatial directions)

Multi-orientation scans, golden standard QSM Single-orientation scan, clinically feasible QSM / 1 ! = 6 1 7897!, Σ<|, , / ! ∝ >?@∥B∇,∥D

07/07/20 3

slide-4
SLIDE 4

Motivation: fitting susceptibility distributions

  • Given ! " and ! & " , solving ! " & ?
  • Traditional approximate inference methods: MCMC, VI. Need to

run on each subject.

  • Can we learn a general distribution !()*) " & for any given &?
  • Introduce parametrized distributions +, " & , learn - so that

+, " & ≈ !()*) " & (amortized optimization).

07/07/20 4

slide-5
SLIDE 5

COSMOS dataset and modeling

! " , $ " , … , ! & , $ & sampled from '()*)(!|$) . '()*) ! $ = 1 1 Σ34"

5 6[! = ! 3 |$ = $(3)]

9:[ ̂ '()*) ! $ ∥ => ! $ ]

1 1 Σ34"

5

− log => ! 3 $ 3 ) + D(.

'()*))

empirical distribution

07/07/20 5

slide-6
SLIDE 6

MEDI dataset and modeling

Only ! " , … , ! % are given. & ! ' and & ' .

+,[./ ' ! ∥ & ' ! ]

+,[./ ' ! ∥ & ' ] − 345[log &(!|')]

Amortized formulation

Σ=>"

? +,[./ ' !(=) ∥ & ' ] − 345[log &(!(=)|')]

Regularization Likelihood

07/07/20 6

slide-7
SLIDE 7

Probabilistic Dipole Inversion (PDI) network

Local field (!) Mean ("#|% ) Variance (Σ#|% ) '( ) !

*+['( ) ! ||-()|!)]

… … MC sampling − 234 '( ) !

COSMOS MEDI

Input data

07/07/20 7

slide-8
SLIDE 8

Experimental setups

§ Multiple sclerosis dataset (6 training, 1 validation, 7 test) § Hemorrhage dataset (4 training, 1 validation, 2 test) PDI PDI-VI § 4 training, 1 validation, 2 test, each having 5 orientations

  • Domain adaptations on MEDI (whole brains)
  • Pre-trained on COSMOS (3D patches)

07/07/20 8

slide-9
SLIDE 9

Healthy subject with COSMOS

COSMOS MEDI FINE PDI (!"|$ ) PDI ( Σ"|$ ) QSMnet

QSMnet: Yoon, Jaeyeon, et al. Neuroimage 179 (2018): 199-206. FINE: Zhang, Jinwei, et al. NeuroImage 211 (2020): 116579.

07/07/20 9

slide-10
SLIDE 10

Healthy subject with COSMOS

07/07/20 10

slide-11
SLIDE 11

Multiple sclerosis patients

MEDI FINE PDI (!"|$ ) PDI ( Σ"|$ ) QSMnet PDI-VI (!"|$ ) PDI-VI ( Σ"|$ ) sub 1 sub 2

07/07/20 11

slide-12
SLIDE 12

Hemorrhagic patient

MEDI FINE PDI (!"|$ ) PDI ( Σ"|$ ) QSMnet PDI-VI (!"|$ ) PDI-VI ( Σ"|$ )

07/07/20 12

slide-13
SLIDE 13

!"|$ Σ"|$ & ' &

Encoder /(1|&) Decoder 4(&|1)

− (67("|$) log 4 & 1 − :; / 1 & ∥ 4 1 )

VAE architecture PDI architecture … …

= !>|? Σ>|?

“Encoder” /A B = “Decoder” 4(=|B) (forward dipole model)

D = EFGEB, Σ?|> I =

loss function

ELBO − (67J[log 4(=|B)] − :;[/A B = ∥ 4 B ])

loss function

Discussion: relationship to VAE

Kingma, Diederik P., and Max Welling. "Auto-encoding variational bayes." arXiv preprint arXiv:1312.6114 (2013).

07/07/20 13

slide-14
SLIDE 14

Conclusion

  • Learn a neural network parametrized distribution which yields

the posterior distribution of susceptibility given input local field.

  • train those parameters by fitting to the empirical distribution

defined from COSMOS dataset.

  • Adapt the pre-trained parameters to different domains using

(amortized) variational inference.

07/07/20 14

slide-15
SLIDE 15

Future work

  • More expressive model family for !" # $ :

invertible neural network

  • Learn a prior density % # instead of pre-defining:

autoregressive or VAE density estimations

07/07/20 15

slide-16
SLIDE 16

Thank you

Questions

07/07/20 16