selection calibration and validation of models of tumor
play

Selection, calibration, and validation of models of tumor growth - PowerPoint PPT Presentation

Selection, calibration, and validation of models of tumor growth Regina C. Almeida Laboratrio Nacional de Computao Cientfica - LNCC Chemnitz Symposium on Inverse Problems New Trends in Parameter Identification for Mathematical Model


  1. Selection, calibration, and validation of models of tumor growth Regina C. Almeida Laboratório Nacional de Computação Científica - LNCC Chemnitz Symposium on Inverse Problems New Trends in Parameter Identification for Mathematical Model Oct. 30 to Nov. 03, 2017, RJ Tumor Modeling Group R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 1 / 37

  2. Collaborations: Ernesto A. B. F. Lima (ICES) J. Tinsley Oden (ICES) Thomas E. Yankeelov (ICES) David A. Hormuth II (ICES) Supported by Grants • DOE-DESC009286 MMICC • National Science Foundation CCO The Center for Computational Oncology • CNPq 305612/2013-1 R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 2 / 37

  3. The Imperfect Paths to Knowledge (Oden & Prudhome, 2010) PREDICTION THE UNIVERSE OF PHYSICAL KNOWLEDGE REALITIES Observational Errors Modeling Discretization Errors Errors THEORY / COMPUTATIONAL OBSERVATIONS MATHEMATICAL MODELS MODELS VALIDATION VERIFICATION R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 3 / 37

  4. Goals of this Presentation • Review some concepts on the Foundations of Predictive Science: the scientific discipline concerned with the predictability of compu- tational models of physical events in the presence of uncertainties ∗ • Describe OPAL (the Occam Plausibility Algorithm) [selection, cali- bration, validation, model inadequacy, UQ of QoIs] † • Application: Predictive Modeling of Tumor Growth ∗∗ ∗ JTO, Foundations of Predictive Computational Sciences, ICES, 17-01, Austin, 2017 † K. Farrell, JTO, D. Faghihi, J. Comp. Physics, 2015 ∗∗ E. A. B. F. Lima, JTO, D. A. Hormuth, T. E. Yankeelov, R. C. Almeida, M3AS, 2016 R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 4 / 37

  5. Some Definitions Model (Mathematical Model): R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 5 / 37

  6. Some Definitions Model (Mathematical Model): Quantity of Interest - QoI : The goal of constructing and solving the model Tumor volume/area, RECIST (longest diameter), etc. R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 5 / 37

  7. A hierarchy of scenarios Calibration Scenarios S c Domain of unit tests designed to initialize parameters (e.g., in vitro observation in a pathology lab) Validation Scenarios S v Domains of subsystem experiments designed to yield observational data that represents the QoI or other ability of the model to predict the QoI (e.g. in vivo data through MRI) Prediction Scenarios S p The full-system domain, in which the target QoI resides R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 6 / 37

  8. Sources of Uncertainty in Predictive Science Terenin and Draper (2015) 1 Discretization 2 Observational Data 3 Model Parameters 4 Model Selection R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 7 / 37

  9. Sources of Uncertainty in Predictive Science Terenin and Draper (2015) 1 Discretization 2 Observational Data h -sensitivity 3 Model Parameters a posteriori 4 Model Selection goal-oriented error estimator R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 7 / 37

  10. Sources of Uncertainty in Predictive Science Terenin and Draper (2015) 1 Discretization Y 2 Observational Data y -uncertainty 3 Model Parameters y = { y ( x i ) } 4 Model Selection y = f ( g , ε ) truth exp. noise � g i + ε i = y i Noise ε i ∼ p µ Model R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 7 / 37

  11. Sources of Uncertainty in Predictive Science Terenin and Draper (2015) θ 1 Discretization θ -uncertainty 2 Observational Data 3 Model Parameters P ( A | B ) P ( B ) = P ( B | A ) P ( A ) 4 Model Selection ⇓ π ( θ | y , S ) ∝ π ( y | θ , S ) π ( θ | S ) θ θ -sensitivity Y ( θ ) = QoI A. Saltelli et. al. 2004, 2008 M. D. Morris 1991 I. M. Sobol 2006 R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 7 / 37

  12. Sources of Uncertainty in Predictive Science Terenin and Draper (2015) M 1 Discretization Model Uncertainty and Model (in)adequacy 2 Observational Data “All models are wrong but 3 Model Parameters some are useful” (Box,1978) 4 Model Selection AIC, BIC Plausibilities . . . Validation g i = d i ( θ ) − η i ( θ ) g i = y i − ε i y i − d i ( θ ) = ε i − η i ( θ ) discrepancy model (GP) or model validation R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 7 / 37

  13. The Major Issues • Computing / selecting priors π ( θ ) . • Model selection: which model is best? • Parameter sensitivity: which parameters do not influence the QoI? • Computational algorithms: for sampling the posterior and solving the stochastic forward problem. • Validation tolerances and metrics. • Quantifying uncertainty in the QoI. R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 8 / 37

  14. Model Selection M = set of parametric model classes = {P 1 , P 2 , . . . , P m } Each P has its own likelihood and parameters θ j Bayes’ rule: π ( θ j | y , P j , M ) = π ( y | θ j , P j , M ) π ( θ j |P j , M ) , 1 ≤ j ≤ m π ( y |P j , M ) where � π ( y |P j , M ) = π ( y | θ j , P j , M ) π ( θ j |P j , M ) d θ j Now apply Bayes’ rule to the evidence: π ( P j | y , M ) = π ( y |P j , M ) π ( P j |M ) m ρ j = � ρ j = 1 π ( y , M ) = model plausibility j =1 H. Jeffreys, 1961; E. E. Prudencio & S. H. Cheung, 2012; R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 9 / 37

  15. Model Validation Validation Metrics • Total Variation Metric � ∞ d ( y v , d v ( θ i )) = || y v − d v ( θ i ) || L 1 = | y ( x ) − d ( θ, x ) | dx −∞ • L 1 error between cumulative distribution functions � ∞ � x � � d ( y v , d v ( θ i )) = � � y ( s ) − d ( θ, s ) ds � dx � � � −∞ −∞ • D KL (Kullback-Leibler Divergence): Measures the difference in information content between probability distributions: for pdfs q (data) and p (prediction) � ∞ q ( x ) log q ( x ) D KL ( q || p ) = p ( x ) dx, −∞ Tolerances � ≤ γ tol model is “valid” (not invalid) d ( y v , d v ( θ i )) = > γ tol model is invalid γ tol = ? R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 10 / 37

  16. Bayesian Model Calibration, Validation, and Prediction Prior π ( θ ) Calibration ( S c , y c ) π ( θ | y c ) = π ( y c | θ ) π ( θ ) π ( y c ) Validation ( S v , y v ) π ( θ | y v , y c ) = π ( y v | θ , y c ) π ( θ | y c ) π ( y v , y c ) Validation Forward Problem A ( θ , S v ; u ( θ , S v )) = 0 0.06 : ( 3 ) || d ( u ( θ , S v )) − y v || ≤ γ tol 0.04 0.02 Prediction ( S p , QoI) 0 -4 -2 0 6 4 A ( θ , S p ; u ( θ , S p )) = 0 2 4 2 0 6 -2 8 3 1 -4 3 2 Q ( u ( θ , S p )) ∼ π ( Q ) = π ( Q | θ , S v , S c , γ tol ) JTO, Moser, Ghattas, 2010; JTO, Babuska, Faghihi, 2017 R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 11 / 37

  17. Occam’s Razor “Non sunt multiplicanda entia sine necessitate” Entities should not be multiplied beyond necessity When choosing among a set of competing models: The simplest valid model is the best choice. • simple ⇒ number of parameters • valid ⇒ passes Bayesian validation test How do we choose a model that adheres to this principle? R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 12 / 37

  18. The Occam-Plausibility Algorithm - OPAL START SENSITIVITY ANALYSIS OCCAM STEP Identify a set of possible models Eliminate models with pa- Choose model(s) in the M = {P 1 ( θ 1 ) , . . . , P m ( θ m ) } rameters to which the lowest Occam Category M ∗ = {P ∗ 1 ( θ ∗ 1 ) , . . . , P ∗ k ( θ ∗ k ) } model output is insensitive M = { ¯ ¯ P 1 (¯ θ 1 ) , . . . , ¯ P l (¯ θ l ) } ITERATIVE OCCAM STEP CALIBRATION STEP Choose models in Calibrate all models in M ∗ next Occam category no PLAUSIBILITY STEP Does P ∗ yes Identify a new set j have the most Compute plausibilities and iden- ¯ of possible models parameters in M ? tify most plausible model P ∗ j no yes Use validated para- VALIDATION STEP Is P ∗ j valid? Submit P ∗ j to validation test meters to predict QoI K. Farrell, JTO, D. Faghihi, 2015 R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 13 / 37

  19. Tumor Model Framework Intratumor heterogeneity How does one develop Cancer in mathematical living tissue models of tumor growth? Metastatic carcinoma in bone marrow † † Silver S, at https://drsusansilverpathologist.wordpress.com R.C. Almeida New Trends in Parameter Identification ... Oct. 30 to Nov. 03, 2017, RJ 14 / 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend