A Metric for the Prognostic Outreach of Scenarios Learning from - - PowerPoint PPT Presentation

a metric for the
SMART_READER_LITE
LIVE PREVIEW

A Metric for the Prognostic Outreach of Scenarios Learning from - - PowerPoint PPT Presentation

Spatio-Temporal Uncertainty Assessment of GHG Emission Inventories with the Specific Focus on Austria and Ukraine: Learning in Space and Time and into the Future A Metric for the Prognostic Outreach of Scenarios Learning from the Past to


slide-1
SLIDE 1
  • M. Jonas

30 April 2015 – 1

A Metric for the Prognostic Outreach

  • f Scenarios

Learning from the Past to Establish a Standard in Applied Systems Analysis

  • M. JONAS, E. ROVENSKAYA and P. Żebrowski

Spatio-Temporal Uncertainty Assessment of GHG Emission Inventories with the Specific Focus on Austria and Ukraine: Learning in Space and Time and into the Future

LViv, Ukraine; 13 October 2015

slide-2
SLIDE 2
  • M. Jonas et al.

14 October 2015 – 2

This talk covers

  • 1. Motivation
  • 2. Framing conditions and definitions
  • 3. Why diagnostic and prognostic uncertainty are

different and independent

  • 4. Learning in a prognostic context
  • 5. Toward application: an accurate and precise system
  • 6. Learning in a diagnostic context
  • 7. Insights and outlook
slide-3
SLIDE 3
  • M. Jonas et al.

14 October 2015 – 3

  • 1. Motivation

Our motivation is two-fold:

  • 1. to expand Jonas et al. (2014)

Uncertainty in an emissions-constrained world emerging from the 3rd (2010) Uncertainty Workshop;

  • 2. and to contribute to the unresolved question of How

limited are prognostic scenarios? We are still moving at a theoretical level but we already encounter important insights and windfall profits!

slide-4
SLIDE 4
  • M. Jonas et al.

14 October 2015 – 4

  • 1. Motivation (2)

An easy-to-apply metric or indicator is needed that informs non-experts about the time in the future at which a prognostic scenario ceases to be (for whatever reasons) in accordance with the system’s past. This indicator should be applicable in treating a system / model coherently (from beginning to end)!

slide-5
SLIDE 5
  • M. Jonas et al.

14 October 2015 – 5

  • 1. Motivation (1)

Jonas et al. (2014): The mode of bridging diagnostic and prognostic uncertainty across temporal scales relies on two discrete points in time: ‘today’ and 2050. Now we want to become continuous ...

2050 2014 Past Future

slide-6
SLIDE 6
  • M. Jonas et al.

14 October 2015 – 6

  • 2. Framing conditions and definitions

Globe or Group of Countries or individual Country Net Storage in the Atmosphere

FF Industry Kyoto Biosphere “Non-Kyoto” Biosphere Impacting?

Sphere of Activity under the KP

Jonas and Nilsson (2007: Fig. 4); modified

Only FFF_C, Fterr_C and Foc_C can be discriminated top-down globally!

slide-7
SLIDE 7
  • M. Jonas et al.

14 October 2015 – 7

Atmosphere

t2

Time Fnet

t1 2e

Jonas and Nilsson (2007: Fig. 6); modified

  • 2. Framing conditions and definitions

Bottom-up / top-down (full C) accounting is not in place. We cannot yet verify DC fluxes at the country scale!

slide-8
SLIDE 8
  • M. Jonas et al.

14 October 2015 – 8

Moss & Schneider (2000: Fig. 5; see also Giles, 2002); IPCC ( 2006: Vol. 1, Fig. 3.2)

  • 2. Framing conditions and definitions

prognostic diagnostic

slide-9
SLIDE 9
  • M. Jonas et al.

14 October 2015 – 9

  • 3. Diagnostic vs prognostic uncertainty

Diagnostic uncertainty

 can increase or decrease depending on whether or not

  • ur knowledge of accounting emissions becomes more

accurate and precise!

Prognostic uncertainty

 under a prognostic scenario always increases with time!

slide-10
SLIDE 10
  • M. Jonas et al.

14 October 2015 – 10

  • 3. Diagnostic vs prognostic uncertainty

Meinshausen et al. (2009: Fig. 2)

slide-11
SLIDE 11
  • M. Jonas et al.

14 October 2015 – 11

Meinshausen et al. (2009: Fig. 3)

10 42 234 25

  • 3. Diagnostic vs prognostic uncertainty
slide-12
SLIDE 12
  • M. Jonas et al.

14 October 2015 – 12

Probability of exceeding 2 oC:

  • 3. Diagnostic vs prognostic uncertainty

Meinshausen et al. (2009: Tab. 1)

slide-13
SLIDE 13
  • M. Jonas et al.

14 October 2015 – 13

  • 3. Diagnostic and prognostic uncertainty

Additional undershooting

Time

2050

Diagnostic Prognostic Combined

Massari Coelho et al. (2012: Fig. 10)

slide-14
SLIDE 14
  • M. Jonas et al.

14 October 2015 – 14

  • 4. Learning in a prognostic context
slide-15
SLIDE 15
  • M. Jonas et al.

14 October 2015 – 15

  • 4. Learning in a prognostic context

Task: Find optimum between ’order of the signal’s dynamics’ and both the extension and the

  • pening of uncertainty wedge!
slide-16
SLIDE 16
  • M. Jonas et al.

14 October 2015 – 16

  • 4. Learning in a prognostic context

Andriana (2015:Slide 15); modified

slide-17
SLIDE 17
  • M. Jonas et al.

14 October 2015 – 17

  • 5. Toward application: accurate + precise system

Assume that we have learned from a RL exercise

  • that each historical data record has a memory and

exhibits (but not necessarily) a linear dynamics;

  • that each data record’s uncertainty (learning) wedge

unfolds linearly into the future (until when?);

  • and that our data records exhibit linear inter-

dependencies [eg: T = T(C) ; C = C(E) ; E = E(t) ]

today

t Y

Y = Y(t) DY  ay * t

slide-18
SLIDE 18
  • M. Jonas et al.

14 October 2015 – 18

  • 5. Toward application: accurate + precise system

= 0 We merge an accurate-precise system with classical statistics! DfEt combines Unc (learn) + Dyn (mem) knowledge!

slide-19
SLIDE 19
  • M. Jonas et al.

14 October 2015 – 19

  • 5. Toward application: accurate + precise system

Source: http://en.wikipedia.org/wiki/Propagation_of_uncertainty

slide-20
SLIDE 20
  • M. Jonas et al.

14 October 2015 – 20

  • 5. Toward application: accurate + precise system
slide-21
SLIDE 21
  • M. Jonas et al.

14 October 2015 – 21

  • 5. Toward application: accurate + precise system

This is a game changer that has not so far been considered!

slide-22
SLIDE 22
  • M. Jonas et al.

14 October 2015 – 22

  • 5. Toward application: accurate + precise system

Jonas &Nilsson (2007: Fig. 9); modified

slide-23
SLIDE 23
  • M. Jonas et al.

14 October 2015 – 23

  • 7. Insights and outlook
  • 1. The risk of exceeding a 2050 global temperature target

(eg, 2 oC) appears to be greater than assessed by the IPCC! The correct approach would have been to deal with cumulated emissions and removals individually to determine their combined risk of exceeding the agreed temperature target. RL allows exactly this to be done: RL overcomes this shortfall and allows the effect of learning about emissions and removals individually to be grasped.

slide-24
SLIDE 24
  • M. Jonas et al.

14 October 2015 – 24

  • 2. We anticipate that, in the case of success, the way of

constructing prognostic models and conducting systems analysis will have to meet certain quality standards:

  • Better diagnostic data handling (retrospective learning)!
  • Specifying the models’ outreach limits!
  • Safe-guarding complex models by means of meta-models

which fulfill the above!

  • 7. Insights and outlook
slide-25
SLIDE 25
  • M. Jonas et al.

14 October 2015 – 25

  • 6. Learning in a diagnostic context

Hamal ( 2010: Fig. 9, 12); modified

Most recent precision estimates Most recent emission estimates Initial emission estimates Initial precision estimates

Time Emissions

today

Accuracy

R2 = 0.9345 1.5 2.5 3.5 4.5 5.5 1983 1988 1993 1998 2003 2008 2013

(%) UNFCCC:

  • 4.2%/yr

EU-15: Total Uncertainty (CO2, w/o LULUCF)

slide-26
SLIDE 26
  • M. Jonas et al.

14 October 2015 – 26

  • 6. Learning in a diagnostic context
slide-27
SLIDE 27
  • M. Jonas et al.

14 October 2015 – 27

  • 6. Learning in a diagnostic context
slide-28
SLIDE 28
  • M. Jonas et al.

14 October 2015 – 28

  • 6. Learning in a diagnostic context
slide-29
SLIDE 29
  • M. Jonas et al.

14 October 2015 – 29

  • 6. Learning in a diagnostic context
slide-30
SLIDE 30
  • M. Jonas et al.

14 October 2015 – 30

  • 6. Learning in a diagnostic context
slide-31
SLIDE 31
  • M. Jonas et al.

14 October 2015 – 31

  • 6. Learning in a diagnostic context
slide-32
SLIDE 32
  • M. Jonas et al.

14 October 2015 – 32

  • 6. Learning in a diagnostic context
slide-33
SLIDE 33
  • M. Jonas et al.

14 October 2015 – 33

  • 7. Insights and outlook
  • 3. We consider diagnostic learning being on the right track.

Nonetheless:

  • We see the need for complete emission records/histories.
  • We see the need of agreeing on a standard for processing

emission data since we operate at the limits of skillful resolution!

  • We see an important role for top-down emissions accounting!
slide-34
SLIDE 34
  • M. Jonas et al.

14 October 2015 – 34

References