NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing - - PowerPoint PPT Presentation

β–Ά
nsosa nsosa
SMART_READER_LITE
LIVE PREVIEW

NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing - - PowerPoint PPT Presentation

NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing System Architecture (NSOSA) Study M. M. Coakley, D. P. Ryan-Howard, F. J. Rich, M. K. Griffin, G. P. Ginet, H. Iskenderian, W. E. Bicknell, and W. J. Blackwell MIT Lincoln


slide-1
SLIDE 1

NSOSA NSOSA

  • M. M. Coakley, D. P. Ryan-Howard, F. J. Rich, M. K. Griffin,
  • G. P. Ginet, H. Iskenderian, W. E. Bicknell, and W. J. Blackwell

MIT Lincoln Laboratory

  • F. W. Gallagher III

NOAA / NESDIS / Office of System Architecture and Advanced Planning

  • M. W. Maier

The Aerospace Corporation

10 January 2018

Performance Evaluation for NOAA Satellite Observing System Architecture (NSOSA) Study

DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. This material is based upon work supported by the National Oceanic and Atmospheric Administration under Air Force Contract No. FA8721-05-C-0002 and/or FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Oceanic and Atmospheric Administration
slide-2
SLIDE 2

NSOSA

Purpose of NOAA Satellite Observing System Architecture (NSOSA) Study

S-NPP and JPSS GOES-13 GOES-16 GOES-15

GEO Belt

GOES-14 DSCOVR at L1

Current Constellations

L1, L5, L1/L5

Example Constellation for Study Study determining what satellites and where located

LEO Sats GEO GEO GEO Geostationary Belt Other orbits

slide-3
SLIDE 3

NSOSA

NSOSA Study Evaluation Methodology

Instrument Catalog Configuration Matrix Additional Information Environmental Data Record (EDR) Value Model Resultant Values Efficient Frontier Configuration Costs

References

Algorithm complexity

4 cycles

1, 2a, 2b, 3

Report and input updates

slide-4
SLIDE 4

NSOSA

  • Example of Objective for Real-Time Regional Weather Imagery
  • SPRWG members suggested limits for each objective and each parameter below it

EDR Value Model (EVM) Structure

Units Real-time Regional weather imagery Study Threshold Expected Max Effective km Ground-projected instantaneous field of view (GIFOV) IR - Vis - NIR (3 boxes) 4 2 3 2 0.5 1 1 0.25 0.3 min Sampling frequency (update rate) 30 5 2.5 min Latency (image time to delivery) 10 5 2.5 # Mesoscale (movable 1000kmx1000km) Number of Regions in CONUS 1 2 5 min Mesoscale (movable 1000kmx1000km) Update rate 7 0.5 0.25 min Mesoscale (movable 1000kmx1000km) Latency (image time to delivery) 7 0.5 0.25 micron Wavelengths covered lower - upper - day/night 0.63 11 0.47 13.35 1E-09 0.4 13.7 0.64 # Number of bands (specific bands) 4 16 32 K Radiometric accuracy 0.2 0.1 0.05 km Navigation accuracy 3 1 0.5 Study Threshold Expected Max Expected
slide-5
SLIDE 5

NSOSA

  • Example of Objective for Real-Time Regional Weather Imagery
  • SPRWG members suggested limits for each objective and each objective parameter
  • MIT LL quantified non-linear relationships between ST, EXP, and ME level

EDR Value Model (EVM) Structure

Units Real-time Regional weather imagery Study Threshold Expected Max Effective SPRWG value Beta (first) Beta (second) Beta (third) km Ground-projected instantaneous field of view (GIFOV) IR - Vis - NIR (3 boxes) 4 2 3 2 0.5 1 1 0.25 0.3 0.7 0.88 2.31 1.19 min Sampling frequency (update rate) 30 5 2.5 0.7 3.74

π‘‡π‘π‘’π‘—π‘‘π‘”π‘π‘‘π‘’π‘—π‘π‘œ = π‘§π‘‡π‘ˆ βˆ’ π‘§πΉπ‘Œπ‘„ 𝑧𝑁𝐹 βˆ’ π‘§π‘‡π‘ˆ

𝛾

Note: Ξ² is set by SPRWG satisfaction level for the objective and allows non-linear relation across ST, EXP, and ME levels

Study Threshold Expected Max Expected
slide-6
SLIDE 6

NSOSA

  • Assessed observational properties for instruments against the EVM to

generate weighted value output for each objective

  • Sum weighted values outputs over all parameters and objectives to generate overall value

EDR Value Model (EVM): Computing Value

Score for Arch60 Arch60Obj Value Weighted Final Score for Arch60 0.7000 0.0457 0.3037 0.0198 1.0000 0.0652 0.7000 0.0457 0.7000 0.0457 0.7000 0.0457 0.8499 0.0554 0.7000 0.0457 0.7000 0.0457 0.7000 0.0457 7.0536 0.0460 0.0460 Overall swing weight A1 Inputs Arch60 Units Real-time Regional weather imagery 0.0652 2 0.5 1 km Ground-projected instantaneous field of view (GIFOV) IR - Vis - NIR (3 boxes) 10 min Sampling frequency (update rate) 1 min Latency (image time to delivery) 2 # Mesoscale (movable 1000kmx1000km) Number of Regions in CONUS 0.5 min Mesoscale (movable 1000kmx1000km) Update rate 0.5 min Mesoscale (movable 1000kmx1000km) Latency (image time to delivery) 0.47 13.6 0.64 micron Wavelengths covered lower - upper - day/night 16 # Number of bands (specific bands) 0.1 K Radiometric accuracy 1 km Navigation accuracy

π‘ƒπ‘π‘˜π‘“π‘‘π‘’π‘—π‘€π‘“ π‘Šπ‘π‘šπ‘£π‘“ 𝑧𝑗 = π‘‹π‘“π‘—π‘•β„Žπ‘’π‘— 𝑂𝑗 ෍

𝑗=1 𝑂

π‘§π‘‡π‘ˆ

𝑗 βˆ’ 𝑧𝑗

𝑧𝑁𝐹

𝑗 βˆ’ π‘§π‘‡π‘ˆ 𝑗

𝛾𝑗 where Weight𝑗 = overall swing weight per objective N𝑗 = number of objective parameters 𝛾𝑗is defined previously

slide-7
SLIDE 7

NSOSA

  • Group A and Group B

weights recommended by the SPRWG

  • Draws on priority for

improvements between Study Threshold to Maximum Effective level

  • Weights reflect objective

interweaving performed by NOAA leadership

  • Final weights assigned by

tanh function

EVM: Group A and Group B Objectives

Overall Swing Weights Objective Number Objective Name 0.065216 A1 Real-time Regional weather imagery 0.058438 A2 Global real-time weather imagery 0.032066 A3 Non-Real-Time global weather imagery (Vis and IR) other than ocean color 0.028707 A4 Global ocean color/phytoplankton composition 0.052690 A5 Global RT vertical IR soundings 0.004364 A6 Regional (CONUS) RT vert IR soundings 0.055681 A7 Global RT vertical MW soundings 0.004956 A8 Regional (CONUS) RT vert MW sounding 0.063206 A9 Global GNSS-RO soundings 0.022560 A10 Lightning 0.003972 A11 Sea surface height (global) 0.042643 A12 Ocean surface vector wind (OSVW) 0.066988 A13 3-D winds (Horizontal wind in 3D) 0.003545 A14 Ozone-global vertical profiles in troposphere and stratosphere and total column 0.025524 A15 Microwave imagery 0.003435 A16 Outgoing Longwave Radiation (OLR) 0.003364 A17 Incoming solar radiation (TOA) 0.005840 A18 Radar-Based Global Precipitation rate 0.003714 A19 Global soundings chemical concentrations 0.046128 B1 Coronagraph imagery:Sun-Earth line 0.049493 B2 Coronagraph imagery:Off Sun-Earth line 0.004148 B3 Solar EUV imaging 0.005355 B4 Photospheric magnetogram imagery-Sun-Earth line 0.019845 B5 Photospheric magnetogram imagery: Off Sun-Earth line 0.004630 B6 Solar X-ray irradiance 0.003830 B7 Solar EUV irradiance 0.003621 B8 Interplanetary Solar wind:Sun-Earth Line 0.007995 B9 Interplanetary Solar wind: Off Sun-Earth line 0.017396 B10 Heliospheric images 0.003484 B11 Interplanetary Energetic particles 0.003317 B12 Interplanetary Magnetic Field 0.003338 B13 Geomagnetic field 0.003396 B14 Geospace Energetic particles 0.010226 B15 Ionospheric electron density profiles 0.015219 B16 Auroral imaging 0.013307 B17 Thermospheric O/N2 ratio (height integrated) 0.011649 B18 Upper thermospheric density 0.009016 B19 Ionospheric drift velocity

Group A (Terrestrial Weather) Group B (Space Weather)

slide-8
SLIDE 8

NSOSA

  • Group D weights defined by NOAA

leadership

  • Group D Objectives assessed

through evaluation of constellation properties

  • Imaging and sounding availability
  • Availability for least-available objective
  • Exceedances of budget
  • Duration of program reflecting

adaptability

  • Ability to form international partnerships

(normalized)

  • Technology Readiness Level by PDR

EVM: Group D Objectives

0.068538 D1 Assurance of Core Capabilities 0.060948 D2 Compatibility with fixed budgets 0.039096 D3 Assurance of all capabilities 0.035549 D4 Programmatic Flexibility and Adaptability 0.007140 D5 Develop and Maintain International Partnerships 0.006453 D6 Low Risk at Constellation Level Overall Swing Weights Objective Number Objective Name

Group D (Strategic)

slide-9
SLIDE 9

NSOSA

Groups A, B, and D Value Summary by Objective for Cycle 3 (C60 – C96)

GROUP A GROUP B GROUP D 60 - 69 70-78 80-88 90-96

Value Constellation Number Objectives Constellation Number Objectives Objectives Value Value 60s series: Modernized Legacy Continuation /Expansion, building up from current system 70s series: Exclusive Cost-Benefit, building up from hosted payloads 80s series: Hybrid, Mixture of US Government satellite and hosted payload 90s series: EXP focused Constellation Number
slide-10
SLIDE 10

NSOSA

Efficient Frontier Shows Value vs. Cost

Currently Unachievable Cost Inefficient Lower Value Currently Unachievable Cost Inefficient Lower Value (Potential) (Potential)
slide-11
SLIDE 11

NSOSA

  • Value error bars have two

components rss-ed together

  • MIT LL error bars component
  • Separately reported values giving

feasible maximum values and minimum values

  • Monte Carlo analysis over 1000

trials was used to generate distributions

  • Points on the distribution chosen to

combine with second error term developed by Eric Wendoloski of Aerospace

  • Aerospace error bars component
  • Sensitivity to judgements on
  • rdering and weighting of objectives
  • Cost error bars computed by

Kiley Yeakel of Aerospace

Value Error Bars Reflect Uncertainty in Assessment and True Constellation Differences

slide-12
SLIDE 12

NSOSA

  • Efficient Frontier curves can

also be drawn to show Cycle 3 constellations in the 60s series, 70s series, and 80s series

  • At original anticipated budget

near 1.6, error bars show overlap between series

  • At lower budget values near 1,

error for all series do not overlap

  • Near 1.4, error bars still overlap
  • 76/86 are close to knee in the curve
  • Maturity also needs to be

examined by configuration

  • Anti-correlated with risk

Efficient Frontier Curves By Series

slide-13
SLIDE 13

NSOSA

  • Constellation level maturity

assessment is shown here as sphere size

  • Employed instrument catalog

maturity evaluation

  • Scale of low to high (1 to 3) averaged
  • ver each constellation
  • Combined with a system

uptime complexity

  • Reduces resultant maturity value by

increments of 0.25

  • Maturity highest in the 60s

series, then in the 80s series

  • Value and maturity higher

in C86

Efficient Frontier with Maturity

slide-14
SLIDE 14

NSOSA

  • Developed method to quantitatively compute value for each objective and for

each constellation

  • Results report benefit/value that provide guidance for architectural choices
  • High level model is limited by number of parameters and single value entries
  • Further trade studies to define instrument are needed
  • Worked with team to provide and refine configurations for study
  • Developed assessment of value error to assess results
  • Efficient Frontier plots of constellation value reflect achievable configurations,

and show more efficient cost for the same value

  • At original anticipated budget near 1.6 in relative cost, error bars show overlap between

series studied during Cycle 3

  • Near 1.4, value and maturity are higher in C86, which is also near knee in the curve
  • Further study near this value vs. cost region is needed to help define future NOAA programs
  • Currently working with NOAA on report and needed next steps with uncertain budget
  • Scoring methodology was key to evaluation of constellations

Summary