nsosa nsosa
play

NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing - PowerPoint PPT Presentation

NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing System Architecture (NSOSA) Study M. M. Coakley, D. P. Ryan-Howard, F. J. Rich, M. K. Griffin, G. P. Ginet, H. Iskenderian, W. E. Bicknell, and W. J. Blackwell MIT Lincoln


  1. NSOSA NSOSA Performance Evaluation for NOAA Satellite Observing System Architecture (NSOSA) Study M. M. Coakley, D. P. Ryan-Howard, F. J. Rich, M. K. Griffin, G. P. Ginet, H. Iskenderian, W. E. Bicknell, and W. J. Blackwell MIT Lincoln Laboratory F. W. Gallagher III NOAA / NESDIS / Office of System Architecture and Advanced Planning M. W. Maier DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. The Aerospace Corporation This material is based upon work supported by the National Oceanic and Atmospheric Administration under Air Force Contract No. FA8721-05-C-0002 and/or FA8702-15-D-0001. Any opinions, 10 January 2018 findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Oceanic and Atmospheric Administration

  2. Purpose of NOAA Satellite Observing System Architecture (NSOSA) Study NSOSA DSCOVR S-NPP and JPSS at L1 GEO Belt Current GOES-13 Constellations GOES-15 GOES-14 GOES-16 Study determining what Other orbits satellites and where located L1, L5, LEO Sats L1/L5 Geostationary Belt Example Constellation GEO GEO GEO for Study

  3. NSOSA Study Evaluation Methodology NSOSA References Additional Information Configuration Matrix Environmental Data Record (EDR) Value Model Instrument Catalog Resultant Values Report and input updates Efficient Frontier Configuration Costs Algorithm 4 cycles complexity 1, 2a, 2b, 3

  4. EDR Value Model (EVM) Structure • Example of Objective for Real-Time Regional Weather Imagery NSOSA • SPRWG members suggested limits for each objective and each parameter below it Real-time Regional Study Max Study Threshold Expected Max Expected weather imagery Threshold Effective Units Expected Ground-projected instantaneous field of km 4 2 3 2 0.5 1 1 0.25 0.3 view (GIFOV) IR - Vis - NIR (3 boxes) Sampling frequency min 30 5 2.5 (update rate) Latency (image time min 10 5 2.5 to delivery) Mesoscale (movable 1000kmx1000km) # 1 2 5 Number of Regions in CONUS Mesoscale (movable min 1000kmx1000km) 7 0.5 0.25 Update rate Mesoscale (movable 1000kmx1000km) min 7 0.5 0.25 Latency (image time to delivery) Wavelengths covered micron lower - upper - 0.63 11 0 0.47 13.35 1E-09 0.4 13.7 0.64 day/night Number of bands # 4 16 32 (specific bands) K Radiometric accuracy 0.2 0.1 0.05 km Navigation accuracy 3 1 0.5

  5. EDR Value Model (EVM) Structure NSOSA • Example of Objective for Real-Time Regional Weather Imagery • SPRWG members suggested limits for each objective and each objective parameter • MIT LL quantified non-linear relationships between ST, EXP, and ME level Real-time Regional Study Max weather imagery Threshold Study Threshold Expected Effective Max Expected Expected SPRWG value Beta (first) Beta (second) Beta (third) Units Ground-projected instantaneous field of km 4 2 3 2 0.5 1 1 0.25 0.3 0.7 view (GIFOV) IR - Vis - NIR (3 boxes) 0.88 2.31 1.19 Sampling frequency min 30 5 2.5 0.7 (update rate) 3.74 𝛾 𝑧 𝑇𝑈 − 𝑧 𝐹𝑌𝑄 𝑇𝑏𝑢𝑗𝑡𝑔𝑏𝑑𝑢𝑗𝑝𝑜 = 𝑧 𝑁𝐹 − 𝑧 𝑇𝑈 Note: β is set by SPRWG satisfaction level for the objective and allows non-linear relation across ST, EXP, and ME levels

  6. EDR Value Model (EVM): Computing Value • Assessed observational properties for instruments against the EVM to NSOSA generate weighted value output for each objective • Sum weighted values outputs over all parameters and objectives to generate overall value Overall Arch60Obj swing Inputs Real-time Regional Score for Value Final Score weight A1 Arch60 weather imagery Arch60 Weighted for Arch60 Units Ground-projected instantaneous field of 2 0.5 1 km view (GIFOV) IR - Vis - NIR (3 boxes) 0.0652 0.7000 0.0457 Sampling frequency 10 min (update rate) 0.3037 0.0198 𝑃𝑐𝑘𝑓𝑑𝑢𝑗𝑤𝑓 𝑊𝑏𝑚𝑣𝑓 𝑧 𝑗 = Latency (image time 1 min to delivery) 1.0000 0.0652 Mesoscale (movable 1000kmx1000km) 2 # 𝛾 𝑗 Number of Regions in 𝑋𝑓𝑗𝑕ℎ𝑢 𝑗 𝑂 𝑧 𝑇𝑈 𝑗 − 𝑧 𝑗 CONUS 0.7000 0.0457 ෍ 𝑂 𝑗 𝑧 𝑁𝐹 𝑗 − 𝑧 𝑇𝑈 Mesoscale (movable 𝑗=1 0.5 min 1000kmx1000km) 𝑗 Update rate 0.7000 0.0457 where Weight 𝑗 = overall swing weight per objective Mesoscale (movable 1000kmx1000km) N 𝑗 = number of objective parameters 0.5 min Latency (image time to delivery) 𝛾 𝑗 is defined previously 0.7000 0.0457 Wavelengths covered 0.47 13.6 0.64 micron lower - upper - day/night 0.8499 0.0554 Number of bands 16 # 0.7000 0.0457 (specific bands) 0.1 K Radiometric accuracy 0.7000 0.0457 0.7000 0.0457 1 km Navigation accuracy 7.0536 0.0460 0.0460

  7. EVM: Group A and Group B Objectives Overall Swing Weights Objective Number Objective Name NSOSA Real-time Regional weather imagery 0.065216 A1 • Group A and Group B 0.058438 A2 Global real-time weather imagery 0.032066 A3 Non-Real-Time global weather imagery (Vis and IR) other than ocean color weights recommended 0.028707 A4 Global ocean color/phytoplankton composition 0.052690 A5 Global RT vertical IR soundings Group A by the SPRWG 0.004364 A6 Regional (CONUS) RT vert IR soundings 0.055681 A7 Global RT vertical MW soundings (Terrestrial 0.004956 A8 Regional (CONUS) RT vert MW sounding • Draws on priority for Weather) 0.063206 A9 Global GNSS-RO soundings 0.022560 A10 Lightning improvements between 0.003972 A11 Sea surface height (global) 0.042643 A12 Ocean surface vector wind (OSVW) Study Threshold to 0.066988 A13 3-D winds (Horizontal wind in 3D) 0.003545 A14 Ozone-global vertical profiles in troposphere and stratosphere and total column Maximum Effective level 0.025524 A15 Microwave imagery 0.003435 A16 Outgoing Longwave Radiation (OLR) 0.003364 A17 Incoming solar radiation (TOA) 0.005840 A18 Radar-Based Global Precipitation rate 0.003714 A19 Global soundings chemical concentrations • Weights reflect objective 0.046128 B1 Coronagraph imagery:Sun-Earth line 0.049493 B2 Coronagraph imagery:Off Sun-Earth line 0.004148 B3 Solar EUV imaging interweaving performed 0.005355 B4 Photospheric magnetogram imagery-Sun-Earth line 0.019845 B5 Photospheric magnetogram imagery: Off Sun-Earth line Group B by NOAA leadership 0.004630 B6 Solar X-ray irradiance (Space 0.003830 B7 Solar EUV irradiance 0.003621 B8 Interplanetary Solar wind:Sun-Earth Line Weather) • Final weights assigned by 0.007995 B9 Interplanetary Solar wind: Off Sun-Earth line 0.017396 B10 Heliospheric images tanh function 0.003484 B11 Interplanetary Energetic particles 0.003317 B12 Interplanetary Magnetic Field 0.003338 B13 Geomagnetic field 0.003396 B14 Geospace Energetic particles 0.010226 B15 Ionospheric electron density profiles 0.015219 B16 Auroral imaging 0.013307 B17 Thermospheric O/N2 ratio (height integrated) 0.011649 B18 Upper thermospheric density 0.009016 B19 Ionospheric drift velocity

  8. EVM: Group D Objectives NSOSA • Group D weights defined by NOAA leadership • Group D Objectives assessed Group D (Strategic) through evaluation of constellation properties Overall Swing Weights Objective Number Objective Name 0.068538 D1 Assurance of Core Capabilities • Imaging and sounding availability 0.060948 D2 Compatibility with fixed budgets 0.039096 D3 Assurance of all capabilities • Availability for least-available objective 0.035549 D4 Programmatic Flexibility and Adaptability 0.007140 D5 Develop and Maintain International Partnerships • Exceedances of budget 0.006453 D6 Low Risk at Constellation Level • Duration of program reflecting adaptability • Ability to form international partnerships (normalized) • Technology Readiness Level by PDR

  9. Groups A, B, and D Value Summary by Objective for Cycle 3 (C60 – C96) GROUP A GROUP B Objectives Objectives NSOSA 80-88 90-96 60 - 69 70-78 Value Value Constellation Number Constellation Number GROUP D 60s series: Modernized Legacy Objectives Continuation /Expansion, building up from current system 70s series: Exclusive Cost-Benefit, building Value up from hosted payloads 80s series: Hybrid, Mixture of US Government satellite and hosted payload Constellation Number 90s series: EXP focused

  10. Efficient Frontier Shows Value vs. Cost NSOSA Currently Currently Unachievable Unachievable Cost Cost Inefficient Inefficient Lower Lower Value Value (Potential) (Potential)

  11. Value Error Bars Reflect Uncertainty in Assessment and True Constellation Differences • Value error bars have two NSOSA components rss-ed together • MIT LL error bars component • Separately reported values giving feasible maximum values and minimum values • Monte Carlo analysis over 1000 trials was used to generate distributions • Points on the distribution chosen to combine with second error term developed by Eric Wendoloski of Aerospace • Aerospace error bars component • Sensitivity to judgements on ordering and weighting of objectives • Cost error bars computed by Kiley Yeakel of Aerospace

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend