Plansforinves-ga-ng-medependent biasesinAIRStemperatureprofiles - - PowerPoint PPT Presentation

plans for inves ga ng me dependent biases in airs
SMART_READER_LITE
LIVE PREVIEW

Plansforinves-ga-ng-medependent biasesinAIRStemperatureprofiles - - PowerPoint PPT Presentation

Plansforinves-ga-ng-medependent biasesinAIRStemperatureprofiles BillIrion JetPropulsionLaboratory,CaliforniaIns-tuteofTechnology Withthanksto


slide-1
SLIDE 1

Plans
for
inves-ga-ng
-me‐dependent
 biases
in
AIRS
temperature
profiles


Bill
Irion


Jet
Propulsion
Laboratory,
California
Ins-tute
of
Technology


With
thanks
to

 Chris
Barnet,
Murty
Divakarla,
Eric
Maddy


NOAA


John
Blaisdell,
Thomas
Hearty,
Gyula
Molnar


GSFC


George
Aumann,
Eric
Fetzer,
Evan
Fishbein,
Steve
Friedman,
 Bjorn
Lambrigtsen,
Sung‐Yung
Lee,
Evan
Manning,
Ed
Olsen,
Joao
Teixeira


Jet
Propulsion
Laboratory,
California
Ins-tute
of
Technology


slide-2
SLIDE 2

Comparisons
of
trends
in
AIRS
V5
(Final
and
MW
 products),
AVN
and
ECMWF
to
sonde
temperatures


Courtesy
of
Eric
Maddy


slide-3
SLIDE 3

V5.0.14 + AVN Tq a priori + CarbonTracker CO2 + no AMSU + CO2 noise covariance matrix ON


NO
CO2
NCV


Eric

 Maddy


slide-4
SLIDE 4

V5.0.14 + AVN Tq a priori + CarbonTracker CO2 + no AMSU + CO2 noise covariance matrix OFF


Eric

 Maddy


slide-5
SLIDE 5

What
do
we
know?


  • V5.0.14
shows
a
-me
dependent
bias
w.r.t.


sondes
~
‐100mK
yr‐1;
similar
results
against
 ECMWF


  • Microwave
only
retrievals
show
a
lesser,
but
s-ll


significant
bias
~
‐50mK
yr‐1


  • Switching
to
Carbon
Tracker
CO2
plus
AVN
Tq


a
priori
with
CO2
NCV
helps
at
200
mb
and
 boundary
layer,
but
liale
difference
in
between


– Turning
CO2
NCV
off
brings
bias
rate
down
to
~
‐50mK
 yr‐1


slide-6
SLIDE 6

AIRS
Retrieval
System


L0
MW
 LO
IR
 L1A
MW
 L1A
IR


MW
 Calibra-on
 IR
 Calibra-on


L1B
MW
 L1B
IR


MW
only
 Retrieval
 MW
Only
 Product
 Cloudy
 Regression
 Cloud‐ clearing
#1
 Regression
 Regression
 sol’n
 products
 Physical
 Retrieval
#1
 Error
Est.
&
 Quality
 Control


L2
product


Cloud.
 Parameter
 Retrieval
#1
 Cloud.
 Parameter
 Retrieval
#2
 Cloud‐ clearing
#2
 Cloud.
 Parameter
 Retrieval
#3
 Cloud‐ clearing
#3
 Physical
 Retrieval
#2
 AMSU
 Retrieval
#1
 AMSU

 Retrieval
#2


slide-7
SLIDE 7

AIRS
Retrieval
System


L0
MW
 LO
IR
 L1A
MW
 L1A
IR


MW
 Calibra-on
 IR
 Calibra-on


L1B
MW
 L1B
IR


MW
only
 Retrieval
 MW
Only
 Product
 Cloudy
 Regression
 Cloud‐ clearing
#1
 Regression
 Regression
 sol’n
 products
 Physical
 Retrieval
#1
 Error
Est.
&
 Quality
 Control


L2
product


Cloud.
 Parameter
 Retrieval
#1
 Cloud.
 Parameter
 Retrieval
#2
 Cloud‐ clearing
#2
 Cloud.
 Parameter
 Retrieval
#3
 Cloud‐ clearing
#3
 Physical
 Retrieval
#2
 AMSU
 Retrieval
#1
 AMSU

 Retrieval
#2


Next
Steps:


  • 1. A
test
bed
for
the
AIRS
PGE
with
hooks
and


rou-nes
to
write
out
results
(incl.
Jacobians
 etc.)
from
each
processing
step.


  • Needed
to
evaluate
trends
introduced
by
each


process.
This
problem
appears
to
have
more
than


  • ne
contributor.

  • A
turnkey
system
will
be
useful
for
later
algorithm


tes-ng
and
evalua-on.

slide-8
SLIDE 8

AIRS
Retrieval
System


L0
MW
 LO
IR
 L1A
MW
 L1A
IR


MW
 Calibra-on
 IR
 Calibra-on


L1B
MW
 L1B
IR


MW
only
 Retrieval
 MW
Only
 Product
 Cloudy
 Regression
 Cloud‐ clearing
#1
 Regression
 Regression
 sol’n
 products
 Physical
 Retrieval
#1
 Error
Est.
&
 Quality
 Control


L2
product


Cloud.
 Parameter
 Retrieval
#1
 Cloud.
 Parameter
 Retrieval
#2
 Cloud‐ clearing
#2
 Cloud.
 Parameter
 Retrieval
#3
 Cloud‐ clearing
#3
 Physical
 Retrieval
#2
 AMSU
 Retrieval
#1
 AMSU

 Retrieval
#2


Next
Steps:


  • 2. A
consistent
set
of
radiosonde
observa-ons


and
ECMWF
days
(i.e.
focus
days)
to
compare
 AIRS
results
against
each
other.


  • We’re
looking
for
small
but
robust
effects,
so
a


consistent
valida-on
set
across
different
tests
will
be
 important.



  • Radiosondes
remain
the
best
“truth,”
but
ECMWF


comparison
allows
tests
under
more
atmospheric
 states
(different
regions,
seasons,
cloud
condi-ons,
 etc.)



slide-9
SLIDE 9

AIRS
Retrieval
System


L0
MW
 LO
IR
 L1A
MW
 L1A
IR


MW
 Calibra-on
 IR
 Calibra-on


L1B
MW
 L1B
IR


MW
only
 Retrieval
 MW
Only
 Product
 Cloudy
 Regression
 Cloud‐ clearing
#1
 Regression
 Regression
 sol’n
 products
 Physical
 Retrieval
#1
 Error
Est.
&
 Quality
 Control


L2
product


Cloud.
 Parameter
 Retrieval
#1
 Cloud.
 Parameter
 Retrieval
#2
 Cloud‐ clearing
#2
 Cloud.
 Parameter
 Retrieval
#3
 Cloud‐ clearing
#3
 Physical
 Retrieval
#2
 AMSU
 Retrieval
#1
 AMSU

 Retrieval
#2


Next
Steps:


  • 3. A
consistent
method
of
ver-cally
averaging
and


calcula-ng
biases
and
trends.


  • This
must
be
chosen
carefully.
Too
fine
a
ver-cal


region
(say
under
the
AIRS
resolu-on)
may
produce
 an
erroneous
trend
while
too
thick
a
region
may
 mask
it.


  • Different
hypotheses
will
be
tested
by
different


groups,
so
we
need
assurance
that
results
can
be
 properly
compared
to
each
other.


slide-10
SLIDE 10

Discussion