adaptive design strategies in establishment surveys Melissa - - PowerPoint PPT Presentation

adaptive design strategies in
SMART_READER_LITE
LIVE PREVIEW

adaptive design strategies in establishment surveys Melissa - - PowerPoint PPT Presentation

Challenges of implementing adaptive design strategies in establishment surveys Melissa Mitchell Kathy Ott Jaki S. McCarthy National Agricultural Statistics Service . . . providing timely, accurate, and useful statistics in service to U.S.


slide-1
SLIDE 1

“. . . providing timely, accurate, and useful statistics in service to U.S. agriculture.”

Challenges of implementing adaptive design strategies in establishment surveys

Melissa Mitchell Kathy Ott Jaki S. McCarthy National Agricultural Statistics Service

slide-2
SLIDE 2

AAPOR May 16, 2014

Outline

  • Introduction

– National Agricultural Statistics Service (NASS) – Summary of Agricultural Resource Management Survey (ARMS)

  • Method to proactively target non-respondents

– Decision tree modeling

  • Adaptive Design Study
  • Lessons learned from implementing the study

– Experiments in production settings – Issues with targeting high impact operations

2

slide-3
SLIDE 3

AAPOR May 16, 2014

National Agricultural Statistics Service (NASS)

  • Conduct surveys of farm operations

– Establishment surveys

  • Survey topics including agricultural

– Production – Economics – Demographics – Environment

  • Every five years NASS also conducts the Census of

Agriculture

3

slide-4
SLIDE 4

AAPOR May 16, 2014

The Agricultural Resource Management Survey (ARMS)

  • Collects farm financial information and costs associated with

producing agricultural commodities

  • Estimates at US, regional, and state level (for 15 states)
  • Lengthy annual survey with historically low response rates (in the

60% range)

– 32 pages long – Over an hour to complete

  • Sample sizes typically >30k
  • Data collection primarily in person over approx. 3 months
  • Uses calibration weighting

– Weights the respondent sample so estimated variable totals for a large set of items match “targets” determined from other sources

4

slide-5
SLIDE 5

AAPOR May 16, 2014

Nonresponse propensity models

  • Data mining approach

– Census of Agriculture data were used as a proxy for ARMS data (both respondents and nonrespondents) – Classification tree models used to identify

  • perations with greater than 70% likelihood of

being a survey nonrespondent

  • These operations are typically large operations

(i.e, capacity, production, acreage)

5

slide-6
SLIDE 6

AAPOR May 16, 2014

Impact Operations

  • NASS also wants to identify likely nonrespondent
  • perations that are influential
  • These large operations may also be more critical

than smaller operations to calibration weighting

– Operations large relative to calibration targets are “impact operations” – Rank order from 1-3

  • This talk will only discuss impact operations that are most

important to calibration weighting

6

slide-7
SLIDE 7

AAPOR May 16, 2014

Adaptive Design Study

  • Split sample experiment

– 441 operations in treatment group & 446 operations in control – Recall all of the operations in the study are likely nonrespondents

  • Manipulated data collection strategies

– For operations in targeted/treatment group

  • Manipulated who goes to the operations based on impact score

– For those most important to the calibration weighting » Initial in-person contact by field office director or other senior level staff » Data collection by experienced or supervisory interviewers » Interviewer incentives ($20) for hard cases

– For operations in comparison/control group

  • Treat as you normally would

7

slide-8
SLIDE 8

AAPOR May 16, 2014

Overall ARMS Response Rates

Likely Nonrespondents Other Complete 55.6% 73.1% Refusal 36.9% 21.8% Noncontacts 4.9% 4.2% Office Hold 2.9% 0.9%

8

  • Our models predicted nonresponse well
slide-9
SLIDE 9

AAPOR May 16, 2014

Response Rate for ARMS Study

Targeted records Control records Complete 55.3% 55.2% Refusal 36.2% 37.5% Noncontact 4.8% 5.2% Office Hold 3.7% 2.1%

9

  • Our treatment did not improve response rates
slide-10
SLIDE 10

AAPOR May 16, 2014

Why didn’t our treatment work?

  • Debriefing with state offices

– Procedures weren’t followed

  • 25% compliance for high impact

– Procedures weren’t followed because of: » Field office reorganization; lack of available staff

  • “office staff were not available to contact in person”

» Lack of funds » Some didn’t like the interviewer incentives

  • “Supervisors were in agreement that incentives are

best used in recognition of the group, not the individual.”

10

slide-11
SLIDE 11

AAPOR May 16, 2014

What does this study reinforce?

  • Our models to identify likely nonrespondents

work fairly well

– How we harness & use this information is still an area of growth

  • Discussions currently ongoing

– Change the focus of who we target with the nonresponse propensity scores?

11

slide-12
SLIDE 12

AAPOR May 16, 2014

Difficulty of experiments in production

  • Control versus treatment procedures

– No one wants to adhere to control procedures that may make things worse – Not enough buy-in to the study

  • Lack of control (how implementation is carried out)

– Directions not followed as anticipated (or at all) – Cannot monitor in real-time (only after-the fact) – No consequences to not following instructions

  • Progress is slow with testing in production

– This survey is only done once a year; so we only get one chance to experiment with it before waiting an entire calendar year to try something else

12

slide-13
SLIDE 13

AAPOR May 16, 2014

Difficulty of targeting impact

  • perations
  • These operations may already have special

arrangements with the regional field offices

– Currently, we are developing a program to formalize these special arrangements

  • No one wants to experiment with these cases

in particular

– Experimentation may have unanticipated consequences

  • These operations are just hard to get

13

slide-14
SLIDE 14

AAPOR May 16, 2014

Next steps

  • Adjustments made to our ARMS 2013 study

– Work with smaller group of states

  • 12 states in 4 regional offices

– Use the same procedures as 2012 study – Working closely with training and tracking the study in these states

  • More communication between research staff and field staff

conducting field procedures

  • Already a sign of improvement: travel funds requested for a

number of states (so they are requesting funds to send people out to the field)

  • Still in the field so we don’t have results yet

14

slide-15
SLIDE 15

AAPOR May 16, 2014

Where should we go from here?

  • More questions than answers

– Should more effort be directed from hardest to get cases to those high impact operations predicted to be more likely to respond? – Should we re-examine our definition of impact

  • peration?

– What other data collection strategy can we employ?

  • Mode-switching
  • Letters
  • Incentives

15

slide-16
SLIDE 16

AAPOR May 16, 2014

Contact Info

  • Email: melissa.mitchell@nass.usda.gov
  • Phone: 703-877-8000 x 141

16