adaptive design strategies in
play

adaptive design strategies in establishment surveys Melissa - PowerPoint PPT Presentation

Challenges of implementing adaptive design strategies in establishment surveys Melissa Mitchell Kathy Ott Jaki S. McCarthy National Agricultural Statistics Service . . . providing timely, accurate, and useful statistics in service to U.S.


  1. Challenges of implementing adaptive design strategies in establishment surveys Melissa Mitchell Kathy Ott Jaki S. McCarthy National Agricultural Statistics Service “. . . providing timely, accurate, and useful statistics in service to U.S. agriculture.”

  2. Outline • Introduction – National Agricultural Statistics Service (NASS) – Summary of Agricultural Resource Management Survey (ARMS) • Method to proactively target non-respondents – Decision tree modeling • Adaptive Design Study • Lessons learned from implementing the study – Experiments in production settings – Issues with targeting high impact operations AAPOR May 16, 2014 2

  3. National Agricultural Statistics Service (NASS) • Conduct surveys of farm operations – Establishment surveys • Survey topics including agricultural – Production – Economics – Demographics – Environment • Every five years NASS also conducts the Census of Agriculture AAPOR May 16, 2014 3

  4. The Agricultural Resource Management Survey (ARMS) • Collects farm financial information and costs associated with producing agricultural commodities • Estimates at US, regional, and state level (for 15 states) • Lengthy annual survey with historically low response rates (in the 60% range) – 32 pages long – Over an hour to complete • Sample sizes typically >30k • Data collection primarily in person over approx. 3 months • Uses calibration weighting – Weights the respondent sample so estimated variable totals for a large set of items match “targets” determined from other sources AAPOR May 16, 2014 4

  5. Nonresponse propensity models • Data mining approach – Census of Agriculture data were used as a proxy for ARMS data (both respondents and nonrespondents) – Classification tree models used to identify operations with greater than 70% likelihood of being a survey nonrespondent • These operations are typically large operations (i.e, capacity, production, acreage) AAPOR May 16, 2014 5

  6. Impact Operations • NASS also wants to identify likely nonrespondent operations that are influential • These large operations may also be more critical than smaller operations to calibration weighting – Operations large relative to calibration targets are “impact operations” – Rank order from 1-3 • This talk will only discuss impact operations that are most important to calibration weighting AAPOR May 16, 2014 6

  7. Adaptive Design Study • Split sample experiment – 441 operations in treatment group & 446 operations in control – Recall all of the operations in the study are likely nonrespondents • Manipulated data collection strategies – For operations in targeted/treatment group • Manipulated who goes to the operations based on impact score – For those most important to the calibration weighting » Initial in-person contact by field office director or other senior level staff » Data collection by experienced or supervisory interviewers » Interviewer incentives ($20) for hard cases – For operations in comparison/control group • Treat as you normally would AAPOR May 16, 2014 7

  8. Overall ARMS Response Rates Likely Other Nonrespondents Complete 55.6% 73.1% Refusal 36.9% 21.8% Noncontacts 4.9% 4.2% Office Hold 2.9% 0.9% • Our models predicted nonresponse well AAPOR May 16, 2014 8

  9. Response Rate for ARMS Study Targeted records Control records Complete 55.3% 55.2% Refusal 36.2% 37.5% Noncontact 4.8% 5.2% Office Hold 3.7% 2.1% • Our treatment did not improve response rates AAPOR May 16, 2014 9

  10. Why didn’t our treatment work? • Debriefing with state offices – Procedures weren’t followed • 25% compliance for high impact – Procedures weren’t followed because of: » Field office reorganization; lack of available staff • “office staff were not available to contact in person” » Lack of funds » Some didn’t like the interviewer incentives • “Supervisors were in agreement that incentives are best used in recognition of the group, not the individual.” AAPOR May 16, 2014 10

  11. What does this study reinforce? • Our models to identify likely nonrespondents work fairly well – How we harness & use this information is still an area of growth • Discussions currently ongoing – Change the focus of who we target with the nonresponse propensity scores? AAPOR May 16, 2014 11

  12. Difficulty of experiments in production • Control versus treatment procedures – No one wants to adhere to control procedures that may make things worse – Not enough buy-in to the study • Lack of control (how implementation is carried out) – Directions not followed as anticipated (or at all) – Cannot monitor in real-time (only after-the fact) – No consequences to not following instructions • Progress is slow with testing in production – This survey is only done once a year; so we only get one chance to experiment with it before waiting an entire calendar year to try something else AAPOR May 16, 2014 12

  13. Difficulty of targeting impact operations • These operations may already have special arrangements with the regional field offices – Currently, we are developing a program to formalize these special arrangements • No one wants to experiment with these cases in particular – Experimentation may have unanticipated consequences • These operations are just hard to get AAPOR May 16, 2014 13

  14. Next steps • Adjustments made to our ARMS 2013 study – Work with smaller group of states • 12 states in 4 regional offices – Use the same procedures as 2012 study – Working closely with training and tracking the study in these states • More communication between research staff and field staff conducting field procedures • Already a sign of improvement: travel funds requested for a number of states (so they are requesting funds to send people out to the field) • Still in the field so we don’t have results yet AAPOR May 16, 2014 14

  15. Where should we go from here? • More questions than answers – Should more effort be directed from hardest to get cases to those high impact operations predicted to be more likely to respond? – Should we re-examine our definition of impact operation? – What other data collection strategy can we employ? • Mode-switching • Letters • Incentives AAPOR May 16, 2014 15

  16. Contact Info • Email: melissa.mitchell@nass.usda.gov • Phone: 703-877-8000 x 141 AAPOR May 16, 2014 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend