adaptive design in an establishment survey adaptive
play

Adaptive Design in an Establishment Survey: Adaptive Design in an - PowerPoint PPT Presentation

Adaptive Design in an Establishment Survey: Adaptive Design in an Establishment Survey: Strategic Targeting in the Agricultural Resource Management Survey (ARMS) Dr. Jaki McCarthy T l Tyler Wilson Wil Research and Development Division,


  1. Adaptive Design in an Establishment Survey: Adaptive Design in an Establishment Survey: Strategic Targeting in the Agricultural Resource Management Survey (ARMS) Dr. Jaki McCarthy T l Tyler Wilson Wil Research and Development Division, National Agriculture Statistics Service National Agriculture Statistics Service

  2. ARMS • Annual survey run by National Agricultural l b i l i l l Statistics Service (NASS) and Economic Research Service (ERS) of the U. S. Department of S i (ERS) f th U S D t t f Agriculture. • Three stage survey: 1. First stage is a screening process 2. Second stage captures production expenses, chemical use, and area-specific commodities 3. Third stage focuses on financial data like expenses and income  St  Stage 3 is the focus of this research 3 i th f f thi h

  3. ARMS Stage III g • Sample of 30-40,000 farms and ranches S l f 30 0 000 f d h • Potentially sensitive topics like finance and household characteristics • Long survey, interviews can exceed an hour g y, • Relatively low history of response rates (50-60%) • Mixed mode approach mail web with in-person • Mixed mode approach mail, web, with in-person interviews follow up • Nonresponse propensity score to data to target • Nonresponse propensity score to data to target some operations

  4. Using Trees to Predict Nonrespondents Nonrespondents • Census Data imported and used as a proxy of ARMS III respondent characteristics characteristics – Process repeated to generate alternative trees using each of 70 different variables as the starting point – Each tree generates different groups of nonrespondents – Initial efforts targeted operations with >70% nonresponse in ANY Initial efforts targeted operations with >70% nonresponse in ANY of the trees • Too many records identified • Some had low overall nonresponse propensity – Current efforts use average nonresponse across all trees Current efforts use average nonresponse across all trees – i.e. consistent nonrespondents

  5. An Example Tree p

  6. Initial Use of Models • Models provide a “profile” of operations • FO’s provided with least likely to respond FO s provided with least likely to respond operations (>70%NR) • FO’s directed to use field office directors, deputies, O’ di d fi ld ffi di d i and supervisory enumerators to recruit respondents Results: Results: • Small sample sizes • Slight rise in response rates, but unclear

  7. A Different View? Th I The Improbables – Operations highly unlikely to b bl O ti hi hl lik l t respond even when targeted • 70% or higher nonresponse propensity scores • 70% or higher nonresponse propensity scores The Pursuadables – Operations who respond when p p targeted • 50-69% nonresponse propensity scores The Sure Things – Operations likely to respond whether targeted or not whether targeted or not • 0-49% nonresponse propensity scores

  8. Data Collection Driven by Nonresponse Propensity Scores Nonresponse Propensity Scores 1. Highly Unlikely to Respond and High Impact on Estimates 1. Highly Unlikely to Respond and High Impact on Estimates • Small group assigned for recruitment to high level staff (Regional Directors and State Statisticians) to large, hard to get operations t ti 2. Somewhat Unlikely to Respond • Testing a drop off pick up (DOPU) distribution model that • Testing a drop off, pick up (DOPU) distribution model that has increased response rates in previous studies (Steel et al. 2001) • First contact IN PERSON to solicit cooperation and drop off questionnaire • Inter ie er sched les SECOND IN PERSON appointment to • Interviewer schedules SECOND IN PERSON appointment to pick up completed questionnaire or conduct an interview 3. Likely Responders 3. Likely Responders • Standard procedures

  9. The Persuadables • Test operations identified as unlikely, but not impossible p – 50-69% Nonresponse Propensity using classification trees (Earp and McCarthy 2010) classification trees (Earp and McCarthy 2010) • Why investigate this subgroup? 1. With alternative collection strategies (incentives, 1. With alternative collection strategies (incentives, face-to-face…) these operations could prove more likely responders more likely responders

  10. Influencing the Persuadables • Customized approach to the hand delivery (drop off, pick up) survey distribution (Allred ( p , p p) y ( and Ross-Davis 2011) • NASS’ hand delivery mode sought to… 1. Increase face to face interaction 2 2. Increase the use of token items and sponsorship Increase the use of token items and sponsorship 3. Provide additional flexibility to respondent 4. Decrease the opportunities and ease of refusal h d f f l

  11. The Drop Off Approach p pp Unique to this study…  No pre mailing or phone call  Must establish contact before  Must establish contact before ‘dropping off’  Required to leave the packet with Required to leave the packet with respondent even if they refuse  After 2 personal contact tries, surveyor can default to standard procedures…

  12. The Data Collection Bag The bag helps facilitate the basic requirements of cooperation in a social exchange: Reciprocation, consistency, social validation, authority, l k liking, scarcity (Groves, Cialdini, Couper 1992) ( ld ) -Questionnaire Questionnaire -DOPU Cover Letter -Brochure Brochure -Postcard -Door Hangers Door Hangers -ERS Data Uses Fact Sheet -NASDA Token Item NASDA Token Item -Privacy Envelope

  13. Experimental Design p g Targeted all operations with nonresponse propensity scores of 50 69% (n = 1552) scores of 50-69% (n = 1552) – Randomly split into treatment (n = 774) and control (n = 778) l ( 778) – Mandatory follow-up supplement sheet for treatment sample to verify how case was handled – Selected at a U.S. level, some regions had more , g than others

  14. Data Collection Mode Mode of Interview between Treatment and Control 62% Face to Face 83% 83% 6% Telephone 4% 24% Mail 9% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% Control Treatment

  15. Were New Procedures Feasible in an Operational Environment an Operational Environment Determinants of Feasibility 1. Training and outreach 2. Handling requirements (sample & materials) 3. Operational deadlines 4. Specific requests from respondents All these determinants were tracked under a ‘catch All these determinants were tracked under a catch all’ question: Were alternative methods used? 85% Said No Initial Treatment Sample D Deemed Viable d Vi bl N=658

  16. Was Delivery Possible? y Were surveyors able to establish contact to deliver y the data collection packet? Contact Established in Treatment Sample Contact Established in Treatment Sample 50% 45% 45% 40% 35% 30% 25% 20% 15% 12% 10% 5% 0% First Attempt Second Attempt 57% (n 388) cases in our treatment sample were able to follow procedures and deliver 57% (n=388) cases in our treatment sample were able to follow procedures and deliver the data collection packet during their two face to face attempts

  17. Completion Rates p Completion Rates 90% 79% 80% 68% 68% 70% 70% 58% 60% mpleted 50% 50% Percentage Com 41% 40% 30% 30% 20% 10% 10% 0% 1st Contact 2nd Contact No Contact Established Control Treatment vs Control

  18. Completion Rate p Completion Percentage Treatment Control 60% 58% No significant difference found between the drop off sample and the standard procedures sample However, when surveyors were able to establish face to face H h bl t t bli h f t f contact, cooperation rates were very high for this unlikely group Cooperation Percentage when Contact Established Treatment 70%

  19. Discussion • When drop off procedures were followed and h d ff d f ll d d contact was achieved, response rates increased b 12% by 12% • Was it operationally possible to make a second in-person survey attempt? • Less inaccessible and out of scope operations in treatment… – A possible indication of DOPU’s ability to decrease the ease of refusing the survey • Mandatory this year

  20. Conclusions • When personally contacted using a drop off process, p y g p p , 70% of operations previously identified as 50-69% unlikely to respond completed ARMS III using a drop y p p g p off, pick up process • Main Determinants: Main Determinants: – Face-to-face interaction – Incentives – Incentives – Fewer ‘easy’ ways to refuse – Flexibility Flexibility • Need to follow up with Field Offices and surveyor to see why they could not establish contact see why they could not establish contact

  21. Further Research – Measurement Error M E • Do harder to get respondents provide less quality data? • Harder to get respondents provide less quality data H d d id l li d (Dahlhamer 2012) • Examination of raw edited and imputed rates between Examination of raw, edited, and imputed rates between samples – Response Bias p • Are harder to get respondents different (with respect to estimates) than easier to obtain respondents? • Analysis of distributions – Resources • Is this an effective use of data collection dollars?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend