structure of responsive and adaptive design r evolutions
play

Structure of Responsive and Adaptive Design (R)evolutions In Memory - PowerPoint PPT Presentation

Structure of Responsive and Adaptive Design (R)evolutions In Memory of Professor Stephen E. Fienberg, 1942-2016 Responsive and Adaptive Design Workshop March 14, 2018 Asaph Young Chun U.S. Census Bureau Guest Editor, Journal of Official


  1. Structure of Responsive and Adaptive Design (R)evolutions In Memory of Professor Stephen E. Fienberg, 1942-2016 Responsive and Adaptive Design Workshop March 14, 2018 Asaph Young Chun U.S. Census Bureau Guest Editor, Journal of Official Statistics Special Issue on Responsive and Adaptive Survey Design

  2. What is RAD? RAD = Wonderful, extraordinary! (Youth slang)

  3. The JOS Special Issue in Memory of Professor Stephen Fienberg (1942-2016) • University Professor of Statistics and Social Science at Carnegie Mellon University • Transformative contributor to JOS for decades (e.g. Fienberg 1994; Fienberg and Makov 1998) • Guest Editor of JOS Special Issue on Disclosure Limitation Methods (Fienberg and Willenborg 1998) • Plenary Speaker in JOS 30 th Anniversary Conference (Fienberg 2015)

  4. Outline • Introduction • Reflections on RAD • Overview of the Special Issue • Comments Tailored to Two Papers • Challenges Remaining for RAD • Conclusions

  5. Introduction • A rapidly changing survey environment requires a nimble, flexible design • Birth of responsive and adaptive survey design (Groves and Heeringa 2006; Wagner 2008) • RAD is being evolved

  6. Triple Phenomena to Watch • Computerization of survey data collection enables real-time analysis of paradata (Couper, 1998) • Methods from fields as diverse as machine learning, operations research, and Bayesian statistics are useful (Early, Mankoff and Fienberg, 2017) • Evidence-driven policy makers as well as survey researchers have renewed their attention to administrative records (Chun 2009; Chun, Larsen, Reiter, and Durrant, forthcoming 2018)

  7. Reflections on RAD • Birth of RAD is a natural reaction to the basic rationale of survey design that addresses response and measurement errors in population subgroups • Systematic approach to adaptive design evolved (Schouten et al. 2013) • Evolution of RAD is due to: – increasing pressure on response rates, – use of paradata, – IT-driven data collection methods

  8. Responsive vs. Adaptive • Responsive survey design originates from settings with less auxiliary data, long data collection periods and detailed quality-cost constraints • Adaptive survey design comes from settings with richer auxiliary data, short data collection periods and structural variation

  9. What Drives RAD? • Use of Auxiliary data • Design features/interventions • Explicit quality and cost metrics • Quality-cost optimization

  10. What Stems Growth of Responsive & Adaptive Design? • Literature tends to be produced by survey statisticians and not by survey managers. • Survey designs demand for more complex monitoring and case management systems, as well as explicit C-Q control • Number of success stories is limited

  11. Overview of JOS Special Issue • Provide formalized rules for adaptation. • Examine the impact of responsive and adaptive designs on the quality of estimates • Consider adaptive design tailored to panel surveys • Examine responsive and adaptive designs for establishment surveys

  12. Rules for adaptive design • Paiva, T., Reiter, J. Stop or Continue Data Collection: A Nonignorable Missing Data Approach for Continuous Variables • Lewis, T. Univariate Tests for Phase Capacity: Tools for Identifying When to Modify a Survey’s Data Collection Protocol • Early, K., Mankoff, J., Fienberg, S. Dynamic question ordering in online surveys. • Vandenplas, C., Loosveldt, G., Beullens, K. Fieldwork Monitoring for the European Social Survey: an illustration with Belgium and the Czech Republic in Round 7 • Burger, J., Perryck, K., Schouten, B. Robustness of adaptive survey designs to inaccuracy of design parameters

  13. Impact of responsive and adaptive designs on the quality of estimates • Lundquist, P., Särndal, C. Inconsistent regression and nonresponse bias: Exploring their relationship as a function of response imbalance • Brick, M., Tourangeau, R. Responsive survey designs for reducing nonresponse bias

  14. Comments on Brick, Tourangeau • Reanalysis of Groves and Peytcheva (2008) • Response Propensity Model Typology is viable • Responsive designs are useful for reducing NRB by employing unequal efforts (e.g, modes, incentives, LOE for subgroups) • (Suggestion) Conduct the next round of meta analysis with NRB studies conducted since 2008, including studies employing AR (Chun, 2009; Chun and Scheuren, 2011)

  15. Groves and Cooper (1998) • “The decision to participate may be affected by characteristics of the survey design and topic as well as by characteristics of the population sampled…We shouldn’t expect there to be a simple relationship between nonresponse error and nonresponse rate.” (p 319)

  16. Questions to Brick, Tourangeau • How would you operationialize costs and monitor cost-quality tradeoffs in justifying responsive designs that reduce NRB? • Under what conditions would you make more efforts to reduce the imbalance of the sample under collection or leverage postsurvey adjustment, such as weighting?

  17. Adaptive design tailored to panel surveys • Shlomo, N., Plewis, I. Using Response Propensity Models to Improve the Quality of Response Data in Longitudinal Studies • Lynn, P., Kaminska, O. The implications of alternative allocation criteria in adaptive design for panel surveys • Durrant, G., Maslovskaya, O., Smith, P. Using prior wave information and paradata: Can they help to predict response outcomes and call sequence length in a longitudinal study?

  18. Comments on Durrant, Maslovskaya, Smith • Prior wave information and paradata are useful in predictive analytics. • Call outcomes of the most recent calls are the most significant in predicting response outcomes in future wave (recency effect). • (Suggestion) Contact/Participation propensity models need to reflect the separate processes in their model specification (Groves and Cooper, 1998; Chun, 2009)

  19. Questions to Durrant, Maslovskaya, Smith • How would you dissect and integrate contact propensity and response propensity to make your models more rigorous and predictive of outcomes of interest in light of a nonresponse lifecycle? • What auxiliary data, including administrative records, would you consider identifying and adding to your models to improve predictive power?

  20. Responsive and adaptive design for establishment surveys • Thompson, K.J., Kaputa, S. Investigating Adaptive Nonresponse Follow-up Strategies for Small Businesses through Embedded Experiments • McCarthy, J., Wagner, J., Sanders, H. The Impact of Targeted Data Collection on Nonresponse Bias in an Establishment Survey: A Simulation Study of Adaptive Survey Design

  21. RAD Prospects • Build the toolkit of evidence-based designs • Learn more about deploying features & cost allocation differentially across pop subgroups • Have survey managers work more closely with statisticians, survey methodologists, cost experts • Develop rules of phase switching and of stopping data collection (e.g. AD in clinical trials) • Design and test cost-quality tradeoff models (Groves, 1989)

  22. Conclusions • RAD is evolving today • Further innovation and cross-fertilization is required • Hope the JOS articles be a catalyst of further innovation in RAD

  23. Thank you for listening! In Memory of Professor Stephen E. Fienberg 1942-2016 Asaph Young Chun US CENSUS BUREAU Asaph.young.chun@census.gov

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend