an email contact protocol experiment in a large scale
play

An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. - PowerPoint PPT Presentation

An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. in a Large Scale Survey of U.S. Government Employees DC-AAPOR Summer Conference Preview/Review Bureau of Labor Statistics Conference Center Washington DC Washington, DC


  1. An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. in a Large Scale Survey of U.S. Government Employees DC-AAPOR Summer Conference Preview/Review Bureau of Labor Statistics Conference Center Washington DC Washington, DC August 3, 2015 Taylor Lewis 1 and Karl Hess 1 1 The opinions, findings, and conclusions expressed in this presentation are those of the authors p , g , p p and do not necessarily reflect those of the U.S. Office of Personnel Management.

  2. Acknowledgments 1. Jim Caplan at the Department of Defense (DoD), f for helping us procure the necessary approvals to h l i th l t conduct this experiment on DoD employees sampled as part of the 2015 Federal Employee p p p y Viewpoint Survey (FEVS) 2. Bill Dristy at the U.S. Office of Personnel Management for serving as an IT liaison to the Management, for serving as an IT liaison to the FEVS team and sending out the survey invitations and reminders at our prescribed times p 2

  3. Outline I. Introduction A A. Background on the FEVS Background on the FEVS B. Brief Literature Review II. Experimental Methods and Data A. Traditional Email Contact Protocol (Control) ( ) B. Rotating Cohort Design C. Dynamic Adaptive Design III. Results IV. Discussion and Ideas for Further Research 3

  4. I. Introduction 4

  5. Background on the FEVS • The FEVS is an annual organizational climate survey administered by the U.S. Office of Personnel Management (OPM) to a sample of 800 000+ federal Management (OPM) to a sample of 800,000+ federal employees from 80+ agencies (biennial until 2010) • Web-based instrument comprised mainly of attitudinal items (e.g., perceptions of leadership, job satisfaction) sent via personalized link embedded in an email message • Agencies launch in one of two cohorts staggered one week apart for a six-week field period p p • Nonrespondents are sent weekly reminder emails 5

  6. The Federal Employee Viewpoint Survey FEVS Sample & Respondent Counts: 2004-2014 1,600,000 1,492,418 1,400,000 , , 1,200,000 1,000,000 839,788 800,000 600,000 600 000 687,687 276,424 400,000 392,752 200,000 147,915 0 2004 2004 2006 2006 2008 2008 2010 2010 2011 2011 2012 2012 2013 2013 2014 2014 FEVS Response Rates: 2004-2014 60.0% 56.7% 55.0% 52.2% 53.5% 50.0% 50.0% 48 2% 48.2% 50.9% 49.3% 46.8% 45.0% 46.1% 40 0% 40.0% 2004 2006 2008 2010 2011 2012 2013 2014 6

  7. More on Declining Response Rates • Many other surveys facing similar response rate declines (de Leeuw and de Heer, 2002; Petroni et al., 2004; Curtain et al Curtain et al., 2005; Brick and Williams, 2013) 2005; Brick and Williams 2013) • Trend continues despite recent enhancements believed to increase response levels: – More inclusive scope – increased sample size over time (all but 16 agencies conduct a census) – Aggressive communications campaigns – agencies are provided fillable posters, template email messages to be sent from senior leaders; several agencies disseminate YouTube videos – R Real-time response rate website – a controlled-access website for l ti t b it t ll d b it f agency points-of-contact to track their response rate status and compare to governmentwide rate and that of prior FEVS administrations • Untapped area of research: evaluating alternative email contact protocols t t t l 7

  8. Literature on Optimizing Contact Times • Abundance of research in interviewer-administered surveys: – Telephone Surveys: Weeks et al., 1987; Greenberg and Telephone Surveys: Weeks et al 1987; Greenberg and Stokes, 1990, Brick et al., 1996 – Face-to-Face Surveys: Purdon et al., 1999; Wagner, 2013 – Longitudinal Surveys: Lipps, 2012; Durrant el al., 2013 • • Important to acknowledge that optimizing contact Important to acknowledge that optimizing contact rates does not guarantee an increase in response rates (Kreuter and Müller, 2014) ( ) • At present, we have no direct way to assess when contact was made after sending an FEVS email invitation (future research could look into this) 8

  9. Literature on Email Timing in Self- Administered Web Surveys Administered Web Surveys • Research is scant, but results are mixed: 1 1. F Faught et al. (2004) – establishment survey; randomly h l (2004) bli h d l assigned email addresses to one of 14 a.m./p.m. timeblocks defined for seven days of week  found emailing Wednesday morning produced highest response ili W d d i d d hi h t rate 2. Sauermann and Roach (2013) – 25 experimental groupings in a sample of approx. 25,000 science and engineering professionals; examined combinations of time of day, day of week, lag time between contacts  found no significant effects • Seems unlikely a single email protocol “treatment” would work best on all survey populations 9

  10. II E II. Experimental Methods i t l M th d and Data and Data 10

  11. Traditional Email Contract Protocol (Control) • Initial email invitation to participate sent on Tuesday morning of first week of field period • Five weekly reminders thereafter, also on Tuesday morning • Final reminder sent on Friday of sixth week (even though surveys stays open through COB following Monday) Date Event Tuesday of Week 1 Initial Invitation Tuesday of Week 2 Reminder 1 Tuesday of Week 3 Tuesday of Week 3 Reminder 2 Reminder 2 Tuesday of Week 4 Reminder 3 Tuesday of Week 5 Reminder 4 Tuesday of Week 6 Reminder 5 Friday of Week 6 F id f W k 6 Fi Final Reminder with wording "Survey Closes Today" l R i d i h di "S Cl T d " Following Monday, COB Survey Links Deactivated 11

  12. Alternative #1: Rotating Cohort Protocol • A static adaptive design , following terminology of Bethlehem et al. (2011) • Randomly assign employees to one of six timeblocks for initial email invitation and then cycle through five for initial email invitation, and then cycle through five remaining timeblocks when sending reminders Tuesday Wednesday Thursday A.M. . . A C C E P.M. B D F • All nonrespondents still receive the final reminder on F id Friday of sixth week f i h k 12

  13. Alternative #2: Dynamic Adaptive Protocol • So named per terminology of Bethlehem et al. (2011) • As with first alternative, employees are randomly assigned to one of six timeblocks for initial email invitation invitation • At each week’s end, a multinomial logistic regression At each week s end, a multinomial logistic regression model fitted using sampling frame covariates (e.g., gender, supervisory status, subagency, minority status)  a vector of six timeblock-specific response t t )  t f i ti bl k ifi propensities generated for each nonrespondent • Subsequent week’s timeblock assigned stochastically in proportion to these probabilities 13

  14. Experimental Data Set • FEVS 2015 sample ( n = 34,799) of Department of Defense 4 th Estate (DoD excluding Army, Navy, and Ai F Air Force) ) • Divided sample randomly into three groups of Divided sample randomly into three groups of approximately equal size, each receiving one of the three email contact protocols p • Bulk emailer used to send out invitations simultaneously within a given timeblock simultaneously within a given timeblock 14

  15. III. Results 15

  16. Response Rates by Experimental Group • Traditional contact protocol consistently outperformed two alternatives over field period • Two alternatives performed nearly identically 16

  17. Response Rates by Select Demographics Gender Minority Status 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% Female Male Non-Minority Minority Supervisory Status Pay Level 60% 60% 50% 50% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% Non-Supervisor Supervisor/Manager Executive < $50,000 $50,001 - $100,001 - More than $100,000 $150,000 $150,000 17

  18. Responses by Day of Work Week • Bulk of responses arrive in same day as email invitation as email invitation or reminder, particularly if sent i in morning i • More spill-over to More spill over to next day when emails go out in afternoon afternoon • Relatively little y spill-over into subsequent week 18

  19. Visualizing Impact of Reminders • Histogram illustrates hourly hourly response counts on d days emails il go out • Intraday heaping indicates indicates upticks in reaction to emails il 19

  20. IV Di IV. Discussion and Ideas i d Id for Further Research for Further Research 20

  21. Discussion • Constant Tuesday morning invitation/reminder schedule yielded highest response rate, although the edge was slight slight • Possible explanations for Tuesday morning effect: p y g – Extra time/days to handle request during work week? – Extra time/days of survey open period? – Respondents react quicker (e.g., more likely to respond in same Respondents react quicker (e.g., more likely to respond in same day) when invitation/reminder arrives earlier in week? – Respondents more likely to be in office and/or checking emails that particular day? • Unfortunately, our experimental design leaves us unable to contrast the effect relative to a fixed to contrast the effect relative to a fixed Wednesday/Thursday protocol 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend