An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. - - PowerPoint PPT Presentation

an email contact protocol experiment in a large scale
SMART_READER_LITE
LIVE PREVIEW

An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. - - PowerPoint PPT Presentation

An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. in a Large Scale Survey of U.S. Government Employees DC-AAPOR Summer Conference Preview/Review Bureau of Labor Statistics Conference Center Washington DC Washington, DC


slide-1
SLIDE 1

An Email Contact Protocol Experiment in a Large-Scale Survey of U.S. in a Large Scale Survey of U.S. Government Employees

DC-AAPOR Summer Conference Preview/Review Bureau of Labor Statistics Conference Center Washington DC Washington, DC August 3, 2015

Taylor Lewis1 and Karl Hess1

1The opinions, findings, and conclusions expressed in this presentation are those of the authors

p , g , p p and do not necessarily reflect those of the U.S. Office of Personnel Management.

slide-2
SLIDE 2

Acknowledgments

1. Jim Caplan at the Department of Defense (DoD), f h l i th l t for helping us procure the necessary approvals to conduct this experiment on DoD employees sampled as part of the 2015 Federal Employee p p p y Viewpoint Survey (FEVS) 2. Bill Dristy at the U.S. Office of Personnel Management for serving as an IT liaison to the Management, for serving as an IT liaison to the FEVS team and sending out the survey invitations and reminders at our prescribed times p

2

slide-3
SLIDE 3

Outline

I. Introduction

A Background on the FEVS A. Background on the FEVS B. Brief Literature Review

II. Experimental Methods and Data

A. Traditional Email Contact Protocol (Control) ( ) B. Rotating Cohort Design C. Dynamic Adaptive Design

  • III. Results
  • IV. Discussion and Ideas for Further Research

3

slide-4
SLIDE 4
  • I. Introduction

4

slide-5
SLIDE 5

Background on the FEVS

  • The FEVS is an annual organizational climate survey

administered by the U.S. Office of Personnel Management (OPM) to a sample of 800 000+ federal Management (OPM) to a sample of 800,000+ federal employees from 80+ agencies (biennial until 2010)

  • Web-based instrument comprised mainly of

attitudinal items (e.g., perceptions of leadership, job satisfaction) sent via personalized link embedded in an email message

  • Agencies launch in one of two cohorts staggered
  • ne week apart for a six-week field period

p p

  • Nonrespondents are sent weekly reminder emails

5

slide-6
SLIDE 6

The Federal Employee Viewpoint Survey

1,492,418

1,400,000 1,600,000

FEVS Sample & Respondent Counts: 2004-2014

839,788

600 000 800,000 1,000,000 1,200,000 , ,

147,915 687,687 392,752 276,424

200,000 400,000 600,000 2004 2006 2008 2010 2011 2012 2013 2014 2004 2006 2008 2010 2011 2012 2013 2014 60.0%

FEVS Response Rates: 2004-2014

53.5% 56.7% 52.2% 48 2%

50.0% 55.0%

50.9% 49.3% 46.1% 48.2% 46.8%

40 0% 45.0% 50.0%

6

40.0% 2004 2006 2008 2010 2011 2012 2013 2014

slide-7
SLIDE 7

More on Declining Response Rates

  • Many other surveys facing similar response rate declines

(de Leeuw and de Heer, 2002; Petroni et al., 2004; Curtain et al 2005; Brick and Williams 2013) Curtain et al., 2005; Brick and Williams, 2013)

  • Trend continues despite recent enhancements believed to

increase response levels:

– More inclusive scope – increased sample size over time (all but 16 agencies conduct a census) – Aggressive communications campaigns – agencies are provided fillable posters, template email messages to be sent from senior leaders; several agencies disseminate YouTube videos R l ti t b it t ll d b it f – Real-time response rate website – a controlled-access website for agency points-of-contact to track their response rate status and compare to governmentwide rate and that of prior FEVS administrations

  • Untapped area of research: evaluating alternative email

t t t l contact protocols

7

slide-8
SLIDE 8

Literature on Optimizing Contact Times

  • Abundance of research in interviewer-administered

surveys:

Telephone Surveys: Weeks et al 1987; Greenberg and – Telephone Surveys: Weeks et al., 1987; Greenberg and Stokes, 1990, Brick et al., 1996 – Face-to-Face Surveys: Purdon et al., 1999; Wagner, 2013 – Longitudinal Surveys: Lipps, 2012; Durrant el al., 2013

  • Important to acknowledge that optimizing contact
  • Important to acknowledge that optimizing contact

rates does not guarantee an increase in response rates (Kreuter and Müller, 2014) ( )

  • At present, we have no direct way to assess when

contact was made after sending an FEVS email invitation (future research could look into this)

8

slide-9
SLIDE 9

Literature on Email Timing in Self- Administered Web Surveys Administered Web Surveys

  • Research is scant, but results are mixed:

1 F h l (2004) bli h d l 1. Faught et al. (2004) – establishment survey; randomly assigned email addresses to one of 14 a.m./p.m. timeblocks defined for seven days of week  found ili W d d i d d hi h t emailing Wednesday morning produced highest response rate 2. Sauermann and Roach (2013) – 25 experimental groupings in a sample of approx. 25,000 science and engineering professionals; examined combinations of time of day, day of week, lag time between contacts  found no significant effects

  • Seems unlikely a single email protocol “treatment”

would work best on all survey populations

9

slide-10
SLIDE 10

II E i t l M th d

  • II. Experimental Methods

and Data and Data

10

slide-11
SLIDE 11

Traditional Email Contract Protocol (Control)

  • Initial email invitation to participate sent on Tuesday

morning of first week of field period

  • Five weekly reminders thereafter, also on Tuesday

morning

  • Final reminder sent on Friday of sixth week (even

though surveys stays open through COB following Monday)

Date Event Tuesday of Week 1 Initial Invitation Tuesday of Week 2 Reminder 1 Tuesday of Week 3 Reminder 2 Tuesday of Week 3 Reminder 2 Tuesday of Week 4 Reminder 3 Tuesday of Week 5 Reminder 4 Tuesday of Week 6 Reminder 5 F id f W k 6 Fi l R i d i h di "S Cl T d "

11

Friday of Week 6 Final Reminder with wording "Survey Closes Today" Following Monday, COB Survey Links Deactivated

slide-12
SLIDE 12

Alternative #1: Rotating Cohort Protocol

  • A static adaptive design, following terminology of

Bethlehem et al. (2011)

  • Randomly assign employees to one of six timeblocks

for initial email invitation and then cycle through five for initial email invitation, and then cycle through five remaining timeblocks when sending reminders

Tuesday Wednesday Thursday A.M. A C E . . C P.M. B D F

  • All nonrespondents still receive the final reminder on

F id f i h k

12

Friday of sixth week

slide-13
SLIDE 13

Alternative #2: Dynamic Adaptive Protocol

  • So named per terminology of Bethlehem et al. (2011)
  • As with first alternative, employees are randomly

assigned to one of six timeblocks for initial email invitation invitation

  • At each week’s end, a multinomial logistic regression

At each week s end, a multinomial logistic regression model fitted using sampling frame covariates (e.g., gender, supervisory status, subagency, minority t t )  t f i ti bl k ifi status)  a vector of six timeblock-specific response propensities generated for each nonrespondent

  • Subsequent week’s timeblock assigned

stochastically in proportion to these probabilities

13

slide-14
SLIDE 14

Experimental Data Set

  • FEVS 2015 sample (n = 34,799) of Department of

Defense 4th Estate (DoD excluding Army, Navy, and Ai F ) Air Force) Divided sample randomly into three groups of

  • Divided sample randomly into three groups of

approximately equal size, each receiving one of the three email contact protocols p

  • Bulk emailer used to send out invitations

simultaneously within a given timeblock

14

simultaneously within a given timeblock

slide-15
SLIDE 15
  • III. Results

15

slide-16
SLIDE 16

Response Rates by Experimental Group

  • Traditional contact protocol consistently outperformed two

alternatives over field period

  • Two alternatives performed nearly identically

16

slide-17
SLIDE 17

Response Rates by Select Demographics

50% 60%

Gender

50% 60%

Minority Status

20% 30% 40% 20% 30% 40% 0% 10% Female Male 0% 10% Non-Minority Minority 50% 60%

Supervisory Status

50% 60%

Pay Level

20% 30% 40% 50% 20% 30% 40% 50% 0% 10% Non-Supervisor Supervisor/Manager Executive 0% 10% < $50,000 $50,001 - $100,000 $100,001 - $150,000 More than $150,000

17

slide-18
SLIDE 18

Responses by Day of Work Week

  • Bulk of responses

arrive in same day as email invitation as email invitation

  • r reminder,

particularly if sent i i in morning

  • More spill-over to

More spill over to next day when emails go out in afternoon afternoon

  • Relatively little

y spill-over into subsequent week

18

slide-19
SLIDE 19

Visualizing Impact of Reminders

  • Histogram

illustrates hourly hourly response counts on d il days emails go out

  • Intraday

heaping indicates indicates upticks in reaction to il emails

19

slide-20
SLIDE 20

IV Di i d Id

  • IV. Discussion and Ideas

for Further Research for Further Research

20

slide-21
SLIDE 21

Discussion

  • Constant Tuesday morning invitation/reminder schedule

yielded highest response rate, although the edge was slight slight

  • Possible explanations for Tuesday morning effect:

p y g

– Extra time/days to handle request during work week? – Extra time/days of survey open period? – Respondents react quicker (e.g., more likely to respond in same Respondents react quicker (e.g., more likely to respond in same day) when invitation/reminder arrives earlier in week? – Respondents more likely to be in office and/or checking emails that particular day?

  • Unfortunately, our experimental design leaves us unable

to contrast the effect relative to a fixed to contrast the effect relative to a fixed Wednesday/Thursday protocol

21

slide-22
SLIDE 22

Ideas for Future Research

  • Investigate additional protocols, such as targeting

individual’s response time from prior FEVS, where il bl available Verify results on other subsets of FEVS survey

  • Verify results on other subsets of FEVS survey

population, not just DoD 4th Estate

  • Explore methods to assess contact of emails (e.g.,

via email read receipts)

  • Consider personalization of emails to improve

t hi h h t b ff ti i response rates, which has proven to be effective in Web-based self-administered surveys (Heerwegh, 2005) just as in paper surveys (Dillman et al., 2007)

22

) j p p y ( , )

slide-23
SLIDE 23

References

Bethlehem, J., Cobben, F., and Schouten, B. (2011). Handbook of Nonresponse in Household Surveys. Hoboken, NJ: Wiley. Brick, M., Allen, B., Cunningham, P., and Maklan, D. (1996). “Outcomes of a Calling Protocol in a Telephone Survey ” Proceedings of the Survey Research Methods Section Alexandria VA: American Statistical Association Survey, Proceedings of the Survey Research Methods Section. Alexandria, VA: American Statistical Association. Brick, M. and Williams, D. (2013). “Explaining Rising Nonresponse Rates in Cross-Sectional Surveys,” The Annals

  • f the American Academy of Political and Social Science, 645, pp. 36 – 59.

Curtin, R., Presser, S., and Singer, E. (2005). “Changes in Telephone Survey Nonresponse Over the Past Quarter Century,” Public Opinion Quarterly, 69, pp. 87 – 98. Dillman, D., Lesser, V., Mason, R., Carlson, J., Willits, F., Robertson, R., Burke, B. (2007). “Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies ” Rural Sociology Surveys for General Public and Populations with a Group Identity: Results from Nine Studies, Rural Sociology, 72, pp. 632 – 646. Durrant, G., D’Arrigo, J., Müller, G. (2013). “Modeling Call Record Data: Examples from Cross-Sectional and Longitudinal Surveys,” in Improving Surveys with Paradata: Analytic Uses of Process Information, ed. Kreuter, F. Faught, K., Whitten, D., and Green, K. (2004). “Doing Survey Research on the Internet: Yes, Timing Does Matter,” Journal of Computer Information Systems, 45, pp. 26 – 34. Greenberg B and Stokes S (1990) “Developing an Optimal Call Scheduling Strategy for a Telephone Survey ” Greenberg, B., and Stokes, S. (1990). Developing an Optimal Call Scheduling Strategy for a Telephone Survey, Journal of Official Statistics, 6, pp. 421 – 435. Heerwegh, D. (2005). “Effects of Personal Salutations in E-Mail Invitations to Participate in a Web Survey,” Public Opinion Quarterly, 69, pp. 588 – 598. Kreuter, F., and Müller, G. (2015). “A Note on Improving Process Efficiency in Panel Surveys with Paradata,” Field Methods, 27, pp. 55 – 65.

23

slide-24
SLIDE 24

References (cont’d)

de Leeuw, E., and de Heer, W. (2002). “Trends in Household Survey Nonresponse: A Longitudinal and International Comparison,” in Survey Nonresponse, eds. Groves, R., Dillman, D., Eltinge, J., and Little, R. New York: Wiley. Lipps O (2012) “A Note on Improving Contact Times in Panel Surveys ” Field Methods 24 pp 95 111 Lipps, O. (2012). A Note on Improving Contact Times in Panel Surveys, Field Methods, 24, pp. 95 – 111. Petroni, R., Sigman, R., Willimack, D., Cohen, S., and Tucker, C. (2004). “Response Rates and Nonresponse in Establishment Surveys – BLS and Census Bureau,” Paper presented to the Federal Economic Statistics Advisory

  • Committiee. Available on-line at:

htt // b / b t/ df/R t i t bli h t FESAC121404 df https://www.bea.gov/about/pdf/ResponseratesnonresponseinestablishmentsurveysFESAC121404.pdf. Purdon, S., Campanelli, P., Sturgis, P. (1999). “Interviewers’ Calling Strategies on Face-to-Face Interview Surveys,” Journal of Official Statistics, 15, pp. 199 – 216. Sauermann, H., and Roach, M. (2013). “Increasing Web Survey Response Rates in Innovation Research: An Experimental Study of Static and Dynamic Contact Design Features,” Research Policy, 42, pp. 273 – 286. Wagner, J., (2013). “Using Paradata-Driven Models to Improve Contact Rates in Telephone and Face-to-Face S ” i I i S ith P d t A l ti U f P I f ti d K t F Surveys,” in Improving Surveys with Paradata: Analytic Uses of Process Information, ed. Kreuter, F. Weeks, M., Kulka, R., and Pierson, S. (1987). “Optimal Call Scheduling for a Telephone Survey,” Public Opinion Quarterly, 51, pp. 540 – 549.

24

slide-25
SLIDE 25

Thanks!

Questions/Comments? Questions/Comments? Taylor.Lewis@opm.gov Karl.Hess@opm.gov @ p g

25