Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, - - PowerPoint PPT Presentation

jenna fulton and rebecca medway joint program in survey
SMART_READER_LITE
LIVE PREVIEW

Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, - - PowerPoint PPT Presentation

When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, University of Maryland May 19, 2012 Background: Mixed-Mode


slide-1
SLIDE 1

When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates

Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, University of Maryland May 19, 2012

slide-2
SLIDE 2

Background: Mixed-Mode Surveys

  • Growing use of mixed-mode surveys among practitioners
  • Potential benefits for cost, coverage, and response rate
  • One specific mixed-mode design – mail + Web – is often

used in an attempt to increase response rates

  • Advantages: both are self-administered modes, likely have similar

measurement error properties

  • Two strategies for administration:
  • “Sequential” mixed-mode
  • One mode in initial contacts, switch to other in later contacts
  • Benefits response rates relative to a mail survey
  • “Concurrent” mixed-mode
  • Both modes simultaneously in all contacts

2

slide-3
SLIDE 3

Background: Mixed-Mode Surveys

  • Growing use of mixed-mode surveys among practitioners
  • Potential benefits for cost, coverage, and response rate
  • One specific mixed-mode design – mail + Web – is often

used in an attempt to increase response rates

  • Advantages: both are self-administered modes, likely have similar

measurement error properties

  • Two strategies for administration:
  • “Sequential” mixed-mode
  • One mode in initial contacts, switch to other in later contacts
  • Benefits response rates relative to a mail survey
  • “Concurrent” mixed-mode
  • Both modes simultaneously in all contacts
  • Mixed effects on response rates relative to a mail survey

3

slide-4
SLIDE 4

Methods: Meta-Analysis

  • Given mixed results in literature, we conducted a meta-

analysis to:

  • Estimate effect of concurrent Web options on mail survey response rates
  • Evaluate whether study features influence size of effect
  • Search for studies
  • Searched journals and conference abstracts
  • Posted messages to AAPOR and ASA listservers
  • Eligible studies
  • Randomly assigned respondents to either
  • “mail-only” condition, or
  • “mode choice” condition (offered mail and web options concurrently )
  • Both conditions: included same survey items and same incentive (if
  • ffered), made all contacts by mail, and did not encourage response by a

particular mode 4

slide-5
SLIDE 5

Methods: Effect size

  • Odds ratios (ORs) to quantify relationship between

response rate in mail-only and mode choice conditions

  • Used response rate for mail-only condition as reference
  • To calculate overall OR: weighted study-level ORs by inverse of

variance

  • Interpretation of ORs
  • OR < 1: adding a Web option to a mail survey has a negative impact
  • n response rates
  • OR = 1: adding a Web option has no effect
  • OR > 1: adding a Web option has a positive impact

5

slide-6
SLIDE 6

Methods: Moderator Analyses

  • Used moderator analyses to determine if study

characteristics impacted the magnitude of the effect size

  • Greater % respondents selecting Web
  • Young people as target population
  • Published study
  • Government sponsorship
  • Required participation
  • Incentive
  • Salient topic
  • Comprehensive Meta-Analysis software, random effects

model

These characteristics would increase motivation to complete survey, reducing difference between conditions Adding Web option would be more effective for studies with these characteristics 6

slide-7
SLIDE 7

Results: Eligible Studies

  • Search produced 19 eligible experimental comparisons
  • All studies conducted during or after 2000
  • Choice response rate lower than mail-only response rate for

almost all comparisons

Min Max Mean Median Mail-only sample size 139 212,072 17,547 1,107 Choice sample size 141 32,520 5,161 1,106 Mail-only response rate 16% 75% 51% 58% Choice response rate 15% 74% 48% 53% Proportion utilizing Web in choice condition 4% 52% 17% 10% 7

slide-8
SLIDE 8

Results: Effect Sizes

  • ORs ranged from 0.57 to 1.13
  • 17 of 19 ORs less than 1.00
  • 8 of 19 ORs significantly less than 1.00
  • Only 1 OR significantly greater than 1.00
  • Overall weighted OR was 0.87 (p<0.001)
  • Providing a concurrent web option in mail surveys decreases the
  • dds of response by 12.8% as compared to a mail-only survey.

8

slide-9
SLIDE 9

Radon et al. (0.57) Smyth et al. (0.66) Ziegenfuss et al. (0.70) Schmuhl et al. (0.72) Hardigan et al. (0.72) Griffin et al. (0.79) Israel (0.80) Gentry and Good (a) (0.84) Werner and Forsman (0.84) Millar and Dillman (a) (0.87) Gentry and Good (b) (0.87) Total (0.87) Millar and Dillman (b) (0.89) Lesser et al. (b) (0.90) Brogger et al. (0.93) Turner et al. (0.93) Friese et al. (0.96) Lesser et al. (a) (0.96) Brady et al. (1.01) Schneider et al. (1.13)

.4 .6 .8 1 1.2 1.4 Odds Ratio

  • The number in

parentheses is the odds ratio for each comparison.

  • The dot for each

comparison represents the odds ratio value, while the bar spans the 95% confidence interval.

Results: Forest Plot of Effect Sizes

9

slide-10
SLIDE 10

Results: Moderator Analyses

  • None of the moderator analyses were significant at the 0.05

level.

0.90 0.86 0.90 0.87 0.89 0.82 0.80 0.90 1.00 15%+ <15% Youth All ages No Yes Published Age of Target Pop. Percent Choosing Web Odds Ratios (n=7) (n=12) (n=11) (n=8) (n=14) (n=5) 10

slide-11
SLIDE 11

Results: Moderator Analyses

  • Surveys with government sponsor see smaller difference

between mail-only and mode choice condition response rates (significant at 0.10 level)

0.92* 0.83 0.96 0.86 0.85 0.90 0.84 0.90 0.80 0.90 1.00 Yes No Yes No Yes No High Regular Government Sponsor Required Topic Salience Incentive Odds Ratios (n=10) (n=9) (n=3) (n=16) (n=9) (n=10) (n=12) (n=7)

* p<0.10

11

slide-12
SLIDE 12

Discussion

  • Across 19 experimental comparisons, we find that offering

a concurrent Web option in a mail survey results in a significant reduction in the response rate.

  • As demonstrated by the moderator analyses, the study

characteristics we examined largely do not influence the magnitude of the effect.

  • Potentially due to: small number of eligible studies; variation in

design characteristics, sample sizes, and response rate calculations 12

slide-13
SLIDE 13

Discussion

Three hypotheses to explain negative effect of concurrent Web options: 1. Making a choice between two modes

  • Increases complexity and burden of responding
  • Weighing pros and cons of each may cause both to appear less

attractive

2. Replying by Web involves a break in the response process

  • Receive survey invitation in mail and likely open it as part of a larger

task of sorting through and responding to mail

  • If choose to complete survey on the Web, they must transition to a

different category of behavior.

3. Implementation problems with Web instrument

  • Sample members who attempt to complete survey online may

abandon effort due to frustration with computerized instrument or Internet connection 13

slide-14
SLIDE 14

Discussion

  • Our findings are only generalizable to specific type of

concurrent Web option included in this meta-analysis.

  • Concurrent Web options also may be offered in mail surveys in other

ways, such as:

  • Adding email or telephone contacts to the design
  • Offering incentives that are conditional on Web response.
  • Further research will need to be conducted to determine whether

these types of designs can be used to increase response rates and improve research quality. 14

slide-15
SLIDE 15

Thank you!

  • For additional information:
  • JFulton@survey.umd.edu
  • RMedway@survey.umd.edu

15

slide-16
SLIDE 16

Additional Slides

16

slide-17
SLIDE 17

Discussion

  • Researchers may be interested in outcomes other than

response rate – such as cost, nonresponse bias, timeliness,

  • r data quality.
  • Reported only occasionally in studies included in this meta-

analysis; as a result, we are not able to empirically evaluate effect of concurrent Web options on these outcomes.

17