Pilot surveys, cogni1ve interviews, and focus groups Oh my! A - - PowerPoint PPT Presentation

pilot surveys cogni1ve interviews and focus groups oh my
SMART_READER_LITE
LIVE PREVIEW

Pilot surveys, cogni1ve interviews, and focus groups Oh my! A - - PowerPoint PPT Presentation

Pilot surveys, cogni1ve interviews, and focus groups Oh my! A mixed-methods approach toward designing a longitudinal survey Darnell Cole, Joseph Kitchen, Adrianna Kezar & MaF Soldner Session Overview Introduction Research


slide-1
SLIDE 1

Pilot surveys, cogni1ve interviews, and focus groups – Oh my!

A mixed-methods approach toward designing a longitudinal survey

Darnell Cole, Joseph Kitchen, Adrianna Kezar & MaF Soldner

slide-2
SLIDE 2
  • Introduction
  • Research Project Overview
  • Eight-Step Mixed Methods Process
  • 1. Review of psycho-social measures/Qual. data
  • 2. Baseline psychometric pilot
  • 3. Baseline questionnaire administration
  • 4. Focus group discussions
  • 5. Qualitative case study research
  • 6. Follow-up psychometric pilot
  • 7. Cognitive Interviews
  • 8. Follow-up questionnaire administration
  • Conclusions and lessons learned
  • Q & A

Session Overview

slide-3
SLIDE 3

Research Project Overview

slide-4
SLIDE 4

Comprehensive College Transi1on Program

  • Overview

○ Since 1960s, foundation provided scholarships for low income students ○ Five years of funding ○ Three campuses in university system ○ Small learning community courses, mentoring, residential component, first year experience course, programming from staff ○ Two year program

  • Purpose for project

○ How the comprehensive college transition program facilitates engagement, the development of academic self-efficacy, mattering/ sense of belonging, persistence and other outcomes.

slide-5
SLIDE 5

Methods

  • Concurrent mixed-methods design

– 2 Cohorts (2015, 2016) – Embedded sequen1al exploratory/explanatory – Six year study

  • Two important tenets of mixed-methods

design:

– Methodological eclec1cism – Paradigm pluralism

slide-6
SLIDE 6

6

Initial qualitative data Baseline Pilot Baseline Survey Focus Group Interviews Case Study Research Follow Up Pilot Cognitive Interviews Follow Up Survey

Longitudinal Survey Development

slide-7
SLIDE 7

Step One: Qualita1ve Data & Review of Psycho-Social Measures

slide-8
SLIDE 8

Qualita1ve Data & Review of Psycho-social Measures

  • Site visits with programs
  • Interviews with program staff and key administra1ons
  • Program document collec1on
  • Focus groups
  • Program faculty
  • 50 students on each campus
  • 12-20 campus stakeholders (e.g., counseling, TRIO, student affairs)
  • Literature review conducted a^er site visits
  • Iden1fy measures to test in psychometric pilot
slide-9
SLIDE 9

Qualita1ve Data & Review of psychosocial measures: Major Takeaways

  • Iden1fied several addi1onal constructs to

measure in survey ques1onnaire

  • Used literature review to iden1fy other

measures that may be relevant given the characteris1cs of the program and par1cipants

  • Biannual process and half-day retreats to

con1nue literature review

  • Quant. team consistently works with
  • qual. team to inform survey and get

feedback

slide-10
SLIDE 10

Step Two: Baseline Psychometric Pilot Survey

slide-11
SLIDE 11

Psychometric Pilot Survey

  • Conducted in July 2015
  • Purposes

– Test and validate more than a dozen outcome variables, including intermediate outcomes – Evaluate opera1onal protocols

  • Sample included 972 scholars from the 2012 and 2013 cohort
  • 350 respondents, about 36 percent.
slide-12
SLIDE 12

Key Baseline Constructs

  • Peer Interactions
  • Faculty Interactions
  • Time Use
  • Social self-efficacy
  • Academic self-efficacy
  • Career decision-making self-efficacy
  • Sense of belonging
  • Perceived academic control
  • Resiliency
  • Expectations about mattering
  • Financial Stress
  • Malleability of ability
  • Interpretation of difficulty
slide-13
SLIDE 13

Psychometric Pilot: Major Takeaways

  • Psychometric proper1es of adapted

measures are o^en in the ball-park of published results, but using Confirmatory Factor Analysis to verify can add confidence in your choices.

  • We used graded response IRT models as

an addi1onal check following CFA, par1cularly when we were considering scale reduc1on.

slide-14
SLIDE 14

Step Three: Baseline Survey Administra1on

slide-15
SLIDE 15

Baseline Survey

  • Primary purpose – To obtain an early (expected) measure of

key outcome variables (and intermediate outcome variables)

  • Secondary purpose – evaluate improved opera1onal protocols
  • Baseline Survey administered in August 2015
  • Sample was 1335 program par1cipants from the 2015 cohort

(first year students only)

slide-16
SLIDE 16

Key Baseline Constructs

  • High school sense of belonging
  • High school interactions with peers
  • High school interactions with faculty
  • High school time use
  • Expectations about social self-efficacy
  • Expectations about academic self-efficacy
  • Expectations about mattering
  • Expectations about belonging
  • Expectations about likelihood of graduation
slide-17
SLIDE 17

Baseline Survey Administra1on: Major Takeaways

  • Our baseline administra1on uncovered

at least one phenomenon that was not detected in the pilot, given its popula1on and the 1ming of administra1on: higher than an1cipated baseline scores on self- efficacy measures.

  • What is the temporal nature of your

phenomena? If they are 1me-variant, how best to measure?

slide-18
SLIDE 18

Step Four: Focus Group Discussions

slide-19
SLIDE 19

Focus Group Discussions

  • Exploring a set of phenomena with the intent of determining

appropriateness of survey item development (i.e., exploratory)

  • Semi-structured, open-ended protocol with 7 guiding

ques1ons

  • Student experiences, percep1ons, and opinions of first contact with

program, summer experience, and transi1on into the program

slide-20
SLIDE 20

Focus Group Discussions: Major Takeaways

Key findings

  • Advising
  • Scheduling conflicts
  • Power of small class size
  • High expecta1ons within the

program

  • Considered adding items to further discriminate

key constructs around high expecta1ons

  • Considered differences between program and

non-program classroom experiences

slide-21
SLIDE 21

Step Five: Case Study Research

slide-22
SLIDE 22

Case Study Research

  • BeFer understanding of program elements on each campus,

student bodies, major stakeholders and broader campus environment

  • Document the ways that the program operates to shape

student experiences

  • Case study elements included program observa1ons,

interviews with staff and stakeholders, and social media/ document analysis

slide-23
SLIDE 23

Case Study Research: Major Takeaways

Key findings

  • Helped iden1fy the important program

elements to study

  • Helped with developing appropriate survey

phrasing and wording

  • Shaped survey instrument design
  • Iden1fied important role of staff members;

added ques1ons related to this

  • Items added to address staff-student

rela1onships, engagement, and support

  • Added in mid-semester grade check
  • Digital diary informa1on resulted in addi1on of

1me-use ques1ons

  • Observa1ons led to addi1on of faculty-student

ques1ons

slide-24
SLIDE 24

Step Six: Follow-up Psychometric Pilot

slide-25
SLIDE 25

Follow-Up Pilot Survey

  • Administered in January 2016
  • Purpose was to test and validate measurement of

10 key engagement constructs

  • Sample included 1652 scholars from the 2013 and

2014 cohorts

  • 352 respondents, about a 21% response rate
slide-26
SLIDE 26

Follow-Up Pilot Survey: Major Takeaways

Key findings

  • Once again, results demonstrated the

importance of pilot tes1ng even those items you think students should have reasonable knowledge of!

  • Expanded response op1ons to seven points
  • Time anchor in the ques1on stem may help the

performance of constructs

  • Do we con1nue to include mediocre performing

constructs for the sake of the longitudinal analysis

  • Minor tweaks to items within the constructs are

s1ll needed to improve performance

slide-27
SLIDE 27

Step Seven: Cogni1ve Interviews

slide-28
SLIDE 28

Cogni1ve Interviews

  • Conducted to revise survey items related to student

engagement with specific program elements

  • Eleven interviews were conducted in Feb. 2016
  • Two components: (a) think aloud method; and (b)

ques1on probing

  • Contributed to survey refinement and helped ensure

quality, valid instrument

slide-29
SLIDE 29

Cogni1ve Interviews: Major Takeaways

Key findings

  • Many survey items sufficiently strong
  • Changes to survey resul1ng from cogni1ve

interviews:

  • Using a gate ques1on to determine event

par1cipa1on prior to asking about engagement

  • Use of grid instead of repeated sec1ons

about each event across different types of engagement

  • Including a “not sure” op1on
slide-30
SLIDE 30

Step Eight: Follow-up Survey Administra1on

slide-31
SLIDE 31

31

Follow-up Survey Sample

  • Sample was 1297 program par1cipants,

COS, and Control from the 2015 cohort (first year students only)

  • 951 Respondents (73.3% response rate)
slide-32
SLIDE 32

Key Follow-up Constructs

  • “Actual” academic self-efficacy
  • “Actual” social self-efficacy
  • Experience of mattering
  • Experience of belonging
  • Engagement with peers and faculty
  • Engagement with program features
  • Financial stress
  • Revised graduation expectations
slide-33
SLIDE 33

33

Follow-up Survey: Major Takeaways

  • We’re still in the preliminary

phases of the substantive analysis.

  • However, we did some analyses

immediately that were designed to feed back into the questionnaire design phase for the next cohort.

  • We also conducted analyses to

evaluate the effect of methodological experiments designed to improve response, and tweak them for the Cohort 2 Baseline.

slide-34
SLIDE 34

Conclusions and Lessons Learned

slide-35
SLIDE 35

35

Initial qualitative data Baseline Pilot Baseline Survey Focus Group Interviews Case Study Research Follow Up Pilot Cognitive Interviews Follow Up Survey

Longitudinal Survey Development

slide-36
SLIDE 36

36

Challenges Benefits

  • Planning of qualitative and quantitative

sequence

  • Established valid, reliable measures
  • Time intensive strategy
  • Strengthened and contextualized

measures via cognitive interviews, psychometrics

  • Gaining buy-in and securing

collaborative relationships

  • Moved beyond reliance on literature

alone

  • Decision-making about what to do next

when established measures don’t work

  • Tested items with students drawn from
  • ur population of interest
  • Deciding how to edit or add new items

for survey based on qual. findings

  • New items added as a result of

qualitative findings

  • Bringing together (sometimes

conflicting) qual. and quant. findings to make a cohesive argument

  • Triangulation of data and clarification of

findings from quant. or qual.

  • Striking the right methodological

balance

  • Qual. work provides guidance in

developing survey items and identifying relevant measures

slide-37
SLIDE 37

Ques1ons and comments We welcome your feedback and ideas!

slide-38
SLIDE 38

Pullias Center for Higher Educa1on Rossier School of Educa1on University of Southern California 3470 Trousdale Parkway Waite Phillips Hall 701 Los Angeles, CA 90089 Website: pullias.usc.edu