Capture Elusive Level 3 Data: The Secrets of Survey Design - - PowerPoint PPT Presentation

capture elusive level 3 data the secrets of survey design
SMART_READER_LITE
LIVE PREVIEW

Capture Elusive Level 3 Data: The Secrets of Survey Design - - PowerPoint PPT Presentation

Capture Elusive Level 3 Data: The Secrets of Survey Design Presented by: Ken Phillips Phillips Associates March 22, 2018 Phillips Associates 1 Agenda 1. Examine Level 3 evaluation facts 2. Analyze survey creation errors in a sample Level


slide-1
SLIDE 1

Phillips Associates

1

Capture Elusive Level 3 Data: The Secrets of Survey Design

Presented by: Ken Phillips Phillips Associates March 22, 2018
slide-2
SLIDE 2

Phillips Associates

2

Agenda

  • 1. Examine Level 3 evaluation facts
  • 2. Analyze survey creation errors in a

sample Level 3 evaluation

  • 3. Discover 12 tips for creating valid,

scientifically sound Level 3 evaluations

slide-3
SLIDE 3

Phillips Associates

3

Kirkpatrick / Phillips Evaluation Model

Level 1: Reaction Degree to which participants find the training favorable, engaging, and relevant to their jobs Level 2: Learning Degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training Level 3: Behavior Degree to which participants apply what they learned during training when they are back on the job Level 4: Results Degree to which targeted outcomes improve as a result of the training, and the support and accountability package Level 5: ROI Degree to which monetary program benefits exceed program costs

slide-4
SLIDE 4

Phillips Associates

4

Level 3 Evaluation Facts

Source: ATD Research Study, “Evaluating Learning Getting to Measurements That Matter,” 2015

60%

Organizations evaluate some programs at Level 3

  • f live classroom

programs being evaluated

33% 75%

Organizations view data collected as having high

  • r very high value

18%

  • f tech based

programs being evaluated

slide-5
SLIDE 5

Phillips Associates

5 Source: Donald & James Kirkpatrick, “Evaluating Training Programs: The Four Levels,” 2006.

5

Data Collection Methods

slide-6
SLIDE 6

Phillips Associates

6

Possible Survey Respondents

External customers Managers

  • f learners

Direct reports Peers/ Colleagues Learners

slide-7
SLIDE 7

Phillips Associates

7

How to Decide

has first-hand knowledge of learners’ behavior? credible do results need to be?

Who How

slide-8
SLIDE 8

Phillips Associates

8

Sample Level 3 Participant Survey

slide-9
SLIDE 9

Phillips Associates

9

Instructions

Note: Survey respondents are the direct reports

  • f managers/ supervisors who attended an

interpersonal feedback learning program.

  • 1. Form a group of 3, 4 or 5 persons
  • 2. Review sample Level 3 participant survey in

handout and see how many different survey creation errors you can find (Hint: 9 different errors are built into the survey)

  • 3. Be prepared to discuss your findings with the

whole group

slide-10
SLIDE 10

Phillips Associates

10

Scientifically Sound Survey Design

Measure ment

Content Format

slide-11
SLIDE 11 Phillips Associates

Content

slide-12
SLIDE 12

Phillips Associates

12
  • 8. Before providing employees with feedback

about their job performance, my manager considers whether or not he or she is knowledgeable about their job.

What’s Wrong With These?

  • 25. When giving feedback to an employee my

manager considers whether it should be done privately or in the presence of others.

slide-13
SLIDE 13

Phillips Associates

13

Tip 1: Content

Source: Palmer Morrel-Samuels, “Getting the Truth into Workplace Surveys”, Harvard Business Review, 2002.

Focus on observable behavior not thoughts or motives.

slide-14
SLIDE 14

Phillips Associates

14
  • 18. My manager provides employees with

regular ongoing feedback about their job performance and speaks in a normal conversational tone or manner when delivering the feedback.

What’s Wrong With These?

  • 14. My manager gives his or her employees

feedback just as soon as possible after an event has happened and avoids getting emotional or evaluative.

slide-15
SLIDE 15

Phillips Associates

15

Tip 2: Content

Limit each item to a single description of behavior.

Source: Palmer Morrel-Samuels, 2002

slide-16
SLIDE 16

Phillips Associates

16

Example

My manager gives his or her employees feedback just as soon as possible after an event has happened. My manager avoids getting emotional or evaluative when giving feedback to his or her employees.

slide-17
SLIDE 17

Phillips Associates

17

What’s Wrong With These?

  • 2. My manager doesn’t get to know his or

her employees as individuals before providing them with feedback about their job performance.

  • 7. When giving employees feedback about

their job performance, my manager doesn’t distinguish between patterns of behavior and random one-time events.

slide-18
SLIDE 18

Phillips Associates

18

Tip 3: Content

Word about 1/3 of the survey items so that the desired answer is negative.

Source: Palmer Morrel-Samuels, 2002

slide-19
SLIDE 19

Phillips Associates

19

Format

slide-20
SLIDE 20

Phillips Associates

20

What’s Wrong With These?

Building Trust Credibility Feedback Sign Feedback Timing Feedback Frequency Message Characteristics

slide-21
SLIDE 21

Phillips Associates

21

Tip 4: Format

Keep sections of the survey unlabeled.

Source: Palmer Morrel-Samuels, 2002

slide-22
SLIDE 22

Phillips Associates

22

Source: Palmer Morrel-Samuels, 2002

Design sections to contain a similar number of items and questions to contain a similar number of words.

Tip 5: Format

slide-23
SLIDE 23

Phillips Associates

23

Source: Palmer Morrel-Samuels, 2002

Tip 6: Format

Place questions regarding respondent demographics (e.g. name, title, department, etc.) at end of survey, make completion optional and keep questions to a minimum.

slide-24
SLIDE 24

Phillips Associates

24

Measurement

slide-25
SLIDE 25

Phillips Associates

25

Source: Ken Phillips, “Capturing Elusive Level 3 Data: The Secrets of Survey Design”, Unpublished Article, 2013.

Tip 7: Measurement

Collect data from multiple observers

  • r a single observer multiple times.
slide-26
SLIDE 26

Phillips Associates

26

What’s Wrong With This?

Strongly Agree Agree Disagree Strongly Disagree N/A

4 3 2 1

slide-27
SLIDE 27

Phillips Associates

27

*Palmer Morrel-Samuels, 2002. Source: Palmer Morrel-Samuels, 2002

Tip 8: Measurement

Create a response scale with numbers at regularly spaced intervals and words only at each end.

slide-28
SLIDE 28

Phillips Associates

28

Examples

This: Not This: Or This:

Not at all True Completely True

1 2 3 4 6 7 5

Not at all True Completely True

1 2 3 4 6 7 5

Rarely True Occasionally True Somewhat True Mostly True Frequently True Not at all True Completely True Rarely True Occasionally True Somewhat True Mostly True Frequently True

slide-29
SLIDE 29

Phillips Associates

29

*Palmer Morrel-Samuels, 2002. Source: Palmer Morrel-Samuels, 2002

Tip 9: Measurement

Use only one response scale with an odd number of points (7, 9 & 11 point scales are best)

slide-30
SLIDE 30

Phillips Associates

30

Odd vs. Even Scale

This: Not This:

slide-31
SLIDE 31

Phillips Associates

31

Tip 10: Measurement

Use a response scale that measures frequency not agreement or effectiveness.

Source: Palmer Morrel-Samuels, 2002

slide-32
SLIDE 32

Phillips Associates

32

Examples

This: Or this:

Never Always

1 2 3 4 6 7 5

Not at all True Completely True

1 2 3 4 6 7 5

slide-33
SLIDE 33

Phillips Associates

33

Source: Ken Phillips, 2013

Tip 11: Measurement

Place small numbers at left or low end of scale and large numbers at right

  • r high end of scale.
slide-34
SLIDE 34

Phillips Associates

34

Examples

This:

Not at all True Completely True

1 2 3 4 6 7 5

Completely True Not at all True

7 6 5 4 2 1 3 Not This:

slide-35
SLIDE 35

Phillips Associates

35

*Palmer Morrel-Samuels, 2002. Source: Palmer Morrel-Samuels, 2002

Tip 12: Measurement

Include a “Did Not Observe” response choice and make it different.

slide-36
SLIDE 36

Phillips Associates

36

Example

Not at all True Completely True

1 2 3 4 6 7 5

Did Not Observe

slide-37
SLIDE 37

Phillips Associates

37

Summary: Content

Focus on observable behavior Limit ideas to a single description of behavior Word 1/3 of items as reverse score

slide-38
SLIDE 38

Phillips Associates

38

Summary: Format

Keep survey sections unlabeled Design sections to contain similar number of items & questions similar number of words Place questions regarding respondent demographics at end of survey, make completion optional and keep questions to a minimum

slide-39
SLIDE 39

Phillips Associates

39

Summary: Measurement

Collect data from multiple observers or multiple times Create a response scale that:

 Has words only at each end  Has an odd number of points  Measures frequency  Has small numbers at left and large numbers at right  Includes a “Did Not Observe” that is different

slide-40
SLIDE 40

Phillips Associates

40

Source: Palmer Morrel-Samuels, 2002

The difference between a good survey and a bad one… quite simply, is careful and informed design.

slide-41
SLIDE 41

Phillips Associates

41
slide-42
SLIDE 42

Phillips Associates

42

Phillips, Ken, “Eight Tips on Developing Valid Level 1 Evaluation Forms”, Training Today, Fall 2007, pps. 8 & 14. Phillips, Ken, “Developing Valid Level 2 Evaluations”, Training Today, Fall 2009, pps. 6-8. Phillips, Ken, “Capturing Elusive Level 3 Data: The Secrets of Survey Design”, Unpublished article, 2013. Phillips, Ken, “Level 1 Evaluations: Do They Have a Role in Organizational Learning Strategy?”, Unpublished article, 2013. Phillips, Ken, “Business Results Made Visible: Designing Proof Positive Level 4 Evaluations”, Unpublished article, 2013.

Free Articles

slide-43
SLIDE 43

Phillips Associates

43

Phillips Associates ken@phillipsassociates.com (847) 231-6068 www.phillipsassociates.com 34137 N. Wooded Glen Drive Grayslake, Illinois 60030

Ken Phillips