Performance Management in VR: What is it and Why do it? Northern - - PowerPoint PPT Presentation

performance management in vr what is it and why do it
SMART_READER_LITE
LIVE PREVIEW

Performance Management in VR: What is it and Why do it? Northern - - PowerPoint PPT Presentation

Performance Management in VR: What is it and Why do it? Northern Arizona University (NAU)-Institute for Human Development (IHD) Evidence for Success Combined Disability Conference Scottsdale/Fountain Hills, Arizona July 10, 2018 Cayte


slide-1
SLIDE 1

Performance Management in VR: What is it and Why do it?

Northern Arizona University (NAU)-Institute for Human Development (IHD) Evidence for Success Combined Disability Conference Scottsdale/Fountain Hills, Arizona July 10, 2018 Cayte Anderson, Ph.D., CRC

slide-2
SLIDE 2

Session Overview

  • Evaluation principles
  • Why should we integrate evaluation into our programs?
  • Demystifying the process
  • Culturally responsive approaches to evaluation
  • Program evaluation and quality assurance within VR and

AIVRS programs

  • Resources (AIVRTTAC, PEQA, AEA, conferences)
slide-3
SLIDE 3

What happens when you hear the words “data” or “evaluation”?

a) Do your eyes glaze over? b) Do you picture a room full of serious people, hunched

  • ver their keyboards and calculators, crunching

numbers?

c) Embrace your inner nerd?

data nerd.

slide-4
SLIDE 4

What if using data is really more like this?

slide-5
SLIDE 5

What is Program Evaluation?

“the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development.”1 Program Evaluation or Informal Assessment?

  • the difference is that PE is conducted according to

guidelines and grounded in principles

slide-6
SLIDE 6

Program Evaluation Guiding Principles

Systematic Inquiry Competence Integrity Respect for People Common Good and Equity

slide-7
SLIDE 7

Principle A: Systematic Inquiry2

Evaluators conduct systematic, data-based inquiries that:

  • Conduct data-based inquiries that are thorough,

methodical, and contextually relevant.

  • Explore strengths and shortcomings of evaluation

questions and approaches

  • Communicate approaches, methods, and limitations

accurately

slide-8
SLIDE 8

Principle B: Competence

Evaluators provide competent performance to stakeholders:

  • Possess appropriate skills and experience
  • Demonstrate cultural competence
  • Practice within limits of competence
  • Continually improve competencies
slide-9
SLIDE 9

Principle C: Integrity/Honesty

Evaluators display honesty and integrity and attempt to ensure them throughout the entire evaluation process:

  • Practice honesty with clients and stakeholders
  • Disclose values, interests, and conflicts of interest
  • Represent accurately methods, data, and findings
  • Disclose source of request and financial support for

evaluation

slide-10
SLIDE 10

Principle D: Respect for People

Evaluators respect security, dignity, and self-worth of all stakeholders:

  • Honor the dignity, well-being, and self-worth of

individuals

  • Acknowledge the influence of culture within and across

groups

slide-11
SLIDE 11

Principle E: Common Good and Equity

Evaluators take into account general and public interests:

  • Include relevant stakeholders
  • Balance client and stakeholder needs
  • Examine assumptions and potential side effects
  • Present results in understandable forms
slide-12
SLIDE 12

Debunking Evaluation Myths

Evaluation is thought to be... Evaluation can be... Expensive Cost-Effective Time-consuming Strategically timed Tangential Integrated Technical Accurate Not Inclusive Engaging Academic Practical Punitive Helpful Political Participatory Useless Useful

slide-13
SLIDE 13

Are You Being Pushed or Pulled Into Evaluation?

Pushed: external mandates from funders, authorizers, or

  • thers.

Performance measures are really just another way of asking, “How are we doing?” leading to a deeper exploration of “Why?” Pulled: internal need to determine how the program is performing and what can be improved It’s usually a combination of both!

slide-14
SLIDE 14

A Framework for Program Evaluation

https://www.cdc.gov/mmwr/PDF/rr/rr4811.pdf

Standa ds

Utili'ty

F ,

ea.si bi ity

Propril, ety Accuracy

slide-15
SLIDE 15

Research or Evaluation?

slide-16
SLIDE 16

RESEARCH

Seek to , g1 enerate new knowledg, e Researcher;..focused Hypothes- es Make res,eairch recomm, endations Publish results

EVALUATION

Information for decision making Stakeholde1 r-J

  • oused

Key Questions Recommendations based on key , questions Report to-stakeholders

slide-17
SLIDE 17

Distinguishing Principles of Research and Evaluation

(CDC, Introduction to Program Evaluation for Public Health Programs, 2012)

Concept Research Principles Program Evaluation Principles Planning Scientific Method: state hypothesis, collect data, analyze data, draw conclusions Framework for program evaluation: engage stakeholders, describe the program, focus eval design, gather information, justify conclusions, ensure use and share lessons learned Decision Making Investigator-controlled Stakeholder-controlled Standards Validity: internal (accuracy, precision) and external (generalizability) Repeatability: utility, feasibility, propriety, accuracy Questions Facts: descriptions, associations, effects Values: merit (quality), worth (value), significance (importance)

slide-18
SLIDE 18

Distinguishing Principles of Research and Evaluation (cont’d)

Concept Research Principles Program Evaluation Principles Design Isolate changes & control circumstances (minimize context) Incorporate changes and account for circumstances (maximize context; encourage flexibility) Data Collection Limited number; sampling strategy critical; concern for protecting human subjects Multiple (triangulation preferred); sample strategy critical; concern for protecting human subjects Analysis & Synthesis At the end; scope focused on specific variables Ongoing; scope integrates all data Uses Disseminate to interested audiences in various formats Feedback to stakeholders; build capacity; disseminate to interested audiences

slide-19
SLIDE 19

Why Evaluate our Programs?

  • Monitor progress toward program goals
  • Determine whether program components are producing

the desired progress on outcomes

  • Permit comparisons among groups, particularly among

populations with disproportionately high risk factors and adverse health outcomes

  • Justify the need for further funding and support
  • Find opportunities for continuous quality improvement
  • Ensure that effective programs are maintained and

resources are prioritized

slide-20
SLIDE 20

The Top Six Reasons to Love Your Data

Data...

  • 1. Helps us make better decisions
  • 2. Tells us more about our “customers”
  • 3. Helps us improve services
  • 4. Allows us to do really cool things
  • 5. Helps us analyze processes
  • 6. Helps us learn what’s working and what’s not
slide-21
SLIDE 21

“Vision without measurement is just dreaming. Measurement without vision is just wasted effort.”

(IRI, 2011) ____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

slide-22
SLIDE 22

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Understanding & Using the Data

DATA _ .,_•

KNOWLEDGE _ .,_• ACTION

slide-23
SLIDE 23

no data

22%

slide-24
SLIDE 24

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Performance Management

Goal: continuous improvement through systematic and constructive action

Program

Eva I

uatio, n Quallity .

Assuranoe

  • Design
  • Data Collection
  • Data Analysis
  • R port R SU Its,
  • Quality o the data
  • Ensure standards of quality met
  • Corrective Acti

ions

  • Evaluat1

ion of Actions Taken

slide-25
SLIDE 25

Performance and WIOA

  • As of July 1, 2017...392 data elements
  • 150 new elements- WIOA, Barriers to Employment,

Education, credentials, customized training, measurable skills gains, employment, post-exit

  • From annual to quarterly and FFY to PY (July 1-June 30)
  • Reports due 45 days after end of quarter-a firm deadline
  • Nov. 15th, Feb. 15th, May 15th, Aug. 15th
  • RSA Portal and Edit Checks
slide-26
SLIDE 26

Performance and WIOA

1.

% of participants who are in competitive integrated employment during the 2nd quarter after exit

2.

% of participants who are in competitive integrated employment during the quarter 12 months after exit

3.

Of those in competitive integrated employment during the 2nd quarter after exit, report the median earnings that are at the midpoint between the highest and lowest total earnings

4.

Credential Rate

5.

Measurable Skills Gains

6.

Effectiveness Serving Employers-determined jointly with core partners

slide-27
SLIDE 27

% of participants who are in competitive integrated employment during the 2nd quarter after exit % of participants who are in competitive integrated employment during the quarter 12 months after exit Of those in competitive integrated employment during the 2nd quarter after exit, report the median earnings that are at the midpoint between the highest and lowest total earnings Credential Rate Measurable Skills Gains Effectiveness Serving Employers - determined jointly with core partners

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

WIOA Common Performance Measures

slide-28
SLIDE 28

 What are my program’s goals/outcomes?  Who are our stakeholders?  What information is needed?  How do we measure progress and outcomes?  How do we define success?  When and how are the measurements presented?  T

  • whom should the measurements and data be provided?

 What is our baseline?

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Performance Management Questions

slide-29
SLIDE 29

PROCESS

 PROGRESS  OUTCOMES

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Performance Measures

slide-30
SLIDE 30

 Modify the program to improve service delivery  Planning  Provide information (quantitative and qualitative) to leadership  Identify capacity building needs  Recognize stakeholders  Identify professional development opportunities  Monitor and improve programs  Inform a wide variety of stakeholders: consumers, councils, advocates, providers, public

____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Using Evaluation Data

slide-31
SLIDE 31

Culturally Responsive Approaches

Public Statement on Cultural Competence in Evaluation American Evaluation Association (2011) Cultural competence is a stance taken toward culture, not a discrete status or simple mastery of particular knowledge or skills. Culturally competent evaluators... ...engage with diverse segments of communities to include cultural and contextual dimensions important to the evaluation. ...respect the cultures represented in the evaluation throughout the process

slide-32
SLIDE 32

4 Core Foundational Concepts in the Pursuit of Cultural Competence (AEA, 2011)

1.

Cultural is central to economic, political, social systems, and individual identity.

2.

Cultural competence is fluid.

3.

Evaluators must maintain a high degree of self-awareness and self-examination.

4.

Culture has implications for all phases of evaluation (staffing, development, implementation, communication)

slide-33
SLIDE 33

https://aea365.org/blog/rodney-hopson-on-culturally-responsive-evaluation/

,. ti on F

ra1 m1 ewo,rk

slide-34
SLIDE 34

The Gold Standard

: Standla ds

Utmty

Feas.i

b. I

ity

Propr"ety Accuracy

1Cultt11raily Responsive Evaluation Frnmewo,

rk

slide-35
SLIDE 35

Program Evaluation & Quality Assurance Within Vocational Rehabilitation Programs

  • “Valued knowledge” or “evidence-based practice”?
  • Ask stakeholders what they want
  • Importance of valuing and using participatory “ways of

knowing” in knowledge creation and evaluation

  • Build trust through respect, reciprocity, and relationship
  • Performance management focus: how can we improve our

service delivery?

  • Share knowledge with others; openly communicate results
  • Reports need to be accessible and culturally appropriate; use

infographics when possible

slide-36
SLIDE 36

Resources

American Indian Vocational Rehabilitation Training and Technical Assistance Center (AIVRTTAC) http://aivrttac.org Program Evaluation and Quality Assurance Technical Assistance Center (PEQA) https://peqatac.org American Evaluation Association (AEA) https://www.eval.org The Summit Group http://vocational-rehab.com

slide-37
SLIDE 37

Thank You!

Cayte Anderson, Ph.D., CRC andersoncay@uwstout.edu caanderson8@wisc.edu