The Measurability of Organizational Values & Ethics: Lessons - - PowerPoint PPT Presentation

the measurability of organizational values ethics lessons
SMART_READER_LITE
LIVE PREVIEW

The Measurability of Organizational Values & Ethics: Lessons - - PowerPoint PPT Presentation

The Measurability of Organizational Values & Ethics: Lessons and Experiences from IDRC Colleen Duggan, Evaluation Unit Ethics Practitioners Association of Canada Workshop Ottawa, April 18 th , 2012 Common questions that need to be


slide-1
SLIDE 1

The “Measurability” of Organizational Values & Ethics: Lessons and Experiences from IDRC

Colleen Duggan, Evaluation Unit

Ethics Practitioners Association of Canada Workshop Ottawa, April 18th, 2012

slide-2
SLIDE 2

Common questions that need to be asked – Before you start evaluating

  • What are you expecting to change?
  • Who are you trying to influence?
  • Why are you evaluating? (purpose, use)
  • Who is this evaluation for? (user)
  • What would success look like? Failure?
slide-3
SLIDE 3

Evaluating Research , Evaluating Ethical Climate/Cultures: Common Challenges in Measuring Intangible Outcomes

  • The attribution problem
  • Multiple Pathways to “impact”

problem

  • The timeline to impact problem
slide-4
SLIDE 4

How is Evaluating Ethics Different?

slide-5
SLIDE 5

Challenge: Did the change happen because it’s the “right thing to do” (values-driven) OR because it’s something that “must be done?” (compliance-driven) This is especially hard to determine in an accountability-driven environment.

slide-6
SLIDE 6

Solution: Evaluate for reasons of Accountability & for Organizational Learning . Look for compliance and cultural shifts

slide-7
SLIDE 7

Challenge: Timeframes can be unpredictable.

Influencing Timeframe

1 Year

Goal

Reporting Timeframe

1 Year 1 Year

10 Years 5 Years 1 Year

slide-8
SLIDE 8

Goal …progress…

Solution: Assess progress and contribution, not just the end result.

slide-9
SLIDE 9

What does changing the ethical culture or climate entail? What does enhancing compliance entail?

slide-10
SLIDE 10

Your Organization: Who you are trying to influence?

SENIOR DECISION-MAKERS

OUTCOMES LEADERSHIP REASONING AWARENESS

INDIVIDUALS

GROUPS (UNITS)

Framework to Illustrate Ethical Change Strategy

Public Awareness Campaigns Code of Conduct Public Polling Policymaker Education Unit -based Education Campaign Political Will Campaigns Rewards & Recognition Program Ethics Leadership Development Senior Integrity Officer Public Forums Ethics Champion Development Model Ethics Legislation Values-based Recruitment Drive Managers V& E Dialogue Kit Coalition Building Course on Values and Ethics

ACTION Michael Hoffman’s Ethical Maturity Model AUDIENCE Activities/ Outputs

slide-11
SLIDE 11

What can we measure about Ethical change? How do we know we are making a difference?

slide-12
SLIDE 12

Measure meaningful things that capture scale and ethical embededness. Don’t just count what is easy to quantify.

slide-13
SLIDE 13

Ethical compliance climate, culture …progress…

Measure the changes made along the way, not just the end result.

INTERIM OUTCOMES

slide-14
SLIDE 14

Interim outcomes are expected and unexpected changes in our organizations as we work toward the goal.

Think about the different sorts of changes you will see in your audiences.

slide-15
SLIDE 15

Increased media coverage Increased collaboration among ethics advocates Increased willingness to act Increased knowledge Changed attitudes

  • r beliefs

New and active advocates Increased salience Reframing of the issue Approval of enhanced ethics legislation Increased issue visibility or recognition New and active high-profile champions

Activities and Outputs Goals Interim Outcomes

Awareness Will Action

Increased capacity to act Increased personal

  • r collective

efficacy

Leadership

slide-16
SLIDE 16

Your Organization: Who you are trying to influence? DECISION MAKERS OUTCOMES LEADERSHIP REASONING AWARENESS INDIVIDUALS Groups (units)

Use the Framework to think about interim outcomes

ACTION

HOW will they change as a result

  • f your programs work?

WHO will change as a result of your programs’ work? HOW will you know?

AUDIENCE

slide-17
SLIDE 17

Your Organization: Who you are trying to influence? SR.DECISION MAKERS OUTCOMES LEADERSHIP REASONING AWARENESS INDIVIDUALS Groups (units)

Where are your audiences and how far do you need to move them?

ACTION AUDIENCE

Ethical culture, climate, full compliance

Human Resource Officers Front-line employees Managers

slide-18
SLIDE 18

How can we measure it?

slide-19
SLIDE 19

Interviews Surveys Focus Groups Polling Traditional Evaluation Methods

slide-20
SLIDE 20

IDRC’s Corporate Assessment Framework (CAF)

  • A tool for Corporate Level mission

assessment

  • Focuses on work that IDRC senior

management does to guide program thinking and systems;

  • And the way IDRC staff implement programs

in line with this thinking and these systems Non-Traditional Evaluation Method: Content- Discourse Analysis

slide-21
SLIDE 21

www.idrc.ca

The Framework

  • Based on 7 performance areas identified by

senior management

  • Identified as critical to mission level

assessment

  • 3 strategic goals
  • 4 operating principles fundamental to the way

IDRC works.

slide-22
SLIDE 22

www.idrc.ca

The Performance Areas

slide-23
SLIDE 23

www.idrc.ca

The Performance Areas

  • Enhancing Capacities: … strengthen the capacities of Southern

researchers…

  • Policy and Technology Influence: link research to policy

formulation and implementation …

  • Canadian Partnerships: … collaborative research that is

mutually beneficial

  • Donor Partnerships: …like-minded and innovative donors
  • Gender Equality and Women’s Rights: .. mainstreaming

gender ... supporting gender-transformative and gender- specific research.

  • Strategic Knowledge Gathering: gathering and use of

knowledge and feedback to … respond to the needs of developing countries …

slide-24
SLIDE 24

www.idrc.ca

How Does it Work?

  • Collection and coding of data on the

performance areas (466 documents in 2007)

  • All levels
  • Computer assisted coding (NVivo) with coding

frame and metrics

  • Triangulation
  • Management response
slide-25
SLIDE 25

www.idrc.ca

The Premise

This approach is grounded in, and tests, the idea that the work of managers is to discuss, deliberate, consider – and that the nature, content, and quality of these discussions and decisions is what moves the organization forward and contributes to mission level performance.

slide-26
SLIDE 26

www.idrc.ca

Evaluative Thinking

The Centre supports evaluative thinking by staff and partner organizations in the effort to be clear and specific about the results being sought, the means used to achieve these results, and to assure the systematic use of evidence to demonstrate achievements for both learning and accountability purposes.

slide-27
SLIDE 27

Sample Metrics (for Evaluative Thinking)

  • Are the core issues related

to Evaluative Thinking being documented, and if documented, with what frequency?

  • Where documented and

applicable, with what depth of analysis, discussion and/or deliberation is ‘Evaluative Thinking’ being discussed?

  • Number of ‘hits’ across

data for keywords

  • Number of documents

coded for each characteristic and performance area

  • The nature of, or purpose

for, deliberation on evaluation

  • The intentionality with

which an evaluation is discussed

  • Reference made to wider

body of literature

slide-28
SLIDE 28

www.idrc.ca

Theoretical Underpinnings

  • rooted in discourse analysis, institutionalization theory, and
  • rganizational studies.
  • language as fundamental to institutionalization: how social

ideas and norms that comprise organizations are created, maintained, and changed

  • Attention to the flow of texts; where the texts come from,

how they are used, who creates them, and how they are connected

  • Influence of text: Work on “sensemaking” (Weick),

legitimation and, legitimacy (Phillips,Lawrence and Hardy)

slide-29
SLIDE 29

www.idrc.ca

What can the CAF tell IDRC?

  • What the Centre is reporting on, discussing,

and what decisions are made– along with documentation on results at the project level

  • Does not rank performance: The performance

areas do not have simple targets or benchmarks

  • Part of analysis happens with management
slide-30
SLIDE 30

THANK–YOU!

*Special thanks to Julia Coffman of the Center for Evaluation Innovation. A number of these slides were borrowed from her presentation “Evaluating for Influence”, September 13th, 2012

www.idrc.ca/evaluation