placing a better bet
play

Placing a Better Bet The Importance of Bridging Research and - PowerPoint PPT Presentation

Placing a Better Bet The Importance of Bridging Research and Practice in Youth Mentoring April 26, 2018 Michael Garringer Director of Research and Evaluation, MENTOR MENTORs Research Agenda Making sure we can make the case (as


  1. “Placing a Better Bet” The Importance of Bridging Research and Practice in Youth Mentoring April 26, 2018 Michael Garringer Director of Research and Evaluation, MENTOR

  2. MENTOR’s Research Agenda  Making sure we can make the case (as strongly as possible) for mentoring’s value to youth and society as a whole  Helping practitioners find and apply research to improve their programs  Collaborating with researchers to understand more about mentoring relationships and interventions  Developing and disseminating tools and evidence-based practice guidance 2

  3. What we’ll cover today…  Why and how research can improve a program’s outcomes  Strategies for measuring the right things in any program  Applying research in mentoring and other disciplines to your own program  Sources of translational research 3

  4. Why research matters in mentoring work 4

  5. Are we making a difference? 5

  6. The practices we use influence the relationships (and outcomes) that result… “Usual Care” Youth Psychotherapy Evidence-Based Youth Psychotherapy .11 .54 N = 32 studies, Avg. ES (EB vs. UC) = .30, Weisz et al., 2006, American Psychologist 6

  7. For comparison… Evidence-Based Youth Psychotherapy Youth Mentoring .21 .54 Weisz et al., 2010 DuBois et al., 2011 7

  8. How mentoring programs fare… 30 25 20 Negative Effect # of Samples Small Effect 15 Small to Medium Effect Medium to Large Effect Large Effect 10 5 0 DuBois, et al., 2002 Effect on Youth 8

  9. Other reviews of the effectiveness of mentoring programs  Crime Solutions reviews of mentoring programs: – 8 are rated as “effective” – 23 are “promising” – 14 are “no effects”  Several with multiple variations show different impacts  We are slowly improving the art of making mentoring intentional and grounded in proven approaches 9

  10. The good news! 0.5 Medium Effect 0.4 0.3 Size of Youth Outcomes Empirically- Based 0.2 Small Practices Effect Theory-Based 0.1 Practices 0 -0.1 0 1 2 3 4 5 6 7 8 9 10 11 DuBois, et al., 2002 Number of Practices 10 10

  11. Converting research into Standards of Practice  Research-informed standards of practice for our field  Adapted from the health care literature on the development of Clinical Practice Guidelines  Builds on implementation science  Supplements coming soon! – STEM mentoring (May/June) – LGBTQ youth (August) – Workforce and career exploration (Dec./Jan.) 11

  12. Studies validating aspects of the EEPM  Research by Kupersmidt and colleagues found the “sum total of both Benchmark program practices and Standards were associated with match length and long-term relationships” – Neither predicted premature match closure – Training was the only Standard that predicted these things  Research by Keller has found that programs reporting higher implementation of Benchmark practices and Enhancements had: – Stronger staff-mentor interactions – Mentors who were more satisfied with the program and felt more effective – Stronger organizational learning cultures 12

  13. Structured mentors can face a range of challenges Major and Minor Challenges Challenges come from… Major Minor Major + Minor  The youth’s parent/ Meeting times/schedules (that work for both) 26% 47% 73% guardian, serious needs Severe needs expressed by youth/family 25% 45% 70% of the youth, or cultural barriers and Lack of support by parent/guardian 30% 37% 67% communication barriers Communication between youth/family 30% 37% 67%  Scheduling conflicts/ Schedule/availability prevent me mentoring 27% 38% 65% available time Cost of the mentoring activities 28% 34% 63%  Challenges with the Getting time off from work to mentor 24% 38% 63% program , including lack of support, lack of training, or Lack of training for mentoring role 20% 43% 63% a difference in values Differences in values- me/mentoring program 18% 44% 61%  27% of mentors rated the Language or cultural barriers 23% 36% 59% quality of program support Lack of support by the mentoring program 24% 30% 54% as poor 13

  14. What drives recommending mentoring to others? 14

  15. We get what we pay for in mentoring Expected Frequency Cost Per Youth No expectation or $1,000 Hours Pre-Match Post-Match requirement None $1,413 $1,149 2-3 times a month $1,523 1 $1,413 $1,340 Monthly $1,537 Weekly $1,769 1-2 $1,433 $1,746 More than once a week $1,847 3-4 $1,541 $1,933 Other - Write In (Required) $1,881 4+ $1,637 $2,074  This trend also holds true for match support tasks and expected match duration 15

  16. Higher costs result in matches that tend to persist Increases in Match Persistence with Cost Per Youth Served  This trend is not true for all states, but is nationally  Sheds light on what it takes to deliver quality services 16

  17. Mentors seem to be more effective when… “Often this EB practice draws on  Programs they serve in are cognitive-behavioral activities such as aligned with evidence-based thought labelling, behavioral activation, practices, and relaxation exercises (Day et al., AND … 2013). Such mentor approaches stand in contrast to… “ mentoring as  Mentors are trained to use relationship” (i.e., where the goal is to evidence-based intervention create more free-flowing, supportive strategies in their work relationships), which continue to dominate youth mentoring practice.” 17

  18. The best possible “evidence”… Your own 18

  19. Remember that “evidence - based” is broader than just hard research Researchers Practitioners Service Users Optimal EBP Provider Policymakers Organizations & Intermediaries Communities Adapted from Pawson, Boaz, Grayson, Long, & Barnes (2003) 19

  20. What to measure in any youth mentoring program 1. Implementation of services – This is what results in poor outcomes in most instances 2. Quality of the mentoring relationship – Hard to achieve outcomes if the relationships themselves are rocky 3. Proximal outcomes that fit the duration and activities of the relationship – What makes sense for the time your mentors spend with youth 20

  21. 1. Implementation of Services  Recruitment accuracy (do your applicants fit your target audiences?)  Training delivery (is everyone getting the same level of training?)  Meeting frequency and intensity (are youth getting the mentoring promised?)  Match activities (are matches engaging in the things you expect them to?)  Match duration (are the matches lasting long enough to make a difference?)  Match support (are you offering the support needed to overcome hurdles?)  Alignment with Elements of Effective Practice for Mentoring (can you demonstrate that running the best program you can?) 21

  22. Tips for what to measure around implementation  Consistency (and Adaptation) – Are program components being implemented as intended? – Adaptations may be useful and beneficial.  Participation – To what extent does each youth take part in or receive intended activities or experiences? – Addresses how much, how frequently, when, and where each activity/experience is received.  Quality – How well is the program delivered? – Are practices being implemented to an intended standard? 22

  23. 2. Mentoring Relationship Quality  Mentoring is a context for the delivery of other practices and interventions  But also an intervention unto itself  At the heart of each approach is a real relationship Rhodes, 2006 23

  24. What Constitutes Relationships?  Mentor feelings of self-efficacy  Shared decisionmaking  Instrumental growth  Trust/closeness – Adherence to/results of curricula  Enjoyment  Academic or career support  Reciprocity  Cultural sensitivity/  Feeling safe/valued responsiveness  Mentor-parent relationship  Youth-centeredness  Staff-participant relationship  Dissatisfaction/unmet  Future mentoring receptiveness expectations 24

  25. Measurement Guidance Toolkit Offers Ready-to-Use Scales and Surveys  Additions around this topic should be complete in June 2018  Omnibus” measures (whole relationship )  Unique relationship features too (such as group cohesion for programs using a group model)  Available at www.nationalmentoringresourcecenter.org 25

  26. 3. Outcome Measurement  Potential categories include: – Mental and Emotional Health – Social Emotional Skills – Healthy and Prosocial Behavior – Problem Behavior – Interpersonal Relationships – Academics – Risk and Protective Factors  Toolkit offers ready-to-use scales in all these domains 26

  27. Tips for how to measure results (outcomes)  Don’t look for too many outcomes at once – Can make you look less successful than you really are  Don’t look too far out – Remember that you can’t control what happens when they leave your services; focus on proximal outcomes  Don’t cherry pick your results when reporting!  Don’t report youth achievements or changes as proof of your impact without a counterfactual (use a comparison group, at the very least)  Don’t use homegrown surveys and scales 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend