establishing legitimacy among
play

Establishing Legitimacy among Project STEP-UP STEM Intervention - PDF document

Establishing Legitimacy among Project STEP-UP STEM Intervention Programs: STEM Trends In Enrollment & Persistence for The Need for Evaluation Underrepresented Populations (STEP-UP) Examines factors that impact the entrance into,


  1. Establishing Legitimacy among Project STEP-UP STEM Intervention Programs:  STEM Trends In Enrollment & Persistence for The Need for Evaluation Underrepresented Populations (STEP-UP)  Examines factors that impact the entrance into, persistence in, and degree attainment in the STEM CASEY GEORGE-JACKSON, Ph.D. fields at large, public, research universities BLANCA RINCON  By gender University of Illinois at Urbana -Champaign  By race/ethnicity ASQ Advancing the STEM Agenda  By socioeconomic status University of Wisconsin -Stout  By STEM field JULY 19-20,2011 2 STEM Intervention Programs (SIPs) Objectives of STEM Intervention Programs  Examine the design, implementation, and impact of  Recruitment and/or Retention STEM intervention programs on underrepresented  Encourage and prepare students for graduate undergraduate students education and/or careers in STEM  Serve women, students of color, low-income, and first-  Aid in the transition to college generation students  Increase awareness of STEM majors and careers  Sample programs: tutoring, mentoring, financial aid, research experiences, first-year seminars, living-learning  Assist in transforming composition of the STEM communities workforce  Questions of Interest:  Create opportunities for access and success  Has the program been formally evaluated (internally or  Increase representation and success of select externally)? populations in STEM  If so, what was the focus and results of the evaluation? 3 4 Evaluation Legitimacy Theory  The systematic review of a program or policy,  Found in Organizational Theory which uses various methodological approaches, to  Describes how an organization (SIP) gains acceptance due to their relationship with mainstream norms and determine its merit, quality, worth, or value. values  Evaluations are used to  SIPs may be influenced to align their missions and goals with certain  Respond to the need for accountability, including efforts values, but also to demonstrate their value “to improve and better programs and society ” ( Alkin and  As an institution becomes legitimate, it sustains the flow Christie, 2004, p. 12). of resources from the environment to the organization  Inform decisions and changes  SIPs use evaluations to demonstrate the value of their services, gain support, and secure funding based on the demonstration of desired  Determine best practices outcomes  Apply for funding  Legitimacy increases resources and support over time;  Determine and demonstrate merit or Worth sustains services to students 5 6 1

  2. Legitimation Process for SIPs Data & Methods  Data  Qualitative Analysis SIP provides SIP uses Evaluation  Collected in 2009-2010  Semi-structured interviews with recruitment and to demonstrates the program administrators  10 large, public, research retention Services value of services and universities  Coded for common themes and to students desired outcomes issues  SIP Directors & Administrators  137 invited; 55 interviewed SIP gains access to SIP gains Resources Legitimacy & is viewed as legitimate (finances & human by stakeholders resources) 7 8 Summary of Evaluation Efforts Internal vs. External Evaluations  Structure of evaluations:  Focus of Evaluations  Internal Evaluations  Students’ experiences,  42% — Formal internal including issues of climate  Conducted own data collection and assessment evaluation  Student outcomes, including  Formal (e.g., surveys) and informal (e.g., anecdotal  18% — Informal internal recruitment and retention in STEM information) approaches evaluation  Extent SIP’s mission is being  Some staff members trained and/or experienced in  18% — Formal external met conducting evaluations evaluation  Evaluation methods and techniques  15% — Combination of  External Evaluations  Pre- and Post- Tests internal and external  Hired outside evaluators; paid from program’s budget  Focus Groups  6% — No response  Exit Interviews  Partnered with evaluators on campus (e.g., graduate  Observation students in education or evaluation programs)  Students’ Self -Evaluation  Comparison Groups 9 10 Results: Use of Evaluation Results Results: Evaluation As A Requirement  Make decisions about and inform changes to SIP  Meet requirements established by funder (e.g. NSF)  “For us to figure out what we’re doing right and wrong”  Provide evidence of specific outcomes in order for  Develop new programs and services funding to be renewed  “Where our students fit in and where they’re lacking”  “ G ood stewards of that money”  Report to others, including campus diversity offices  “College wants to make sure they’re getting their money’s  Bottom line is the “numbers” (students entering and worth” succeeding in STEM)  Criticisms:  Share with funders who want to assess the impact of  Funds for evaluation should go towards serving students a particular program and its services  Expectations may not be clearly articulated  Aids in establishing legitimacy  “Vague pressure to evaluate our programs”  Enables program to secure recurring funding & new resources 11 12 2

  3. Results: Evaluation Expertise Results: Resource Constraints  Partner with local experts  A significant roadblock to performing evaluations  Faculty or graduate students in education departments and/or  Funding evaluation programs  (Qualified) Staff  Experts speak the language of evaluation  Knowledge of Evaluation  Demonstrates level of expertise  Time  Lack of staff expertise  Can result in incomplete evaluation efforts  Need for training and/or purposeful hiring  Data gathered but not analyzed  “I’m not a statistician. I don’t know how to design a questionnaire.  Can lead to difficult decisions I don’t know how to do that … I’m a community organizer.”  Importance of evaluation still recognized  Using funds to conduct evaluations or provide services 13 14 Limitations Recommendations  Conduct evaluations and assessments  Limited generalizability  To demonstrate and report value and worth of SIP to  10 large, public, research universities others  Four-year, doctoral-granting universities  To secure additional funding and support  Predominately White Institutions (PWIs)  To inform decisions, changes, development of new  Participant recruitment based on publicly available programs/services information of SIPs on each university’s website  Online Resources:  Response rate based on self-selection  National Center for Women in Information Technology  Analysis based on opinions and perceptions of  http://www.ncwit.org/resources.assessment.html directors and administrators  Pell Institute Evaluation Toolkit  May not reflect opinions or perceptions of funders or other  http://toolkit.pellinstitute.org/ stakeholders 15 16 Recommendations ( con’t ) Conclusions  Seek out strategic partnerships  View evaluations as an important and necessary way to gain legitimacy, garner support, and secure  Hire graduate student and/or faculty evaluators from funding local campus  Budget evaluation activities in requests for funding  Program design improves based on use of evaluation  Provide training opportunities to current staff members results, so more students benefit and/or receive services they need  Hire new staff members with evaluation experience  Collaborate with similar SIPs or student service programs  Legitimized SIPs are better situated and able to serve to combine evaluation efforts and resources students, recruit and retain students in STEM fields, and contribute to students’ educational success 17 18 3

  4. Questions & Discussion Contact Information Project STEP-UP stem@education.illinois.edu http://stepup.education.illinois.edu/ http://twitter.com/ProjectStepUP Facebook: step-up project This material is based upon work supported by the National Science Foundation under Grant No. 0856309. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. 19 4

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend