PRACTICAL STRATEGIES TO SUPPORT DATA COLLECTION AND ANALYSIS
1
SUPPORT DATA COLLECTION AND ANALYSIS 1 WE ARE GLAD YOU ARE HERE! - - PowerPoint PPT Presentation
PRACTICAL STRATEGIES TO SUPPORT DATA COLLECTION AND ANALYSIS 1 WE ARE GLAD YOU ARE HERE! 2 ITACC: 2017 AIDD Technical Assistance Institute LEARNING OBJECTIVES Participants will recognize basic concepts related to data collection;
1
ITACC: 2017 AIDD Technical Assistance Institute
2
ITACC: 2017 AIDD Technical Assistance Institute
3
ITACC: 2017 AIDD Technical Assistance Institute
4
Notes for Slide 4: Before collecting data – Know why you want to evaluate a project/activity. Logic model identifies desired outcomes, resources, and activities necessary to accomplish the
process between work and desired outcomes.
monitoring data
results with internal and external stakeholders
aligned strategies and activities.
5
Systematic approach to ensure quality and progress towards goals Aligning structures, processes, and routines Logic Model Evaluation Plan annual work plan – data collection
ITACC: 2017 AIDD Technical Assistance Institute
6
ITACC: 2017 AIDD Technical Assistance Institute
7
Notes for Slide 6: Collect and Use Data Establish and implement routines and processes for collecting, analyzing, and monitoring data. There is a resource on the itacchelp.org website - http://itacchelp.org/wp-content/uploads/2015/12/2015-TAI-Evaluation-FAQ-3.pdf
8
ITACC: 2017 AIDD Technical Assistance Institute
9
Notes for Slide 7: Making decisions to continue, improve or end practices based on data; implementing incentives tied to performance; and engaging and communicating results with internal and external stakeholders.
10
ITACC: 2017 AIDD Technical Assistance Institute
11
Notes for Slide 8: To ensure the Council is getting what they want from projects and activities, it is important to establish priorities for the activity and to communicate the expectations. Of course, setting clear, measurable goals and outcomes that align well with the strategies and activities will help
12
ITACC: 2017 AIDD Technical Assistance Institute
13
Notes for Slide 9: When councils engage other entities to help accomplish goals and objectives – and changes are made to the initial plans, don’t forget to adjust the evaluation component as well – ITACC staff frequently hears about data being provided to Councils that does not speak to the overall
the data that is needed to measure impact is critical!
14
ITACC: 2017 AIDD Technical Assistance Institute
15
Notes for Slide 10: Let’s take a look at the new performance measures – the new measures call for connection between investments, outputs, and the outcomes of the activity – this translates into results and short-term impact, intermediate impact, and ultimate impacts of Council activities
16
ITACC: 2017 AIDD Technical Assistance Institute
17
Notes from Slide 11: Data: Good data isn’t just a bunch of raw numbers – it’s a compilation of facts based on a clear understanding of WHAT you are tying to find out. The results should tell a story. Do you know what you need to know? These PM’s aren’t new. Although they keep changing slightly, the basic concept of what you need to collect isn’t. Make sure you really read and think through these PM’s so you know what you need to collect. Lessons Learned: Initial training is vital. Talk to the grantees about the new way we need to collect and report on data. Put the work in up front - look ahead at what you need to report in the PPR, then track back to figure how you will get that information from the grantees in a way that’s clear and concise. For us it was creating our own reporting templates which will get to in a min. Take the lead in telling sub-grantee’s what you need to know and how you want it reported. I tell the grantees if they question whether info is relevant – put it in and I will determine that because I’d rather have more than not enough. Continuous monitoring = allows you to make sure that you’re getting the data, and that it’s relevant to what you need to report in the PPR. Follow up with sub-grantees is very important. Make sure you get clarification. Looking at surveys = not everyone completes every question on a survey! You have to calculate percentages based on the number of people who answered a question – not based on the number
18
PPR/IMPACT
How We Show Our Worth
ITACC: 2017 AIDD Technical Assistance Institute
19
STATE PLAN DATA PPR OUR IMPACT
ITACC: 2017 AIDD Technical Assistance Institute
20
Notes from Slide 13: The Oreo effect. The filling is the data collection, it’s what holds the top and bottom wafers together. Without it, you’d just have chocolate wafers, which are ok, but it’s just not an Oreo.
21
ITACC: 2017 AIDD Technical Assistance Institute
22
Notes from Slide 14: With change – Council’s have the opportunity to revisit the evaluation tools that will best meet the Councils needs…
23
ITACC: 2017 AIDD Technical Assistance Institute
24
Notes for Slide 15: Adapted AIDD performance measures into quarterly progress reporting and surveying processes with follow-up provided by monitoring visits. Program monitoring site visits complement the sub-grantee reporting process in addition to ensuring compliance and integrity to Terms and Conditions, relevant state and federal rules, and original grant Request for Proposal. While SC DD Council provides the minimum requirements for survey data collected, we encourage sub- grantees to modify our tools to best meet their program needs (surveys and other impact-gathering tools may be modified only—quarterly progress report is standard). Most common question from DD Council program staff to sub-grantees is “how are you measuring the impact of your programming?”
data being collected is relevant to all involved.
people being impacted by activities and raise awareness of SC DD Council (we are often confused with our state’s DD service provider, DDSN)
to be a part of (not a requirement, though)
developed a peer network of support while learning about new employment benefits) but also long- term (e.g., participants organized themselves into a local “advocacy” group in Charleston and speak regularly at City Council meetings. This contact resulted in a member being asked to join the Mayor’s Advisory Committee a year after the individual participated in the original grant activities). Projects working directly with individuals and families are asked to assess how their grant activities are affecting participants lives outside of grant activities. EG, a new skill or knowledge area might be introduced to participants, but how are the participants using these new skills and how have their lives in general been affected by being a part of sub-grantees’ activities?
25
ITACC: 2017 AIDD Technical Assistance Institute
26
Notes from Slide 16: Word document created with “developer tools tab,” “design mode,” “controls panel.” including “click text” boxes to help guide sub-grantees through the QPR. The document is “locked” so sub- grantees cannot accidentally modify it and can only enter info in the “click text” boxes. There are instances in which a sub-grantee needs to add rows to the objectives section of the QPR, in which case, the sub-grantee must come to program staff. This is good because some sub-grantees include a lot of objectives to their programming and this affords an opportunity to discuss realistic expectations for a 12-month program—more than six objectives to a grant project is a “rose flag” (not a red flag) to program staff to take a closer look at the sub-grantees programming expectations so they don’t freak out (and Council program staff doesn’t freak out, too).
27
ITACC: 2017 AIDD Technical Assistance Institute
28
Notes from Slide 17:
lawyers, etc.)
should never be counted in number of project participants). Sub grantees are asked to show how they are tracking participants and gathering information on disability.
new best and/or promising practices (AIDD PM).
activities are leading to improvements in best/promising practices, this section helps program staff evaluate whether or not the sub- grantees’ activities have clear links to the overall program outcomes. Quarterly narrative: focus on impact. Sub grantees are encouraged to share success stories and lessons learned. Program staff gathers questions for monitoring visits (from the whole QPR, but particularly this these narrative sections). Duplication—some grantees partner on multiple projects including conferences. Program staff has to ensure participants are being counted only once.
Sub-grantees often influence policy and procedure and do not realize it especially processes internal to their own organization.
with people. Systems change work is more nuanced and, therefore, we work with individual systems change projects to create assessments for impact.
29
Questions before we review survey tools?
ITACC: 2017 AIDD Technical Assistance Institute
30
http://www.scddc.state.sc.us/Reportandreimb.html
ITACC: 2017 AIDD Technical Assistance Institute
31
Notes for Slide 19:
behavioral, affective, emotional, and cognitive satisfaction. Therefore, program staff TA ensures that sub-grantees creating their own satisfaction survey is relevant to these measures.
(keeping in mind a broad definition of advocacy/self advocacy) increases as a result of the project's
Survey Monkey or other online platforms.
32
ITACC: 2017 AIDD Technical Assistance Institute
33
ITACC: 2017 AIDD Technical Assistance Institute
34
Notes for Slide 21: There are overlaps in affective/behavioral; affective/cognitive satisfaction because affective (liking/ disliking something specific) measures indicate how a specific item affects one’s behavior (behavioral) or judgment (cognitive) Affective (like/dislike) and behavioral satisfaction: May measure how something specific affected one’s behavior
Behavioral satisfaction: May measure intent or similar behavior
Affective (like/dislike) and Emotional satisfaction: Measures satisfaction overall or on specific topics
Cognitive Satisfaction: Measures judgment
35
http://www.scddc.state.sc.us/documents/DDC%20Outcomes%20Survey.docx
ITACC: 2017 AIDD Technical Assistance Institute
36
Notes for Slide 22:
and their family members.
we partner with them to work on data collection tools appropriate for their audience.
project staff and DD Council program staff or, my favorite, peer lead group interviews.
advocacy, and self determined behavior. Definition provided on the survey and sub-grantees are encouraged to emphasize the “speaking up for someone else or yourself” aspect of
testing because you don’t know what you don’t know.
general way
testimony, Advocacy Day for Access and Independence, Community Engagement Day
formal services (depends on the project)
they would like to be a part of and make connections when appropriate
37
direction
ITACC: 2017 AIDD Technical Assistance Institute
38
Notes for Slide 23: Use SDM survey as an example.
39
ITACC: 2017 AIDD Technical Assistance Institute
40
Notes for Slide 24: COLLECTION
projects in-house.
they said they would meet the specific requirements of the Goal/Objective in our 5 year state plan (Review sections of template) add what I need to know for the PPR.
collect if we limited the PM’s we had them report on).
surveys for each activity and we also do surveys for Council or “in-house” activities. Each survey has the correlating objective printed on it. That info ends up in our Sat Fact. Survey results tab on our spreadsheet.)
CALCULATION
IMPACT
feel like your 5 yr state plan is following you around! And although we may all take different directions, we all end up back at the same place.
41
ITACC: 2017 AIDD Technical Assistance Institute
42
ITACC: 2017 AIDD Technical Assistance Institute
43
ITACC: 2017 AIDD Technical Assistance Institute
44
ITACC: 2017 AIDD Technical Assistance Institute
45