Washington, District of Columbia September 2017
Standard 4: Program Impact
Emerson Elliott - Special Projects, CAEP Jennifer Carinci – Director of Research and Engagement, CAEP
Standard 4: Program Impact Emerson Elliott - Special Projects, CAEP - - PowerPoint PPT Presentation
Standard 4: Program Impact Emerson Elliott - Special Projects, CAEP Jennifer Carinci Director of Research and Engagement, CAEP Washington, District of Columbia September 2017 Session Overview the standard what the standard is trying
Washington, District of Columbia September 2017
Emerson Elliott - Special Projects, CAEP Jennifer Carinci – Director of Research and Engagement, CAEP
Fall 2017 | Washington, D.C.
2
Fall 2017 | Washington, D.C.
3
learning and development, classroom instruction, and schools, and the satisfaction of its completers with the relevance and effectiveness of their preparation.
effectiveness, and satisfaction? What research methodologies could you feasibly employ to gain such information?
4
teachers
If state data are not available:
the data
Fall 2017 | Washington, D.C.
The provider demonstrates, through structured and validated observation instruments AND/OR student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve.
Fall 2017 | Washington, D.C.
Fall 2017 | Washington, D.C.
– Comparison of exit surveys with in-service surveys
Fall 2017 | Washington, D.C.
9
Fall 2017 | Washington, D.C.
“...Standard 4 addresses the results of preparation at the point where they most matter—in classrooms and schools.” “...judgment (of candidate knowledge and skills) is finally dependent on the impact the completers have on-the-job with P-12 student learning and development.”
Fall 2017 | Washington, D.C.
Criteria for Performance Excellence
preparation where it matters most—in the schools and classrooms where completers are employed
Fall 2017 | Washington, D.C.
12
Fall 2017 | Washington, D.C.
13
Fall 2017 | Washington, D.C.
– from state to state – Within states
14
“Standard 4 asks the right questions”
Fall 2017 | Washington, D.C.
show comparisons to performance standards
Guidelines for EPP self-study reports: Present Results Appropriately, in a way that Aligns with the Standard
Fall 2017 | Washington, D.C.
for limitations of any one data source
Fall 2017 | Washington, D.C.
17
Fall 2017 | Washington, D.C.
CONSULT:
18
Fall 2017 | Washington, D.C.
STANDARD 4
Fall 2017 | Washington, D.C.
AREAS FOR IMPROVEMENT (AFIs) MAY BE CITED WHEN
EPP’s case that it meets the standard E.g.:
have significant deficiencies with respect to CAEP’s assessment evaluation framework
20
Fall 2017 | Washington, D.C.
STIPULATIONS MAY BE CITED WHEN
representativeness of the data
21
Fall 2017 | Washington, D.C.
STANDARD 4 MAY BE DEEMED UNMET WHEN
assigned, and the Standard may or may not be met (depending on other accreditation findings)
a component, the EPP has 24 months from the decision to provide sufficient evidence to remedy the deficiency.
site visitors.
22
Fall 2017 | Washington, D.C.
When data are not available from the state, consider these options: Teacher-linked P-12 student learning data from school districts or from individual schools Teacher information from district or school tests Teacher action research information on P-12 student learning, perhaps in the form of a “portfolio” of different teacher experiences and results with P-12 student learning CAEP encourages providers whose completers are employed by the same school districts to collaborate in development and conduct of such options.
23
These are examples of evidence that CAEP has suggested in Qs and As and in the Handbook
Fall 2017 | Washington, D.C.
learning impact data that are returned to EPPs. E.g.,
the psychometric qualities of the P-12 assessments, the alignment of the assessments with the state’s curriculum, technical features such as the proportion of students included, the soundness of
the student teacher link,
the method of forecasting expected student growth, and the adjustments for classroom or school characteristics so that teachers in similar
situations can be fairly compared.
24
EPPs should:
Fall 2017 | Washington, D.C.
characteristics and patterns in the data (e.g., stability of the data, or trends in the
data),
their interpretations of the data (e.g., comparisons with completers from other
EPPs,
possible influences on the data from the particular places in which completers are
employed,
consistency or differences in data compared with other sources such as employer
surveys or teacher observation measures).
25
Fall 2017 | Washington, D.C.
implications for features of the preparation experiences.
Note, however, that even though the EPP reports data that the state has shared, it may conclude that the state data are of limited value for its demonstration that Standard 4 is met.
26
Fall 2017 | Washington, D.C.
27
Fall 2017 | Washington, D.C.
28
Fall 2017 | Washington, D.C.
pre-service
Fall 2017 | Washington, D.C.
highly summarized, so of little utility to the EPP.
completers, and being constructed as a continuing activity
specifically include pre and post measures for teaching a comprehensive unit.
30
Fall 2017 | Washington, D.C.
information is available. But these data are part of the state’s teacher evaluation assessment so evidence for 4.1 and 4.2 are linked
and designates one as the path it intends to follow. It will work with a school partner to gather and evaluate classroom data from novice teachers.
Fall 2017 | Washington, D.C.
that CAEP has examined generally found them insufficient as feedback on the EPP’s own performance
32
Fall 2017 | Washington, D.C.
teacher data
an instructional unit
input
recurring activity
33
Fall 2017 | Washington, D.C.
35
Fall 2017 | Washington, D.C.
Director of Research and Engagement
Jennifer.carinci@caepnet.org
Director, Special Projects
Emerson.elliott@caepnet.org
36