Standard 4: Program Impact Emerson Elliott - Special Projects, CAEP - - PowerPoint PPT Presentation

standard 4
SMART_READER_LITE
LIVE PREVIEW

Standard 4: Program Impact Emerson Elliott - Special Projects, CAEP - - PowerPoint PPT Presentation

Standard 4: Program Impact Emerson Elliott - Special Projects, CAEP Jennifer Carinci Director of Research and Engagement, CAEP Washington, District of Columbia September 2017 Session Overview the standard what the standard is trying


slide-1
SLIDE 1

Washington, District of Columbia September 2017

Standard 4: Program Impact

Emerson Elliott - Special Projects, CAEP Jennifer Carinci – Director of Research and Engagement, CAEP

slide-2
SLIDE 2

Fall 2017 | Washington, D.C.

Session Overview

  • the standard
  • what the standard is trying to do
  • challenges and how CAEP has addressed them
  • examples of evidence and ways of reviewing
  • illustrations from recent EPP submissions

2

slide-3
SLIDE 3

Fall 2017 | Washington, D.C.

Standard 4 and its Components

3

slide-4
SLIDE 4

STANDARD 4: PROGRAM IMPACT

  • The provider demonstrates the impact of its completers on P-12 student

learning and development, classroom instruction, and schools, and the satisfaction of its completers with the relevance and effectiveness of their preparation.

  • This standard must be met to be fully accredited.
  • Consider: What evidence do you have that would demonstrate graduates’ impact,

effectiveness, and satisfaction? What research methodologies could you feasibly employ to gain such information?

4

slide-5
SLIDE 5

4.1 Impact on P-12 Student Learning and Development

  • Direct measures of student learning and development for inservice

teachers

  • Addresses diverse subjects and grades
  • P-12 impact or growth data from state teacher evaluations (if available)

If state data are not available:

  • Teacher-linked student assessments from districts
  • Teacher conducted action research
  • Focus group producing qualitative data
  • For EPPs using qualitative data, a qualitative research method must be identified for the analyze of

the data

  • Results must be reported
slide-6
SLIDE 6

Fall 2017 | Washington, D.C.

The provider demonstrates, through structured and validated observation instruments AND/OR student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve.

  • Multi-dimensional measures of preparation impact
  • State data
  • Work Sample
  • Case Study
  • Action Research

4.2 Indicators of Teaching Effectiveness

slide-7
SLIDE 7

Fall 2017 | Washington, D.C.

  • Employer satisfaction data – EPP or State instruments
  • Descriptive of knowledge and skills that were developed during preparation
  • Surveys, focus groups, interviews, case studies, etc.
  • Data on employment milestones
  • Promotion, employment trajectory, retention

4.3 Satisfaction of Employers

slide-8
SLIDE 8

Fall 2017 | Washington, D.C.

  • Completer satisfaction data – EPP or State instruments
  • Descriptive of knowledge and skills that were developed during preparation
  • Surveys, focus groups, interviews, case studies, etc. (See CAEP Evidence Guide)

– Comparison of exit surveys with in-service surveys

4.4 Satisfaction of Completers

slide-9
SLIDE 9

Fall 2017 | Washington, D.C.

Purpose of Standard 4

9

slide-10
SLIDE 10

Fall 2017 | Washington, D.C.

“...Standard 4 addresses the results of preparation at the point where they most matter—in classrooms and schools.” “...judgment (of candidate knowledge and skills) is finally dependent on the impact the completers have on-the-job with P-12 student learning and development.”

CAEP Standards Commission Rationale

slide-11
SLIDE 11

Fall 2017 | Washington, D.C.

  • inspired by the Baldridge Performance Excellence Program, Education

Criteria for Performance Excellence

  • describes four complementary aspects of the results of educator

preparation where it matters most—in the schools and classrooms where completers are employed

  • results matter – effort is not enough

Emphasis on Results

slide-12
SLIDE 12

Fall 2017 | Washington, D.C.

  • Standards 1-3
  • NRC 2010 aspects of teacher preparation “likely to have the strongest effects” on
  • utcomes
  • Standard 4
  • Lynchpin - assessment of outcomes critical to
  • testing inputs and
  • grounding data-informed improvement
  • STANDARD 5
  • Quality assurance system & organizational improvement
  • research advanced by the Carnegie Foundation for the Advancement of Teaching

12

Standard 4 in Context

slide-13
SLIDE 13

Fall 2017 | Washington, D.C.

Addressing the Challenges

13

slide-14
SLIDE 14

Fall 2017 | Washington, D.C.

  • This is a new category of evidence
  • Data gathered on student learning and shared with EPPs differ

– from state to state – Within states

  • Methodologies differ for measuring student learning
  • CAEP phase-In policy
  • Making the case, multiple measures
  • Accreditation Council Decision
  • CAEP Accreditation Policy pages 17, 33, and 34

14

Acknowledging the Challenges of Standard 4

“Standard 4 asks the right questions”

slide-15
SLIDE 15

Fall 2017 | Washington, D.C.

  • Describe data sources, trends/patterns, and representativeness
  • Examine differences and similarities
  • Look across licensure areas; show differences across demographic categories
  • Identify comparison points and contextualize data
  • Comparisons over time; highlight any existing benchmarks; make normative comparisons to peers;

show comparisons to performance standards

  • Discuss implications of the findings for subsequent action
  • Focus is on the standard’s holistic and overarching expectations

Guidelines for EPP self-study reports: Present Results Appropriately, in a way that Aligns with the Standard

slide-16
SLIDE 16

Fall 2017 | Washington, D.C.

  • What points or data are provided as evidence?
  • Which points or data support the argument?
  • Which points or data are neutral?
  • Which points or data conflict with each other?
  • Draw conclusions about the extent to which the data support the standard
  • If appropriate, address triangulation and convergence of different forms or evidence to compensate

for limitations of any one data source

  • Acknowledge weaknesses in data or evidence and how you are addressing weaknesses
  • Appropriate interpretations and conclusions are supported by data

More guidelines for EPPs: Frame the Argument at the Standard Level

slide-17
SLIDE 17

Fall 2017 | Washington, D.C.

Evidence and Sufficiency Criteria

17

slide-18
SLIDE 18

Fall 2017 | Washington, D.C.

EVIDENCE SUFFICIENCY: RESOURCES

CONSULT:

  • Assessment Sufficiency Criteria
  • CAEP Evaluation Framework for EPP-Created Assessments
  • Evidence Sufficiency Criteria
  • CAEP Evaluation Rubric for Visitor Teams

18

slide-19
SLIDE 19

Fall 2017 | Washington, D.C.

EVIDENCE SUFFICIENCY: GENERAL RULES

STANDARD 4

  • All components of each standard are addressed
  • At least three cycles of data
  • Sequential and most recent available
  • Results disaggregated by licensure area, campus, delivery method
  • EPP-created assessments meet CAEP’s assessment sufficiency criteria
  • All data are from/about completers (in-service)
  • All phase-in requirements are met.
  • Site visits in Academic Year 2017-2018 can present plan along with progress data
  • Site visits in F18 and beyond are not eligible for phase-in
  • 19
slide-20
SLIDE 20

Fall 2017 | Washington, D.C.

POTENTIAL ISSUES: STANDARD 4

AREAS FOR IMPROVEMENT (AFIs) MAY BE CITED WHEN

  • Site team identifies specific gaps or inconsistencies in the coherence of the

EPP’s case that it meets the standard E.g.:

  • Instrument Quality is Poor; EPP-created assessments used to collect Standard 4 data

have significant deficiencies with respect to CAEP’s assessment evaluation framework

  • Phase-In Plans for one or more components do not meet CAEP’s guidelines for plans
  • Less than three cycles of data are provided
  • Less than one cycle of phase-in data collected by calendar 2018
  • Interpretations of evidence are not well grounded in the provided evidence
  • Inaccuracies found when comparing original data to reported results
  • Measures are not relevant for the purpose for which they are used

20

slide-21
SLIDE 21

Fall 2017 | Washington, D.C.

POTENTIAL ISSUES: STANDARD 4

STIPULATIONS MAY BE CITED WHEN

  • No significant analysis or interpretation of evidence
  • Evidence not responsive to the Standard
  • Preservice data rather than inservice
  • Significant aspects of the standard not addressed by relevant measures
  • No efforts to document validity of evidence and/or no information on

representativeness of the data

  • No explicit plans for the EPP to move toward stronger Standard 4 data in the future

21

slide-22
SLIDE 22

Fall 2017 | Washington, D.C.

POTENTIAL ISSUES: STANDARD 4

STANDARD 4 MAY BE DEEMED UNMET WHEN

  • Standard 4 will not be met when two or more stipulations are cited
  • Within a component
  • Across components
  • If required evidence is not provided for any component, a stipulation is

assigned, and the Standard may or may not be met (depending on other accreditation findings)

  • If the standard is met—with one stipulation cited for insufficient evidence on

a component, the EPP has 24 months from the decision to provide sufficient evidence to remedy the deficiency.

  • A document review will be conducted by a site team comprised of 2 or 3

site visitors.

22

slide-23
SLIDE 23

Fall 2017 | Washington, D.C.

When data are not available from the state, consider these options:  Teacher-linked P-12 student learning data from school districts or from individual schools  Teacher information from district or school tests  Teacher action research information on P-12 student learning, perhaps in the form of a “portfolio” of different teacher experiences and results with P-12 student learning CAEP encourages providers whose completers are employed by the same school districts to collaborate in development and conduct of such options.

23

These are examples of evidence that CAEP has suggested in Qs and As and in the Handbook

slide-24
SLIDE 24

Fall 2017 | Washington, D.C.

  • 1. Demonstrate that they are familiar with the sources of the P-12 student

learning impact data that are returned to EPPs. E.g.,

 the psychometric qualities of the P-12 assessments,  the alignment of the assessments with the state’s curriculum,  technical features such as the proportion of students included, the soundness of

the student teacher link,

 the method of forecasting expected student growth, and  the adjustments for classroom or school characteristics so that teachers in similar

situations can be fairly compared.

24

When data are available from the state or its districts

EPPs should:

slide-25
SLIDE 25

Fall 2017 | Washington, D.C.

  • 2. Provide their own analysis and evaluation of the state-provided information
  • n P12 student learning. E.g.,

 characteristics and patterns in the data (e.g., stability of the data, or trends in the

data),

 their interpretations of the data (e.g., comparisons with completers from other

EPPs,

 possible influences on the data from the particular places in which completers are

employed,

 consistency or differences in data compared with other sources such as employer

surveys or teacher observation measures).

25

More on when data are available from the state

slide-26
SLIDE 26

Fall 2017 | Washington, D.C.

AND, FINALLY, WHEN DATA ARE AVAILABLE FROM THE STATE

  • 3. Indicate how they have used the P-12 student impact data to consider

implications for features of the preparation experiences.

Note, however, that even though the EPP reports data that the state has shared, it may conclude that the state data are of limited value for its demonstration that Standard 4 is met.

26

slide-27
SLIDE 27

Fall 2017 | Washington, D.C.

Annotated EPP Evidence Examples for 4.1 & 4.2

27

slide-28
SLIDE 28

Fall 2017 | Washington, D.C.

  • CAEP Standard 4 evidence: A resource for EPPs

28

Resource with Illustrations of and Comments on Recent EPP Submissions to Demonstrate 4.1 & 4.2

slide-29
SLIDE 29

Fall 2017 | Washington, D.C.

Example #1—STATE University

  • P-12 academic achievement comparison using NWEA data
  • makes use of data in the state from NWEA-available student growth measures;
  • with confirmation from correlated measures and case study using Danielson
  • case study including TWS type data that is linked with similar assessments used during

pre-service

  • triangulates findings
  • emphasizes importance, first, of building a tracking system for graduates
  • 29
slide-30
SLIDE 30

Fall 2017 | Washington, D.C.

Example #2—PRIVATE University

  • P-12 student growth
  • Self-study report includes student growth percentiles from state but the information is

highly summarized, so of little utility to the EPP.

  • Complemented by a teacher action research project using volunteering

completers, and being constructed as a continuing activity

  • design permits links with pre-service data for the same completers
  • candidate tasks are similar to those in a teacher work sample assessment, and

specifically include pre and post measures for teaching a comprehensive unit.

30

slide-31
SLIDE 31

Fall 2017 | Washington, D.C.

Example #3—PUBLIC University

  • student value-added data as part of state teacher evaluation
  • State value-added data are available but are aggregated so that little useful

information is available. But these data are part of the state’s teacher evaluation assessment so evidence for 4.1 and 4.2 are linked

  • Characteristics of the state system are described.
  • complemented by planned teacher action research
  • The EPP shows options that it has considered to complement the state data,

and designates one as the path it intends to follow. It will work with a school partner to gather and evaluate classroom data from novice teachers.

  • 31
slide-32
SLIDE 32

Fall 2017 | Washington, D.C.

  • Use available data, but learn their strengths and weaknesses
  • State data on P-12 learning are highly variable from state to state, and self-study reports

that CAEP has examined generally found them insufficient as feedback on the EPP’s own performance

  • Need something to complement the state data
  • Many EPPs are developing or conducting some kind of “case study”
  • One important lesson: build a tracking system with your completers
  • Collaborating with partner school districts may be a strategy
  • Collaboration across EPPs might be another
  • Think about what the data are telling you
  • how you can get the most important messages from the data so that you can consider the implications

32

Summary of key points

slide-33
SLIDE 33

Fall 2017 | Washington, D.C.

  • These can supplement evidence to make it more complete
  • E.g., other grades, subjects
  • And they can provide feedback in useful forms for the EPP
  • E.g., use pre-service teacher work sample-type assessments as a base point for analyzing new in-service

teacher data

  • E.g., tasks could include some version of assessment information on P-12 knowledge/ skills before and after

an instructional unit

  • One approach was in the form of teacher action research—requires completer

input

  • E.g., pilot a teacher action research project using volunteering completers, and constructed as an annually

recurring activity

33

Case studies have been a frequent strategy for EPPs

slide-34
SLIDE 34

Fall 2017 | Washington, D.C.

  • Strengths found in evidence:
  • Significant amount
  • Multiple Measures
  • Some concerns in evidence:
  • Evidence Choices
  • Wide Variations

Take Away Messages

slide-35
SLIDE 35

35

slide-36
SLIDE 36

Fall 2017 | Washington, D.C.

Contact Us

  • Jennifer Carinci

Director of Research and Engagement

Jennifer.carinci@caepnet.org

  • Emerson Elliott

Director, Special Projects

Emerson.elliott@caepnet.org

36