The Integration of Teaching, We need to make our assumptions more - - PDF document

the integration of teaching
SMART_READER_LITE
LIVE PREVIEW

The Integration of Teaching, We need to make our assumptions more - - PDF document

Jim Pellegrino, LSRI June 28, 2017 University of Illinois at Chicago A common problem is that the program designs, instructional activities, and assessment strategies we use in higher education often do not lead to the desired student


slide-1
SLIDE 1

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 1

The Integration of Teaching, Learning and Assessment: A Design-Based Approach

Jim Pellegrino

“I think you should be more explicit here in Step Two.”

A common problem is that the program designs, instructional activities, and assessment strategies we use in higher education often do not lead to the desired student outcomes. We need to make our assumptions more EXPLICIT

  • - rather than believe in miracles.

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
  • About Deeper Learning & Teaching for Transfer
  • About “Knowing What Students Know”
  • Putting the Pieces Together: Principled

Redesign of Advanced Placement Science

  • Some Considerations for the Road Ahead

Learning Sciences Research Institute

Learning at the Center of a Larger Coordinated System

Research & Theory on Knowing and Learning

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
slide-2
SLIDE 2

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 2

Learning Sciences Research Institute

Generalizations About Performance

Competence & Expertise

– performance develops in communities that value certain forms

  • f knowledge and activity, like modeling in science or design in

engineering. – knowledge is tuned to specific patterns of activity, like solving certain kinds of problems or designing classes of objects and tools. – performance increases in scope and precision with multiple, contextualized experiences. – no magic levers: practice, disciplined inquiry.

  • Implication -- Assessments must be designed to

capture the complexity of expertise and competent performance, ranging from mental processes to participation in forms of practice

Learning Sciences Research Institute

Generalizations about Development

  • Not all persons learn in the same way or follow the

same paths to competence.

– Conceptual change is often not a simple, uniform progression, nor is there movement directly from erroneous to optimal solution strategies or incorrect to correct representations. – Intermediate forms of knowledge may not resemble expert forms, so simple building block relations may not hold. – Participation often “starts at the edges” and becomes progressively more aligned with core disciplinary practices.

  • Implication - Assessments should identify specific

strategies, knowledge representations, and forms of activity with respect to the role they play in developmental trajectories.

Learning Sciences Research Institute

Generalizations about Knowledge

  • Disciplinary Knowledge

– Is organized in ensembles that facilitate its use. – Is amplified by processes of self regulation, or “metacognition,” where learners spontaneously evaluate their knowledge and its limits. – Is developed in communities that foster identity and interest.

  • Implications for Assessment -- multiple “questions”

– Knowledge Issues - Specific Facts, Procedures, Schemas – Reflection Issues - Articulation, Evaluation – Practice Issues - Why prove? Model?

Learning Sciences Research Institute

Curriculum Instruction Assessment HPL

Domain-Based Models of Learning & Understanding

Domain-Based Models at the “Core”

slide-3
SLIDE 3

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 3

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
  • About Deeper Learning & Teaching for Transfer

Learning Sciences Research Institute

Skills ¡Iden+fied ¡in ¡an ¡ Influen+al ¡OECD ¡Survey ¡

Many, if not all, of these are what we expect of the graduates from our engineering programs

Learning Sciences Research Institute

Clarifying ¡the ¡Larger ¡ “Educa+onal ¡Ques+on” ¡

  • Why ¡such ¡great ¡interest ¡in ¡21st ¡Century ¡Skills? ¡
  • What ¡exactly ¡is ¡everyone ¡a:er ¡-­‑-­‑ ¡what ¡do ¡people ¡see ¡as ¡

missing ¡from ¡the ¡current ¡educaAon ¡system? ¡ ¡

  • What’s ¡different ¡relaAve ¡to ¡the ¡20th ¡or ¡19th ¡centuries? ¡
  • Some ¡possible ¡answers: ¡
  • # ¡1 ¡-­‑-­‑ ¡The ¡Holy ¡Grail ¡of ¡EducaAon ¡– ¡TRANSFER ¡
  • #2 ¡– ¡Not ¡so ¡different ¡than ¡in ¡the ¡past ¡with ¡the ¡

excepAon ¡of ¡technology ¡mediated ¡acAviAes ¡

  • #3 ¡– ¡What’s ¡different ¡is ¡that ¡everyone ¡needs ¡to ¡

develop ¡a ¡high ¡level ¡of ¡Competency ¡given ¡the ¡nature ¡of ¡ life ¡and ¡work ¡in ¡the ¡21st ¡century ¡

Learning Sciences Research Institute

Clarifying Terms

  • Deeper ¡learning ¡is ¡the ¡process ¡of ¡learning ¡

for ¡transfer. ¡ ¡It ¡enables ¡an ¡individual ¡to ¡ take ¡what ¡was ¡learned ¡in ¡one ¡situaAon ¡and ¡ apply ¡it ¡to ¡new ¡situaAons. ¡

  • The ¡product ¡of ¡deeper ¡learning ¡is ¡

transferable ¡knowledge, ¡including ¡ ¡content ¡ knowledge ¡in ¡a ¡subject ¡area ¡and ¡procedural ¡ knowledge ¡of ¡how, ¡why, ¡and ¡when ¡to ¡apply ¡ this ¡knowledge ¡to ¡answer ¡quesAons ¡and ¡ solve ¡problems ¡in ¡the ¡subject ¡area. ¡

  • We ¡refer ¡to ¡this ¡transferable ¡knowledge ¡as ¡

“21st ¡century ¡competencies” ¡to ¡reflect ¡that ¡ both ¡skills ¡and ¡knowledge ¡are ¡included. ¡ ¡ ¡

slide-4
SLIDE 4

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 4

Learning Sciences Research Institute

  • The Cognitive Domain includes three clusters of competencies:
  • cognitive processes and strategies
  • knowledge
  • creativity
  • These clusters include competencies such as critical thinking, information

literacy, reasoning and argumentation, and innovation.

  • The Intrapersonal Domain includes three clusters of competencies:
  • intellectual openness
  • work ethic and conscientiousness
  • positive core self-evaluation
  • These clusters include competencies such as flexibility, initiative,

appreciation for diversity, and metacognition (the ability to reflect on one’s

  • wn learning and make adjustments accordingly).
  • The Interpersonal Domain includes two clusters of competencies:
  • teamwork and collaboration
  • leadership
  • These clusters include competencies such as communication, collaboration,

responsibility, and conflict resolution.

Three Domains of Competence

Learning Sciences Research Institute

Teaching for Transfer

  • Emerging ¡evidence ¡indicates ¡

that ¡cogniAve, ¡intrapersonal ¡ and ¡interpersonal ¡ competencies ¡can ¡be ¡taught ¡ and ¡learned ¡in ¡ways ¡that ¡ promote ¡effecAve ¡transfer. ¡

Learning Sciences Research Institute

Transfer is Supported When:

  • Learners ¡understand ¡general ¡principles, ¡

as ¡emphasized ¡in ¡the ¡recent ¡standards ¡in ¡ mathemaAcs, ¡science ¡and ¡English ¡ language ¡arts. ¡

  • Learners ¡understand ¡factual ¡and ¡

conceptual ¡knowledge ¡in ¡a ¡subject ¡area ¡ and ¡also ¡applicable ¡problem-­‑solving ¡

  • strategies. ¡
  • Learners ¡recognize ¡how, ¡when, ¡and ¡why ¡

to ¡apply ¡their ¡factual, ¡conceptual, ¡and ¡ procedural ¡knowledge ¡and ¡skills. ¡

¡

Learning Sciences Research Institute

To Design Instruction for Transfer:

  • Begin ¡with ¡clearly-­‑defined ¡learning ¡goals ¡and ¡a ¡

model ¡of ¡how ¡learning ¡is ¡expected ¡to ¡develop. ¡

  • Use ¡assessments ¡to ¡measure ¡and ¡support ¡progress ¡

toward ¡goals. ¡ ¡

  • Provide ¡mulAple, ¡varied ¡representaAons ¡of ¡

concepts ¡and ¡tasks. ¡

  • Encourage ¡quesAoning ¡and ¡discussion. ¡
  • Engage ¡learners ¡in ¡challenging ¡tasks, ¡with ¡support ¡

and ¡guidance. ¡

  • Teach ¡with ¡carefully ¡selected ¡sets ¡of ¡examples ¡and ¡
  • cases. ¡
  • Prime ¡student ¡moAvaAon. ¡
  • Use ¡formaAve ¡assessment ¡to ¡provide ¡feedback. ¡

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
  • About Deeper Learning & Teaching for Transfer
  • About “Knowing What Students Know”
slide-5
SLIDE 5

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 5

Learning Sciences Research Institute

Contexts and Purposes: Distinctions Making a Difference

  • Contexts:

– small scale: individual classrooms – intermediate scale: departments, colleges – large scale: university systems, states, nations

  • Purposes:

– assist learning (formative) – measure individual achievement (summative) – evaluate programs (accountability)

  • Problem: One size does not fit all

– Educators at different levels need different information at different grain sizes and time scales – Differing priorities, constraints, & tradeoffs

Learning Sciences Research Institute

Assessment as a Process of

Reasoning from Evidence

  • cognition

– model of how students represent knowledge & develop competence in the domain

  • observation

– tasks or situations that allow one to observe students’ performance

  • interpretation

– method for making sense of the data

  • bservation

interpretation cognition

Must be coordinated!

Learning Sciences Research Institute

Scientific Foundations

  • f Educational Assessment
  • Advances in the Sciences of Thinking and

Learning -- the cognition vertex

– informs us about what observations are important and sensible to make

  • Contributions of Measurement and Statistical

Modeling -- the interpretation vertex

– Informs us about how to make sense of the

  • bservations we have made

Learning Sciences Research Institute

Why Models of Development of Domain Knowledge are Critical

  • Tell us what are the important aspects of

knowledge that we should be assessing.

  • Give us strong clues as to how such

knowledge can be assessed

  • Can lead to assessments that yield more

instructionally useful information

– diagnostic & prescriptive

  • Can guide the development of systems of

assessments

– work across contexts & time

slide-6
SLIDE 6

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 6

Learning Sciences Research Institute

Beyond Issues of Theory to Issues of Design & Use

  • Assessment design spaces vary tremendously &

involve multiple dimensions

– Type of knowledge and skill and levels of sophistication – Time period over which knowledge is acquired – Intended use and users of the information – Availability of detailed theories & data – Distance from instruction and assessment purpose

  • Need a principled process that can help structure

going from theory, data and/or speculation to an

  • perational assessment

– Evidence-Centered Design

Exactly what knowledge do you want students to have and how do you want them to know it?

claim space (student model) evidence model task model

What task(s) will the students perform to communicate their knowledge? How will you analyze and interpret the evidence?

Evidence-Centered Design

What will you accept as evidence that a student has the desired knowledge?

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
  • About Deeper Learning & Teaching for Transfer
  • About “Knowing What Students Know”
  • Putting the Pieces Together: Principled

Redesign of Advanced Placement Science

Learning Sciences Research Institute

Why an AP Science Redesign?

  • A 2002 NRC Report identified ways

to improve advanced study of math and science in the U.S. The Report’s recommendations are applicable to all AP course subjects:

– Emphasize deep understanding rather than comprehensive coverage -- avoid “mile wide & inch deep” syndrome – Reflect current understanding of how students learn in a discipline – Reflect current research directions within the disciplines – Emphasize the development of inquiry and reasoning skills

Learning Sciences Research Institute

Understanding by Design, G. Wiggins & J. McTighe (1998, 2006). How People Learn: Brain, Mind, Experience &

  • School. J. Bransford, A. Brown, & R. Cocking, S.

Donovan, & J. Pellegrino (2000). Knowing What Students Know: The Science and Design of Educational Assessment, J. Pellegrino, N. Chudowsky, & R. Glaser (2001). Evidence-Centered Assessment Design: Layers, Structures, and Terminology, R.J. Mislevy & M.M. Riconscente (2005).

Conceptual Approach Builds Upon Work of Others:

slide-7
SLIDE 7

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 7

Learning Sciences Research Institute

Components & Timeline

2006 2007 2008 2009 2010 2011-2015+

Curriculum Design:

Formative assessments and interpretive framework Course Description

Process Design:

Based on how people learn, identified shortcomings, and discipline expertise

Establish Foundations:

Disciplinary experts identify essential concepts and reasoning skills

Professional Development:

Provide AP teachers with the curriculum resources & training needed to teach the redesigned courses

OPERATIONS:

Deliver the redesigned course & exam to students, and incorporate into

  • ngoing operational processes

Assessment Design:

Tasks derived from claims and evidence Exam score descriptions from claims and evidence

Model of Knowing & Learning

Framework for Curriculum and Assessment are claims and evidence

Involve AP and Professional Community:

Review approach with and solicit feedback from instructors (secondary and post-secondary) and scientific communities Learning Sciences Research Institute Three Critical Design Phases

Domain Analysis: Core Disciplinary Ideas & Science Practices

Content Science practices

1. Use representations and models to communicate scientific phenomena and solve scientific problems. 2. Use mathematics appropriately. 3. Engage in scientific questioning to extend thinking or to guide investigations within the context of the AP course. 4. Plan and implement data collection strategies in relation to a particular scientific question. 5. Perform data analysis and evaluation of evidence. 6. Work with scientific explanations and theories. 7. Connect and relate knowledge across various scales, concepts, and representations in and across domains.

Learning Sciences Research Institute

Peer Review by Science Experts

Learning Sciences Research Institute

Biology: “Big Ideas”

The unifying concepts or Big Ideas increase coherence both within and across disciplines. A total of Four Big Ideas: The process of evolution drives the diversity and unity of life. B I G I D E A 1 Living systems retrieve, transmit, and respond to information essential to life processes. B I G I D E A 3 Biological systems interact, and these interactions possess complex properties. B I G I D E A 4 Biological systems utilize energy and molecular building blocks to grow, reproduce, and maintain homeostasis. B I G I D E A 2

Learning Sciences Research Institute

Biology: “Enduring Understandings”

For each Big Idea, there are enduring understandings which incorporate core concepts that students should retain. Total of 17 enduring understandings across the four Big Ideas. The process of evolution drives the diversity and unity of life. B I G I D E A 1

Enduring Understanding 1.A: Change in the genetic makeup of a population over time is evolution Enduring Understanding 1.B: Organisms are linked by lines of descent from common ancestry Enduring Understanding 1.C: Life continues to evolve within a changing environment Enduring Understanding 1.D: The origin of living systems is explained by natural processes

slide-8
SLIDE 8

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 8

Learning Sciences Research Institute

AP Science Practices: Level 1

  • 1. Use representations and models to communicate

scientific phenomena and solve scientific problems.

  • 2. Use mathematics appropriately.
  • 3. Engage in scientific questioning to extend thinking or to

guide investigations within the context of the AP course.

  • 4. Plan and implement data collection strategies in relation

to a particular scientific question.

  • 5. Perform data analysis and evaluation of evidence.
  • 6. Work with scientific explanations and theories.
  • 7. Connect and relate knowledge across various scales,

concepts, and representations in and across domains.

Learning Sciences Research Institute

AP Science Practices: Level 2

Level 1: work with scientific explanations & theories Level 2:

  • 1. justify claims with evidence
  • 2. construct explanations of phenomena based on

evidence produced through scientific practices

  • 3. articulate the reasons that scientific explanations

and theories are refined or replaced

  • 4. make claims and predictions about natural

phenomena based on scientific theories & models.

  • 5. evaluate alternative scientific explanations

Learning Sciences Research Institute

AP Science Practices: Level 3 Evidence of Competency

Level 1: work with scientific explanations and theories Level 2: justify claims with evidence Level 3: Description of what observable data are needed to support the claim that a student can achieve the Level 2

Learning Sciences Research Institute

Level 3: Examples of Evidence Found in Students’ Work

  • Robustness of evidence (from investigations, theories, or

models) mustered in support of claim

  • Appropriateness of reasoning behind selection and

exclusion of evidence

  • Appropriateness of model incorporated
  • Consideration of data from multiple sources (e.g.,

investigations, scientific observations, the findings of others, historic reconstruction, and/or archived data historical experiments)

  • Differentiation between a claim and the evidence that

supports it

  • Differentiation between evidence and explanation
  • Inclusion and reasonableness of a statement of prediction
  • r existence of a phenomena

Learning Sciences Research Institute Three Critical Design Phases

Exactly what knowledge do you want students to have and how do you want them to know it?

claim space evidence task

What task(s) will the students perform to communicate their knowledge? How will you analyze and interpret the evidence?

Evidence-Centered Design

What will you accept as evidence that a student has the desired knowledge?

Instantiated through the Intersection of Course Content & Science Reasoning

slide-9
SLIDE 9

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 9

Learning Sciences Research Institute

Integrating the Content and Science Practices

Content Science Practice Learning Objective

+

Science Practice 5.3

The student connects phenomena and models across spatial and temporal scales

Learning Objective (1.B.2 & 5.3)

The student is able to evaluate evidence provided by a data set in conjunction with a phylogenetic tree or a simple cladogram to determine evolutionary history and speciation

Essential Knowledge 1.B.2

Phylogenetic trees and cladograms are graphical representations (models) of evolutionary history that can be tested

Learning Sciences Research Institute

Connecting the Domain Model to Curriculum, Instruction, & Assessment

Learning Sciences Research Institute

Structure of the AP Biology Curriculum Framework

Science Practices: Science Inquiry & Reasoning

Essential Knowledge Learning Objectives Enduring Understandings

4 Big Ideas

Learning Sciences Research Institute

Connecting the Domain Model to Curriculum, Instruction, & Assessment

Learning Sciences Research Institute

Three Critical Design Phases

slide-10
SLIDE 10

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 10

Learning Sciences Research Institute

ECD and the New AP Exam

No test items will focus on low cognitive level/declarative knowledge/recall

For each exam item, students will either produce the evidence (CR) or engage with the evidence (SR/MC)

► explain ► justify ► predict ► evaluate ► describe ► analyze ► pose scientific questions ► construct explanations ► construct models ► represent graphically ► solve problems ► select and apply

mathematical routines

Learning Sciences Research Institute

Lessons Learned from the AP Redesign Project

  • No Pain -- No Gain!!! -- this is hard work
  • Backwards Design and Evidence

Centered Design are challenging to execute & sustain

– Requires multidisciplinary teams – Requires sustained effort and negotiation – Requires time, money & patience

  • Value-added -- Validity is “designed in”

from the start as opposed to “grafted on”

– Elements of a validity argument are contained in the process and the products

5 7

Application to Re-envisioning K-12 Science Education

2012 2013 2014

Learning Sciences Research Institute

Application to Changes in STEM in Higher Education

  • What are the big ideas in “my course”?

– What claims about student knowledge and competence do I want to make as a result of my course? – What are my assessments assessing?

  • What are the big ideas that run through our

collection of courses and our program?

– What claims do we want/need to make -- for ourselves and our students? – What evidence do we have to support those claims?

  • Given the assessments and data that we have

now, what claims can they support?

– What’s missing? What should be changed?

Learning Sciences Research Institute

Making Explicit What is Often Tacit

  • Learning as the Core for Integrating Curriculum,

Instruction & Assessment

  • About “How People Learn”
  • About Deeper Learning & Teaching for Transfer
  • About “Knowing What Students Know”
  • Putting the Pieces Together: Principled

Redesign of Advanced Placement Science

  • Some Considerations for the Road Ahead

Learning Sciences Research Institute

About Teaching, Learning, & Assessment

  • There is an evolving science of learning and

assessment that incorporates ideas about the development of disciplinary knowledge and the nature

  • f competence
  • It has major implications for all aspects of schooling

from pre-kindergarten through post-secondary education – design of curriculum, instruction, assessment, and faculty professional development

  • It provides a basis for knowing when, how and why to

use various instructional strategies; it can guide intelligent design and use of new curricular materials as well as information technologies

slide-11
SLIDE 11

Jim Pellegrino, LSRI University of Illinois at Chicago June 28, 2017 ASEE Annual Meeting Columbus, OH 11

Learning Sciences Research Institute

Things ¡to ¡Consider ¡

  • Important outcomes of engineering education

align well with general goals for education & prediction of workplace success

– Deeper learning – 21st Century Competencies

  • We develop these competencies through well

designed disciplinary instruction and programs

– Transfer is challenging and is not universal – Instructional design can foster deeper learning – Assessment is a key part of instructional design

Learning Sciences Research Institute

More ¡Things ¡to ¡Consider ¡

  • Assessment of critical competencies is

challenging and requires principled approaches to assessment design and validation

– Cognitive, interpersonal, intrapersonal – Formative and summative assessment purposes – Certain assessment tools can add value in this effort

  • Evaluation requires us to specify our Theory of

Action and we can do so using Logic Models

– Helps to clarify intents, goals, and actions – Framework for formative & summative evaluation

“I think you should be more explicit here in Step Two.”

In Higher Education we need to make our assumptions and our Theory of Action more EXPLICIT.