Reporting Results from Three Dimensional Science Assessments Nathan - - PowerPoint PPT Presentation

reporting results from three dimensional science
SMART_READER_LITE
LIVE PREVIEW

Reporting Results from Three Dimensional Science Assessments Nathan - - PowerPoint PPT Presentation

Reporting Results from Three Dimensional Science Assessments Nathan Dadey Center for Assessment National Conference On Student Assessment San Diego, CA June 27 th , 2018 Most educational assessment score reports look and feel about the same,


slide-1
SLIDE 1

Reporting Results from Three Dimensional Science Assessments

National Conference On Student Assessment San Diego, CA June 27th, 2018

Nathan Dadey Center for Assessment

slide-2
SLIDE 2

6/28/2018 2

Most educational assessment score reports look and feel about the same, and have for some time.

slide-3
SLIDE 3

6/28/2018 3

Look at this part of a report from 2001.

slide-4
SLIDE 4

6/28/2018 3D Science Assessment Reporting 4

slide-5
SLIDE 5

6/28/2018 5

Seem familiar?

slide-6
SLIDE 6

6/28/2018 6

And this would be fine, if we thought score reports were working the way we want them to.

slide-7
SLIDE 7

Take Away Points

6/28/2018 3D Science Assessment Reporting 7

  • 1. Consider Audience and Use

– Deemphasize indefensible uses – Design early and ask people to think aloud using draft reports

  • 2. Put psychometric concerns in context, as one of

many competing concerns

  • 3. Stop thinking in terms of two page paper or pdf

score report – if possible, leverage technology

slide-8
SLIDE 8

6/28/2018 8

This presentation focuses on the first two points.

slide-9
SLIDE 9

Take Away Points

6/28/2018 3D Science Assessment Reporting 9

  • 1. Consider Audience and Use

– Deemphasize indefensible uses – Design early and ask people to think aloud using draft reports

  • 2. Put psychometric concerns in context, as one of

many competing concerns

  • 3. Stop thinking in terms of two page paper or pdf

score report – if possible, leverage technology

slide-10
SLIDE 10

6/28/2018 3D Science Assessment Reporting 10

Part I Audience & Use

slide-11
SLIDE 11

6/28/2018 11

Score reports are infamously “the last thing developed and the first thing seen”

(c.f., Zenisky, Hambleton & Sireci, 2009)

3D Science Assessment Reporting

slide-12
SLIDE 12

The Current State of Affairs

6/28/2018 12

Educators used the score report information for a wide range of purposes, from planning everyday lessons to making school-level programming decision (e.g., placement) or district-level decisions (e.g., offering professional learning opportunities). Although ACCESS is not intended for informing daily instruction, educators heavily relied on ACCESS scores due to lack of access to or familiarity with other resources to understand students’ English language development.

slide-13
SLIDE 13

6/28/2018 13

Thus we need to show best practice around intended uses and also signal unsupported uses.

slide-14
SLIDE 14

6/28/2018 3D Science Assessment Reporting 14

Audience Intended Use Narrative or Story (needed info) Supports

slide-15
SLIDE 15

6/28/2018 3D Science Assessment Reporting 15

Audience Intended Use Narrative or Story (needed info) Supports

Narrative is about translating the claim to make sense to the intended audience.

slide-16
SLIDE 16

6/28/2018 3D Science Assessment Reporting 16

Audience Intended Use Narrative or Story (needed info) Supports

For parents, the narrative might not use terms like “three dimensional” or DCI, SEPs or CCCs.

slide-17
SLIDE 17

6/28/2018 3D Science Assessment Reporting 17

Audience Intended Use Narrative or Story (needed info) Supports

Supports are all about what needs to be put in place beyond the report to insure the outcomes based on the intended uses can be meet.

slide-18
SLIDE 18

6/28/2018 3D Science Assessment Reporting 18

Audience Intended Use Narrative or Story (needed info) Supports

Which gets into your theory of action that supports assessment score use, but that’s not a story for today.

slide-19
SLIDE 19

Testing out Designs

6/28/2018 3D Science Assessment Reporting 19

  • Action plan – developing out a template based on

prior table and working with states to see how it works

– At several stages of development

  • Building on the design pattern work coming out
  • f ETS (cf., Zapata-Rivera & Katz, 2016)
slide-20
SLIDE 20

6/28/2018 3D Science Assessment Reporting 20

Part II Putting Psychometrics in its Place

slide-21
SLIDE 21

6/28/2018 3D Science Assessment Reporting 21

Don’t try to solve a communication problem with a measurement model.

slide-22
SLIDE 22

22

Behrens, DiCerbo, Murphy & Robinson (2013)

6/28/2018

slide-23
SLIDE 23

23

Behrens, DiCerbo, Murphy & Robinson (2013)

6/28/2018

slide-24
SLIDE 24

A Balancing Act

6/28/2018 3D Science Assessment Reporting 24

  • Psychometrics is only one concern among many,
  • ther concerns in include:

– Audience – Intended Use – Constraints

  • Sometimes sacrificing psychometric “integrity” to

better tell a story is worthwhile (e.g., using scores with no “value-added”)!

slide-25
SLIDE 25

A Balancing Act, Continued

6/28/2018 3D Science Assessment Reporting 25

Two perspectives on subscores:

  • Subscores are less reliable versions of the total

score and should not be reported

  • Subscores provide insight into the structure of the

assessment

slide-26
SLIDE 26

6/28/2018 3D Science Assessment Reporting 26

There are examples of each perspective within the field.

slide-27
SLIDE 27

6/28/2018 3D Science Assessment Reporting 27

Here’s an example reporting structure from the latter option.

slide-28
SLIDE 28

6/28/2018 3D Science Assessment Reporting 28

slide-29
SLIDE 29

Foregrounding

6/25/2018 29

  • Any aspect of the standards could

“foregrounded,” or emphasized, to provide additional structure to the claims, achievement level descriptors, score reports, blueprints, item clusters, or any combination thereof

– For example, reporting three subscores based on the Disciplinary Core Idea domains will foreground these domains

slide-30
SLIDE 30

Foregrounding & Subscores

6/28/2018 3D Science Assessment Reporting 30

  • Should the foregrounded aspects of the

assessment be reflected in:

– ALDs? – Subscores?

  • The model used to produce the overall score and

subscores doesn’t have to be the same!

slide-31
SLIDE 31

What Defines Subscores?

6/28/2018 3D Science Assessment Reporting 31

  • A Single Dimension

– Leveraging 2D coding – Probably a bad idea (even if say, a Scientific and Engineering Practice Score and Cross Cutting Concept Score are reported within a Content Domain)

slide-32
SLIDE 32

What Defines Subscores?

6/28/2018 3D Science Assessment Reporting 32

  • A Single Dimension
  • Dimension Groupings
  • PEs
slide-33
SLIDE 33

What Defines Subscores?

6/28/2018 3D Science Assessment Reporting 33

Dimension Groupings

– DCI Topic Domain, Topic Bundles – Groupings of the Scientific and Engineering Practices (Investigating, Evaluating and Developing Explanations & Solutions) – Groupings of Cross Cutting Concept Scores (Systems, Causality, Patterns)

slide-34
SLIDE 34

Cross-tabular Score Report?

6/28/2018 34

Investigating Evaluating Developing Explanations & Solutions Systems Score Score Causality Score Patterns Score

slide-35
SLIDE 35

6/28/2018 3D Science Assessment Reporting 35

slide-36
SLIDE 36

6/28/2018 3D Science Assessment Reporting 36

Supplemental Slides

slide-37
SLIDE 37

6/28/2018 3D Science Assessment Reporting 37

Part III Moving Beyond the PDF

slide-38
SLIDE 38

Leveraging digital formats

6/28/2018 3D Science Assessment Reporting 38

  • How can we use the affordances of digital

technology to tell the right kinds of stories?

– Tooltips, hyperlinks, tabs, etc. – A score report should be less of a “one page summary” and more of a flexible tool that scaffolds users in developing understanding (e.g., score reporting as more

  • f an app than a single static page)
slide-39
SLIDE 39

A example (not so great) parent report

39

How did my student do? What kinds of things can my student do? What are some things I can do to help my student?

slide-40
SLIDE 40

40

How did my student do? What kinds of things can my student do? What are some things I can do to help my student?

A example (not so great) parent report

This category could be a bad idea.

slide-41
SLIDE 41

Reporting Scores from NGSS Assessments: Exploring Scores & Subscores

Nathan Dadey Center for Assessment

Reidy Interactive Lecture Series Portsmouth, NH September 28th, 2017

slide-42
SLIDE 42

6/28/2018 3D Science Assessment Reporting 42

Part III Moving Beyond the PDF

slide-43
SLIDE 43

Leveraging digital formats

6/28/2018 3D Science Assessment Reporting 43

  • How can we use the affordances of digital

technology to tell the right kinds of stories?

– Tooltips, hyperlinks, tabs, etc. – A score report should be less of a “one page summary” and more of a flexible tool that scaffolds users in developing understanding (e.g., score reporting as more

  • f an app than a single static page)
slide-44
SLIDE 44

A example, perhaps not so great, parent report

44

How did my student do? What kinds of things can my student do? What are some things I can do to help my student?

slide-45
SLIDE 45

45

How did my student do? What kinds of things can my student do? What are some things I can do to help my student?

A example, perhaps not so great, parent report

slide-46
SLIDE 46

46

How did my student do? What kinds of things can my student do? What are some things I can do to help my student?

This category could be a bad idea as the assessment many not support such uses , but pushes on the supports that could be provided.

A example, perhaps not so great, parent report

slide-47
SLIDE 47

6/28/2018 3D Science Assessment Reporting 47

Slides From a Previous Presentation an the 2017 Reidy Interactive Lecture Series.

slide-48
SLIDE 48

6/28/2018 3D Science Assessment Reporting 48

For federally required accountability, likely an achievement level classification*. And the score reports will need at least one score.

*ESSA also requires “individual student interpretive, descriptive, and diagnostic reports… that allow parents, teachers, principals, and other school leaders to understand and address the specific academic needs of students” (§1111(b)(2)(B)(ix))

slide-49
SLIDE 49

A Quick Aside on Score Reports

6/28/2018 3D Science Assessment Reporting 49

  • Infamously “the last thing developed and the first

thing seen” (c.f., Zenisky, Hambleton & Sireci, 2009)

– Instead, develop score reports (reporting categories and mock ups) through-out the development cycle, starting in conjunction with the development of claims and blueprints, involving a multidisciplinary team

(Zenisky & Hambleton, 2012)

slide-50
SLIDE 50

Best Practices (Zenisky & Hambleton, 2012)

6/28/2018 3D Science Assessment Reporting 50

Development

  • Tailored to stakeholder groups
  • Field tested

Design

  • Clean and simple layout
  • Clear and concise language
  • Graphs

Content

  • Contain all needed information
  • Be actionable
  • Contain anchor points
  • Align clearly and explicitly to standards
  • Reported at the most fine-grain level possible
  • Provide context for score scales

Ancillary materials

  • Annotated example score report
  • In-depth background materials
  • Sample questions

Dissemination efforts

  • Timely enough to be meaningful
  • Menu-driven websites with on demand

information

  • Languages other than English as well as offline

formats

slide-51
SLIDE 51

Best Practices (Zenisky & Hambleton, 2012)

6/28/2018 3D Science Assessment Reporting 51

Development

  • Tailored to stakeholder groups
  • Field tested

Design

  • Clean and simple layout
  • Clear and concise language
  • Graphs

Contents

  • Contain all needed information
  • Be actionable
  • Contain anchor points
  • Align clearly and explicitly to standards
  • Reported at the most fine-grain level possible
  • Context for score scales

Ancillary materials

  • Annotated example score report
  • In-depth background materials
  • Sample questions

Dissemination efforts

  • Timely enough to be meaningful
  • Menu-driven websites with on demand

information

  • Languages other than English as well as offline

formats

In short, tell a compelling story about a student’s performance to a particular audience for a particular use.

slide-52
SLIDE 52

Best Practices (Zenisky & Hambleton, 2012)

6/28/2018 3D Science Assessment Reporting 52

Development

  • Tailored to stakeholder groups
  • Field tested

Design

  • Clean and simple layout
  • Clear and concise language
  • Graphs

Contents

  • Contain all needed information
  • Be actionable
  • Contain anchor points
  • Align clearly and explicitly to standards
  • Reported at the most fine-grain level possible
  • Provide context for score scales

Ancillary materials

  • Annotated example score report
  • In-depth background materials
  • Sample questions

Dissemination efforts

  • Timely enough to be meaningful
  • Menu-driven websites with on demand

information

  • Languages other than English as well as offline

formats

Focus areas for this presentation.

  • 1. Tailored to

Stakeholders

  • 2. Clearly Convey

Standards

  • 3. Provide Sample

Questions

  • 4. Provide Fine-

Grain Information

slide-53
SLIDE 53
  • 1. Defining Stakeholder groups

6/28/2018 3D Science Assessment Reporting 53

  • Who are the audiences and how can we best

communicate the NGSS to them? Often we have a

– Student Report (for Students, Teachers & Parents)

  • For the NGSS Students & Parents may need a different report

than Teachers

– Classroom Report (For Teachers and Administrators) – School Report (For Administrators & Policy Makers) – District Reports (For Administrators & Policy Makers)

  • Stakeholder groups are not exchangeable -
slide-54
SLIDE 54
  • 1. Defining Stakeholder groups

6/28/2018 3D Science Assessment Reporting 54

  • Who are the audiences and how can we best

communicate the NGSS to them? Often we have a

– Student Report (for Students, Teachers & Parents)

  • For the NGSS Students & Parents may need a different

report than Teachers

– Classroom Report (For Teachers and Administrators) – School Report (For Administrators & Policy Makers) – District Reports (For Administrators & Policy Makers)

slide-55
SLIDE 55
  • 2. Clearly convey the standards

6/28/2018 3D Science Assessment Reporting 55

  • Different than reporting performance on a task!

Instead, we are trying to make an overall claim about student performance in relation to the standards.

  • Likely, the claim is based on a selected (and

limited) number of performance expectations.

slide-56
SLIDE 56
  • 2. Clearly convey the claim

6/28/2018 3D Science Assessment Reporting 56

  • The claims we’ve seen thus far are often are in the

form of:

Disciplinary Core Ideas Science & Engineering Practices

Crosscutting Concepts Students can Engage in

&

Draw on

&

Explain Phenomenon

  • r Solve

Problems

to

slide-57
SLIDE 57
  • 2. Clearly convey the claim

6/28/2018 3D Science Assessment Reporting 57

  • This type of claim is very general – can we support such a

claim given a particular set of PEs?

– Or should the overall claim be delimited, or even defined

  • perationally?

– For example, grade 5 is heavy on the developing and using models and engaging in argument SEPs. Should the claim be specific enough to reflect that?

  • We suggest making the overall claim specific enough to

inform test development, and then translate that claim for each audience (or perhaps, start at the subclaim level).

slide-58
SLIDE 58
  • 2. Translate the claim

6/28/2018 3D Science Assessment Reporting 58

  • Rephrase the claim in ways that communicate to the

intended audience.

  • Develop user friendly text and graphics that go above and

beyond statements of the “mastery of the standards”.

– A rough example might be “science learning involves not only knowing the core ideas of science, but also being able apply the practices scientists and engineers use to solve problems and draw on concepts that cut across the domains of science”. – Clearly, the above text would need to be carefully explored to determine how to best communicate the standards for the given audience.

slide-59
SLIDE 59
  • 3. Sample questions

6/28/2018 3D Science Assessment Reporting 59

  • Given the complexity of the standards, as well as the

items and tasks, providing a sample question or questions that conveys the gist of the standards is likely to be more important for the NGSS than other standards.

  • Particularly for assessments with rich tasks (e.g., those

designed under the item clustering approach)

  • Could these be part of the score reporting (e.g., on a

second page)?

slide-60
SLIDE 60
  • 4. Fine-Grained Information

6/28/2018 3D Science Assessment Reporting 60

  • How can we convey information at a finer grain

than the overall score? E.g., about student strengths and weaknesses.

  • Particularly difficult – are we trying to disintegrate

the integrated NGSS? Maybe, but maybe not.

– Need to tackle this tension, as subclaims can help provide clarity on issues of design, and – Subscores are often expected on score reports.

slide-61
SLIDE 61
  • 4. Subscores

6/28/2018 3D Science Assessment Reporting 61

  • Subclaims & subscores have often been used

synonymously, but we suggest that subclaims be developed to help guide development and reporting.

  • Whether subclaims can be used to create

subscores for the NGSS is an open question.

  • Often, subscores are a “less reliable version of the

total score”.

slide-62
SLIDE 62
  • 4. Consumer Reports?

6/28/2018 3D Science Assessment Reporting 62

slide-63
SLIDE 63
  • 4. Approaching subscores

6/28/2018 3D Science Assessment Reporting 63

  • We suggest that a subclaim encompassing all of a

dimension is untenable (e.g., a claim about a student’s ability to apply the set of SEPs)

  • What will be “foregrounded” within the subclaims?

– DCIs Domains? – SEPs and CCCs? – Phenomenon?

  • Stringent item classifications to create better subscores?
slide-64
SLIDE 64

Foregrounding DCI Domains

6/28/2018 3D Science Assessment Reporting 64

  • The student understands physical systems as demonstrated

through the application of the Science and Engineering practices and the Crosscutting Concepts.

  • The student understands Earth and space systems as

demonstrated through the application of the Science and Engineering practices and the Crosscutting Concepts.

  • The student understands living systems as demonstrated through

the application of the Science and Engineering practices and the Crosscutting Concepts.

slide-65
SLIDE 65

6/28/2018 3D Science Assessment Reporting 65

Asking questions and defining problems Developing and using models Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Constructing explanations and designing solutions Engaging in argument from evidence Obtaining, evaluating, and communicatin g information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter 5-PS3-1

(PS3D.b, LS1C.a)

5-LS1-1

(LS1C.b)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

Science and Engineering Practices Crosscutting Concepts

Scale, proportion, & quantity 5-PS1-1

(PS1A.a)

5-PS1-3

(PS1A.c)

Systems & system models 5-ESS1-1

(ESS1A.a)

Grade 5 Standards Matrix

slide-66
SLIDE 66

Foregrounding SEPs & CCCs

6/28/2018 3D Science Assessment Reporting 66

Gathering Data and Investigating Scientific Questions: Reason with Evidence and Evaluate Scientific Claims and Questions Construct Scientific Explanations: Making Connections:

The student is able to obtain information, ask questions or define problems, plan and carry

  • ut investigations, use models to

gather data and information and/or use mathematics and computational thinking to gather evidence relevant to a scientific question or problem relating to the structure and properties of matter. The student is able to evaluate information, analyze data, use mathematics and computational thinking, construct explanations, develop arguments from evidence and/or use models to predict and develop evidence to make sense of scientific phenomena specific to the structure and properties of matter. The student is able to explain or develop an argument to support

  • r refute another explanation of

scientific phenomena relevant to the structure and properties of matter by arguing from evidence and/or using models to communicate information. The Student is able to use crosscutting concepts to define the physical system being investigated, recognize changes in the system, and/or to find patterns to use as evidence to support explanations or arguments of how or why the phenomenon occurs.

slide-67
SLIDE 67

6/28/2018 3D Science Assessment Reporting 67

Grade 5 Standards Matrix

Asking questions and defining problems Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Engaging in argument from evidence Developing and using models Constructing explanations and designing solutions Obtaining, evaluating, and communicating information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter

5-LS1-1

(LS1C.b)

5-PS3-1

(PS3D.b, LS1C.a)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

5-PS1-3

(PS1A.c)

Science and Engineering Practices Subclaim #1: Gathering Data & Investigating Scientific Questions Subclaim #2: Reason with Evidence and Evaluate Scientific Claims and Questions Subclaim #3 Construct Scientific Explanations Crosscutting Concepts

Scale, proportion, & quantity Systems & system models

Subclaim 1 Subclaim 2 Subclaim 3

slide-68
SLIDE 68

6/28/2018 3D Science Assessment Reporting 68

Grade 5 Standards Matrix

Asking questions and defining problems Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Engaging in argument from evidence Developing and using models Constructing explanations and designing solutions Obtaining, evaluating, and communicating information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter

5-LS1-1

(LS1C.b)

5-PS3-1

(PS3D.b, LS1C.a)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

5-PS1-3

(PS1A.c)

Science and Engineering Practices Subclaim #1: Gathering Data & Investigating Scientific Questions Subclaim #2: Reason with Evidence and Evaluate Scientific Claims and Questions Subclaim #3 Construct Scientific Explanations Crosscutting Concepts

Scale, proportion, & quantity Systems & system models

Subclaim 1 Subclaim 2 Subclaim 3

slide-69
SLIDE 69

6/28/2018 3D Science Assessment Reporting 69

Grade 5 Standards Matrix

Asking questions and defining problems Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Engaging in argument from evidence Developing and using models Constructing explanations and designing solutions Obtaining, evaluating, and communicating information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter

5-LS1-1

(LS1C.b)

5-PS3-1

(PS3D.b, LS1C.a)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

5-PS1-3

(PS1A.c)

Science and Engineering Practices Subclaim #1: Gathering Data & Investigating Scientific Questions Subclaim #2: Reason with Evidence and Evaluate Scientific Claims and Questions Subclaim #3 Construct Scientific Explanations Crosscutting Concepts

Scale, proportion, & quantity Systems & system models

Subclaim 1 Subclaim 2 Subclaim 3

slide-70
SLIDE 70

6/28/2018 3D Science Assessment Reporting 70

Grade 5 Standards Matrix

Asking questions and defining problems Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Engaging in argument from evidence Developing and using models Constructing explanations and designing solutions Obtaining, evaluating, and communicating information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter

5-LS1-1

(LS1C.b)

5-PS3-1

(PS3D.b, LS1C.a)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

5-PS1-3

(PS1A.c)

Science and Engineering Practices Subclaim #1: Gathering Data & Investigating Scientific Questions Subclaim #2: Reason with Evidence and Evaluate Scientific Claims and Questions Subclaim #3 Construct Scientific Explanations Crosscutting Concepts

Scale, proportion, & quantity Systems & system models

Subclaim 1 Subclaim 2 Subclaim 3

slide-71
SLIDE 71

6/28/2018 3D Science Assessment Reporting 71

Grade 5 Standards Matrix

Asking questions and defining problems Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Engaging in argument from evidence Developing and using models Constructing explanations and designing solutions Obtaining, evaluating, and communicating information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter

5-LS1-1

(LS1C.b)

5-PS3-1

(PS3D.b, LS1C.a)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

5-PS1-3

(PS1A.c)

Science and Engineering Practices Subclaim #1: Gathering Data & Investigating Scientific Questions Subclaim #2: Reason with Evidence and Evaluate Scientific Claims and Questions Subclaim #3 Construct Scientific Explanations Crosscutting Concepts

Scale, proportion, & quantity Systems & system models

Subclaim 4

slide-72
SLIDE 72

Foregrounding Phenomenon

6/28/2018 3D Science Assessment Reporting 72

  • The student has explained the phenomenon of

migration by describing the variations in available food using a model (5-PS3-1)

slide-73
SLIDE 73

6/28/2018 3D Science Assessment Reporting 73

Asking questions and defining problems Developing and using models Planning and carrying out investigations Analyzing and interpreting data Using mathematics… Constructing explanations and designing solutions Engaging in argument from evidence Obtaining, evaluating, and communicatin g information

Patterns 5-ESS1-2

(ESS1B.a)

Cause & Effect 5-PS1-4

(PS1B.a)

5-PS2-1

(PS2B.c)

5-PS1-2

(PS1A.b, PS1B.a, PS1B.b)

5-ESS2-2

(ESS2C.a, ESS2C.b)

5-LS2-1

(LS2A.a, LS2A.b, LS2A.c, LS2A.D, LS2B.a)

5-ESS2-1

(ESS2A.b)

Energy & matter 5-PS3-1

(PS3D.b, LS1C.a)

5-LS1-1

(LS1C.b)

Structure & function Stability & change 5-ESS3-1

(ESS3C.a, ETS1B.c)

Science and Engineering Practices Crosscutting Concepts

Scale, proportion, & quantity 5-PS1-1

(PS1A.a)

5-PS1-3

(PS1A.c)

Systems & system models 5-ESS1-1

(ESS1A.a)

Grade 5 Standards Matrix