best practice for designing exams to assess student
play

Best Practice for Designing Exams to Assess Student Learning Seema - PowerPoint PPT Presentation

Best Practice for Designing Exams to Assess Student Learning Seema C. Shah-Fairbank, P.E., PhD Director of Assessment and Program Review, Division of Academic Affairs Associate Professor of Civil Engineering Office of Assessment and Program


  1. Best Practice for Designing Exams to Assess Student Learning Seema C. Shah-Fairbank, P.E., PhD Director of Assessment and Program Review, Division of Academic Affairs Associate Professor of Civil Engineering Office of Assessment and Program Review, Academic Affairs Tiffany Frontino Administrative Analyst, Assessment Program Review

  2. Outcomes During Workshop • Distinguish between the “dos and don'ts” of common question types ▫ Binary, Matching, Multiple Choice, Short Answer, Open Ended • Aligning specific test questions ( items ) to learning outcomes ▫ Not a holistic test score, but evaluation of specific questions ▫ Go beyond content and think about what you want students to learn. ▫ Results will provide you with evidence of student learning, which you can act on. Post Workshop • Design a fair, yet challenging, exams that accurately gauge student learning.

  3. Types of Test Advantages Disadvantages Questions Easy to correct Range of responses are limited Binary True False Easy to compile results Test creator must possess content mastery True/False Instruction easy to understand High odds of accidental correctness Easy to correct Only test lower-level objectives Easy to create Medium odds for accidental correctness Matching Allows for many items to get tested Very Applicable to paring items Easy to correct Generally limited to fact-based questions Can include distractors Does not allow for explanation Multiple Choice Permits testing of a large body of Not that easy to create A B C material rapidly Easy to correct Limits richness of response Short Answer Closed Easy to create Takes longer to correct Question Easy to insert during instruction Can result in variability Easy to create Requires strong subject matter knowledge Where? Who? Open Ended Allows for freedom in response Correction is labor intensive How? Appropriate for "why" and "how" Allows highly diverse responses What? When?

  4. True/False Select Response Items Matching Multiple Choice • Mastery of factual information • Demonstrate Complex Thinking ▫ Make inferences about the topic based on data provided ▫ Select the correct answer by solving a quantitative problem ▫ Possible effect based on a situation ▫ Make a conclusion based on a graph

  5. Binary – True and False Do Don’ts • Make the statement concise and direct. • Focus on small factual details that are not • When writing a true/false problem only ask crucial to student learning. • Negatively phrased questions tests a students about a single situation/problem • Use qualifies such as “usually” or “seldom” reasoning ability not knowledge of the outcome. • Absolute statements such as “always” and “never”

  6. Matching Do Don’ts • Include two parallel list with approximately no • Create on giant matching problem more than 7 items. • Have students flip pages to see all the options. • The entire list fits on one page • Put items from different categories into a • Components come from the same subcategory matching.

  7. Multiple Choice STEM 1. To ensure the quality of multiple choice questions: a. Make some of the options and distractor negative. Distractors b. Include qualifiers and absolutes. c. Make all options and distractors similar in length. Correct Answer d. Include several correct answer options..

  8. Multiple Choice – Dos & Don’ts Do’s Don’ts • Make the stem concise and direct • Same word or a derivative in both the stem • When writing a stem only ask about a single and options • Grammatical clues in which article, verb or situation/problem • Correct answers and distractors should only pronoun eliminates options from being correct • Repeating same words across options have plausible situation • Evaluate what you have the student learning • Making response options unequal lengths to be on a specific item. • Negatively worded stems • Use of “All of the above” – Discourages discrimination • Use of “None of the Above” – Encourages guessing

  9. Short Answer/ Quantitative Constructed Responses Essay • Specific Questions or open ended response • Assess basic recall to creation of new ideas

  10. Short Answer Do Don’ts • Ask direct questions using your own words. • Ask trivia questions. • Ask specific problems which are concise. • Ask questions that are long or composed of • Inform students on keeping responses brief. complex sentences. • Provide the units that you want the final answer in.

  11. Essay Do Don’ts • Use your own words to develop the prompt. • Develop complex and ambiguous wording in • Use words such as “compare” and contrast” at the problem statement. • Questions that are too broad to allow time for the beginning of a questions. • Make the statement concise and direct. an in-depth response. • Provide students with a clear rubric so that they understand the expectations. • Include time for thinking and brainstorming prior to writing.

  12. Effective Exam Design • How many items? ▫ As many as you need - Capture student learning ▫ Think about the time (student need 3x as long) • Instructions ▫ Group items by test type ▫ How to record answer ▫ Whether or not to show work ▫ Point values for each item ▫ Neatness • Rule of Thumb ▫ Don’t let one early incorrect answer repeatedly penalize a student ▫ Alternative: tell students to assume specific answer to prior item ▫ Students capacity is a limited resources

  13. Learning Outcomes Program Level – Student Learning Outcomes Course Learning Outcomes • Student learning outcomes clearly state the • CLOs clearly relate to topics, assignments, and specific and measureable behaviors students exams that are covered in the present course. will display to verify learning has occurred at • CLOs should be measurable and map to SLOs the program level. for assessment. • CLOs are more detailed and specific, they • Key characteristics of student learning identify the unique knowledge and skills outcomes include 1) clarity, 2) specificity, (this expected to be gained from a given course. means they are worded with active verbs stating observable behaviors) and, 3) measurability.

  14. Assessment at Program Level Curriculum Matrix

  15. Assessment • Align items that students performed on to CLO or SLO • Grade/Assess if the students have met the learning ▫ Set a criterion for success – 75% of students will get each item correct ▫ Determine percent-correct for each items • Evaluate the results – Close the Loop ▫ Did the students understand the item ▫ Was the item written clearly ▫ Are changes need to be made within this course or a prior course ▫ Are there curricular changes needed at the program level ▫ Are there different pedogeological methods to teach the material.

  16. Questions – Comments - Practice Seema C. Shah-Fairbank, P.E., PhD shahfairbank@cpp.edu

  17. Resources • Cal Poly Pomona ▫ https://www.cpp.edu/~academic-programs/program-review • Other websites • https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test- questions/ • https://uwaterloo.ca/centre-for-teaching-excellence/teaching-resources/teaching- tips/developing-assignments/exams/questions-types-characteristics-suggestions • Books ▫ Meaningful and Manageable Program Assessment ▫ The College Instructor’s Guide to Writing Test Items: Measuring Student Learning

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend