Ask-Elle
An Adaptable Programming Tutor for Haskell Giving Automated Feedback
Bastiaan Heeren April 26, 2016 OU Research Seminar
Ask-Elle An Adaptable Programming Tutor for Haskell Giving Automated - - PowerPoint PPT Presentation
Ask-Elle An Adaptable Programming Tutor for Haskell Giving Automated Feedback Bastiaan Heeren April 26, 2016 OU Research Seminar 2. exercise description 4. high-level hint 5. bottom-out hint 3. student program 1. list of exercises Why use
Bastiaan Heeren April 26, 2016 OU Research Seminar
Why use an ITS?
Evaluation studies have indicated that:
human tutor (VanLehn 2011)
performance on tests (Odekirk-Hash and Zachary 2001)
feedback common in classroom settings (Mory 2003)
Type of exercises
− Class 1: single correct solution − Class 2: different implementation variants − Class 3: alternative solution strategies
Ask-Elle’s contribution
The design of a programming tutor that: 1.
2. supports incremental development of solutions 3. automatically calculates feedback and hints 4. allows teachers to add exercises and adapt feedback Our approach:
Overview
Example
Student session 32 + 8 + 2 = 42
we follow the foldl approach
Session
Student session a hole (expression)
Session (continued)
Student session standard compiler error by Helium
Model solutions
Teacher session
Recognising solutions
Teacher session
can be recognised by:
Adapting feedback
Teacher session description of the solution textual feedback annotations enforce use of library function alternative definition
Properties
Teacher session f is the student program
round-trip property
Ask-Elle’s design
Design
Assessing Student Programs
Automated assessment
1. you have tested enough (coverage)? 2. that good programming techniques are used? 3. which algorithm was used? 4. the executed code has no malicious features?
Assessing student programs
Classification (by hand)
sanity checks (e.g. input checks)
superfluous cases, length (x:xs) - 1
− 94 submissions for fromBin − 64 are good, 8 good with modifications (total: 72)
Assessing student programs
Results
Assessing student programs
Questionnaire
Questionnaire
− FP experts from the IFIP WG 2.1 group − Student participants of the CEFP 2011 summer school
Questionnaire
Results
Questionnaire
Evaluation of open questions
Remarks that appear most:
Questionnaire
Student Program Analysis
Classification (by Ask-Elle)
Correctness:
Categories:
Analysis
Questions related to feedback quality
behaviour contain imperfections (hard to remove)?
although the student program is incorrect? (precise answers in paper)
Analysis
Correct (but no match)
Cases:
that significantly differs from the model solutions
exercise (e.g. extra checks)
programming practices or contains imperfections
Analysis
Incorrect (but no counterexample)
Cases:
cases are run with random values for each property.
more than 90% is considered to be too many.
Analysis
Results
Analysis
Missing program transformations
Analysis (by hand) of 436 interactions in ‘Tests passed’:
definitions (37); followed by beta-reduction (39)
Analysis
Updated results
Analysis
Conclusions
programs for class 3 programming exercises
from teacher-specified annotated model solutions and properties
and property-based testing.
− recognise nearly 82% of (correct) interactions − classify nearly 93% of interactions
Future work
learning environments for programming − Part 1 to be presented at ITiCSE 2016 (69 tools)