Reflect began as an aid for students to learn more from reading examples. The inspiration was to help students learn to think like a computer scientist when they studied solutions to homework programming tasks. Essentially, the first versions of Reflect (called Assess) asked students to answer a set of questions about each supplied example answer to a problem: for example, they might be asked if it worked correctly when the input file is empty. This stage aimed to help students think about the code as the teacher did, in terms of the aspects that should be considered.
Since the teacher had coded their own answer to the question, the next stage of interaction was to show the student how their answers to the questions compared with the teachers. At this stage, the student could also see general comments from the teacher, perhaps pointing out why certain features of the code are interesting, or elegant, ….
As the student used the system, it kept track of their answers and provided a display of how they were doing. Initially, this was a very simple indication of performance on each question. In more recent versions of Reflect, Licaho Li added a detailed learner model has been built and this is displayed using tools like SIV.
While the initial vision was purely to support reading of examples, Reflect now has support for student submission of their own answers. If these are programs, there is support for automated grading and feedback.
With the addition of a learner model, it was necessary to provide an authoring environment that enables the teacher to create the set of learning objectives which constitute the components of the learner model. This makes use of our Mecureo automatic ontology builder: the author simply provides a glossary definition of each component and this is analysed to define an ontology for the set of Reflect tasks in the site. This means that Reflect can do ontological inference to give feedback on the learner’s progress at a range of granularity levels.
Learner Reflection in Student Self-assessment Inproceedings
Proceedings of ACE 2007, 9th Australasian Computing Education Conference, pp. 89-95, Australian Computer Society, Ballarat, Victoria, 2007.
The Carrick Vision and Computing Education: Four Case Studies in Multi-institutional Collaboration Inproceedings
Proceedings of ACE '07, Australian Computing Education Conference, pp. 3-8, Australian Computer Society, Ballarat, Victoria, 2007.
Scrutable Learner Modelling and Learner Reflection in Student Self-assessment Inproceedings
Ikeda, M; Ashley, K D; Chan, T (Ed.): Intelligent Tutoring Systems: Proceedings 8th International Conference, ITS 2006, pp. 197-206, Springer, Jhongli, Taiwan, 2006.
Learner Reflection in Student Self-Assessment Technical Report
School of Information Technologies, The University of Sydney (568), 2005.
The Cost of Authoring with a Knowledge Layer Inproceedings
Online Proceedings of the AIED (Artificial Intelligence in Education) 2005 Workshop on Authoring of Adaptive and Adaptable Educational Hypermedia (A3EH), pp. 72-79, Amsterdam, The Netherlands, 2005.
School of IT, The University of Sydney (577), 2005.
Metacognition and Open Learner Models Inproceedings
The 3rd Workshop on Meta-Cognition and Self-Regulated Learning in Educational Technologies, at ITS2008, pp. 7–20, Montreal, Canada, 0000.