Summary: learners' reflection in technological learning environments

by P

We are in the process of reading and summarising papers that will help us inform our thinking on rip-mix-learn practices in higher education. We are keeping them on an internal wiki, which has a few public pages. I am working on a way to making it easier to navigate only the pages that are accessible. Our summaries are not intended as comprehensive (and full) description of the papers, but we focus on aspects that are important in our context.

http://freecourseware.uwc.ac.za/dokuwiki/doku.php?id=rml:rimor_summary

Here is the summary:

Learners’ Reflections in Technological Learning Environments: Why to Promote and How to Evaluate
Rimor, R. and Kozminsky, E.
Proceedings of SITE 2000, February 8-12, 2000, San Diego, California, USA

This paper introduces the concept of metacognitive processes as a driver of learning, and uses a case-study of grade 9 learners working with electronic information (the paper confusingly refers to “data-base environment”) as an example. The underlying argument is that successful learners practice metacognition more efficiently and more frequently than less successful learners, or in other words: students who engage in metacognitive activity learn more and better.

Metacognition is defined as the ability to reflect upon one’s own thinking (and action) going back to Flavell (1976). It is connected to learning as an important contribution to Self-Regulated Learning SRL (Butler & Winne, 1995). In SRL the “teacher’s role has changed from being an infallible expert responsible for a final product, to being a guide who is more responsive to the context in which learning is occurring” and is thus directly based to the Constructivist theory. “This approach encourages learners to control their learning processes, reflect upon them and evaluate their results and progress in an open debugging procedure, which entails self reflection and peer dialogue.” (Note: that sounds just like what we mean by rip-mix-learn)

Students were asked to keep journals (personal reflection notes) over a period of 5 months. The journal content was then analysed and categorised into the authors’ tool for evaluating metacognitive components of students reflection (MCSR), which is based on Flavell’s three components of metacognition:
1 personal characteristics (P) (for example referring to preferences one has in regards to a way of learning)
2 task requirements (T) (for example, evaluating the relevance of the task in comparison to goals and objectives)
3 strategies for accomplishing the task (S) (for example, referring to results of data search)

Where the paper seems weak, is connecting data to the model and making a clear connection between type and frequency of personal reflection and learning progress. The authors are not examining which types of reflections are correlated with certain learning outcomes. As such, the tool remains a useful guide for categorising content, but lacks predictive or analytical power. What does it “mean” if one student shows more (or different types of) metacognition compared to other students?

For example, the authors do not go further as the following statement: “… Analysis of three [journal entries] suggest that [student 1's] explanations are better articulated, detailed and explicit that [student 2's]…Based on this analysis we can claim that [student 1's] articulation is qualitatively richer than [students 2's].” What this means in terms of actual learning is not explained – student 2 might well be learning more?

Notes on rip-mix-learn:

One thread that seems to be emerging is the issue of students’ awareness of what they are doing and why they are doing it. It could be interesting to investigate if students that can make sense of the practices used in their course will learn more than those who are questioning them. For example in Richard’s/Jay’s course, are students that are familiar with blogs and believe they are a useful tool learning more (and enjoying the learning more?) than those who do not see the value of using these ICTs for biology. The understanding of their own learning and the process they are going to could be described using the model developed in this paper. The connection to the quality of learning would be our contribution (and given that the authors did not attempt it – might be quite a challenge).