Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool
Simon Buckingham Shum, Agnes Sandor, Rosalie Goldsmith, Xiaolong Wang, Randall Bass, Mindy McWilliams
When used effectively, reflective writing tasks can deepen learners’ understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning.
As such reflecting writing is attracting substantial interest from universities concerned with experiential learning, reflective practice, and developing a holistic conception of the learner. However, reflective writing is for many students a novel genre to compose in, and tutors may be inexperienced in its assessment.
While these conditions set a challenging context for automated solutions, natural language processing may also help address the challenge of providing real time, formative feedback on draft writing. This paper reports progress in designing a writing analytics application, detailing the methodology by which informally expressed rubrics are modelled as formal rhetorical patterns, a capability delivered by a novel web application.
This has been through iterative evaluation on an independently humanannotated corpus, showing improvements from the first to second version. We conclude by discussing the reasons why classifying reflective writing has proven complex, and reflect on the design processes enabling work across disciplinary boundaries to develop the prototype to its current state.
Citation: LAK, University of Edinburgh, Edinburgh, UK, April 25-29, 2016.
Co-author Ágnes Sándor
will present the paper at LAK, the 6th International Learning Analytics & Knowledge Conference
Learning Analytics and Knowledge, LAK, University of Edinburgh, Edinburgh, UK, April 25-29, 2016.