Publications
Authors:
  • Simon Knight , Simon Buckingham Shum , Philippa Ryan , Agnes Sandor , Xiaolong Wang
Citation:
Published in International Journal of Artificial Intelligence in Education
Abstract:
Research into the teaching and assessment of student writing shows that many
students find academic writing a challenge to learn, with legal writing no exception.
Improving the availability and quality of timely formative feedback is an important aim.
However, the time-consuming nature of assessing writing makes it impractical for
instructors to provide rapid, detailed feedback on hundreds of draft texts which might
be improved prior to submission. This paper describes the design of a natural
language processing (NLP) tool to provide such support. We report progress in the
development of a web application called AWA (Academic Writing Analytics), which has
been piloted in a Civil Law degree. We describe: the underlying NLP platform and the
participatory design process through which the law academic and analytics team
tested and refined an existing rhetorical parser for the discipline; the user interface
design and evaluation process; and feedback from students, which was broadly
positive, but also identifies important issues to address. We discuss how our approach
is positioned in relation to concerns regarding automated essay grading, and ways in
which AWA might provide more actionable feedback to students. We conclude by
considering how this design process addresses the challenge of making explicit to
learners and educators the underlying mode of action in analytic devices such as our rhetorical parser, which we term algorithmic accountability.
Year:
2017
Report number:
2016/044