Writing Analytics

Writing analytics across essay tasks with different cognitive load demands

ASCILITE, 2020.

keywords: writing analytics, learning analytics, stylometry, keystroke analysis, cognitive load

Here I am again, writing about cool research!

I wish I could do this more often but I've been way too busy in the past few days :)

Today I'm very pleased to share a little bit about my research collaboration with Dr Paula de Barba (CSHE/Unimelb), Professor Gregor Kennedy (Unimelb), Dr Kelly Trezise (Loughborough University), Dr Rianne Conijn (Eindhoven University of Technology) and Dr Menno van Zaanen (South African Centre for Digital Language).

As we all know, essay writing is a widely used form of assessment in higher education and it can be used to assess different learning objectives. The Bloom taxonomy proposes six educational objectives: (1) remember, e.g., retrieval, (2) understand, e.g., interpret and explain, (3) apply, e.g., execute and implement, (4) analyse, e.g., organise and attribute, (5) evaluate, e.g., critique and make judgements, (6) create, e.g., generate and plan. These categories are thought to increasingly demand higher cognitive load from students. If the cognitive demands required for a given task exceed students’ available working memory capacity, students' ability to perform the task will be affected. Previous research has found that such differences in cognitive load demands can be detected in essay writing using writing analytics.

Isn't this super interesting? Have you ever thought the way you design your online assignments can (or will?) impact the way students type (writing process - keystroke dynamics)? And, that the different cognitive load or complexities required by each question (or task) may also change students' linguistic styles and static/final completed answers (writing product - stylometry)?

Several studies have examined the relationship between the type of question (or task) and keystroke or stylometry patterns. Together, these studies suggest that varying task load can affect authors’ writing process and writing outcomes, as reflected in keystroke patterns and stylometry. However, existing studies have largely focused on examining whether keystrokes and stylometry can predict low versus high cognitive load tasks. What is missing is an analysis of which aspects of the writing process and writing outcomes vary with cognitive load; i.e. what metrics change and in what way. (aHaaaaaaa!) Developing a characterization of the keystroke and stylometry changes associated with cognitive load offers two advantages: (1) understanding in what way the metrics obtained from the writing product and writing process are affected by essay tasks with different cognitive load demands, and (2) identification of the metrics that do not vary with cognitive load, so authorship identification/verification or automatic assessment analysis are not compromised with variations in essay task demands. This may in turn inform educators in the development of educational practices to improve writing for high cognitive load tasks (e.g., if the writing process of planning is affected, educators could increase the amount of class time spent on preparation before writing begins).

From 2017 to 2019 we investigated how students’ writing analytics varied across essay tasks with different cognitive load (or complexity), considering both their typing behavior (i.e., writing process) and writing style (i.e., writing product). From the keystroke data, seven metrics were selected to use for the analysis of writing processes. From the stylometry data, we also used seven metrics across four dimensions to analyse the writing outcome [for more details about our method and analysis, please access our study here]

In short, we found that the writing process is affected by task cognitive load (we also present and discuss in details how writing process is affected in the paper, as shown in the image below). While there were some effects of task cognitive load on writing product, these effects were moderate (we also discuss this in detail in the paper). However, given that the writing product is the outcome of the writing process, we would have expected similar effects of cognitive load on the writing process and product. This indicates that the relation between the writing product and writing process might not be as straightforward.

The findings of this research have implications for educational practice. Educators and instructional designers could use keystroke and stylometry metrics to identify and compare the cognitive demands imposed by their chosen learning design. This is particularly relevant to educators setting open book exams in online environments. In addition, the relation between cognitive load and students’ writing process and product may be used as a starting point for (personalized) feedback on students’ writing processes, to improve their writing strategies. Moreover, this research has implications for authorship identification within academic institutions. The use of stylometry for authorship identification assumes that an author’s writing style is consistent and recognizable, much like a fingerprint. However, our findings show that some keystroke and stylometry measures commonly used in authorship identification studies can vary in response to the cognitive demands of the writing task. This raises questions as to whether the accuracy of keystroke and/or stylometry analysis for authorship identification is affected by the cognitive load of the writing task. If the ability to verify authorship is impaired, it suggests an important issue for academic integrity in educational institutions. Hence, it is important to analyse the accuracy of stylometry and keystroke metrics for authorship identification in educational contexts, where cognitive load can differ across tasks.

So, after reading this, will you (or can you) rethink the way you design open book exams? ;) What educational objectives do you want to explore with your students and how will you do that?

All the details from this incredible study are now available (for free) on ASCILITE 2020 Proceedings and can be accessed here: https://minerva-access.unimelb.edu.au/bitstream/handle/11343/258925/ASCILITE-2020-Proceedings-Oliveira-E-et-al-Writing-analytics.pdf?sequence=6

Let me know your thoughts about it. I'd love to hear from you.

Have you been researching in this area as well? Let's chat!