the e-Assessment Association

Assessing oracy reliably and at scale with Adaptive Comparative Judgement

Assessing oracy reliably and at scale with Adaptive Comparative Judgement

Oracy is recognised worldwide as a powerful tool for learning that empowers students to find their own voice, and better understand themselves and people around them. However, assessing oracy has proved challenging as existing approaches raise concerns over validity and reliability. To assess oracy, a holistic approach that considers performance qualities, subjectivity, and variability is needed. Voice 21, the UK’s oracy education charity, partnered with Assessment from RM to explore the use of Adaptive Comparative Judgement (ACJ) to assess oracy at scale. Being able to assess oracy reliably will raise its status and encourage more schools to develop their oracy provision, making strides towards ensuring every voice is valued and heard.  

Oracy in education: the present and the future

The All Party Parliamentary Group (APPG) for Oracy Report ‘Speak for Change identifies that oracy improves academic outcomes, underpins literacy and vocabulary acquisition, supports wellbeing and confidence, enables young people to access employment and opportunities beyond school, and develops citizenship and agency. While an oracy education is universally important, it is currently disproportionately advantageous to groups experiencing poverty, and those with special education needs and disabilities. Furthermore, oracy within school curricula has been marginalised because of the difficulties with assessing through existing methods. A reliable assessment of oracy will help to establish an evidence-based approach for teachers and build a strong case for oracy to receive the acknowledgement in education that it deserves.

When considering the longevity of oracy in education, alternative and holistic approaches to assessment need to be explored and comparative judgement is one of them. This approach enables the comparison of two pieces of work, to decide which is better. By repeating this exercise, a higher level of reliability is achieved. RM Compare is a digital tool that utilises the power of comparative judgement combined with an adaptive algorithm in real-time that uses prior comparisons to determine how many times each item of work needs to appear and be assessed. Using RM Compare to assess oracy enables the consideration of its key competencies: performance, subjectivity and variability.

Comparative judgement as a best fit

RM Compare was used as an Adaptive Comparative Judgement (ACJ) tool to uncover learnings about oracy. A reliable assessment like this would enable the monitoring of students’ progress and performance comparison with other schools, and in the longer term it would help to create a standardised rank order across the UK that could be used as a benchmark for future assessment. Assessing oracy at this scale would also provide invaluable insight into the performance of oracy, in the same way as for other key subjects.

With this assessment, it was important to focus on the tacit knowledge of teachers, and their confidence in judgements of ‘what a good one looks like’. In order to achieve this, they needed to have a clear understanding of effective oracy and RM Compare’s use of ACJ provided the opportunity to establish reliable oracy standards in an authentic and valid way, with the potential of having this standard for ‘when-ready’ assessments. Also, any approach to assessing oracy must be compatible with assessing video and audio content from multiple sources and RM Compare offered technology that enabled this.

The sessions

Two sessions were ran across the UK with teachers examining oracy tasks from two age groups. RM Compare would show two pieces of students’ work and the teachers had to select the better piece of work against a holistic statement. Combining all the judgements, a standardised rank order was then produced that was representative of students across the UK – resulting in a reliable assessment. The intrinsically iterative nature of the comparison process helps teachers to develop their knowledge and familiarity of oracy standard at a national level, which helps to make the assessment process more reliable and straightforward.

An authentic and ethical approach

There are several challenges with assessing oracy that must be considered. In particular, the performance aspect of oracy is often in danger of being lost in a live context. Combining this with a noisy classroom environment can pose questions around the feasibility of assessment, while perceived authenticity of the audience can impact validity of assessment. Taking these into consideration, it was imperative to use real students and teachers in this project in order to validate the process and examine how people interacted with the task at hand. To offset concerns around using video and image content, the participants were asked to create the content themselves so that authenticity of work was not at risk of being compromised by the digital tool. Working with early adopters meant it was vital for them to understand their role and the project as a whole and so Voice 21 acted as the data controller to ensure safeguarding along the way while RM Compare, as the data processor, exercises data protection by default. Ultimately, the subsequent variation of item quality and working with early adopters provided invaluable learnings for this project and future plans. The efficiency, predictability and higher output of ACJ were all validated and enabled the progression of a second project and insight into what might be needed for an on-demand solution.

What next?

“Oracy can be assessed – Comparative Judgment – which relies on assessors making quick comparisons between videos of student talk – is a reliable way to assess oracy” (Voice 21, Annual Impact Report)

RM Compare puts the user at its core. As part of this project, the team dedicated over 50 hours with the participants to gain a better understanding of the user experience. These user learnings helped to make several changes to the production environment, but also in the creation of additional proofs of concept to test other hypotheses.

As RM Compare continues to support other oracy projects, a larger project is on the horizon to further learnings, validate the use case, and fuel the development of the world’s first on-demand ACJ platform. As a starting point, this will help to provide schools worldwide with the ability to assess oracy on a ‘when-ready’ basis – facilitating the inclusion of oracy education into curricula across the globe.


References  

Insights and Impact report 2021-22 – Voice 21

Oracy APPG – Voice 21

Share this: