New digital assessment solution successfully trialled in UK schools
RM Results announce successful research project in schools using RM Compare.
New digital assessment technology has proven its ability to improve the quality of writing and attainment of students, aid Continuous Professional Development (CPD) for teachers, and help to reduce teacher workload, in a successful pilot.
Fourteen primary schools in Oxfordshire took part in the research project using RM Compare, a new assessment solution from e-assessment technology organisation RM Results.
RM Compare is an online, cloud-based tool that teachers can use from any device and location, whenever is most convenient. It removes the need for traditional paper marking, instead showing teachers successive pairs of anonymous pieces of work side-by-side on a screen, with the teacher judging which best meets the assessment criteria.
The software uses Adaptive Comparative Judgement (ACJ) technology, based on the Law of Comparative Judgement, which proves that people are better at making comparative, paired judgements, rather than absolute ones.
RM Compare’s ACJ technology uses a unique algorithm that continuously updates in order to intelligently select and pair pieces of work based on previous judgements, displaying similarly ranked work side-by-side. This intelligent pairing reduces the time it takes to accurately place a piece of work on the “professional consensus rank order”, streamlining the assessment and moderation process to be as efficient as possible.
This new approach to moderation amongst Oxfordshire schools has been pioneered by Steve Dew, head teacher at Church Cowley St. James School, who brought together a group of previously unlinked schools to work together on assessing writing tasks undertaken by Year 6 children. Steve explained:
“We had previously led local partnership moderation and were looking for a valid way of assessing writing, whilst also hoping to cut down on the time it would take to get everyone involved. RM Compare and the power of adaptive comparative judgement gave us the ability to work and achieve our goals in a completely different way. It’s such a simple process and had really positive feedback from all the teachers and head teachers involved. Each teacher receives a comprehensive overview of all of the children’s work, so we are collectively raising standards in a very collaborative way.”
During the trial, children in each school were set the same task – a piece of writing based on a video – which was then uploaded to RM Compare. All of the teachers were then invited to assess the work, which was anonymised, as part of the moderation process.
By the end of each assessment at least 20 teachers across the schools had viewed each child’s writing, creating a collective consensus of what “good” work looks like across the 14 schools taking part.
Schools received the resulting data, which allowed them to see, for the first time, not only where children sat within their own school, but within the whole cohort of schools taking part. In addition, RM Compare’s versatile reporting enabled the creation of reports for sub-sets of schools within the cohort, providing schools with further insight. This new method of assessment and moderation allowed teachers to forensically understand the detail of children’s performance in each writing task, across various genres, so that raising attainment was more achievable.
Teachers could compare their own judgements with that of other teachers from other schools, to see whether they were in line with the professional consensus. Teachers are also able to see best practice in other schools and compare this with work from their own children. These outcomes helped to inform their strategic lesson planning, whilst also aiding teacher CPD.
The new technology is especially useful in formative assessment, which is assessment carried out during the learning process, aimed at allowing teachers to modify and focus their approaches to increase student attainment. RM Compare seeks to improve formative assessment and collaborative learning, particularly in more creative subjects, such as English or Art.
Steve Dew added:
“Due to the inherently more natural comparison process, rather than marking each piece of work against a mark scheme, we found that the time spent assessing the work was reduced, creating a more manageable workload. RM Compare gave us a much more reliable assessment and far greater insight into each child’s strengths and weaknesses. The assessment method is also fairer, helping to combat common assessment biases and reducing the impact of any unintentional preference caused by the assessor knowing the child.”
Following the success of the trial, Church Cowley St. James School has now implemented the technology for formative assessment of writing across all year groups. Instead of using a traditional, formal mark scheme and marking papers weekly, RM Compare is used ten times a year.
Steve Dew elaborated on the outcomes of the trial:
“It helps to boost the attainment of learners and improves the teacher’s judgements, as they have a clearer view of what good quality writing looks like, leading to better professional conversations between teachers at school. The tool is very flexible, so it has been straightforward for us to implement it however we choose. I think any literacy leader or head teacher who is keen on reducing the amount of time teachers spend marking would benefit from the use of RM Compare.
“It allowed us to moderate and benchmark writing across a group of schools. RM Compare made it easy for us to connect. Whilst we undertook the trial with 14 schools that were geographically close, there is no reason why we couldn’t use this on a larger scale, nationally and even internationally.
“It’s reduced the teacher workload hugely – most of our teachers now leave school at half past four and don’t take any books home. The time that we used to spend marking, we can use more strategically – planning and delivering the highest quality of learning in the classroom.”
The technology behind RM Compare was originally developed by a group of assessment scientists, with input from senior academics at Cambridge University and Goldsmiths, University of London. RM Results acquired the ACJ technology earlier this year.