the e-Assessment Association

Recent Conference Round-Up

The late summer/early autumn in the UKย  is always a busy time for conferences, providing an opportunity to reflect on what happened during the first half of the year and to explore and look forward to the innovations promised for the coming year.

๐™‰๐™–๐™ฉ๐™ž๐™ค๐™ฃ๐™–๐™ก ๐˜ผ๐™ฅ๐™ฅ๐™ง๐™š๐™ฃ๐™ฉ๐™ž๐™˜๐™š๐™จ๐™๐™ž๐™ฅ๐™จโ€™ ๐™‰๐™ค๐™ฌ ๐™–๐™ฃ๐™™ ๐™ฉ๐™๐™š ๐™๐™ช๐™ฉ๐™ช๐™ง๐™š ๐˜พ๐™ค๐™ฃ๐™›๐™š๐™ง๐™š๐™ฃ๐™˜๐™š – ๐˜ผ๐™ช๐™œ๐™ช๐™จ๐™ฉ, ๐˜ฝ๐™ž๐™ง๐™ข๐™ž๐™ฃ๐™œ๐™๐™–๐™ข

This event caters primarily for employers, training providers and end-point assessment organisations (EPAOs) working within the apprenticeship sector in England, and was well attended.

Several key issues were highlighted:

– ๐—ฅ๐—ฒ๐—ฐ๐—ฟ๐˜‚๐—ถ๐˜๐—บ๐—ฒ๐—ป๐˜ ๐—ผ๐—ณ ๐—ป๐—ฒ๐˜„ ๐—ฎ๐—ฝ๐—ฝ๐—ฟ๐—ฒ๐—ป๐˜๐—ถ๐—ฐ๐—ฒ๐˜€ ๐—ฟ๐—ฒ๐—บ๐—ฎ๐—ถ๐—ป๐˜€ ๐—ฐ๐—ต๐—ฎ๐—น๐—น๐—ฒ๐—ป๐—ด๐—ถ๐—ป๐—ด โ€“ the 3m new apprenticeship starts by 2020 target set by government in 2015 has proved woefully over optimistic, with just 325k (12.3%) actual starts since 2020;

– ๐—”๐—ฝ๐—ฝ๐—ฟ๐—ฒ๐—ป๐˜๐—ถ๐—ฐ๐—ฒ ๐—ฟ๐—ฒ๐˜๐—ฒ๐—ป๐˜๐—ถ๐—ผ๐—ป โ€“ nearly 50% of learners fail to complete their apprenticeships for a variety of reasons, including lack of familiarity and support in how to use online learning and assessment platforms;

– ๐—”๐—ฝ๐—ฝ๐—ฟ๐—ฒ๐—ป๐˜๐—ถ๐—ฐ๐—ฒ๐˜€๐—ต๐—ถ๐—ฝ๐˜€ ๐—ฐ๐—ผ๐—ป๐˜๐—ถ๐—ป๐˜‚๐—ฒ ๐˜๐—ผ ๐—ฏ๐—ฒ ๐˜€๐—ฒ๐—ฒ๐—ป ๐—ฎ๐˜€ ๐—ฎ ๐˜€๐—ฒ๐—ฐ๐—ผ๐—ป๐—ฑ-๐—ฐ๐—น๐—ฎ๐˜€๐˜€ ๐—ผ๐—ฝ๐˜๐—ถ๐—ผ๐—ป ๐˜๐—ผ ๐˜‚๐—ป๐—ถ๐˜ƒ๐—ฒ๐—ฟ๐˜€๐—ถ๐˜๐˜† ๐—ฑ๐—ฒ๐—ด๐—ฟ๐—ฒ๐—ฒ๐˜€ โ€“ despite the so called โ€˜Baker Clauseโ€™ being introduced in 2018 making it mandatory for schools to facilitate a better understanding of the options to all Year 8-13 students.

– ๐—” ๐—บ๐—ผ๐—ฟ๐—ฒ ๐—ท๐—ผ๐—ถ๐—ป๐—ฒ๐—ฑ ๐˜‚๐—ฝ ๐—ฎ๐—ฝ๐—ฝ๐—ฟ๐—ผ๐—ฎ๐—ฐ๐—ต ๐˜๐—ผ ๐—ฎ๐˜€๐˜€๐—ฒ๐˜€๐˜€๐—บ๐—ฒ๐—ป๐˜ ๐—ฒ๐˜ƒ๐—ถ๐—ฑ๐—ฒ๐—ป๐—ฐ๐—ฒ ๐—ด๐—ฎ๐˜๐—ต๐—ฒ๐—ฟ๐—ถ๐—ป๐—ด ๐˜๐—ต๐—ฟ๐—ผ๐˜‚๐—ด๐—ต๐—ผ๐˜‚๐˜ ๐—ฎ๐—ป ๐—ฎ๐—ฝ๐—ฝ๐—ฟ๐—ฒ๐—ป๐˜๐—ถ๐—ฐ๐—ฒ๐˜€๐—ต๐—ถ๐—ฝ ๐—ถ๐˜€ ๐—ป๐—ฒ๐—ฒ๐—ฑ๐—ฒ๐—ฑ, to take pressure off the learner and to support accurate tracking of learner progression.

– ๐—•๐—ฒ๐˜๐˜๐—ฒ๐—ฟ ๐—ฐ๐—ผ๐—บ๐—บ๐˜‚๐—ป๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—ฎ๐—ป๐—ฑ ๐—ฐ๐—ผ๐—น๐—น๐—ฎ๐—ฏ๐—ผ๐—ฟ๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—ฏ๐—ฒ๐˜๐˜„๐—ฒ๐—ฒ๐—ป ๐—ฒ๐—บ๐—ฝ๐—น๐—ผ๐˜†๐—ฒ๐—ฟ๐˜€, ๐˜๐—ฟ๐—ฎ๐—ถ๐—ป๐—ถ๐—ป๐—ด ๐—ฝ๐—ฟ๐—ผ๐˜ƒ๐—ถ๐—ฑ๐—ฒ๐—ฟ๐˜€ ๐—ฎ๐—ป๐—ฑ ๐—˜๐—ฃ๐—”๐—ข๐˜€, moving away from the current linear model and putting the apprentice at the core.

๐™š-๐˜ผ๐™๐™‹ ๐˜พ๐™ค๐™ฃ๐™›๐™š๐™ง๐™š๐™ฃ๐™˜๐™š – ๐™Š๐™˜๐™ฉ๐™ค๐™—๐™š๐™ง, ๐™‡๐™ค๐™ฃ๐™™๐™ค๐™ฃ

This three-day conference, which is run by the European branch of the Association of Test Publishers, has a full and interesting programme and is always well attended.

The first day included a dedicated ๐—ฆ๐—ฒ๐—ฐ๐˜‚๐—ฟ๐—ถ๐˜๐˜† ๐—ฆ๐˜‚๐—บ๐—บ๐—ถ๐˜, which focused on the latest news and developments related to assessment security and malpractice detection. ๐——๐—ฟ. ๐—ง๐—ต๐—ผ๐—บ๐—ฎ๐˜€ ๐—Ÿ๐—ฎ๐—ป๐—ฐ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟ ย delivered a fascinating keynote that looked at how the ๐—ฎ๐˜€๐˜€๐—ฒ๐˜€๐˜€๐—บ๐—ฒ๐—ป๐˜ ๐—บ๐—ฎ๐—น๐—ฝ๐—ฟ๐—ฎ๐—ฐ๐˜๐—ถ๐—ฐ๐—ฒ landscape has changed in the last 20 years. A journey that has taken us from a preoccupation with preventing ๐—ฝ๐—น๐—ฎ๐—ด๐—ถ๐—ฎ๐—ฟ๐—ถ๐˜€๐—บ in 2003, something that technology now rigorously detects, through the rise of ๐—ฐ๐—ผ๐—ป๐˜๐—ฟ๐—ฎ๐—ฐ๐˜ ๐—ฐ๐—ต๐—ฒ๐—ฎ๐˜๐—ถ๐—ป๐—ด in 2006 (where someone else is paid to take the test for you) to the present day where we are most focused on preventing the ๐—ต๐—ฎ๐—ฟ๐˜ƒ๐—ฒ๐˜€๐˜๐—ถ๐—ป๐—ด ๐—ผ๐—ณ ๐—พ๐˜‚๐—ฒ๐˜€๐˜๐—ถ๐—ผ๐—ป ๐—บ๐—ฎ๐˜๐—ฒ๐—ฟ๐—ถ๐—ฎ๐—น๐˜€ for republication online. He argued that whilst ๐—”๐—ฟ๐˜๐—ถ๐—ณ๐—ถ๐—ฐ๐—ถ๐—ฎ๐—น ๐—œ๐—ป๐˜๐—ฒ๐—น๐—น๐—ถ๐—ด๐—ฒ๐—ป๐—ฐ๐—ฒ (๐—”๐—œ) has a lot to offer in detecting malpractice, the people behind cheating also use AI to develop more and more sophisticated ways to cheat!

Whilst ๐—œ๐—ป๐˜ƒ๐—ถ๐—ด๐—ถ๐—น๐—ฎ๐˜๐—ถ๐—ผ๐—ป/๐—ฃ๐—ฟ๐—ผ๐—ฐ๐˜๐—ผ๐—ฟ๐—ถ๐—ป๐—ด (be this in person or remotely delivered online) is an effective way of reducing malpractice, ๐—ฃ๐—ฎ๐˜‚๐—น ๐— ๐˜‚๐—ถ๐—ฟ , Head of Technology Enabled Assessment for ๐—ง๐—ต๐—ฒ ๐—•๐—ฟ๐—ถ๐˜๐—ถ๐˜€๐—ต ๐—–๐—ผ๐˜‚๐—ป๐—ฐ๐—ถ๐—น and the eAAโ€™s Vice Chair, reminded us all that the main purpose of invigilation/proctoring is to ensure candidates have a fair and consistent assessment experience โ€“ reducing cheating is just a very desirable bonus.

We were also reminded by ๐—ฃ๐—ฒ๐—ฎ๐—ฟ๐˜€๐—ผ๐—ป ๐—ฉ๐—จ๐—˜โ€™๐˜€ Director of Special Investigations ๐—›๐—ฎ๐—ฟ๐—ฟ๐˜† ๐—ฆ๐—ฎ๐—บ๐—ถ๐˜ that candidate cheating is not something new and that delivering assessments through a secure online assessment platform can reduce candidate malpractice significantly โ€“ he stated that typically less than 12% of candidate using their online platform try to cheat, verses 50% in physical test centres.

The ๐—ธ๐—ฒ๐˜† ๐˜๐—ฎ๐—ธ๐—ฒ-๐—ฎ๐˜„๐—ฎ๐˜† from all of this for me was that we need to focus on ways of decreasing the demand for cheating rather than just relying on technology to detect it when it happens. We need to ๐—ฑ๐—ฒ๐˜๐—ฒ๐—ฟ ๐—ฎ๐—ป๐—ฑ ๐—ฒ๐—ฑ๐˜‚๐—ฐ๐—ฎ๐˜๐—ฒ ๐—ฐ๐—ฎ๐—ป๐—ฑ๐—ถ๐—ฑ๐—ฎ๐˜๐—ฒ๐˜€ ๐—ฎ๐—ฏ๐—ผ๐˜‚๐˜ ๐—ฐ๐—ต๐—ฒ๐—ฎ๐˜๐—ถ๐—ป๐—ด and ensure they understand the consequences if they are caught.

The eAA was also given the opportunity to showcase three winners from this yearโ€™s ๐—œ๐—ป๐˜๐—ฒ๐—ฟ๐—ป๐—ฎ๐˜๐—ถ๐—ผ๐—ป๐—ฎ๐—น ๐—ฒ๐—”๐˜€๐˜€๐—ฒ๐˜€๐˜€๐—บ๐—ฒ๐—ป๐˜ ๐—”๐˜„๐—ฎ๐—ฟ๐—ฑ๐˜€ โ€“ for an overview of all of this yearโ€™s finalists and winners, please visit the awards website here.

๐™๐™š๐™™๐™š๐™ง๐™–๐™ฉ๐™ž๐™ค๐™ฃ ๐™ค๐™› ๐˜ผ๐™ฌ๐™–๐™ง๐™™๐™ž๐™ฃ๐™œ ๐˜ฝ๐™ค๐™™๐™ž๐™š๐™จ (๐™๐˜ผ๐˜ฝ) ๐˜พ๐™ค๐™ฃ๐™›๐™š๐™ง๐™š๐™ฃ๐™˜๐™š – ๐™Š๐™˜๐™ฉ๐™ค๐™—๐™š๐™ง, ๐™‡๐™š๐™ž๐™˜๐™š๐™จ๐™ฉ๐™š๐™ง

This yearโ€™s conference theme was ๐——๐—ฒ๐—น๐—ถ๐˜ƒ๐—ฒ๐—ฟ๐—ถ๐—ป๐—ด ๐—ข๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜๐˜‚๐—ป๐—ถ๐˜๐˜† ๐—ณ๐—ผ๐—ฟ ๐—”๐—น๐—น ๐˜๐—ต๐—ฟ๐—ผ๐˜‚๐—ด๐—ต ๐—˜๐—พ๐˜‚๐—ถ๐˜๐˜†, ๐——๐—ถ๐˜ƒ๐—ฒ๐—ฟ๐˜€๐—ถ๐˜๐˜† ๐—ฎ๐—ป๐—ฑ ๐—œ๐—ป๐—ฐ๐—น๐˜‚๐˜€๐—ถ๐—ผ๐—ป.

The opening address, given by ๐—ž๐—ถ๐—ฟ๐˜€๐˜๐—ถ๐—ฒ ๐——๐—ผ๐—ป๐—ป๐—ฒ๐—น๐—น๐˜† , CEO of ๐—–๐—ถ๐˜๐˜† & ๐—š๐˜‚๐—ถ๐—น๐—ฑ๐˜€ and Co-Chair of FAB, highlighted some key facts about the very diverse FE and Vocational/Technical learning and assessment sector: a sector that in 2020/21 saw 1.5 million learners and 600,000 VTQ certificates issued. Through a conversation with EDI advocate ๐—™๐—ฟ๐—ฎ๐—ป๐—ธ ๐——๐—ผ๐˜‚๐—ด๐—น๐—ฎ๐˜€ they also highlighted the difference between ๐—˜๐—พ๐˜‚๐—ถ๐˜๐˜† ๐˜ƒ๐˜€ ๐—˜๐—พ๐˜‚๐—ฎ๐—น๐—ถ๐˜๐˜† โ€“ something I hope that the image that accompanies this article helps to articulate โ€“ and stressed the importance of ensuring ๐—ฒ๐—พ๐˜‚๐—ถ๐˜๐˜† in learning and assessment to deliver true inclusion.

There were lots of very interesting sessions throughout the conference, but one that stood out was from ๐—˜๐˜€๐˜๐—ต๐—ฒ๐—ฟ ๐—ข’๐—–๐—ฎ๐—น๐—น๐—ฎ๐—ด๐—ต๐—ฎ๐—ป founder of ๐—ต๐˜‚๐—ป๐—ฑ๐—ผ.๐˜…๐˜†๐˜‡ who introduced us to the ๐—Ÿ๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐— ๐—ฒ๐˜๐—ฎ๐˜ƒ๐—ฒ๐—ฟ๐˜€๐—ฒ โ€“ an immersive, virtual learning environment. I was also really interested to hear about Hundoโ€™s development of a ๐—ฑ๐—ถ๐—ด๐—ถ๐˜๐—ฎ๐—น ๐˜€๐—ธ๐—ถ๐—น๐—น๐˜€ ๐˜„๐—ฎ๐—น๐—น๐—ฒ๐˜, which is secured against the ๐—•๐—น๐—ผ๐—ฐ๐—ธ๐—ฐ๐—ต๐—ฎ๐—ถ๐—ป, which has to be the future of certification especially with micro-credentials becoming ever more popular.

๐™Š๐™›๐™ฆ๐™ช๐™–๐™ก’๐™จ ๐˜ผ๐™ก๐™ก ๐™Ž๐™ฉ๐™–๐™›๐™› ๐˜พ๐™ค๐™ฃ๐™›๐™š๐™ง๐™š๐™ฃ๐™˜๐™š – ๐™‰๐™ค๐™ซ๐™š๐™ข๐™—๐™š๐™ง, ๐˜พ๐™ค๐™ซ๐™š๐™ฃ๐™ฉ๐™ง๐™ฎ

The eAA was invited to deliver a session on the increasing use of ๐—ข๐—ป๐—น๐—ถ๐—ป๐—ฒ ๐—ฃ๐—ฟ๐—ผ๐—ฐ๐˜๐—ผ๐—ฟ๐—ถ๐—ป๐—ด by exam awarding and professional bodies, both here in the UK and internationally. Fellow eAA Board member ๐—ฃ๐—ฎ๐˜๐—ฟ๐—ถ๐—ฐ๐—ธ ๐—–๐—ผ๐—ฎ๐˜๐—ฒ๐˜€ and ๐—”๐—ป๐—ฑ๐—ฟ๐—ฒ๐˜„ ๐—–๐—ฟ๐—ผ๐˜†๐—ฑ๐—ผ๐—ป , Director of Examinations, Education and Skills Policy for the ๐—ง๐—ต๐—ฒ ๐—”๐˜€๐˜€๐—ผ๐—ฐ๐—ถ๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—ผ๐—ณ ๐˜๐—ต๐—ฒ ๐—•๐—ฟ๐—ถ๐˜๐—ถ๐˜€๐—ต ๐—ฃ๐—ต๐—ฎ๐—ฟ๐—บ๐—ฎ๐—ฐ๐—ฒ๐˜‚๐˜๐—ถ๐—ฐ๐—ฎ๐—น ๐—œ๐—ป๐—ฑ๐˜‚๐˜€๐˜๐—ฟ๐˜† (๐—”๐—•๐—ฃ๐—œ) joined me to give this very positively received presentation to over 70 of Ofqualโ€™s staff. My sincere thanks not only to Patrick and Andrew, but also to eAA sponsors ๐—˜๐˜…๐—ฎ๐—บ๐—ถ๐˜๐˜† and ๐—ค๐˜‚๐—ฒ๐˜€๐˜๐—ถ๐—ผ๐—ป๐—บ๐—ฎ๐—ฟ๐—ธ for introducing me to Andrew and the ABPI.

Looking ahead, there are a number of conferences and events on the horizon, but one that Iโ€™d like to call out is the eAAโ€™s own ๐—˜๐—ป๐—ฑ ๐—ผ๐—ณ ๐—ฌ๐—ฒ๐—ฎ๐—ฟ ๐—ช๐—ฟ๐—ฎ๐—ฝ virtual event on Wednesday 7th December at 2pm (GMT) where leading educationalist ๐—ฃ๐—ฟ๐—ผ๐—ณ๐—ฒ๐˜€๐˜€๐—ผ๐—ฟ ๐—š๐—ฒ๐—ฟ ๐—š๐—ฟ๐—ฎ๐˜‚๐˜€ ๐—ข๐—•๐—˜ will be looking at how we measure the learning impact of educational technologies โ€“ from ๐—”๐—œ ๐—ฎ๐˜€๐˜€๐—ถ๐˜€๐˜๐—ฒ๐—ฑ ๐—น๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐—ฎ๐—ป๐—ฑ ๐—ฎ๐˜€๐˜€๐—ฒ๐˜€๐˜€๐—บ๐—ฒ๐—ป๐˜ ๐˜๐—ฒ๐—ฐ๐—ต๐—ป๐—ผ๐—น๐—ผ๐—ด๐—ถ๐—ฒ๐˜€ to use of the ๐—น๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐—บ๐—ฒ๐˜๐—ฎ๐˜ƒ๐—ฒ๐—ฟ๐˜€๐—ฒ โ€“ full details can be found in this newsletter.

Full details of all upcoming events can be found on the eAAโ€™s event calendar.

Developing and delivering e-assessments: the importance of communication

by Helen Claydon

For much of the past 10 years I have been involved in e-assessment content development across aย variety of organisations and assessments. In all instances, my experience of e-assessment hasย involved the assessment โ€˜ownerโ€™ outsourcing various aspects to third parties.

Handing over responsibility for aspects of your assessment to a third party can be a nerve wrackingย experience. Establishing a collaborative working relationship is key to successful delivery. In this blogย I share some thoughts based upon past experience and hope that they strike a chord with your ownย experiences.

Aims

We clearly have the shared aim of wanting the assessment to succeed, but to what extent do weย agree on how to achieve this? Quite often compromises will need to be made along the journey.ย Main areas for discord and compromise relate to quality, timescales and costs.ย In my view quality is very important, but it is important to remember that quality and accuracy areย not the same thing. A candidate will probably not be too bothered if a question doesnโ€™t look โ€˜prettyโ€™.

However, it is a different story if there is an error in a question, which results in the candidate losingย a mark and potentially failing the exam. For high stakes examinations, it is no exaggeration to sayย that one small error in a line of code has the potential to be life altering for a candidate. It is worthย agreeing up front who has responsibility for ensuring accuracy and how and when this will beย achieved. For example, will it be through:

  • up front quality assurance testing prior to launch of the exam, or
  • remedial work post-launch to fix issues identified by candidates when they were taking theirย actual exam?

What are the consequences if the assessments are not ready by the agreed launch date? If theย launch of a new high-stakes examination needs to be delayed, particularly if it was due to beย administered on a fixed day, the consequences can be significant. All parties need to be clear aboutย the importance of the timeline, with early indication provided of any potential slippages to theย critical path. In the event of a slippage it is important that all parties work together in agreeingย realistic mitigations. Transparency is key; covering up slippages or downplaying risks has theย potential to create bigger problems later on which are more difficult to resolve. However, the launchย date isnโ€™t always critical, sometimes it is okay for dates to slip to avoid compromising quality.

The test owner will often have a finite budget, so additional spend to address unexpected issues willย not always be a realistic option. As a test owner, we never want to compromise on quality orย timescale. However, there will be instances when we have no choice; it is worth deciding up frontย the location of the line beyond which we are not prepared to cross.

Successful collaboration

Collaborative scoping of requirements at the start of a project is invaluable. Making time to talkย before commencement of any system development will help to develop a shared and consistentย understanding of high level requirements and reduce the likelihood of time and money beingย invested in developing a solution that doesnโ€™t meet the test ownerโ€™s expectations.

At the start of a project, a common pitfall is failing to realise that you do not have a sharedย vocabulary. The resulting repercussions for a project can be significant, with solutions developed byย the e-assessment provider not meeting the expectations of the assessment owner. If you perceiveย this to be a problem, it can help to work together to develop a shared glossary of terms andย definitions. An alternative is to use definitions from an externally published source such as the โ€˜e-Assessment Glossaryโ€™, commissioned by JISC and QCA and published in 2006.

In conclusion โ€ฆ

To use the old BT adage, โ€˜Itโ€™s good to talk.โ€™ The importance of investing time in up-front collaborativeย discussion should not be underestimated. It can help to increase the likelihood that our e-assessments succeed, with reduced need for compromise along the way.

Standards in e-Testing

I started working in e-assessment sales in the late 90โ€™s and carried on into the mid-2000s Perhaps looking back I see this period as the heyday of e-assessment, with the new world of e-assessment providing an opportunity to give learners and candidates a better experience including instant feedback and on-demand testing. I worked closely with QCA (as was) and with a number of awarding bodies, and even completed my MBA dissertation with the incredibly exciting โ€˜Barriers to adoption of e-assessment by UK Awarding Bodiesโ€™. However, having had a seven year break from e-assessment and focussing more on the e-learning side of things, I have returned to the e-assessment fold and discovered that things have not really moved on. I had a presentation from one of the stalwarts of e-testing โ€“ I wonโ€™t mention any names โ€“ it was the same presentation I had seen them give 10 years ago.

Working now internationally I speak to many organisations who are contemplating moving to e-assessment and they are struggling. They still see e-assessment as being about the technology, whereas I think thatโ€™s the easy bit. The real challenge is that, as with any IT project, you need to understand what it is that you are trying to achieve strategically, and then align the organisation to achieve this. Once you know this, it will direct you towards the tactical and operational decisions you need to make. This is a process that I am hoping to provide a solution for and I am currently working with the International Standards Organisation (ISO) to put together a standard on high-stakes e-testing, which will include the more strategic implications of e-testing and how this impacts on the operational elements. NB. I am specifically using the terms high stakes and e-testing as it narrows the scope and makes things somewhat easier.

Whilst this may sound straight forward the actuality of it is much more difficult as different types of organisations (for example, academic and commercial testing organisations) and different countries see things differently. And, as with the United Nations there has to be consensus! This diversity is further exacerbated by the challenges around creating a standard for something that is subjective, like strategy. The intention is not to recreate work that has already been completed in the area of developing and delivering e-tests, of which there is a lot, but to draw this information together into a single body of knowledge in the context of understanding why e-testing is being implemented within an organisation.

This work is in progress and whilst I have paraphrased a lot of the activity here, I have put a schematic together which I think neatly brings the different elements of e-testing together to explain them. I would appreciate any feedback anyone may have and if they would be interested to be involved or have anything they would like to share.

Author:ย Patrick Coates, [email protected]

The personal views expressed in this blog are those of the eAA Board member, and not of their employer.

Mobile Technology and e-Assessment

Mobile Technology and e-Assessment

The technology research company Gartner reported that the sale of PCs was down yet again, and that the trend for mobile devices was increasing apace, especially in countries where there has been little adoption of computer technology previously, like across Africa. The number of Smart phones and tablet devices I see here around the university bear testament to this. If these devices are so common are there ways we can harness the power of them for e-assessment?

Socrative

This tool can be used on any device, as you can access it via a web interface, or via apps on some devices. The beauty of it is that the interface is identical no matter which way you choose to access the quiz. The tool allows you to set up a variety of question formats through a simple interface, and now you can include images with the questions. When learners take the quiz you can see progress immediately. Another facility it offers is to be able to create an exit poll to enquire about what has been learned in a session, and also set a closing problem.www.socrative.com

QuizSlides

This has to be the simplest quiz creator ever, if you can make a PowerPoint you can do this. You create your quiz in PowerPoint and upload it as a .ppt or .pdf file, then you take the quiz to set the correct answers and youโ€™re done. All quizzes run through a browser so can be used on any web connected device or PC. Itโ€™s not sophisticated, only allowing four choice MCQs, but it does let you create wonderful looking quizzes simply based on existing Office templates. When you run these on handheld devices it simply looks like it was designed for that device.www.quizslides.com

InfuseLearning

This platform offers a great choice of question formats that are easy to set up in advance and run as you need. You can even choose to run spot questions as you go. The really nice option though is to be able to use drawing questions. You can upload your own images and then ask learners to indicate specific things, or even draw a simple diagram on a blank canvas. When their images are submitted, you can view them on screen and save them in .pdf format. The slightly frustrating thing is drawing can be a bit difficult if you want fine detail, so getting people to write labels isnโ€™t viable, better to ask them to indicate things as A, B, C.

Another feature, which is wonderful for accessibility, is that InfuseLearning offers the option of text-to-speech question and answer reading. Alongside this it can convert questions and answers into a limited number of alternative languages. It runs on all web browsers except Internet Explorer, so can be accessed on many different devices.www.infuselearning.com

NearPod

This is a beautiful tool that can be run via an iPad, iPhone or iPod Touch app available through iTunes. A new function has just been added to allow access via a browser for participation in the quizzes, however the administrator still needs to use an Apple device to control the session. You can upload a PowerPoint presentation, or pdf to create a slide show to which you can then add a variety of styles of questions at relevant points. Originally you controlled the pace of change of each slide, and when to move on at each quiz point, as the โ€˜teacherโ€™ was in charge of moving each slide forwards. The tool lets you see each individualโ€™s progress to determine when to move forwards. Now there is also a โ€˜homeworkโ€™ mode that allows learners to work through at their own pace. This tool really lets you test understanding as learners work their way through. www.nearpod.com

There are many other tools out there but these are my favourites. The beauty of all of these is you can test them out with no financial outlay, for some there are costs when you go beyond certain limits โ€“ number of participants, number of quizzes, or improved reporting. But what I especially like is the ease of setting up quizzes on all of these. In a visually savvy world, they also produce professional looking output to engage learners.

Author: Judy Bloxham | eLearning Adviser FE&SFC | Jisc RSC Northwest

[email protected]

CACHE Award Assessment Delivery Contract to BTL

BTL is pleased to announce that CACHE (Council for Awards in Care, Health and Education) has awarded BTL a contract for the delivery of Assessment Services through Surpass Professional Edition, BTLโ€™s new SaaS-based Assessment Platform.
ย 
Through 2013, CACHE ran a major procurement exercise to โ€˜test the marketโ€™, seeking a new supplier and a new system for its online examination programme. From a competitive field, BTL was awarded the contract. The new Surpass product is built on the Surpass Enterprise Edition platform already used by 70% of the major Awarding Organisations in the UK.
ย 
Rob Bendelow, Procurement & Project Manager at CACHE comments:
ย 
โ€œFour potential suppliers were short-listed. Of course, all four systems could do the job, but the Surpass product from BTL ticked 95% of our boxes. Itโ€™s pretty easy to use and has clearly been developed with Awarding Organisations in mind. Creating assessments, scheduling tests, candidates taking on the challenge, co-ordinating the resultsโ€ฆ..itโ€™s all in there.โ€
ย 
Clare Ruiz-Palma, Project Manager for the Surpass implementation at CACHE, added:
ย 
โ€œBTL and CACHE have worked well together on this project. They (BTL) have also gone shoulder-to-shoulder with another of our strategic suppliers to help forge a seamless union between Surpass and our back office learner management system โ€˜Parnassusโ€™. We launched this new e-assessment platform in early April 2014, right on schedule.โ€
ย 
Rob added: โ€œSourcing a new IT system involves far more than bytes, browsers and bandwidth. People like to do business with people and so far the team from Shipley (BTL) have worked really well with CACHEโ€.

Keith Myers, Managing Director, BTL says: โ€œBTL are delighted to be working with CACHE. CACHE were very rigorous in their tendering process and we regard this choice of Surpass as an acknowledgement of the quality of our Assessment System. โ€
ย 
About CACHE
CACHE is a leading UK Awarding Organisation, based in St Albans and focussing primarily on the child care, adult care, health and education sectors.
ย 
All CACHE qualifications require the learners to undertake assessment, to measure and confirm their progress towards achievement of the qualification.
ย 
Assessment includes assignments, extended essays and examinations under controlled conditions. Online exams, wherein the learnerโ€™s answers are auto-marked by the computer system itself, are a growing segment of the assessment landscape.
ย 
www.cache.org.uk
ย 
About BTL
BTL is a leading UK provider of assessment systems to Awarding Organisations, Regulators and Government Bodies. First established in 1985, BTL is a privately held Limited Company founded by Chairman Dr Bob Gomersall based in Yorkshire, UK. Trusted by clients to deliver high-stakes examinations, BTLโ€™s assessment platform, Surpass, is the most widely-used e-Assessment system on the UKโ€™s education market.
ย 
www.btl.com
www.surpass.com
ย 
For further information please contact:
ย 
Jo Matthews
CACHE Corporate Communications Manager
Telephone: 01727 818 646
Email:ย [email protected]
ย 
Geoff Chapman
BTL Head of Marketing
Telephone: 07866 317346
Email:ย [email protected]

Redefining Assessment with New Technology

In learning, the SAMR (Substitution Augmentation Modification Redefinition) model is designed to encourage those involved in teaching to move away from simply substituting what they have always done manually into a technology, toward redesigning their activity by using technology to support completely new things previously not feasible. Perhaps there is something to learn for assessment here too. eAssessment can simply be thought of as delivering the same assessment by putting it on-line, however here again is an opportunity to think outside the box and to aim to redefine how we assess. Here is an example of how assessment was redefined to make use of technological innovation.

Quite often the problem with assessment is it can appear to the learner to be an artificial process by tasks that appear to be bolted on and not a realistic part of that situation. To enable all the learning outcomes to be shown to be met, some โ€˜extrasโ€™ have to be added in, the requirement to write a guide for how to use a web site was one I found myself having to add to meet set criteria. At Myerscough College one tutorโ€™s use of emerging technology enabled outcomes to be met effectively

Jon, a lecturer in Animal Studies, supervised learners as part of the ASDAN enterprise management unit, a short course supplementary to the main animal management qualification that can make up part of a PDP qualification, or evidence wider key skills. The learners had initially come up with ideas like cake baking, car washes, sock puppets etc.to make money. Jon showed the learners how the โ€˜studentโ€™ ID tags on the bearded dragon vivarium were augmented. The learners were enthralled and wanted to know how it was possible. He told them that it wasnโ€™t too difficult and the idea of an augmented reality calendar was born.

The development of the calendar naturally lent itself to meeting all the project process areas; planning, design, research, production. The learners took responsibility for the months, trying to theme each one to suit that point in the year eg February as โ€œLove Birds for Valentineโ€™s Dayโ€ and April as โ€œrabbits for Easterโ€. The learners then had to research that animal and write a script that included main care points for it. They found that this helped focus them as they were writing for an external audience. The whole process of developing the content for the calendar covered all the unit assessment outcomes.

Each monthโ€™s contribution was filmed, photographed, voiceovers added and then edited together. The final results were uploaded to the Aurasma[2] studio with the help of the tutor and each month was โ€˜brought to lifeโ€™. Finally all aspects of the calendar were there and ready for printing.

Speaking about the experience Joe said โ€œAs students, we wanted to try something new.โ€ Jayne followed this up by adding โ€œIt was really cool when you got to see it, you felt proud that youโ€™d actually made something.โ€ From Jonโ€™s point of view, he said it was easier to get the learners to meet all the necessary criteria because โ€œit was like theywere doing work, without actually doing work!โ€ He explained that โ€œit was a lot easier to get them to buy into the idea when they were engaged, focused and enthusiastic.โ€

For the assessment Jon was able to observe the learners involved in the process, capture evidence of them participating and then link to the final outputs for external examiners. The learners themselves had to add their evidence to an electronic book. Although there may appear to be a lot of work involved in this type of assessment activity, if the learners are engaged then it lessens the need to keep pushing them to complete tasks. It may well be that this activity is not scalable to cover a large group, but certainly is possible to replicate with around 15-20 learners. Alongside the assessment benefits it can also contribute to wider skills and confidence building.

The learners recognized that the technology could be utilized again when they were given an assessment in this academic year. They had to design a poster that helped people understand about animal training, but had a limited word count. One way they found to get round this was to add Augmented Reality content. They also noted that it was a great way for someone who wasnโ€™t too confident with words, to ensure their message got across as they wanted, without having to struggle with writing.

This goes to show how well designed activities that make use of technology can make the assessment process easier. Listen to the students and Jon talk about their experiences http://youtu.be/n2wgEhUFG-Y

Author: Judy Bloxham

[1] http://www.asdan.org.uk/courses/programmes/enterprise-short-course

[2] http://www.aurasma.com