Recent Conference Round-Up
The late summer/early autumn in the UK is always a busy time for conferences, providing an opportunity to reflect on what happened during the first half of the year and to explore and look forward to the innovations promised for the coming year.
𝙉𝙖𝙩𝙞𝙤𝙣𝙖𝙡 𝘼𝙥𝙥𝙧𝙚𝙣𝙩𝙞𝙘𝙚𝙨𝙝𝙞𝙥𝙨’ 𝙉𝙤𝙬 𝙖𝙣𝙙 𝙩𝙝𝙚 𝙁𝙪𝙩𝙪𝙧𝙚 𝘾𝙤𝙣𝙛𝙚𝙧𝙚𝙣𝙘𝙚 – 𝘼𝙪𝙜𝙪𝙨𝙩, 𝘽𝙞𝙧𝙢𝙞𝙣𝙜𝙝𝙖𝙢
This event caters primarily for employers, training providers and end-point assessment organisations (EPAOs) working within the apprenticeship sector in England, and was well attended.
Several key issues were highlighted:
– 𝗥𝗲𝗰𝗿𝘂𝗶𝘁𝗺𝗲𝗻𝘁 𝗼𝗳 𝗻𝗲𝘄 𝗮𝗽𝗽𝗿𝗲𝗻𝘁𝗶𝗰𝗲𝘀 𝗿𝗲𝗺𝗮𝗶𝗻𝘀 𝗰𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗶𝗻𝗴 – the 3m new apprenticeship starts by 2020 target set by government in 2015 has proved woefully over optimistic, with just 325k (12.3%) actual starts since 2020;
– 𝗔𝗽𝗽𝗿𝗲𝗻𝘁𝗶𝗰𝗲 𝗿𝗲𝘁𝗲𝗻𝘁𝗶𝗼𝗻 – nearly 50% of learners fail to complete their apprenticeships for a variety of reasons, including lack of familiarity and support in how to use online learning and assessment platforms;
– 𝗔𝗽𝗽𝗿𝗲𝗻𝘁𝗶𝗰𝗲𝘀𝗵𝗶𝗽𝘀 𝗰𝗼𝗻𝘁𝗶𝗻𝘂𝗲 𝘁𝗼 𝗯𝗲 𝘀𝗲𝗲𝗻 𝗮𝘀 𝗮 𝘀𝗲𝗰𝗼𝗻𝗱-𝗰𝗹𝗮𝘀𝘀 𝗼𝗽𝘁𝗶𝗼𝗻 𝘁𝗼 𝘂𝗻𝗶𝘃𝗲𝗿𝘀𝗶𝘁𝘆 𝗱𝗲𝗴𝗿𝗲𝗲𝘀 – despite the so called ‘Baker Clause’ being introduced in 2018 making it mandatory for schools to facilitate a better understanding of the options to all Year 8-13 students.
– 𝗔 𝗺𝗼𝗿𝗲 𝗷𝗼𝗶𝗻𝗲𝗱 𝘂𝗽 𝗮𝗽𝗽𝗿𝗼𝗮𝗰𝗵 𝘁𝗼 𝗮𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 𝗲𝘃𝗶𝗱𝗲𝗻𝗰𝗲 𝗴𝗮𝘁𝗵𝗲𝗿𝗶𝗻𝗴 𝘁𝗵𝗿𝗼𝘂𝗴𝗵𝗼𝘂𝘁 𝗮𝗻 𝗮𝗽𝗽𝗿𝗲𝗻𝘁𝗶𝗰𝗲𝘀𝗵𝗶𝗽 𝗶𝘀 𝗻𝗲𝗲𝗱𝗲𝗱, to take pressure off the learner and to support accurate tracking of learner progression.
– 𝗕𝗲𝘁𝘁𝗲𝗿 𝗰𝗼𝗺𝗺𝘂𝗻𝗶𝗰𝗮𝘁𝗶𝗼𝗻 𝗮𝗻𝗱 𝗰𝗼𝗹𝗹𝗮𝗯𝗼𝗿𝗮𝘁𝗶𝗼𝗻 𝗯𝗲𝘁𝘄𝗲𝗲𝗻 𝗲𝗺𝗽𝗹𝗼𝘆𝗲𝗿𝘀, 𝘁𝗿𝗮𝗶𝗻𝗶𝗻𝗴 𝗽𝗿𝗼𝘃𝗶𝗱𝗲𝗿𝘀 𝗮𝗻𝗱 𝗘𝗣𝗔𝗢𝘀, moving away from the current linear model and putting the apprentice at the core.
𝙚-𝘼𝙏𝙋 𝘾𝙤𝙣𝙛𝙚𝙧𝙚𝙣𝙘𝙚 – 𝙊𝙘𝙩𝙤𝙗𝙚𝙧, 𝙇𝙤𝙣𝙙𝙤𝙣
This three-day conference, which is run by the European branch of the Association of Test Publishers, has a full and interesting programme and is always well attended.
The first day included a dedicated 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗦𝘂𝗺𝗺𝗶𝘁, which focused on the latest news and developments related to assessment security and malpractice detection. 𝗗𝗿. 𝗧𝗵𝗼𝗺𝗮𝘀 𝗟𝗮𝗻𝗰𝗮𝘀𝘁𝗲𝗿 delivered a fascinating keynote that looked at how the 𝗮𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 𝗺𝗮𝗹𝗽𝗿𝗮𝗰𝘁𝗶𝗰𝗲 landscape has changed in the last 20 years. A journey that has taken us from a preoccupation with preventing 𝗽𝗹𝗮𝗴𝗶𝗮𝗿𝗶𝘀𝗺 in 2003, something that technology now rigorously detects, through the rise of 𝗰𝗼𝗻𝘁𝗿𝗮𝗰𝘁 𝗰𝗵𝗲𝗮𝘁𝗶𝗻𝗴 in 2006 (where someone else is paid to take the test for you) to the present day where we are most focused on preventing the 𝗵𝗮𝗿𝘃𝗲𝘀𝘁𝗶𝗻𝗴 𝗼𝗳 𝗾𝘂𝗲𝘀𝘁𝗶𝗼𝗻 𝗺𝗮𝘁𝗲𝗿𝗶𝗮𝗹𝘀 for republication online. He argued that whilst 𝗔𝗿𝘁𝗶𝗳𝗶𝗰𝗶𝗮𝗹 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲 (𝗔𝗜) has a lot to offer in detecting malpractice, the people behind cheating also use AI to develop more and more sophisticated ways to cheat!
Whilst 𝗜𝗻𝘃𝗶𝗴𝗶𝗹𝗮𝘁𝗶𝗼𝗻/𝗣𝗿𝗼𝗰𝘁𝗼𝗿𝗶𝗻𝗴 (be this in person or remotely delivered online) is an effective way of reducing malpractice, 𝗣𝗮𝘂𝗹 𝗠𝘂𝗶𝗿 , Head of Technology Enabled Assessment for 𝗧𝗵𝗲 𝗕𝗿𝗶𝘁𝗶𝘀𝗵 𝗖𝗼𝘂𝗻𝗰𝗶𝗹 and the eAA’s Vice Chair, reminded us all that the main purpose of invigilation/proctoring is to ensure candidates have a fair and consistent assessment experience – reducing cheating is just a very desirable bonus.
We were also reminded by 𝗣𝗲𝗮𝗿𝘀𝗼𝗻 𝗩𝗨𝗘’𝘀 Director of Special Investigations 𝗛𝗮𝗿𝗿𝘆 𝗦𝗮𝗺𝗶𝘁 that candidate cheating is not something new and that delivering assessments through a secure online assessment platform can reduce candidate malpractice significantly – he stated that typically less than 12% of candidate using their online platform try to cheat, verses 50% in physical test centres.
The 𝗸𝗲𝘆 𝘁𝗮𝗸𝗲-𝗮𝘄𝗮𝘆 from all of this for me was that we need to focus on ways of decreasing the demand for cheating rather than just relying on technology to detect it when it happens. We need to 𝗱𝗲𝘁𝗲𝗿 𝗮𝗻𝗱 𝗲𝗱𝘂𝗰𝗮𝘁𝗲 𝗰𝗮𝗻𝗱𝗶𝗱𝗮𝘁𝗲𝘀 𝗮𝗯𝗼𝘂𝘁 𝗰𝗵𝗲𝗮𝘁𝗶𝗻𝗴 and ensure they understand the consequences if they are caught.
The eAA was also given the opportunity to showcase three winners from this year’s 𝗜𝗻𝘁𝗲𝗿𝗻𝗮𝘁𝗶𝗼𝗻𝗮𝗹 𝗲𝗔𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 𝗔𝘄𝗮𝗿𝗱𝘀 – for an overview of all of this year’s finalists and winners, please visit the awards website here.
𝙁𝙚𝙙𝙚𝙧𝙖𝙩𝙞𝙤𝙣 𝙤𝙛 𝘼𝙬𝙖𝙧𝙙𝙞𝙣𝙜 𝘽𝙤𝙙𝙞𝙚𝙨 (𝙁𝘼𝘽) 𝘾𝙤𝙣𝙛𝙚𝙧𝙚𝙣𝙘𝙚 – 𝙊𝙘𝙩𝙤𝙗𝙚𝙧, 𝙇𝙚𝙞𝙘𝙚𝙨𝙩𝙚𝙧
This year’s conference theme was 𝗗𝗲𝗹𝗶𝘃𝗲𝗿𝗶𝗻𝗴 𝗢𝗽𝗽𝗼𝗿𝘁𝘂𝗻𝗶𝘁𝘆 𝗳𝗼𝗿 𝗔𝗹𝗹 𝘁𝗵𝗿𝗼𝘂𝗴𝗵 𝗘𝗾𝘂𝗶𝘁𝘆, 𝗗𝗶𝘃𝗲𝗿𝘀𝗶𝘁𝘆 𝗮𝗻𝗱 𝗜𝗻𝗰𝗹𝘂𝘀𝗶𝗼𝗻.
The opening address, given by 𝗞𝗶𝗿𝘀𝘁𝗶𝗲 𝗗𝗼𝗻𝗻𝗲𝗹𝗹𝘆 , CEO of 𝗖𝗶𝘁𝘆 & 𝗚𝘂𝗶𝗹𝗱𝘀 and Co-Chair of FAB, highlighted some key facts about the very diverse FE and Vocational/Technical learning and assessment sector: a sector that in 2020/21 saw 1.5 million learners and 600,000 VTQ certificates issued. Through a conversation with EDI advocate 𝗙𝗿𝗮𝗻𝗸 𝗗𝗼𝘂𝗴𝗹𝗮𝘀 they also highlighted the difference between 𝗘𝗾𝘂𝗶𝘁𝘆 𝘃𝘀 𝗘𝗾𝘂𝗮𝗹𝗶𝘁𝘆 – something I hope that the image that accompanies this article helps to articulate – and stressed the importance of ensuring 𝗲𝗾𝘂𝗶𝘁𝘆 in learning and assessment to deliver true inclusion.
There were lots of very interesting sessions throughout the conference, but one that stood out was from 𝗘𝘀𝘁𝗵𝗲𝗿 𝗢’𝗖𝗮𝗹𝗹𝗮𝗴𝗵𝗮𝗻 founder of 𝗵𝘂𝗻𝗱𝗼.𝘅𝘆𝘇 who introduced us to the 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗠𝗲𝘁𝗮𝘃𝗲𝗿𝘀𝗲 – an immersive, virtual learning environment. I was also really interested to hear about Hundo’s development of a 𝗱𝗶𝗴𝗶𝘁𝗮𝗹 𝘀𝗸𝗶𝗹𝗹𝘀 𝘄𝗮𝗹𝗹𝗲𝘁, which is secured against the 𝗕𝗹𝗼𝗰𝗸𝗰𝗵𝗮𝗶𝗻, which has to be the future of certification especially with micro-credentials becoming ever more popular.
𝙊𝙛𝙦𝙪𝙖𝙡’𝙨 𝘼𝙡𝙡 𝙎𝙩𝙖𝙛𝙛 𝘾𝙤𝙣𝙛𝙚𝙧𝙚𝙣𝙘𝙚 – 𝙉𝙤𝙫𝙚𝙢𝙗𝙚𝙧, 𝘾𝙤𝙫𝙚𝙣𝙩𝙧𝙮
The eAA was invited to deliver a session on the increasing use of 𝗢𝗻𝗹𝗶𝗻𝗲 𝗣𝗿𝗼𝗰𝘁𝗼𝗿𝗶𝗻𝗴 by exam awarding and professional bodies, both here in the UK and internationally. Fellow eAA Board member 𝗣𝗮𝘁𝗿𝗶𝗰𝗸 𝗖𝗼𝗮𝘁𝗲𝘀 and 𝗔𝗻𝗱𝗿𝗲𝘄 𝗖𝗿𝗼𝘆𝗱𝗼𝗻 , Director of Examinations, Education and Skills Policy for the 𝗧𝗵𝗲 𝗔𝘀𝘀𝗼𝗰𝗶𝗮𝘁𝗶𝗼𝗻 𝗼𝗳 𝘁𝗵𝗲 𝗕𝗿𝗶𝘁𝗶𝘀𝗵 𝗣𝗵𝗮𝗿𝗺𝗮𝗰𝗲𝘂𝘁𝗶𝗰𝗮𝗹 𝗜𝗻𝗱𝘂𝘀𝘁𝗿𝘆 (𝗔𝗕𝗣𝗜) joined me to give this very positively received presentation to over 70 of Ofqual’s staff. My sincere thanks not only to Patrick and Andrew, but also to eAA sponsors 𝗘𝘅𝗮𝗺𝗶𝘁𝘆 and 𝗤𝘂𝗲𝘀𝘁𝗶𝗼𝗻𝗺𝗮𝗿𝗸 for introducing me to Andrew and the ABPI.
Looking ahead, there are a number of conferences and events on the horizon, but one that I’d like to call out is the eAA’s own 𝗘𝗻𝗱 𝗼𝗳 𝗬𝗲𝗮𝗿 𝗪𝗿𝗮𝗽 virtual event on Wednesday 7th December at 2pm (GMT) where leading educationalist 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗼𝗿 𝗚𝗲𝗿 𝗚𝗿𝗮𝘂𝘀 𝗢𝗕𝗘 will be looking at how we measure the learning impact of educational technologies – from 𝗔𝗜 𝗮𝘀𝘀𝗶𝘀𝘁𝗲𝗱 𝗹𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗮𝗻𝗱 𝗮𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 𝘁𝗲𝗰𝗵𝗻𝗼𝗹𝗼𝗴𝗶𝗲𝘀 to use of the 𝗹𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗺𝗲𝘁𝗮𝘃𝗲𝗿𝘀𝗲 – full details can be found in this newsletter.
Full details of all upcoming events can be found on the eAA’s event calendar.
Developing and delivering e-assessments: the importance of communication
by Helen Claydon
For much of the past 10 years I have been involved in e-assessment content development across a variety of organisations and assessments. In all instances, my experience of e-assessment has involved the assessment ‘owner’ outsourcing various aspects to third parties.
Handing over responsibility for aspects of your assessment to a third party can be a nerve wracking experience. Establishing a collaborative working relationship is key to successful delivery. In this blog I share some thoughts based upon past experience and hope that they strike a chord with your own experiences.
We clearly have the shared aim of wanting the assessment to succeed, but to what extent do we agree on how to achieve this? Quite often compromises will need to be made along the journey. Main areas for discord and compromise relate to quality, timescales and costs. In my view quality is very important, but it is important to remember that quality and accuracy are not the same thing. A candidate will probably not be too bothered if a question doesn’t look ‘pretty’.
However, it is a different story if there is an error in a question, which results in the candidate losing a mark and potentially failing the exam. For high stakes examinations, it is no exaggeration to say that one small error in a line of code has the potential to be life altering for a candidate. It is worth agreeing up front who has responsibility for ensuring accuracy and how and when this will be achieved. For example, will it be through:
- up front quality assurance testing prior to launch of the exam, or
- remedial work post-launch to fix issues identified by candidates when they were taking their actual exam?
What are the consequences if the assessments are not ready by the agreed launch date? If the launch of a new high-stakes examination needs to be delayed, particularly if it was due to be administered on a fixed day, the consequences can be significant. All parties need to be clear about the importance of the timeline, with early indication provided of any potential slippages to the critical path. In the event of a slippage it is important that all parties work together in agreeing realistic mitigations. Transparency is key; covering up slippages or downplaying risks has the potential to create bigger problems later on which are more difficult to resolve. However, the launch date isn’t always critical, sometimes it is okay for dates to slip to avoid compromising quality.
The test owner will often have a finite budget, so additional spend to address unexpected issues will not always be a realistic option. As a test owner, we never want to compromise on quality or timescale. However, there will be instances when we have no choice; it is worth deciding up front the location of the line beyond which we are not prepared to cross.
Collaborative scoping of requirements at the start of a project is invaluable. Making time to talk before commencement of any system development will help to develop a shared and consistent understanding of high level requirements and reduce the likelihood of time and money being invested in developing a solution that doesn’t meet the test owner’s expectations.
At the start of a project, a common pitfall is failing to realise that you do not have a shared vocabulary. The resulting repercussions for a project can be significant, with solutions developed by the e-assessment provider not meeting the expectations of the assessment owner. If you perceive this to be a problem, it can help to work together to develop a shared glossary of terms and definitions. An alternative is to use definitions from an externally published source such as the ‘e-Assessment Glossary’, commissioned by JISC and QCA and published in 2006.
In conclusion …
To use the old BT adage, ‘It’s good to talk.’ The importance of investing time in up-front collaborative discussion should not be underestimated. It can help to increase the likelihood that our e-assessments succeed, with reduced need for compromise along the way.
Standards in e-Testing
I started working in e-assessment sales in the late 90’s and carried on into the mid-2000s Perhaps looking back I see this period as the heyday of e-assessment, with the new world of e-assessment providing an opportunity to give learners and candidates a better experience including instant feedback and on-demand testing. I worked closely with QCA (as was) and with a number of awarding bodies, and even completed my MBA dissertation with the incredibly exciting ‘Barriers to adoption of e-assessment by UK Awarding Bodies’. However, having had a seven year break from e-assessment and focussing more on the e-learning side of things, I have returned to the e-assessment fold and discovered that things have not really moved on. I had a presentation from one of the stalwarts of e-testing – I won’t mention any names – it was the same presentation I had seen them give 10 years ago.
Working now internationally I speak to many organisations who are contemplating moving to e-assessment and they are struggling. They still see e-assessment as being about the technology, whereas I think that’s the easy bit. The real challenge is that, as with any IT project, you need to understand what it is that you are trying to achieve strategically, and then align the organisation to achieve this. Once you know this, it will direct you towards the tactical and operational decisions you need to make. This is a process that I am hoping to provide a solution for and I am currently working with the International Standards Organisation (ISO) to put together a standard on high-stakes e-testing, which will include the more strategic implications of e-testing and how this impacts on the operational elements. NB. I am specifically using the terms high stakes and e-testing as it narrows the scope and makes things somewhat easier.
Whilst this may sound straight forward the actuality of it is much more difficult as different types of organisations (for example, academic and commercial testing organisations) and different countries see things differently. And, as with the United Nations there has to be consensus! This diversity is further exacerbated by the challenges around creating a standard for something that is subjective, like strategy. The intention is not to recreate work that has already been completed in the area of developing and delivering e-tests, of which there is a lot, but to draw this information together into a single body of knowledge in the context of understanding why e-testing is being implemented within an organisation.
This work is in progress and whilst I have paraphrased a lot of the activity here, I have put a schematic together which I think neatly brings the different elements of e-testing together to explain them. I would appreciate any feedback anyone may have and if they would be interested to be involved or have anything they would like to share.
Author: Patrick Coates, [email protected]
The personal views expressed in this blog are those of the eAA Board member, and not of their employer.
Mobile Technology and e-Assessment
Mobile Technology and e-Assessment
The technology research company Gartner reported that the sale of PCs was down yet again, and that the trend for mobile devices was increasing apace, especially in countries where there has been little adoption of computer technology previously, like across Africa. The number of Smart phones and tablet devices I see here around the university bear testament to this. If these devices are so common are there ways we can harness the power of them for e-assessment?
This tool can be used on any device, as you can access it via a web interface, or via apps on some devices. The beauty of it is that the interface is identical no matter which way you choose to access the quiz. The tool allows you to set up a variety of question formats through a simple interface, and now you can include images with the questions. When learners take the quiz you can see progress immediately. Another facility it offers is to be able to create an exit poll to enquire about what has been learned in a session, and also set a closing problem.www.socrative.com
This has to be the simplest quiz creator ever, if you can make a PowerPoint you can do this. You create your quiz in PowerPoint and upload it as a .ppt or .pdf file, then you take the quiz to set the correct answers and you’re done. All quizzes run through a browser so can be used on any web connected device or PC. It’s not sophisticated, only allowing four choice MCQs, but it does let you create wonderful looking quizzes simply based on existing Office templates. When you run these on handheld devices it simply looks like it was designed for that device.www.quizslides.com
This platform offers a great choice of question formats that are easy to set up in advance and run as you need. You can even choose to run spot questions as you go. The really nice option though is to be able to use drawing questions. You can upload your own images and then ask learners to indicate specific things, or even draw a simple diagram on a blank canvas. When their images are submitted, you can view them on screen and save them in .pdf format. The slightly frustrating thing is drawing can be a bit difficult if you want fine detail, so getting people to write labels isn’t viable, better to ask them to indicate things as A, B, C.
Another feature, which is wonderful for accessibility, is that InfuseLearning offers the option of text-to-speech question and answer reading. Alongside this it can convert questions and answers into a limited number of alternative languages. It runs on all web browsers except Internet Explorer, so can be accessed on many different devices.www.infuselearning.com
This is a beautiful tool that can be run via an iPad, iPhone or iPod Touch app available through iTunes. A new function has just been added to allow access via a browser for participation in the quizzes, however the administrator still needs to use an Apple device to control the session. You can upload a PowerPoint presentation, or pdf to create a slide show to which you can then add a variety of styles of questions at relevant points. Originally you controlled the pace of change of each slide, and when to move on at each quiz point, as the ‘teacher’ was in charge of moving each slide forwards. The tool lets you see each individual’s progress to determine when to move forwards. Now there is also a ‘homework’ mode that allows learners to work through at their own pace. This tool really lets you test understanding as learners work their way through. www.nearpod.com
There are many other tools out there but these are my favourites. The beauty of all of these is you can test them out with no financial outlay, for some there are costs when you go beyond certain limits – number of participants, number of quizzes, or improved reporting. But what I especially like is the ease of setting up quizzes on all of these. In a visually savvy world, they also produce professional looking output to engage learners.
Author: Judy Bloxham | eLearning Adviser FE&SFC | Jisc RSC Northwest
CACHE Award Assessment Delivery Contract to BTL
BTL is pleased to announce that CACHE (Council for Awards in Care, Health and Education) has awarded BTL a contract for the delivery of Assessment Services through Surpass Professional Edition, BTL’s new SaaS-based Assessment Platform.
Through 2013, CACHE ran a major procurement exercise to ‘test the market’, seeking a new supplier and a new system for its online examination programme. From a competitive field, BTL was awarded the contract. The new Surpass product is built on the Surpass Enterprise Edition platform already used by 70% of the major Awarding Organisations in the UK.
Rob Bendelow, Procurement & Project Manager at CACHE comments:
“Four potential suppliers were short-listed. Of course, all four systems could do the job, but the Surpass product from BTL ticked 95% of our boxes. It’s pretty easy to use and has clearly been developed with Awarding Organisations in mind. Creating assessments, scheduling tests, candidates taking on the challenge, co-ordinating the results…..it’s all in there.”
Clare Ruiz-Palma, Project Manager for the Surpass implementation at CACHE, added:
“BTL and CACHE have worked well together on this project. They (BTL) have also gone shoulder-to-shoulder with another of our strategic suppliers to help forge a seamless union between Surpass and our back office learner management system ‘Parnassus’. We launched this new e-assessment platform in early April 2014, right on schedule.”
Rob added: “Sourcing a new IT system involves far more than bytes, browsers and bandwidth. People like to do business with people and so far the team from Shipley (BTL) have worked really well with CACHE”.
Keith Myers, Managing Director, BTL says: “BTL are delighted to be working with CACHE. CACHE were very rigorous in their tendering process and we regard this choice of Surpass as an acknowledgement of the quality of our Assessment System. ”
CACHE is a leading UK Awarding Organisation, based in St Albans and focussing primarily on the child care, adult care, health and education sectors.
All CACHE qualifications require the learners to undertake assessment, to measure and confirm their progress towards achievement of the qualification.
Assessment includes assignments, extended essays and examinations under controlled conditions. Online exams, wherein the learner’s answers are auto-marked by the computer system itself, are a growing segment of the assessment landscape.
BTL is a leading UK provider of assessment systems to Awarding Organisations, Regulators and Government Bodies. First established in 1985, BTL is a privately held Limited Company founded by Chairman Dr Bob Gomersall based in Yorkshire, UK. Trusted by clients to deliver high-stakes examinations, BTL’s assessment platform, Surpass, is the most widely-used e-Assessment system on the UK’s education market.
For further information please contact:
CACHE Corporate Communications Manager
Telephone: 01727 818 646
Email: [email protected]
BTL Head of Marketing
Telephone: 07866 317346
Email: [email protected]
Redefining Assessment with New Technology
In learning, the SAMR (Substitution Augmentation Modification Redefinition) model is designed to encourage those involved in teaching to move away from simply substituting what they have always done manually into a technology, toward redesigning their activity by using technology to support completely new things previously not feasible. Perhaps there is something to learn for assessment here too. eAssessment can simply be thought of as delivering the same assessment by putting it on-line, however here again is an opportunity to think outside the box and to aim to redefine how we assess. Here is an example of how assessment was redefined to make use of technological innovation.
Quite often the problem with assessment is it can appear to the learner to be an artificial process by tasks that appear to be bolted on and not a realistic part of that situation. To enable all the learning outcomes to be shown to be met, some ‘extras’ have to be added in, the requirement to write a guide for how to use a web site was one I found myself having to add to meet set criteria. At Myerscough College one tutor’s use of emerging technology enabled outcomes to be met effectively
Jon, a lecturer in Animal Studies, supervised learners as part of the ASDAN enterprise management unit, a short course supplementary to the main animal management qualification that can make up part of a PDP qualification, or evidence wider key skills. The learners had initially come up with ideas like cake baking, car washes, sock puppets etc.to make money. Jon showed the learners how the ‘student’ ID tags on the bearded dragon vivarium were augmented. The learners were enthralled and wanted to know how it was possible. He told them that it wasn’t too difficult and the idea of an augmented reality calendar was born.
The development of the calendar naturally lent itself to meeting all the project process areas; planning, design, research, production. The learners took responsibility for the months, trying to theme each one to suit that point in the year eg February as “Love Birds for Valentine’s Day” and April as “rabbits for Easter”. The learners then had to research that animal and write a script that included main care points for it. They found that this helped focus them as they were writing for an external audience. The whole process of developing the content for the calendar covered all the unit assessment outcomes.
Each month’s contribution was filmed, photographed, voiceovers added and then edited together. The final results were uploaded to the Aurasma studio with the help of the tutor and each month was ‘brought to life’. Finally all aspects of the calendar were there and ready for printing.
Speaking about the experience Joe said “As students, we wanted to try something new.” Jayne followed this up by adding “It was really cool when you got to see it, you felt proud that you’d actually made something.” From Jon’s point of view, he said it was easier to get the learners to meet all the necessary criteria because “it was like theywere doing work, without actually doing work!” He explained that “it was a lot easier to get them to buy into the idea when they were engaged, focused and enthusiastic.”
For the assessment Jon was able to observe the learners involved in the process, capture evidence of them participating and then link to the final outputs for external examiners. The learners themselves had to add their evidence to an electronic book. Although there may appear to be a lot of work involved in this type of assessment activity, if the learners are engaged then it lessens the need to keep pushing them to complete tasks. It may well be that this activity is not scalable to cover a large group, but certainly is possible to replicate with around 15-20 learners. Alongside the assessment benefits it can also contribute to wider skills and confidence building.
The learners recognized that the technology could be utilized again when they were given an assessment in this academic year. They had to design a poster that helped people understand about animal training, but had a limited word count. One way they found to get round this was to add Augmented Reality content. They also noted that it was a great way for someone who wasn’t too confident with words, to ensure their message got across as they wanted, without having to struggle with writing.
This goes to show how well designed activities that make use of technology can make the assessment process easier. Listen to the students and Jon talk about their experiences http://youtu.be/n2wgEhUFG-Y
Author: Judy Bloxham