the e-Assessment Association

Developing and delivering e-assessments: the importance of communication

by Helen Claydon

For much of the past 10 years I have been involved in e-assessment content development across a variety of organisations and assessments. In all instances, my experience of e-assessment has involved the assessment ‘owner’ outsourcing various aspects to third parties.

Handing over responsibility for aspects of your assessment to a third party can be a nerve wracking experience. Establishing a collaborative working relationship is key to successful delivery. In this blog I share some thoughts based upon past experience and hope that they strike a chord with your own experiences.

Aims

We clearly have the shared aim of wanting the assessment to succeed, but to what extent do we agree on how to achieve this? Quite often compromises will need to be made along the journey. Main areas for discord and compromise relate to quality, timescales and costs. In my view quality is very important, but it is important to remember that quality and accuracy are not the same thing. A candidate will probably not be too bothered if a question doesn’t look ‘pretty’.

However, it is a different story if there is an error in a question, which results in the candidate losing a mark and potentially failing the exam. For high stakes examinations, it is no exaggeration to say that one small error in a line of code has the potential to be life altering for a candidate. It is worth agreeing up front who has responsibility for ensuring accuracy and how and when this will be achieved. For example, will it be through:

  • up front quality assurance testing prior to launch of the exam, or
  • remedial work post-launch to fix issues identified by candidates when they were taking their actual exam?

What are the consequences if the assessments are not ready by the agreed launch date? If the launch of a new high-stakes examination needs to be delayed, particularly if it was due to be administered on a fixed day, the consequences can be significant. All parties need to be clear about the importance of the timeline, with early indication provided of any potential slippages to the critical path. In the event of a slippage it is important that all parties work together in agreeing realistic mitigations. Transparency is key; covering up slippages or downplaying risks has the potential to create bigger problems later on which are more difficult to resolve. However, the launch date isn’t always critical, sometimes it is okay for dates to slip to avoid compromising quality.

The test owner will often have a finite budget, so additional spend to address unexpected issues will not always be a realistic option. As a test owner, we never want to compromise on quality or timescale. However, there will be instances when we have no choice; it is worth deciding up front the location of the line beyond which we are not prepared to cross.

Successful collaboration

Collaborative scoping of requirements at the start of a project is invaluable. Making time to talk before commencement of any system development will help to develop a shared and consistent understanding of high level requirements and reduce the likelihood of time and money being invested in developing a solution that doesn’t meet the test owner’s expectations.

At the start of a project, a common pitfall is failing to realise that you do not have a shared vocabulary. The resulting repercussions for a project can be significant, with solutions developed by the e-assessment provider not meeting the expectations of the assessment owner. If you perceive this to be a problem, it can help to work together to develop a shared glossary of terms and definitions. An alternative is to use definitions from an externally published source such as the ‘e-Assessment Glossary’, commissioned by JISC and QCA and published in 2006.

In conclusion …

To use the old BT adage, ‘It’s good to talk.’ The importance of investing time in up-front collaborative discussion should not be underestimated. It can help to increase the likelihood that our e-assessments succeed, with reduced need for compromise along the way.

Standards in e-Testing

I started working in e-assessment sales in the late 90’s and carried on into the mid-2000s Perhaps looking back I see this period as the heyday of e-assessment, with the new world of e-assessment providing an opportunity to give learners and candidates a better experience including instant feedback and on-demand testing. I worked closely with QCA (as was) and with a number of awarding bodies, and even completed my MBA dissertation with the incredibly exciting ‘Barriers to adoption of e-assessment by UK Awarding Bodies’. However, having had a seven year break from e-assessment and focussing more on the e-learning side of things, I have returned to the e-assessment fold and discovered that things have not really moved on. I had a presentation from one of the stalwarts of e-testing – I won’t mention any names – it was the same presentation I had seen them give 10 years ago.

Working now internationally I speak to many organisations who are contemplating moving to e-assessment and they are struggling. They still see e-assessment as being about the technology, whereas I think that’s the easy bit. The real challenge is that, as with any IT project, you need to understand what it is that you are trying to achieve strategically, and then align the organisation to achieve this. Once you know this, it will direct you towards the tactical and operational decisions you need to make. This is a process that I am hoping to provide a solution for and I am currently working with the International Standards Organisation (ISO) to put together a standard on high-stakes e-testing, which will include the more strategic implications of e-testing and how this impacts on the operational elements. NB. I am specifically using the terms high stakes and e-testing as it narrows the scope and makes things somewhat easier.

Whilst this may sound straight forward the actuality of it is much more difficult as different types of organisations (for example, academic and commercial testing organisations) and different countries see things differently. And, as with the United Nations there has to be consensus! This diversity is further exacerbated by the challenges around creating a standard for something that is subjective, like strategy. The intention is not to recreate work that has already been completed in the area of developing and delivering e-tests, of which there is a lot, but to draw this information together into a single body of knowledge in the context of understanding why e-testing is being implemented within an organisation.

This work is in progress and whilst I have paraphrased a lot of the activity here, I have put a schematic together which I think neatly brings the different elements of e-testing together to explain them. I would appreciate any feedback anyone may have and if they would be interested to be involved or have anything they would like to share.

Author: Patrick Coates, [email protected]

The personal views expressed in this blog are those of the eAA Board member, and not of their employer.

Mobile Technology and e-Assessment

Mobile Technology and e-Assessment

The technology research company Gartner reported that the sale of PCs was down yet again, and that the trend for mobile devices was increasing apace, especially in countries where there has been little adoption of computer technology previously, like across Africa. The number of Smart phones and tablet devices I see here around the university bear testament to this. If these devices are so common are there ways we can harness the power of them for e-assessment?

Socrative

This tool can be used on any device, as you can access it via a web interface, or via apps on some devices. The beauty of it is that the interface is identical no matter which way you choose to access the quiz. The tool allows you to set up a variety of question formats through a simple interface, and now you can include images with the questions. When learners take the quiz you can see progress immediately. Another facility it offers is to be able to create an exit poll to enquire about what has been learned in a session, and also set a closing problem.www.socrative.com

QuizSlides

This has to be the simplest quiz creator ever, if you can make a PowerPoint you can do this. You create your quiz in PowerPoint and upload it as a .ppt or .pdf file, then you take the quiz to set the correct answers and you’re done. All quizzes run through a browser so can be used on any web connected device or PC. It’s not sophisticated, only allowing four choice MCQs, but it does let you create wonderful looking quizzes simply based on existing Office templates. When you run these on handheld devices it simply looks like it was designed for that device.www.quizslides.com

InfuseLearning

This platform offers a great choice of question formats that are easy to set up in advance and run as you need. You can even choose to run spot questions as you go. The really nice option though is to be able to use drawing questions. You can upload your own images and then ask learners to indicate specific things, or even draw a simple diagram on a blank canvas. When their images are submitted, you can view them on screen and save them in .pdf format. The slightly frustrating thing is drawing can be a bit difficult if you want fine detail, so getting people to write labels isn’t viable, better to ask them to indicate things as A, B, C.

Another feature, which is wonderful for accessibility, is that InfuseLearning offers the option of text-to-speech question and answer reading. Alongside this it can convert questions and answers into a limited number of alternative languages. It runs on all web browsers except Internet Explorer, so can be accessed on many different devices.www.infuselearning.com

NearPod

This is a beautiful tool that can be run via an iPad, iPhone or iPod Touch app available through iTunes. A new function has just been added to allow access via a browser for participation in the quizzes, however the administrator still needs to use an Apple device to control the session. You can upload a PowerPoint presentation, or pdf to create a slide show to which you can then add a variety of styles of questions at relevant points. Originally you controlled the pace of change of each slide, and when to move on at each quiz point, as the ‘teacher’ was in charge of moving each slide forwards. The tool lets you see each individual’s progress to determine when to move forwards. Now there is also a ‘homework’ mode that allows learners to work through at their own pace. This tool really lets you test understanding as learners work their way through. www.nearpod.com

There are many other tools out there but these are my favourites. The beauty of all of these is you can test them out with no financial outlay, for some there are costs when you go beyond certain limits – number of participants, number of quizzes, or improved reporting. But what I especially like is the ease of setting up quizzes on all of these. In a visually savvy world, they also produce professional looking output to engage learners.

Author: Judy Bloxham | eLearning Adviser FE&SFC | Jisc RSC Northwest

[email protected]

CACHE Award Assessment Delivery Contract to BTL

BTL is pleased to announce that CACHE (Council for Awards in Care, Health and Education) has awarded BTL a contract for the delivery of Assessment Services through Surpass Professional Edition, BTL’s new SaaS-based Assessment Platform.
 
Through 2013, CACHE ran a major procurement exercise to ‘test the market’, seeking a new supplier and a new system for its online examination programme. From a competitive field, BTL was awarded the contract. The new Surpass product is built on the Surpass Enterprise Edition platform already used by 70% of the major Awarding Organisations in the UK.
 
Rob Bendelow, Procurement & Project Manager at CACHE comments:
 
“Four potential suppliers were short-listed. Of course, all four systems could do the job, but the Surpass product from BTL ticked 95% of our boxes. It’s pretty easy to use and has clearly been developed with Awarding Organisations in mind. Creating assessments, scheduling tests, candidates taking on the challenge, co-ordinating the results…..it’s all in there.”
 
Clare Ruiz-Palma, Project Manager for the Surpass implementation at CACHE, added:
 
“BTL and CACHE have worked well together on this project. They (BTL) have also gone shoulder-to-shoulder with another of our strategic suppliers to help forge a seamless union between Surpass and our back office learner management system ‘Parnassus’. We launched this new e-assessment platform in early April 2014, right on schedule.”
 
Rob added: “Sourcing a new IT system involves far more than bytes, browsers and bandwidth. People like to do business with people and so far the team from Shipley (BTL) have worked really well with CACHE”.

Keith Myers, Managing Director, BTL says: “BTL are delighted to be working with CACHE. CACHE were very rigorous in their tendering process and we regard this choice of Surpass as an acknowledgement of the quality of our Assessment System. ”
 
About CACHE
CACHE is a leading UK Awarding Organisation, based in St Albans and focussing primarily on the child care, adult care, health and education sectors.
 
All CACHE qualifications require the learners to undertake assessment, to measure and confirm their progress towards achievement of the qualification.
 
Assessment includes assignments, extended essays and examinations under controlled conditions. Online exams, wherein the learner’s answers are auto-marked by the computer system itself, are a growing segment of the assessment landscape.
 
www.cache.org.uk
 
About BTL
BTL is a leading UK provider of assessment systems to Awarding Organisations, Regulators and Government Bodies. First established in 1985, BTL is a privately held Limited Company founded by Chairman Dr Bob Gomersall based in Yorkshire, UK. Trusted by clients to deliver high-stakes examinations, BTL’s assessment platform, Surpass, is the most widely-used e-Assessment system on the UK’s education market.
 
www.btl.com
www.surpass.com
 
For further information please contact:
 
Jo Matthews
CACHE Corporate Communications Manager
Telephone: 01727 818 646
Email: [email protected]
 
Geoff Chapman
BTL Head of Marketing
Telephone: 07866 317346
Email: [email protected]

Redefining Assessment with New Technology

In learning, the SAMR (Substitution Augmentation Modification Redefinition) model is designed to encourage those involved in teaching to move away from simply substituting what they have always done manually into a technology, toward redesigning their activity by using technology to support completely new things previously not feasible. Perhaps there is something to learn for assessment here too. eAssessment can simply be thought of as delivering the same assessment by putting it on-line, however here again is an opportunity to think outside the box and to aim to redefine how we assess. Here is an example of how assessment was redefined to make use of technological innovation.

Quite often the problem with assessment is it can appear to the learner to be an artificial process by tasks that appear to be bolted on and not a realistic part of that situation. To enable all the learning outcomes to be shown to be met, some ‘extras’ have to be added in, the requirement to write a guide for how to use a web site was one I found myself having to add to meet set criteria. At Myerscough College one tutor’s use of emerging technology enabled outcomes to be met effectively

Jon, a lecturer in Animal Studies, supervised learners as part of the ASDAN enterprise management unit, a short course supplementary to the main animal management qualification that can make up part of a PDP qualification, or evidence wider key skills. The learners had initially come up with ideas like cake baking, car washes, sock puppets etc.to make money. Jon showed the learners how the ‘student’ ID tags on the bearded dragon vivarium were augmented. The learners were enthralled and wanted to know how it was possible. He told them that it wasn’t too difficult and the idea of an augmented reality calendar was born.

The development of the calendar naturally lent itself to meeting all the project process areas; planning, design, research, production. The learners took responsibility for the months, trying to theme each one to suit that point in the year eg February as “Love Birds for Valentine’s Day” and April as “rabbits for Easter”. The learners then had to research that animal and write a script that included main care points for it. They found that this helped focus them as they were writing for an external audience. The whole process of developing the content for the calendar covered all the unit assessment outcomes.

Each month’s contribution was filmed, photographed, voiceovers added and then edited together. The final results were uploaded to the Aurasma[2] studio with the help of the tutor and each month was ‘brought to life’. Finally all aspects of the calendar were there and ready for printing.

Speaking about the experience Joe said “As students, we wanted to try something new.” Jayne followed this up by adding “It was really cool when you got to see it, you felt proud that you’d actually made something.” From Jon’s point of view, he said it was easier to get the learners to meet all the necessary criteria because “it was like theywere doing work, without actually doing work!” He explained that “it was a lot easier to get them to buy into the idea when they were engaged, focused and enthusiastic.”

For the assessment Jon was able to observe the learners involved in the process, capture evidence of them participating and then link to the final outputs for external examiners. The learners themselves had to add their evidence to an electronic book. Although there may appear to be a lot of work involved in this type of assessment activity, if the learners are engaged then it lessens the need to keep pushing them to complete tasks. It may well be that this activity is not scalable to cover a large group, but certainly is possible to replicate with around 15-20 learners. Alongside the assessment benefits it can also contribute to wider skills and confidence building.

The learners recognized that the technology could be utilized again when they were given an assessment in this academic year. They had to design a poster that helped people understand about animal training, but had a limited word count. One way they found to get round this was to add Augmented Reality content. They also noted that it was a great way for someone who wasn’t too confident with words, to ensure their message got across as they wanted, without having to struggle with writing.

This goes to show how well designed activities that make use of technology can make the assessment process easier. Listen to the students and Jon talk about their experiences http://youtu.be/n2wgEhUFG-Y

Author: Judy Bloxham

[1] http://www.asdan.org.uk/courses/programmes/enterprise-short-course

[2] http://www.aurasma.com

E-assessment provides greater quality assurance

Over the past decade when I have been ‘on the road’ talking about electronic assessment a frequent accusation that I have had to counter, is that computers are ‘easier to cheat’ and therefore e-assessment, by inference, is less reliable. This in part explains the continued insistence from some public bodies on ‘wet signatures’, as if these cannot be forged. It would not, for example, be too difficult to ‘forge’ the signature of Vince Cable, the minister who signs off all the funding for the sector. It’s a large C with a dot in the middle!

There is however a growing recognition that it is pretty difficult to ‘cheat’ the e-assessment of vocational skills because every decision and action is date stamped and ascribed to a person’s password-protected user name. It is just not possible to alter that date, and, for every action that is taken, there is a clear audit trail.

This assertion of “cheat-proof” is most applicable to those systems that are designed to support rigorous assessment, where, although the portfolio belongs to the learner, they are not, for obvious reasons, in total control of what happens within it. The systems that I am referring to are those that are able to control who has permission to see and do what, depending on the group they belong to and the role they play.

The level of transparency that e-assessment provides goes well beyond this level of quality assurance. Electronic assessment brings decisions about assessment methodology to the surface, which in traditional assessment systems often get hidden amongst stacks of paper. Computers are not very good at accommodating the common scenario in human judgement-based assessment of ‘this student looks like a fail to me, but maybe in some often ill-defined conditions the student could pass…’. In e-assessment all rules of assessment need to be clearly defined and understood in order to work at all, and this is largely a benefit (although it might not always seem so).

The process of creating an e-assessment system often results in awkward questions being asked of the assessment methodology itself. This is often true in my experience within Higher Education where the assessment methodologies are not always ‘standardised’ across a department (never mind across the wider university). In one sense this is not surprising: a health course at a University often has to be validated by three separate bodies; the University itself, the relevant professional body and the regulator and so standardisation across these three is likely to draw more attention than standardisation with other departments teaching disciplines that are seen as very different.

However there is an increasing interest in standardising the processes of courses covering similar vocational subjects. This is, in part, driven by the desire for an electronic approach and the recognition that there is significant extra expense involved in trying to accommodate all the different requirements from individual courses. This is particularly the case where these requirements often appear not to be core to the assessment process. It is also driven by a demand from those being assessed and their assessors to have processes in place that are clear.

When e-assessment was first introduced for vocational assessment, there was a sense that the early adopters were carried away by all the opportunities that it potentially offered, in terms of the way evidence was captured and the possibilities of using the portfolio beyond the course, etc. As a result, people often lost sight of the need to fully reflect the assessment journey. This is perhaps why, even though there have been lots of electronic assessment projects in HE, very few have secured widespread adoption or had a significant impact.

The real irony is that the very thing that was perceived to be the Achilles heel of e-assessment, namely that by using computers the assessment will be less rigorous and subject to cheating, the opposite turns out to be the case. Well-designed e-assessment systems actually offer to Universities and other similar institutions the opportunity to strengthen their quality assurance systems. In our experience with Southampton University, the pressure on mentors to make ‘quick assessments,’ because the student must leave with their paper portfolio is replaced by an opportunity for the mentor to reflect, in their own time, on the judgements they need to make. From the tutors’ perspective, they are able to see at any time the progress their students are making, rather than waiting until the portfolio lands on their desk at the end of the placement, by which time it is often too late. For the student the pain of carrying around a 300 page document is replaced by their tablet or mobile.

In this and in many ways e-assessment can be said to strengthen rather than undermine quality assurance.

Chris Peat Axia Interactive http://www.axiainteractive.net/