eAA Newsletter Issue 10

Hello and welcome to all our members with a new look newsletter.
Our fresh look, combined with our new website, is built with our members in mind. I hope you like it. We have plenty to update you on, following a productive last 12 months, with case studies, interviews, news, as well as an Election to talk about – the eAA one, naturally! We have great examples of eAssessment practice from the following sectors – Higher Education, Medical, Oil & Gas and Financial Services. We have new Board members and a flagship Q & A with eAA member, Martin Mackain-Bremner of the Association of Accounting Technicians (AAT). As well as being an eAA sponsor, the AAT have been a pioneer in assessment and learning technology who want to share their experience with the eAA membership.
We’d really appreciate your feedback on the new style newsletter, the website and what the eAA is doing. I hope you enjoy the newsletter and look forward to hearing your views.
Matt Wingfield, Chairman, eAssessment Association
Contents
- Board Election and AGM 2015
- Membership
- Finances
- Marketing
- New members of the eAA Board
- Support the eAA
- Case Study – Peer Review to increase university student engagement
- Public Relations – getting to those hard to reach places
- Q + A with Martin Mackain-Bremner
- Call for blog posts
- Centre Readiness Report
- Case Study – CIPFA global expansion needs flexible eAssessment solutions
- Case Study – Moving to eAssessment with a Royal Medical College
___________________________________________________________________________
Board Election and AGM 2015
The eAA holds an Annual General Meeting (AGM) every March, with hosting facilities kindly supplied by eAA Sponsor Assessment Tomorrow at the e-Assessment Question event.
On March 25 2015, the AGM conducted a review of activities across membership, marketing and finance, which is then finalised with the eAA Board election results. It’s also an opportunity for eAA members to raise points, issues and questions on the eAA’s activities and future plans.
Membership
Our membership continues to grow with over 1,200 individual members and 19 corporate sponsors, who fund our activities. Thank you to our members for your support and also to our sponsors, for enabling us to fulfill our objectives.
Finances
The accounts and budgeting are performed by our Treasurer, Gavin Busuttil-Reynaud, of eAA sponsor, AlphaPlus Consultancy. The eAA’s main budget lines are for website maintenance and work, marketing/ administration, and for having presence at events such as the eAssessment Question. We have a nominal cash reserve ‘float’ for contingency. All expenditure is subject to controls as specified in the eAA Constitution. Any members who would like details, can email Gavin directly.
Marketing
Since the last newsletter, we’ve completed a number of initiatives to help us achieve the eAA’s objectives and provide value to members.
1. A full rebuild of our website, together with our membership database, to increase accessibility and enhance our ability to provide you with content.
2. A revamped newsletter – please let us know what you think of the new design by email or on the Twitter feed.
3. A public relations campaign, which has resulted in eAA gaining coverage in the Guardian, TES, Training Journal and other trade press.
4. A ramping-up of our social media presence, with plenty of active discussions in our LinkedIn group and our Twitter feed.
At each AGM, a number of Board seats become vacant when a Board member completes their three year term and/ or stands down. For 2015, two seats from our institutional/ corporate membership became available, as well as four seats for individual eAA members. We welcome the six new members of the eAA Board from 2015.
Heather Chesley (RM)
Helen Claydon (Standards and Testing Agency)
Linda Steedman (eCOM Scotland)
Jay Ashcroft (Learnmaker)
Tim Burnett (BTL)
Matt Wingfield (Digital Assess)
We also want to thank the following Board members who have stepped down within this term for their efforts.
Grainne Hamilton, Brendon Kavanagh, Kenji Lamb, Denis Saunders, Keith Smyth and Matthew White.
Board member terms run for three years. We hold quarterly Board meetings, with further meetings for specfic eAA projects and initiatives. If you’re interested in being elected to the Board next year, please get in touch via the website or by email.
Support the eAA
The eAA relies on its sponsors to help fulfill its mission and objectives. As a membership organisation, we are accountable to our members, so it’s important that we recognise and acknowledge how we are funded. Each January, our sponsors pay an annual subscription to access the eAA, promote themselves and their services. The subscriptions help to pay for our presence at conferences, upkeep of the website, public relations services, marketing, administration and many other items.
Could your organisation support the eAA through sponsorship?
Register your interest at our website and discover the big strides the eAA has already made, and how we can help your organisation within the eAssessment sector. Discover how to support the eAA. Visit our website today for details.
Case Study – Peer Review to increase university student engagement

This study involves the University of Edinburgh and our sponsor, Digital Assess. The deployment of the CompareAssess product was an award winner at the 2014 e-Learning Awards.
Objective
The University of Edinburgh was seeking ways of improving student engagement, an area of weakness as measured in their Complete University Guide score.
Situation
Digital Assess met the University of Edinburgh Academic Development team to focus on enhancing learner experience and providing richer means of student collaboration. The CompareAssess tool facilitates student peer review and feedback as part of formative assessment.
By enabling students to be the assessors of the work within CompareAssess, they gain insight into, (and can leave feedback on) the work of their peers, enabling them to make value judgements about their own work in relation to others – inspiring self- improvement and critical judgement skills. Facilitating the capture and dissemination of meaningful and insightful peer feedback, relieves pressures on tutors, but also demonstrates a positive impact on attainment. The first school to implement CompareAssess was the School of Physics and Astronomy, who initiated a pilot in late 2012, headed by Dr. Richard Blythe.
The ‘Science’ of Comparison
CompareAssess is based on The Law of Comparative Judgement (L.L. Thurstone, 1927), developed to deliver highly reliable summative assessment of essays and portfolios of evidence for exam accreditation bodies, within the context of GCSE and A Level qualifications.
The system enables assessors (‘judges’) to view successive pairs of digitized work. They are then asked to make a professional judgement on which of each pair is ‘better’ at achieving a desired quality (usually holistic in nature but that can be made up of one or more assessment criteria). Through the use of a patent pending adaptive algorithm, (which reacts to the judgement result being made by the judges to provide each piece of work from each pair with a value score, known as the ‘parameter value’), the system places each piece of work within a highly accurate scaled rank order, from which grades can be calculated.
The Power of Feedback
In addition to deciding which of the two pieces of work is ‘better’, the judges are also asked to leave feedback about each piece of work around the qualities or areas for improvement within each piece.
It was this latter feedback mechanism that was the focus of this project within the University of Edinburgh, asking the students themselves to be the judges (of the anonymous work of their peers), making judgements about the quality of work, but also and most importantly, leaving written feedback on how the work could be improved.
The system captures all student feedback for each piece of work. This is fed back to the appropriate student after the judgements had been completed.
Outcome
The approach proved very popular with both the tutors and students, and was deemed to be very successful by all involved in the project.
From the tutor perspective, the system alleviated the need to review and feedback on each student’s plan, saving time for other duties. There was a requirement to give the students guidance on the sort of feedback to leave, but it was generally found that the students were capable in their ability to leave constructive feedback. The tutors believed the students would be naturally exposed to examples of other student work that was either better or worse than the student’s own work, and thus the approach naturally supported formative assessment and afforded a very positive influence on the student’s own quality perceptions in relation to their work.
The students were positive about the experience. They saw how this approach gave them a transparent view of not only what represented a good or a bad piece of work, but also gave them a clear sense of what they were ultimately going to be assessed on – i.e. the qualities that the tutors were looking for in their work, in an easier way than if the tutors had tried to verbally explain this. It was acknowledged that CompareAssess was beneficial not only as a result of receiving multiple sets of feedback (i.e. not just from one person – the tutor) but also that it had a positive effect on the way that they then tackled their essays – in some cases improving the quality of the final piece.
Impact
In using CompareAssess to support peer feedback, the University of Edinburgh has seen an improvement in its CUG student rating. The University remains convinced of its benefits and has since begun using CompareAssess to facilitate and support the provision of student feedback and peer-evaluation in other areas of study as diverse from Maths to Biological Sciences to Veterinary Medicine to Business Studies.
It has acknowledged how the system could be rolled out to deploy greatly enhanced student feedback across the university, without a direct impact on tutor workload. The University of Edinburgh also chose CompareAssess to support the assessment of their Edinburgh Award, a co-curricular course that specifically targets ’employability’ skills.
CompareAssess is now being deployed by the university to manage both peer review and final course assessment of the Edinburgh Award, offering a highly reliable, self-managing and completely scalable solution for the assessment of this innovative programme.
Public Relations – getting to those hard to reach places
A big part of our 2014 – 2015 activity has been generating public relations and lobbying coverage.
Aided by Galibier PR, the eAA has gained significant coverage in mainstream and trade press.
The eAA Board, with help from some eAA members, have prepared and given their valuable time to craft articles, features and interviews that have made the pages of the Times Educational Supplement, The Guardian, Training Journal and the specialist Gas International magazine.
We’ve also met Parliamentarians such as Simon Wright, to increase awareness of the good practice of the eAA’s members, as well as offering our opinions on how technology can positively impact issues within education.
The eAA is also represented on the eAssessment Research and Development Forum for General Qualifications, as we seek to impart the eAA’s expertise and experience to regulators.
If you would like to get involved with the eAA’s PR activities, such as writing an article response, please contact us.
Q + A with Martin Mackain-Bremner 
The Association of Accounting Technicans (AAT) has made big strides by deploying eAssessment across its qualifications. The eAA caught up with the AAT’s Head of Assessment, Martin Mackain-Bremner, for a Q&A to share the AAT’s story, and tackle questions on the bigger issues for the sector
Q: When you first joined AAT, what were the immediate eAssessment challenges you faced?
MMB: I joined AAT as Head of Assessments in June 2010, just as we were going live with Computer Based Assessments (CBA) in place of paper-based exams. The immediate challenge was to provide support to the 450 or so training providers (there are now more than 600) who were delivering our qualifications and taking on the challenge of online assessment at their centres, so that we could respond effectively to any ‘teething problems’.
The next immediate challenge was to upgrade our eAssessment systems to meet increasing demand from students and training providers as they became more familiar with the interfaces, functionality and potential of the systems. We have found that membership of organisations such as the eAssessment Association, the Federation of Awarding Bodies and the Association of Colleges enables us both to share with others our experiences of transition and deployment of such technologies, and to learn from them in return.
Q: How does eAssessment enable AAT to meet its growth ambitions and commitment to quality?
MMB: Growth: AAT’s stated mission is ‘to ensure that AAT reaches the heart of every business’. Growth, through the acquisition, progression and retention of students, training providers and qualified members, enables AAT to achieve its ambition, whether in UK, Europe or elsewhere in the world. Growth is also about the personal and professional development of students and members and the organisations within which they work or will work. In short, AAT enables individuals and businesses to achieve their potential in the world of business and finance.
eAssessment plays a key part on the pathway to qualification, by providing flexibility in terms of where, when and how assessments are sat, as well as delivering rapid turn-around of results. As well, the detailed data arising from the taking of assessments and progression to a qualification enables AAT to focus on both strong and less strong aspects of assessments and qualifications, to deliver a flexible, responsive and high quality portfolio of products and services.
Quality: AAT stakeholders, such as students, training providers and employers, as well as regulators such as Ofqual and SQA, expect and demand high quality products and services. Quality assurance and control are key aspects of the development, delivery, maintenance and management of these products and services, and the eAssessment system used by AAT enables effective and efficient workflow as well as detailed analysis arising from the performance of students, training providers and the assessments themselves, down to the level of individual items (questions).
Over time, this data allows for the identification of trends and risks, and enables pre-emptive action ahead of issues arising.
Q: Given AAT have fundamentally shifted to on-screen assessment, has this shift resulted in a significant change in resource allocation focus?
MMB: The most significant changes in resource allocation have been to the development and management of the systems used to develop, deliver and manage online assessments. This has included the management of our relationship with key suppliers (such as BTL and Kineo, for the provision of eAssessment delivery platforms and services, and Linney Group, for the printing and distribution of certificates) and robust and effective project management of new features and releases.
In addition, attention has been focussed on the integration of the eAssessment system with AAT’s other key business systems: CRM for qualification, student, training provider and other information, Web services for communicating with stakeholders an providing access to study support and other services, and AX, for financial management.
The development of statistical analysis and reports around the behaviour of both the assessments (success rates, individual items and whole qualifications) and the centres and cohorts of students taking the assessments. This allows us to forecast the numbers of assessments taken throughout a year, as well as to identify and take pre-emptive action in areas of assessments that are not performing as well as they should.
Q: Are you surprised that awarding organisations with a similar mandate to AAT’s have not adopted eAssessment with the same level of commitment and investment? Or do you see similarities in their approach, deployment and success?
MMB: We have been approached by a number of organisations keen to understand what we have learned from our experiences, and a number of these have subsequently gone or are going online. There is no denying that the levels of commitment and investment are very significant, and may be enough to deter some smaller organisations from adopting eAssessment, but equally suppliers such as BTL are making available ‘lighter’ options around SaaS (software as a service).
Equally, some methods of online assessment, such as multiple choice questions (MCQ), are perceived as less demanding and rigorous than paper-based approaches, despite the research demonstrating the efficacy and effectiveness of MCQs, when well-written.
AAT has adopted an approach of matching the best method of assessment to the learning outcomes being tested, and this has included the use of written-answer questions to test communication skills as well as more ‘typical’ accounting knowledge and skills.
Q: 2015 marks ten years since the former head of the QCA, Ken Boston, made his epochal speech on eAssessment, calling exam development and delivery a ‘cottage industry’. Given that standards and quality appear not to have been impaired by eAssessment in that time, do you believe that the improvements in the last ten years have been acknowledged properly?
MMB: It is very clear that assessment development and delivery simply cannot be, and for us has never been, ‘cottage industry’. Firstly, Awarding Organisations such as AAT are required by regulators (such as Ofqual andSQA), to deliver high quality products and services that are underpinned by robust and transparent policies, procedures and practices. Secondly, training providers, students and employers do have, quite properly, expectations around the value and quality of the qualifications students undertake and the services around these qualifications. This means that we have to build, maintain and progress the capability and capacity to deliver and sustain high quality outputs, including assessments that are, in the words of the Ofqual Conditions of Recognition: valid; reliable; authentic; sufficient and comparable. All of this demands a highly professional approach in all aspects of assessments, not a ‘cottage industry’!
There are occasions on which we are challenged, especially by employers and practising accountants who experienced the paper-based assessment format, to demonstrate the effectiveness of eAssessment, but this is a decreasing occurrence as eAssessment is taken up more widely and shown to be as effective as other methods of testing.
There cannot be many accountants that do not carry out business every day using computers, and eAssessment does reflect these workplace practices, helping to prepare students better to move into the workplace.
Q: As exam validity, fairness and reliability and quality are scrutinised through eAssessment, could the sector do a better job in ‘showing and telling’ this success story to the mainstream media?
MMB: Yes, probably. There are two sides to this coin:
To describe how eAssessment works in terms of validity and reliability, using research and empirical evidence to demonstrate the effectiveness of well-developed assessments – this builds public confidence in eAssessment as a rigorous and suitable form of testing.
To describe how eAssessment affords increased flexibility in terms of when and where tests may be taken, how quickly results can be produced and disseminated and how the performance of the assessments, the candidates and training centres can be monitored through detailed analysis.
Q: What are the consequences of awarding organisations and other sectors not continuing to innovate and invest in eAssessment, even after successful deployment?
MMB: eAssessment is here to stay, and take-up will only grow, not shrink. New developments in functionality will help drive acceptance and deployment. Organisations that do not innovate and invest will become stagnant and increasingly irrelevant – that said, innovation must be well-managed, so that change is not an end in itself, but remains a means to an end – technology is an enabler, not a total solution, and must be applied intelligently, to complement and support non-technology areas of teaching, learning and testing.
Organisations, conferences and ‘ideas exchange’ forums such as ATP and EATP, eAssessment Question, eAssessment Association, Federation of Awarding Bodies and Transforming Assessment offer valuable opportunities to share, investigate and discuss experiences, approaches and options around adopting and developing eAssessment capabilities, for both intending or new adopters and seasoned practitioners.
Q: As exam and item writing are becoming more visible and professionalised pursuits, in what ways do you consider that eAssessment technology has benefited these critical areas?
MMB: The principal benefit arising from the use of eAssessment technology in exam and item writing is the opportunity for distributed, collaborative working within defined workflow structures. This leads to opportunities for rapid development of test items, leading to the associated benefit of large banks of questions for assessments, allowing for increased security through randomisation of questions for assessment sessions.
As part of the process, eAssessment technology allows for effective quality control through distributed scrutiny of parts of complete assessment ‘papers’ before launch, and for subsequent feedback to writing teams arising from analysis of the performance of individual items or whole assessments.
Q: Have exam development teams shed their ‘cottage industry and volunteering’ image? Do you believe the increasing role of statistics and data analysis (or psychometrics) is playing a part in changing perceptions?
MMB: There is no room for ‘cottage industry and volunteering’ within AAT’s systems – regulators, training providers and employers, as well as students, rightly demand a highly professional approach to assessment development, with robust and effective quality assurance and control measures.
Data analysis and statistics are playing an increasingly important part in the development and delivery of assessments, as organisations become more aware of the powerful opportunities afforded by technology to track performance of all aspects of assessments, from user ‘profiling’ (such as student age groups and motivation, training provider status (private or public sector) and time taken to complete a qualification) to individual items (eg facility values, frequency of exposure).
This places additional demand on organisations to train and maintain staff in the skills and techniques associated with assessment design (eg Angoff, Beuk and Hofstee methods for setting a passing score) and reliability and validity analysis (see Ofqual report on the validation of vocational qualifications).
Data analysis and statistics are also key aspects of business information/intelligence to inform decision-making, as well as determining achievement of business objectives and for forecasting income and expenditure.
Who’s next f
or the eAA Q + A?
Would you like to participate in a future Q + A for the eAAnewsletter? We always need examples of good eAssessment practice, from practitioners, awarding organisations, question writers, verifiers, etc. In fact, as long as you are an eAA member, your contribution is warmly welcomed.
Get in touch by email to contribute to a future eAA Q + A.
Call for blog posts
On our website and via the LinkedIn group, we call on members to contribute a short blog post on an eAssessment topic.
We successfully ran a series of postings last year on a number of topics including accessibility, e-testing standards, adaptive testing and mobile assessment.
It’s a really easy way to give an opinion piece, or to shine a light on an unheralded area of eAssessment. We need postings of between 500 – 1,500 words that can be attributed to an eAA member.
Interested? Send your posting to John Winkley (that’s him in the picture) by email.
Centre Readiness Report
A key research study conducted in 2014 by SQA, CCEA and the Welsh Government (and published recently) focuses on how ready centres such as schools and colleges are to deliver accredited General Qualifications by eAssessment.
Many commentators in our sector consider General Qualifications to be the ‘final frontier’ for summative, high-stakes, eAssessment. A short, independent, review of the research findings was published by the eAA’s Vice-Chairman, Geoff Chapman. A copy of the research report can be downloaded from the eAA website.
Case Study – CIPFA global expansion needs flexible eAssessment solutions
The Chartered Institute of Public Finance and Accountancy (CIPFA) and our sponsor Calibrand have partnered to deliver complex exams in challenging places for the Institute’s new-look Professional Qualification (PQ). Aga Jop, Head of Student Services, gave us some insight into the rationale,
“The project is taking place over a 30-month period from inception, through selection of partners to full delivery, with various milestones along the way. Certainly the project is ambitious in that many eAssessment qualifications are delivered using some of the following functions; online authoring, multiple-choice, short answer, essays and case studies, online marking of and online invigilation of examinations. This is one of a few but increasing number of projects that will do all of these functions.”
CIPFA is the professional body for people in Public Finance with its 14,000 members working throughout public services, national audit offices, major accountancy firms and other bodies managing public funds. In all there are 12 exams to complete from Certificate through Diploma to Strategic qualifications.
The Professional Certificate is the entry route into accountancy. No previous experience or knowledge is necessary. Materials and online courses are designed to teach students all they need to know. In fact, more than half the students come from non-relevant backgrounds. In addition to passing exams, a professional experience portfolio must be completed in order to apply for full Membership of the Institute.
“CIPFA, like many similar bodies, understands the importance of effectively identifying and managing risks. In taking up this project we were encouraged by the ‘pull factors’ such as comfort in using technology by Generations X and Y, academic engagement and the fact that the technology is available”, continues Aga. “For us though it was the ‘push factors’ that made us take action, the fact we are increasing globally in our delivery, the fact we have to operate in War Zones and the risk to our business of ‘not adopting eAssessment. We deliver exams on behalf of the UN and they operate in some difficult places. Delivering exams in those particular circumstances wasn’t easy. The point is we needed flexible solutions.”
After a very rigorous procurement process and checking of capability, CIPFA chose Calibrand as their official eAssessment partner. Dr Adrian Pulham, CIPFA Director of Education and Membership comments,
“We chose Calibrand as they provided the best overall solution to the challenges CIPFA faces as it expands globally. We are confident that they will deliver an excellent service to CIPFA and those studying with us. We are particularly confident in their delivery as their marking, essay and case-study examination solutions fitted very closely with our existing methodology.”
Aga expanded on this a little further,
“Calibrand are able to deliver all our exam types including complex high level questions and online marking functionality. They were able to seamlessly integrate with our remote invigilation partners and with other CIPFA systems”.
CIPFA see one of the biggest challenges being in educating and communicating with their stakeholders, examiners, tutors, students, authors, administrators and of course the governance of this.
“It’s ground-breaking thinking for us,” concludes Aga, ”CIPFA has undergone a revolution in thinking which allows an evolution in exam delivery.”
For further information on the project, please contact Aga Jop, Head of Student Services, CIPFA. For information on Calibrand solutions please contact Tim Winfield, Business Development Manager, Calibrand the currency for talent®.
Get in touch – help the eAA
Membership is the lifeblood of the eAA. Your time, engagement and promotion of eAssessment and assessment technology has tangible effects on policy, project deployments, new initiatives, and most importantly, provides learners with better assessment of their knowledge, skills and capabilities.
Would you like to help the eAA with a project? We’d like to hear from members who can volunteer to assist the Board in our current plans, or who would like to offer an initiative that will benefit our membership.
Get in touch, we’d love to hear from you. You can reach us via the website, email or by the Twitter feed.
Case Study – Moving to eAssessment with a Royal Medical College
This case study from our sponsor BTL looks at a deployment which involves transitioning from paper-based assessment to eAssessment for three high-stakes medical theory examinations. The Royal College of Paediatrics and Child Health (RCPCH), through its mission to transform child health through knowledge, innovation and expertise, is responsible for training and examining all paediatricians in the United Kingdom.
Over the last 24 months, RCPCH have deployed Surpass (BTL’s summative assessment platform) which has enabled a smooth transition from conventional paper-based testing, to an innovative computer-based approach, making it a pioneer within the Royal College medical group. It also made use of BTL’s test centre management services.
The secure testing environment and scalability of the platform means that testing the doctors of tomorrow can be improved to achieve higher standards, and without the worry of disruptions. In a recent trial, over 70% of candidates said they preferred the on-screen testing environment. Online distribution of assessment materials not only meets the highest security standards, but also removes the pain and cost of traditional test distribution and marking and the hassle of test centre management. Real-time remote monitoring of test progression and marking means that assessors can quickly and easily analyse performance. Candidates can also benefit from instant feedback, giving them a thorough understanding of how their examinations will be marked; much more reliable than showing them past papers.
“What would I say to other Royal Colleges…? It (Surpass) is going to provide a better exam for your candidates. It’s going to be more ‘Real World’. You will have an exam that replicates what you are trying to test. We’re improving something that’s already recognised, globally, as being a world quality exam, and the outcome from that is a better outcome for children.” Dr Will Carroll, Chief for Theory Examinations, RCPCH
Next Issue
If you have a case study, article or news that you’d like to submit for the next eAA newsletter, please get in touch. We welcome contributions from all areas and sectors, especially those which appeal to our membership.
Send your contributions or ideas for the next eAA newsletter to us by email: [email protected].