Computer Adaptive Testing
I first began working in online learning and assessment back in 2000. At the time I was a Content Manager at a start-up in Rio de Janeiro in Brazil. The NASDAQ bubble was still to burst, 9/11 hadn’t happened; there was a brave new world feeling. We felt we were on the cusp of something that was about to radically change teaching, learning and assessment as we had always known it. Exciting times indeed!
It was at this time that I first came across computer adaptive testing. The company I worked for had been in conversation with a beverage giant, Ambev, who needed to make sure that candidates for their MBA programme in the US were sufficiently proficient in English to tackle the programme. Their President had witnessed first hand that the previous intake could not follow lectures, in some cases recording them and much less engage in in-depth discussions with their peers.
The task was complicated by the fact that their cohort was spread out across much of Latin America, with head office not entirely sure which country the candidates would be in at any given time. The final element of the challenge was a rapidly diminishing window to select their candidates.
Much of the English Language Testing that I was familiar with up to that point was divided up into levels: Cambridge Key English test through to Proficiency etc. That is great for testing attainment levels when you know which level to pitch the candidate in at. We had a cohort of completely unknown proficiency levels. What we needed was an adaptive test. Candidates would need to start at a common point and then depending on their performance in the first few items, be fed more difficult or easier questions. I have heard about this being done on paper before but it is really messy. With online testing it is much easier. As each item or group of items is completed and automatically corrected, performance is logged and the next item or group of items selected based on how well the candidate has done. This adaptive approach has the added benefit of not demotivating candidates by consistently giving them questions which are too difficult or too easy.
The upshot of this was that we managed to solve our customer’s business needs in a most satisfactory way. Within minutes of the final candidate completing the final module, they had the results available to them via the web. Computer Adaptive Testing can be applied in many ways. Eugene Burke gave a great talk on the work his company SHL did for HSBC and their recruitment programme at the eATP conference in Brussels in 2009.
The eAA is really keen to share good practice and experience in the eAssessment field. If you have had experience, good, bad or indifferent we would love to hear about it and why you chose the CAT route.
Author: Matthew White, International Baccalaureate
The personal views expressed in this blog are those of the eAA Board member, and not of their employer.