the e-Assessment Association

Blog post: English Language Testing – Shaping the e-Assessment Sector

Blog post: English Language Testing – Shaping the e-Assessment Sector

In this blog, Geoff Chapman, Management and Business Development Consultant, former Vice-Chairman of the e-Assessment Association and co-founder of trade publication World Exam Tech looks at the use of ELT

English language testing (ELT) is rapidly growing, driven by learners developing work language skills, entering higher education, settlement, and for pleasure. Disruption is also occurring: exam owners attempting innovative e-Assessment delivery, and creating new distribution channels.

ELT for higher education is widespread, notably where institutions need to know whether an international student has the language abilities necessary to succeed.


Q: Which Other Sectors Use ELT?

A: Here are is a sample of five sectors that actively use ELT

  1. Aviation

English is aviation’s international language. The International Civil Aviation Organization (ICAO) requires all pilots and air traffic controllers to be competent in spoken and written English. Aviation has distinctive domain-specific English requirements including key words such as ‘roger’. Pilots have their licenses endorsed with their English language capability.

The ICAO’s Aviation English Language Test Service measures the speaking and listening ability of pilots and controllers. For UK pilots, there are five main English exams which are part of the ICAO’s programme: Anglo-Continental’s TEAP; The ELPAC ‘Level 6 test’; RMIT REALTA; The Mayflower College TEA Test, and the Versant for Aviation test.

  1. Military

NATO’s Standardization Agreement deals with languages taught and spoken by NATO members. Language proficiency skills across Listening Comprehension, Speaking, Reading Comprehension, and Writing are codified into six levels, from Nil Proficiency to Highly-Articulate Native. 

For example, the Benchmark Advisory Test (BAT) is for NATO non-native English-speaking military and civilian personnel. The BAT has a computer adaptive writing part with 60 MCQs within a two-hour test window. A speaking part, conducted over the phone, is administered by a test centre proctor, within a 40-minute test window.

Most US federal government agencies rely on the Defense Language Proficiency Test, and the Oral Proficiency Interview, delivered via the Defense Language Institute, the language training wing of the US Military. The UK Defence Centre for Languages and Culture (DCLC), based within the Defence Academy at Shrivenham, delivers language training for military personnel through two in-house language delivery functions: Foreign Language Wing (FLW) and English Language Wing (ELW).

  1. Medical & Healthcare

This sector has hitherto been the domain of generalists. With Covid-19, many sectors requiring rapid staff on-boarding and deployment have explored alternatives.

For example, the UK’s Nursing and Midwifery Council and General Medical Council now recognise the Occupational English Test (OET), which is marketed as a specific test of medical English.

  1. Contact Centre

Corporate employers are frustrated with employees who have good TOEIC/ IELTS scores, but underperform within the workplace. The need for assessing second language English-speaking contact centre agents is relatively new. 

Spoken language performances in specific customer exchange service contexts require a good understanding of that context, the target texts in that context, and a reliance on subject matter expert (SME) input.

This sector tends to use non-proctored, low price point tests, delivered in the workplace.

  1. Legal English

The Legal English language exam sector has two broad areas: assessment for lawyers (practising in a setting where they need a second language), and court interpreters, who are translating and interpreting for speakers of languages other than those used by the courts and police services of the country concerned.

Demand from England-based law firms has seen the development of the Test of Legal English Skills (TOLES) exam suite. Interestingly, the UK Solicitor’s Regulation Authority does not yet require candidates to provide separate evidence of English language skills.

Q: Clearly there are examples of malpractice within assessment practice, whether this is digitally enabled or not. Do you think the risk of this should hinder innovation within the sector?

A: There are different risk and resource profiles for traditional paper delivery versus on-screen delivery. Some exam owners believe the challenge is how to deliver a standardised assessment in geographies which have differing cultural norms to Western societies, regarding exam coaching and exam delivery. Having said that, there is no doubt that Western societies are also affected.

Many solution providers and commentators run the risk of continuing to flag malpractice, without providing solutions, many of which use common consumer-grade technology layers. The fear is that exam owners become numb to the constant ‘noise’ of malpractice, accepting it as an occupational hazard. 

Anyone with an interest in malpractice should pre-order Professor Phillip Dawson’s book “Defending Assessment Security in a Digital World” to comprehend the extent of the problem(s), and the proactive, principles-based approaches available.

Q: Do you see any innovations within ELT coming down the line that will help mitigate some of the risks around the examples of malpractice that you highlight?

A: Data analytics and machine-learning to gain better understanding of instructor, centre, candidate and invigilator performance is important, but not yet mainstream. Learning and test centre risk-rating isn’t new, but the ability to ‘intervene-and-act’ on forward-fraud indicators will become mainstream. 

However, exam owners and solution providers are not in the business of punishing learners, but encouraging and nurturing positive, ethical and community/ cohort-driven behaviours. The E-Assessment toolkit has a big impact in this field.

Q: Do you have anything to add on testing of other languages? For example, as China continues to grow in economic power, are we likely to see a growth in CLT?

A: From Afrikaans to Zulu, I track 107 languages that have proficiency exams. For example, Chinese language testing is state-controlled through the Ministry of Education’s agency called the Hanban. Non-native Mandarin learners take the Hanyu Shuiping Kaoshi (HSK) exams to prove their language proficiency, mainly through Confucius Institutes. In the UK, the University of Manchester and London South Bank have institute branches. 

The combined volume for the HSK, HSKK (spoken Chinese), YCT (youth learners) and the Business Chinese Test is around 660k per annum. http://www.chinesetest.cn/index.do

The HSK suite is delivered on-screen – the Business Test is adaptive. High volume countries for Chinese language testing are South Korea, Thailand, Japan, the Philippines, Indonesia, Vietnam, Pakistan, the US, Russia and Kyrgyzstan.

Arabic, French, German, Italian, and Spanish are among other languages that have multiple tests and test owners.

Share this: