Friday 24 May 2019

TOEIC English Language Test and Home Office flaky data




The National Audit Office has just published its Investigation into The Response to Cheating in English Language Tests. The report criticises the Home Office for failing to protect students wrongly accused of cheating in an English language test that they had to sit as part of a visa application process.

Approximately 2500 students have been forcibly removed from the United Kingdom after being accused of cheating and another 7200 left the country after being warned that they faced detention and removal if they remained. So far 12,500 appeals have been heard in the courts of which 3600 have been won.

The test was known as TOEIC (Test Of English for International Communication) and consisted of a written and verbal element. The aim was to ensure that overseas students applying for UK study had relevant levels of fluency in English. The Home Office contracted the task to a US operation called ETS. ETS in turn contracted with a number of UK language schools to act as test centres. The tests themselves were mediated by computer and the results were assessed back in the United States.
The Home Office received a stream of data from ETS which it then converted into a “look up tool” and officials used this in determining whether tests had been passed and if there had been cheating.

There is no doubt that cheating occurred, indeed a BBC Panorama programme in February 2014 showed it taking place as students were prompted to input “correct” answers. Large-scale frauds were being carried out by some of the test centres, presumably in return for covert fees. But large numbers of students taking the test were wholly capable of passing the test unaided. Everything depended, therefore, on the reliability of the data that ETS supplied to the Home Office.

The full NAO report describes the many obstructions that were put in the way of those who wished to appeal and how many applicants were detained or removed without having any significant opportunity to appeal.

In fact by 2016, if not before, it was clear that the ETS data was insufficiently reliable for the Home Office to be taking the actions that it did.

One element of the testing was about verbal skills and it was being suggested that large numbers of the voice files that were thereby collected were not of the actual applicant but as some proxy. Specialists in forensic voice analysis said that the tests that were used by ETS to link a voice file to a real person were insufficient to function reliably at the quantity of files that had to be examined. The point was not that in absolute terms the tests used were unreliable but that they were insufficient to be relied on solely for important decision making.

But there was a greater problem about the ETS data and this is where I became involved. I was asked by solicitors to look at the entire testing procedure in the light of the number of obvious anomalies in the results. How was it possible that large numbers of people who could obviously converse fluently in English were being failed? Moreover what was the explanation where individuals were being recorded as having taken a test at a particular time when they had clear evidence of alibis to show they were elsewhere?

By the time I was instructed the test centres which were suspected of acting fraudulently had been closed down and their records and computer systems had more or less vanished. Some of the principals were facing criminal charges.

What I wanted to do was to understand the procedures by which students were registered for tests, what happened when they attended for tests, what sort of computer records were created during the tests, and how that data was sent to ETS. After considerable effort by solicitors it was possible to get some of the operational manuals. Few of these documents had any dates associated with them and we came to understand that some of the processes changed over time. I am of course no expert in English language testing – what I was interested in the step-by-step processes and the controls against cheating. ETS were concerned about their own reputation and in any event in turned out that some of their computer processes were out-sourced to a third party.

The ETS arrangements anticipated that individual students might cheat but had not really thought through the possibility that much of the cheating would be carried out by the test centres. In my detailed report I looked at a variety of means by which such cheating could be enabled. It was theoretically possible that data files could be directly manipulated but one also had to acknowledge that this would require relatively high levels of computer skill. It had been seen in other investigations of test centres that use was made of remote control software so that whilst a student at a computer terminal that terminal would in fact be controlled by someone else using specialist software such as team viewer.

But one very likely method of cheating was probably at the stages where students initially registered and then later presented themselves for a test.  A further strong possibility was the test centre delaying the sending of test results to ETS in the US and using the delay to substitute faked results.  This second method had been highlighted by one of the ETS staff but apparently no follow-up action occurred.  

Alas the difficulty of getting hold of accurate and complete records either from the test centres or from ETS meant that one could not identify a definitive fraudulent method. What was also interesting was the difficulty of carrying out any cross checks on the data – were the test results correctly matched up to an authentic, properly identified, applicant? Applicants had a registration number but each individual test (each student took several) had its own identifying number as well – probably part of an arrangement so that ETS testers would be “blind” to the individuals. Everything then depended on the correct matching up of the tests to the applicants.   Simple low cost antifraud measures such as the use of webcams to capture the presence of individuals sitting at a particular terminal had not been used..

What one could conclude was that ETS data records, as with their methods of identifying students by voice pattern, were insufficiently reliable for the Home Office to be making the decisions that it did. As early as 2014 it was clear that a number of test centres were providing the means of cheating by the use of proxies.  But why would those aiding cheating limit themselves to just the one method when there were other loopholes ready to be exploited?

Once the presence of significant quantities of anomalies had been shown the Home Office should not have continued to use the ETS files alone as the basis of decision-making.   The most obvious next step was surely to permit those who wanted to to take a fresh test with stricter controls over the circumstances.  The costs, actual and political, would have been low.

The National Audit Office as the U.K.’s official spending watchdog make significant criticisms of the Home Office. I have seen through my own experience the considerable sums of money that were spent in various appeal proceedings. At one stage Home Office lawyers were seeking to prevent my giving evidence to a tribunal on the basis that this was a judicial review which would not normally hear “new” evidence. Their lawyers were more interested in legal procedure than in getting a just solution. Public money was spent not only directly on lawyers and officials employed by the Home Office but also in legal aid where applicants were able to secure it.

The National Audit Office does not involve itself in direct criticism of Home Office politicians and officials though no doubt others will.

An All Party Parliamentary Group (APPG) is now in existence and is chaired by Stephen Timms MP.  https://bit.ly/2wmnJRB.  I have been asked to give evidence to them on June 11. 

The NAO report is at: https://bit.ly/30ERKKe
My own detailed report, plus other statements and relevant law reports are at: https://bit.ly/2IZfN0f
Detailed press coverage in the Financial Times is at: https://on.ft.com/2DihTVK


No comments:

Post a Comment