UK: Legal action threatened over algorithm used to grade teenagers' exams


Digital rights organisation Foxglove is threatening to take legal action against Ofqual - the government body that regulates qualifications, exams and tests in England - on the grounds that the algorithm being used to determine students' estimated A-Level results potentially violates the Data Protection Act. Due to the pandemic, students' final exam results are being estimated based on previous grades, but Foxglove argue that schools, rather than individuals students, are being assessed.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

Press released issued by Foxglove, 12 August 2020

Grade the student, not the school: threat of legal action unless Ofqual fix unfair A-level and GCSE grading algorithm

For immediate release: 12:30pm Wednesday, 12 August 2020

  • Legal letter sent today on behalf of Ealing A-level student Curtis Parfitt-Ford
  • Ofqual’s grading algorithm reported to ignore teachers’ assessments and sets grades based on schools’ prior results
  • System hurts bright pupils at disadvantaged schools, giving private school children an unfair leg up
  • Scotland scrapped a similar system this week in major Scottish Government U-turn
  • Lawyers’ letter demands another appeal route and gives more weight to teachers’ grade estimates
  • More affected students invited to join potential legal action

An algorithm which will determine the life chances of millions of English students by ‘estimating’ their GCSE and A-level results for this year risks being challenged in the courts unless the government acts swiftly to address concerns. Curtis Parfitt-Ford, an A-level student at a comprehensive school in Ealing, supported by tech-justice group Foxglove, has demanded that Ofqual correct defects in its grading algorithm or potentially be taken to court.

In the wake of Covid, for the first time in history, A-level and GCSE students in England are being assessed not on their individual performance, but by an algorithm.

Ofqual’s algorithm doesn’t really grade the student—it grades the school. Ofqual has disclosed very little about its process, but has very recently let slip that for larger schools, teachers’ predicted grades don’t count at all: “Where a subject has more than 15 entries in a school, teachers’ predicted grades will not be used as part of the final grade calculation.” In practice, then, what matters is the school’s prior results. Millions of kids’ life chances hang on a statistical estimate based on their school’s previous performance compared to other schools.

Great students in poor schools will lose out from this system. A student from a historically underperforming school, who was on track to get their school’s first A, now almost certainly won’t—because of the algorithm.

Smaller, richer schools get more personalised treatment. In theory, teachers’ assessment of the students’ likely individual performance factors into the assessment. Ofqual have said they will factor in teachers’ assessments for schools with smaller classes where under 5 students have grades submitted in a topic. These will tend to be better-resourced, often private schools.

Scotland has already had to scrap their system and is fixing it. The Scottish system that First Minister Nicola Sturgeon binned this week had significant unfairness – their system led to a quarter of all grades being reduced from teachers’ assessments by the algorithm. In a major U-turn, the First Minister said she was “sorry” for the system and pledged a reform that would take greater account of pupils’ individual circumstances.

Automating a major decision about pupils in this way also potentially violates the GDPR and UK Data Protection Act. Those laws (e.g., Art. 22 GDPR) provide significant protections from automated decisions about people which may have significant consequences – and this is a significant consequence for every student expecting GCSE/A-Level results this week.

Teachers know best. To fix this system, government should institute a) an appeal route for students to challenge unfair results, and b) give greater weight to teachers’ assessments of their pupils.

Curtis Parfitt-Ford said:

“My friends, my cohort, and all the other students awaiting results this year deserve better than to be graded by a postcode lottery. Our grades should track our capability and effort as individual students, and what’s been proposed quite simply does not do that. Ofqual acknowledges our teachers know best what we can do, so it doesn’t make any sense for their assessments to be ignored, and for the Government to choose to rely on a biased computer programme instead.

Whilst it's progress of a sort that the Government announced they'll allow students to appeal on the basis of mock exam results, that won't work for a lot of us - especially because many mocks were disrupted by the pandemic. In practice right now a lot of us are still at the mercy of this algorithm.

I don’t want to have to comfort friends whose lives have been ruined by an unfair government algorithm. The Government must revamp its system, just as the Scottish Government chose to do yesterday.”

Cori Crider, co-founder and Director of Foxglove, said:

“Ofqual’s algorithm deserves an F for unfairness. This system fails bright kids in bad schools, and treats pupils as statistics, not individuals. This will damage social mobility and undermines the sense that grades award individual effort and achievement. Where’s the meritocracy in doling out grades on the basis of some made-up bell curve?”

Notes to Editors 

Curtis Parfitt-Ford is an A-level student at Elthorne Park High School, a comprehensive school in Ealing, London.  He excelled in his GCSEs – achieving six grade 9s and five grade 8s – and is bringing this challenge out of deep concern that the system will further ingrain inequality in education, treating him and his peers, especially those at disadvantaged schools, unfairly. Curtis is also a programmer and has started his own company, Loudspeek, a digital system which aims to help people launch campaigns and write to their MPs. Alongside the potential legal action, he has launched a petition on calling on Boris Johnson to intervene to make the system fairer:

Foxglove, who are supporting Curtis to bring this case, is a new non-profit which exists to make tech fair. We were behind the recent successful legal challenge for the Joint Council for the Welfare of Immigrants (JCWI) to the Home Office visa algorithm – another biased algorithm that harmed millions of people.

Legal team: Curtis is represented by solicitor Rosa Curling at Leigh Day. Counsel are David Wolfe QC at Matrix Chambers and Ciar McAndrew at Monckton Chambers.

What system would be better?   

  • A FREE appeal system which allows pupils to challenge a decision as unfair
  • Guidance for how students can take reassessments in the autumn (at no cost)
  • Give greater weight to teachers’ grade assessments

Martha Dark (Foxglove) info [at] / 0207 1835 926 

Contact: Martha Dark (Foxglove) info [at] / 0207 1835 926

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.


Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error