University’s use of online proctor in exams ‘not discriminatory’

Dutch student complained that anti-cheating software did not work due to her skin colour, but university said problems were universal

October 18, 2023
A video monitor
Source: iStock

A Dutch university that used anti-cheating software during the pandemic did not discriminate against a black student who alleged she experienced problems taking online exams due to her skin colour, a human rights body has ruled.

Robin Pocornie filed a complaint against Vrije Universiteit Amsterdam because of its use of a programme developed by Proctorio that uses facial recognition tools to check whether the right student is taking a test.

She said she was logged out of the programme several times and had to sit with a lamp shining in her face because the software struggled to recognise her.

In a judgment the board of the Netherlands Institute for Human Rights cleared the university of discrimination in relation to its use of the software but said it should have handled the student’s complaint better.

“The university has shown that this student did not experience more problems during her exams than other students and that in the case of this student these problems were not caused by her skin colour,” a ruling said. “However, the university should have handled its complaint about discrimination more carefully.”

Ms Pocornie, a bioinformatics student, took exams online at VU during the Covid-disrupted academic year of 2020-21.

Her complaint said she regularly received the message “face not found” or “room too dark”, which disrupted her participation in the exams, losing her time and causing her stress and anxiety.

Proctorio has denied its software works less well for people with darker skin colours. The university claimed its analysis of login data showed that Ms Pocornie did not experience significantly more issues than other students taking the same exam.

“The fact that the student took longer to complete the verification process during one of her exams and had to restart during the exam was due to other things that were not related to her skin colour, such as a poor internet connection or wearing glasses,” the judgment said.

But the institute added that its judgment “does not rule out that the use of Proctorio or similar AI software may lead to discrimination in other situations”.

Reacting to the ruling, Naomi Appelman, a lawyer and chair of the Racism and Technology Center, which supported Ms Pocornie in the case, said it “shows how difficult it is to legally prove that an algorithm discriminates”.

She claimed that there was an “overwhelming amount of scientific evidence that facial detection works less well for people with dark skin”, but they had not been able to conclusively prove that discrimination took place in the specific case.

Ms Pocornie said she was disappointed by the outcome but her efforts to raise awareness of the issues with facial recognition software were already having an impact.

“The facts remain as they are: I had to take my exams with a light in my face, while my white fellow students did not have to do that,” Ms Pocornie said.

“The institute has also indicated that I have been discriminated against by my own university in how they handled my complaint, and that is painful.

“However, I am happy with all the attention the case has received. I have noticed in recent months that, partially as a result of my case, educational institutions have started to think a lot more about whether the technology they use works the same for everyone. I am proud of that.

“As a computer scientist and as a person of colour, I understand very well how institutional racism can manifest itself in technology. So I continue to fight for a society in which everyone is treated equally.”

tom.williams@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

Surely an AI that has difficulty recognising darker skinned people isn't 'being racist' - it's got bad training data that is not a true reflection of the population it's going to have to examine. Did other dark-skinned students suffer similar issues? Did this student prefer low lighting in the room where they used their computer? Genuine technical questions get left by the wayside as soon as someone yells "racisim" and make it harder to find and address the actual flaws.

Sponsored