Proctorio is a remote proctoring service that is used in some Blackboard classes for quizzes and exams. During the exam session, Proctorio can record your webcam, screen, or other actions and share that information with your instructor.
Students of color have long argued that Proctorio’s and other exam surveillance companies’ facial recognition algorithms struggle to identify their faces, making it difficult, if not impossible, for them to take high-stakes exams.
The controversial exam program was reverse-engineered by a student researcher, who discovered a method notorious for failing to identify non-white faces.
The software researcher, who is also a college student at a school that uses Proctorio, claims he can demonstrate that the software uses a facial recognition model that fails to identify Black faces more than half of the time.
The researcher, Akash Satheesan, told Motherboard, “I decided to look into it because [Proctorio has] claimed to have heard of ‘fewer than five’ instances where there were issues with face recognition due to race. I knew that from anecdotes to be unlikely … so I set out to find some more conclusive proof and I think I’m fairly certain I did.”
When checked on the same set of faces, Satheesan demonstrated for Motherboard that the facial detection algorithms embedded in Proctorio’s tool performed identically to the OpenCV models. A security researcher was also consulted by Motherboard, who confirmed Satheesan’s findings and was able to replicate his study.
Proctorio claims to use “proprietary facial detection” technology on its website. It also says it uses OpenCV products, but it doesn’t specify which ones or for what. Meredith Shadle, a spokesperson for Proctorio, did not respond directly to Motherboard’s question about whether the company uses OpenCV’s models for facial recognition. Instead, she sent a connection to Proctorio’s licenses page, where she could find an OpenCV license.
Shadle wrote to Motherboard in an email. “While the public reports don’t accurately capture how our technology works in full, we appreciate that the analyses confirm that Proctorio uses face detection (rather than facial recognition).”
Proctorio also declined to respond to a number of questions about the technology, including whether it had fine-tuned the OpenCV models for use in its applications.
The inability of computer vision systems to recognize and correctly distinguish darker-skinned faces is well-documented, and Proctorio’s use of OpenCV models has serious consequences for the students who are required to use it. Code based on OpenCV for facial recognition and detection has previously been found to be biased.
Satheesan put the models to the test on nearly 11,000 faces from the FairFaces dataset, which is a collection of photographs compiled to include labeled images of people of various ethnicities and races.
57 percent of the time, the models failed to detect faces in images labeled as having Black faces. Some of the flaws were glaring: the algorithms detected a white face in the same picture, but not a Black face posing in a nearly identical location.
Other classes had higher pass rates, but they were still far from optimal. In 41% of images with Middle Eastern faces, 40% of images with white faces, 37% of images with East Asian faces, 35% of images with Southeast Asian or Indian faces, and 33% of images with Latinx faces, the models Satheesan tested failed to detect faces.
Proctorio’s weak facial detection system has been characterized as frustrating and anxiety-inducing by Black students. Some say that the program consistently fails to remember them when they take a test. Others are concerned that if they leave ideal lighting, their tests would suddenly close and shut them out.
According to Satheesan, his findings illustrate a fundamental flaw in proctoring software: the tools, ostensibly designed for educational purposes, actually undermine education—especially for students who are already disadvantaged.