UNIVERSITY of NOTRE DAME

Big Proctor: Online Proctoring Problems and How FERPA Can Promote Student Data Due Process

Elana Zeide 1

When the pandemic forced schools to shift to remote education, school administrators worried that unsupervised exams would lead to widespread cheating. Many turned to online proctoring technologies that use facial recognition, algorithmic profiling, and invasive surveillance to detect and deter academic misconduct. It was an “epic fail.”.

Intrusive and unproven remote proctoring systems turned out to be inaccurate, unfair—and often ineffectual. The software did not account for foreseeable student diversity, leading to misidentification and false flags that disadvantaged test-takers from marginalized communities. Educators implemented proctoring software without sufficient transparency, training, and oversight. As a result, students suffered privacy, academic, reputational, pedagogical, and psychological harms.

Online proctoring problems prompted significant public backlash but no systemic reform. Students have little recourse under existing legal frameworks, including current biometric privacy, consumer protection, and antidiscrimination laws. Student privacy laws like the Family Educational Rights and Privacy Act (FERPA) also offer minimal protection against schools’ education technology. However, FERPA’s overlooked rights of review, explanation, and contestation offer a stop-gap solution to promote algorithmic accountability and due process.

The article recommends a moratorium on online proctoring technologies until companies can demonstrate that they are accurate and fair. It also calls for schools to reject software that relies on prolonged surveillance and pseudoscientific automated profiling. Finally, it recommends technical, institutional, and pedagogical measures to mitigate proctoring problems in the absence of systemic reform. 

Introduction

It’s every student’s worst nightmare: you’ve studied hard for an exam, are about to wrap up a tough essay, and suddenly get locked out of the testing software—the proctoring software flagged your rifling through notes as suspicious. When tech support finally manages to get you back in the system, the essay is gone, along with most of your time to complete the test. Or you finish, feeling confident, but kept looking away from the screen, so your professor won’t give you credit unless the dean determines you’re not guilty of cheating.

This scenario is all too real for many test-takers who took the bar exam online in 2020 and 2021. Students worldwide faced similar problems when schools administered tests through online platforms accompanied by remote proctoring software. Online proctoring technology (OPT) vendors promised that extensive surveillance, facial recognition, and algorithmic profiling would prevent and detect cheating. They became ubiquitous, especially in U.S. higher education institutions. However, many automated proctoring technologies proved inaccurate, unfair, invasive – and not even effective at spotting academic misconduct.

This article offers a systemic analysis of the technical, institutional, and pedagogical problems posed by proctoring technologies. It then disambiguates the components of proctoring technologies, considering not only their design, features, and computing processes, but also their implementation by schools and teachers. Most online proctoring services rely on controversial technologies—facial recognition, artificial intelligence, and biometric surveillance. But neither vendors nor schools accounted for foreseeable problems by ensuring sufficient accuracy, oversight, or avenues for student appeal.

Online proctoring has become part of the new normal. However, current legal regimes don’t address the harms inflicted by
proctoring technologies. Privacy, consumer protection, and antidiscrimination laws offer aggrieved test takers minimal recourse. They also place negligible pressure on vendors to improve their technologies and on schools to implement better institutional practices. 

This article offers a stop-gap solution to promote algorithmic due process using FERPA’s overlooked rights of inspection, explanation, and contestation. The statute gives students the right to inspect personal information in education records held by schools and vendors—and to challenge information they believe to be inaccurate or inappropriate. To support these rights, the U.S. Department of Education (ED) requires schools and vendors to explain the information in the records. In the context of online proctoring, this could include providing students with raw surveillance footage and the basis for algorithm-generated cheating flags and suspicion scores. In light of ED’s recent Agora letter, this strategy offers students a means to pursue due process and promote algorithmic accountability without depending on unlikely agency enforcement.

Given the inadequacy and uncertainty of current legal frameworks, Part III proposes a moratorium on proctoring technologies, rejection of their unproven features, and, at the very least, reserving their use only when truly necessary, not just expedient. The article also suggests technical, institutional, and pedagogical reforms to at least improve upon proctoring technologies absent a moratorium. Pandemic proctoring showcases the limits of the student privacy status quo, which allows schools to adopt unproven technologies without sufficient oversight or due process. It offers a cautionary tale calling for vendors, educators, and policymakers to protect students from problematic education technology.

Notre Dame Journal on Emerging Technologies ©2020  

Scroll to Top