I was interviewed recently (via email) by Katie Deighton, from the Wall Street Journal’s Experience Report. She was writing an article on online exam proctoring, and wanted to follow up with me about the categories of proctoring software and to get a university learning technologies perspective.

The article was published yesterday. I’m officially a critic. She wasn’t able to use the entire response, so I’m putting the rest of it here:

Katie:

What are the biggest challenges that online proctoring companies face when it comes to both reputation and customer acquisition? How big of a problem is the user experience, when compared to issues surrounding privacy, security, etc.?

This Educause QuickPoll report (which mentioned your work!) on proctoring services was conducted at the beginning of the pandemic. From what you know about the industry, do you believe the sentiment regarding online proctoring has changed at all now, given the critical media attention and student petitioning that’s occurred since then?

Companies like Proctorio and ProctorU have committed to tweaking the user experience, for instance, by making the set-up instructions easier, improving transparency around where a user’s data goes, improving accessibility, and better educating professors in the various proctoring settings. Do you think these kinds of tweaks will be enough to either bring back lost customers, or improve online proctoring’s reputation at large? Or, do you think security/privacy concerns will still dominate the discourse here, no matter how easy and transparent it is for people to use the software?

D’Arcy:

There is a fundamental problem with how online exam proctoring software is designed. This problem involves issues of power, control, consent, and agency. The concept itself puts students and instructors into an adversarial relationship, with students framed as assumed cheaters, and instructors as police or security analysts trying to catch the students. This can’t be resolved through interface tweaks or streamlined installation processes - the problem is the nature of the software, not the design of the interface or user experience.

Online exam proctoring software is invasive. It installs surveillance software onto students’ own computers. Software which is designed to integrate with their web browser and operating system in ways that would have previously been described as a “root kit” and banned in order to mitigate security vulnerabilities. Once installed, the software relies on active surveillance of students in their homes, using video and audio streamed from their computer’s webcam as well as monitoring the software and data that resides on the computer. Once third-party surveillance software has been installed on a computer, it should not be treated as a trusted device. Even with some transparency provided by the software companies, the trust that a person previously had that enabled secure uses of their computer, such as online banking, personal income tax preparation, or storage of private content and data, has already been compromised. Compelling students to install it on their own devices, compromising the security and integrity of those devices, is problematic.

If online exam proctoring software was strictly used on devices provided by an institution for the sole purpose of completing assessments in online courses, it could be a viable solution. But, even then, the software relies on video and audio surveillance of students in their own homes. There are a number of issues that make that a challenging proposition. Software that uses artificial intelligence and computer vision techniques has been shown to be biased against people of colour because the machine learning software is typically trained using less diverse images and videos to teach it what “cheating” looks like. Software that uses live proctoring - a human watching the student as they take the exam - avoids the problem of AI bias, but adds issues of reliability (do all observers consistently treat behaviours the same way?) as well as privacy, as unknown staff members or contractors engaged by the proctoring company are now compelling students to allow them to watch them for extended periods of time.

At the beginning of the pandemic, there was a strong push to shift to remote proctoring at our institution, but we put together a team with people from various faculties and roles - and included both undergraduate and graduate students - in trying to figure out what was needed. I think the initial hope was that online exam proctoring software would enable online exams to be offered with very little adjustment to existing processes - replacing in-class invigilated high stakes exams with online proctored exams. But that avoids the root issues underlying assessment, which we decided to attempt to address rather than to apply a temporary solution. Our community was very clear that they thought it would be a better investment of university resources (both financial and staffing) to support initiatives to foster a stronger culture of academic integrity, and in the design and implementation of authentic assessment in online and remotely-taught courses. These alternate forms of authentic assessment are more difficult for instructors to create, but also make it more difficult for students to engage in academic misconduct.