Universities Shouldn’t Use Software to Monitor Online Exams: Here’s Why
Monitoring software monitors a student’s computer or phone while they take exams. These programs have been around for a while but have become ubiquitous in online learning during the pandemic.
Proctoria, Respondus and Proctor U, the most popular programs, have seen a 500% increase in usage since the onset of COVID-19 and surveillance software is now a US$19 billion global market.
Some monitoring programs work by verifying that the student has only opened the test software and no other programs; others monitor keystrokes. Some use the computer camera or cell phone audio to verify that the student is working alone. A number of South African universities have adopted cell phone monitoring programs.
But this software is not trivial.
I argue in a recent article that the adoption of surveillance software is a symptom of a much larger problem.
Universities have neglected their educational responsibilities in the service of a neoliberal ideology. This positions students as customers and higher education as a business. This is a problem because when universities become businesses selling qualifications, it reduces their potential to be places where students have transformative relationships with knowledge and where knowledge is created to serve people and the planet.
The ability to memorize information and regurgitate it in a short time is only required in a small number of situations. What most students need is an understanding of how knowledge is acquired in their field of study, what contributions this field makes to society, and how they can find and evaluate information to answer questions and solve problems. problems. They must learn to be ethical and critical citizens.
Evaluation directed towards such ends seems very different from current practices, which are obsessed with both memorization and cheating.
What’s wrong with monitoring
Surveillance raises three issues of concern: privacy, racism and ableism.
Privacy: The vendors of the software insist that students consent to its use. But if the students do not do this, they are excluded from the exam. Universities have ethics committees to make sure their researchers don’t use such coercive tactics and yet they use them on students. Researchers must ensure that potential participants fully understand the potential risks and benefits of a study before they can offer informed consent.
The invasiveness of the software is well documented and many researchers have stated that it has most of the characteristics of illegal spyware.
Allowing a stranger to eavesdrop on a student’s family home while they are taking a test is definitely an indication that this is the wrong way to do the assessment.
Racism of facial recognition software: Whether it’s photo tagging suggestions from social media, border security systems, or surveillance software, facial recognition remains poor at recognizing people with darker skin. The artificial intelligence that compares the face on the student ID to the person in front of the computer camera is much more likely to flag a suspicion if that student is black than if they are white.
Face Recognition Capability: Anyone whose morphology does not meet the expectations of the program may find themselves flagged as a suspect. This includes the tics and stimming of people with Tourette’s syndrome, cerebral palsy, Huntington’s syndrome and autism.
Many American universities have now chosen not to use surveillance software in response to protests from academics and students.
But the opt-out deals with the symptom – universities spying on their students – not the causes of these activities.
The underlying cause is that many universities around the world have embraced a neoliberal ideology, whereby the value of any person, object, creature or activity is seen as measurable in terms of contribution to the economy.
A neoliberal university believes, first, that it is a business in the knowledge market. As part of the commercialization of education, universities are increasingly outsourcing educational activities, such as invigilating exams using proctoring software.
When Ian Linkletter, an educational technologist at the University of British Columbia in Canada, tweeted criticism of the surveillance software used at his university, he was sued by the company. The market cannot afford the critical engagement that should be at the heart of a university.
Second, the neoliberal university treats the student as a customer. In a world where knowledge is packaged and sold as a commodity, software companies convince universities that their product, the degrees they award, can be devalued if left unchecked.
In such an understanding of the university, monitoring software comes into its own.
It’s no surprise that students quickly learn to outsmart the system. The internet is full of tips on how to confuse the software and get help online even when the software is running.
The third characteristic of neoliberal ideology is that power is granted on the basis of wealth. This characteristic is also present in most universities in the world. The university, as a relatively wealthy institution, has the power to implement invasive technology without much difficulty. The average student just has to comply.
Universities for the common good
It becomes impossible to implement surveillance software if the design of the university is that it is a social structure that brings powerful and principled knowledge to the service of people and the planet.
Such a social structure should expend considerable energy in initiating students into their role as creators of knowledge and encouraging them to assume this identity in a responsible way. This would require changes in the way academics interact with students and articulate the purpose of higher education to students and the public. It would also require rethinking the form and function of evaluation.