View profile

Online proctoring excludes and discriminates, the Dutch elections and racist tech, and white only soap dispensers


Racism and Technology Center

March 20 · Issue #3 · View online

Curated news, research and activism on the intersection between racism and technology.

Online proctoring excludes and discriminates
The use of software to automatically detect cheating on online exams – online proctoring – has been the go-to solution for many schools and universities in response to the COVID-19 pandemic. In this article, Shea Swauger addresses some of the potential discriminatory, privacy and security harms that can impact groups of students across class, gender, race, and disability lines. Swauger provides a critique on how technologies encode “normal” bodies – cisgender, white, able-bodied, neurotypical, male – as the standard and how students who do not (or cannot) conform, are punished by it.
Online proctoring uses facial recognition technologies to detect cheating based on characteristics of the person and the room; technologies that have consistently shown to produce racist outcomes. For example, while using Proctorio, Black students have reported that the system were unable to detect their faces. As a result, students with dark skin have to shine more light on themselves to verify their identities prior to taking an exam.
In the Netherlands, several universities have utilised online proctoring software, resulting in a court case brought forth by students (who eventually lost). The case, while important, centred around privacy and data protection, leaving out issues of discrimination, inclusion and exclusion. In the shift to online education, it is crucial to think of how the choices to adopt certain technological pedagogical tools are political choices that can reinforce exclusion and discrimination in our education systems.
Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
The Dutch elections and racist tech
In last week’s Dutch parliamentary elections, digitisation and the impact of technology on society was definitely part of the political debate. However, racism in technology was, with the exception of BIJ1, hardly explicitly addressed with most parties focussing on topics such as cybersecurity, the power of big tech, and privacy in their party programmes.
Additionally, a coalition of NGO’s and digital rights organisations, Amnesty, Bits of Freedom, Open State Foundation and Waag, organised a ‘digital ballot box agreement’. Eight parties signed the agreement, committing to eleven technology and digitisation-related policy objectives. The agreement does not directly address racist technology, though it does explicitly want to stop the government using risk profiling based on nationality of ethnicity.
That specific point is most likely directly inspired by the recent child benefits scandal which involved immensely damaging ethnic profiling by the Dutch Tax Agency. Even though this scandal led to the fall of the previous government, the three largest parties in Parliament do not mention or allude to ethnic profiling in their party programmes. This really highlights the lack of political attention to racist technology in general. The six signatories of the digital ballot box agreement that gained a seat in parliament have their work cut out for them combatting ethnic profiling by the Dutch state.
Het Digitale Stembusakkoord
IBM is failing to increase diversity while successfully producing racist information technologies
Charlton McIlwain, author of the book Black Software, takes a good hard look at IBM in a longread for Logic magazine. McIlwain writes that:
IBM’s history is instructive because the company has been at the forefront of producing racist information technologies that have disparately harmed the very same people the company has spent decades trying to recruit—a dynamic that also characterizes many of today’s tech giants.
IBM did its first failed diversity project in 1964, when a US military base was converted to host hundreds of high-school dropouts in a free monthslong training program that was supposed to turn them into graduates ready for entry-level job at tech companies. Now, more than fifty years and many diversity projects later, the percentage of black people in the tech industry still hasn’t budged (it is around 2.5 percent). With IBM in the meantime being the largest computer supplier in South Africa during the apartheid years, and continually delivering carceral technologies to law enforcement from the sixties up until today (for example by delivering surveillance capabilities to Duterte in the Philippines).
The Fort Rodman Experiment
Racist technology in action: White only soap dispensers
In 2015, when T.J. Fitzpatrick visited a conference in Atlanta, he wasn’t able to use any of the soap dispensers in the bathroom. Whenever his white friend Larry would put his hand under the dispenser it would work though:
Whites Only?
Fitzpatrick wasn’t the only one to film this problem. Chukwuemeka Afigbo posted a similar video on Twitter in 2017:
Chukwuemeka Afigbo
If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video
The physical reason for this happening is easy enough to understand: these dispensers use a near-infrared light probe that needs its lightbeam reflected for it to realise that something is underneath it. So a mirror will always work, while anything painted with Vantablack would never trigger the dispenser.
But from a socio-technical perspective, the producers of these systems should be ashamed of themselves. They have clearly and unnecessarily miscalibrated their systems, and through not testing their dispensers properly, have excluded the majority of the people of this world from using their product.
Unfortunately, it is likely that with our current Covid-induced focus on washing our hands (preferably without touching the tap), many more people will have been confronted with this particular form of racist technology.
Events, exhibits and other things to do
Help us…
Do you know somebody who might think this newsletter is worthwhile too? Please tell them to subscribe here.
Did you read something that would be relevant for this newsletter? Then share it with us.
About the Racism and Technology Center
The Racism and Technology Center is based in the Netherlands. It uses technology as a mirror to reflect and make visible existing racist practices in (Dutch) society. As a knowledge center, it provides a platform, resources, knowledge, skills and legitimacy to anti-racism and digital rights organizations to help them create understanding of how racism is manifested in technology with the goal of dismantling systems of oppression and injustice. Read more about the Center.
This is a monthly newsletter.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue