View profile

How our world is designed for the ‘reference man’, restricting the future to its racist past, and using race as a (horrible) predictor of student success

Revue
 
 

Racism and Technology Center

January 22 · Issue #14 · View online

Curated news, research and activism on the intersection between racism and technology.


How our world is designed for the ‘reference man’ and why proctoring should be abolished
We belief that software used for monitoring students during online tests (so-called proctoring software) should be abolished because it discriminates against students with a darker skin colour.
In this video, Dutch student Robin explains how the software wouldn’t recognize her at the beginning of an exam that was already stressful enough as it was. Robin shares the painful reality that she is used to not being seen because of her colour. The passport-scanning machines at Schiphol airport also consistently fail to recognize her.
The clip comes from a four-part series exploring how most things in our world are designed for the ‘reference man’: a white male. It is very worthwhile to watch the full episode (and the rest of the series), as it shows many other examples of racist technology.
Reference man: Technologie
Predictive policing constrains our possibilities for better futures
In the context of the use of crime predictive software in policing, Chris Gilliard reiterated in WIRED how data-driven policing systems and programs are fundamentally premised on the assumption that historical data about crimes determines the future. He highlighted Wendy Chun’s new book Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition, emphasising that these AI-based methods of statistical correlation and machine learning are not equipped to anticipate the myriad of possibilities of the future. Rather, these systems “restrict the future to the past” in order to ‘predict’ it. Thus, not only is it a self-fulfilling prophecy, these systems actually work to uphold the status quo; to “cement existing realities rather than change them”. Chun wrote:
If the captured and curated past is racist and sexist, these algorithms and models will only be verified as correct if they make sexist and racist predictions.
It is telling that these technologies are mainly used on certain types of crimes (not white collar crimes), thus disproportionately policing Black and brown bodies. No new and purportedly innovative technologies will solve racism if we remain bound to the same set of existing carceral logics that permeate our societies. Rather, they constrain other possibilities.
Crime Prediction Keeps Society Stuck in the Past
Nani Jansen Reventlow receives Dutch prize for championing privacy and digital rights
The Dutch digital rights NGO Bits of Freedom has awarded Nani Jansen Reventlow the “Felipe Rodriguez Award” for her outstanding work championing digital rights and her crucial efforts in decolonising the field. In this (Dutch language) podcast Nani is interviewed by Bits of Freedom’s Inge Wannet about her strategic litigation work and her ongoing fight to decolonise the digital rights field.
The Felipe Rodriguez Award is named after a Dutch privacy pioneer and is an award presented to people who stand out for their contribution to our right to privacy. It the positive counterpart to the Big Brother Awards that, in contrast, highlight the biggest privacy violators of the year. That award was given to the Dutch Ministry of Justice and Safety and the Minister of Health Welfare and Sport.
Nani Jansen Reventlow by Tetsuro Miyazaki
Nani Jansen Reventlow by Tetsuro Miyazaki
Jansen Reventlow has been awarded this price for her great effort over the past years in the digital rights field. Most notably she founded the strategic litigation NGO Digital Freedom Fund (DFF). Building on her extensive experience as a human rights lawyer she grew DFF into a leading organisation that also functions as a place to bring the digital rights field together. She has fostered an intersectional approach to human rights within the digital context as an integral part of DFF’s work, which has opened up a broader understanding of digital rights. Importantly, she is at the forefront of a large project to Decolonise Digital Rights, together with European Digital Rights (EDRi). This project aims “to reflect on the way in which uneven power dynamics, exclusion, and privilege play out in our field, including how these shape the way in which digital rights are conceived and how they are protected.” Recently, she left DFF to found a new NGO called Systemic Justice, which “seeks to radically transform how the law works for communities fighting for racial, social, and economic justice.”
In this moving and informative podcast Jansen Reventlow speaks about the strategic litigation work of DFF over the past years, her experience as a woman of colour in the digital rights field, the importance and struggles of the decolonisation process and the essential work still to be done.
Podcast: Felipe Rodriguez award
Nani Jansen Reventlow receives the Felipe Rodriguez Award 2021
Racist Technology in Action: U.S. universities using race in their risk algorithms as a predictor for student success
An investigation by The Markup in March 2021, revealed that some universities in the U.S. are using a software and risk algorithm that uses the race of student as one of the factors to predict and evaluate how successful a student may be. Several universities have described race as a “high impact predictor”. The investigation found large disparities in how the software treated students of different races, with Black students deemed a four times higher risk than their White peers.
The Markup found that at the University of Massachusetts Amherst, “Black women are 2.8 times as likely to be labeled high risk as White women, and Black men are 3.9 times as likely to be labeled high risk as White men.” At the University of Wisconsin–Milwaukee, “Black women [are of] high[er] risk at 2.2 times the rate of White women, and Black men at 2.9 times the rate of White men,” and at Texas A&M University, they labelled “Black women high risk at 2.4 times the rate of White women, and Black men at 2.3 times the rate of White men.”
Chart by The Markup
Chart by The Markup
The effect of these practices could mean that Black and other minority students will be pushed into “easier” majors by systems that are neither transparent nor reliable. The risk algorithms include other information such as test scores, high school percentiles, estimated skills, to name a few. These models are trained on historic student data, between two to ten or more years of student outcomes. Basing decisions on historic data, however, can encode and perpetuate un-investigated racist and discriminatory practices. It is also unclear how heavily race is weighted in the algorithms, as the universities have no insight into the proprietary software that they use.
A fundamental issue with using someone’s racial background as an indicator of risk is not only that it is stigmatising, but also that it is redundant. Race as a category cannot be changed, unlike an issue such as financial struggle. Hannah Quay-de la Vallee, a senior technologist at the Center for Democracy and Technology notes:
Using race for any kind of system, even if it is in a very narrow context of trying to end racial disparities in higher education … you can go into that with the best of intentions and then it takes very, very few steps to get you in place where you’re doing further harm.
Events, exhibits and other things to do
Help us…
Do you know somebody who might think this newsletter is worthwhile too? Please tell them to subscribe here.
Did you read something that would be relevant for this newsletter? Then share it with us.
About the Racism and Technology Center
The Racism and Technology Center is non-profit foundation based in the Netherlands. It uses technology as a mirror to reflect and make visible existing racist practices in (Dutch) society. As a knowledge center, it provides a platform, resources, knowledge, skills and legitimacy to anti-racism and digital rights organizations to help them create understanding of how racism is manifested in technology with the goal of dismantling systems of oppression and injustice. Read more about the Center.
This is a monthly newsletter.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue