View profile

Centering social justice and de-centering tech, Facebook's racial profiling in its targeted ads, and the racial impact of unbiased traffic cameras

Revue
 
 

Racism and Technology Center

February 19 · Issue #15 · View online

Curated news, research and activism on the intersection between racism and technology.


Centering social injustice, de-centering tech
The Racism and Technology Center organised a panel titled Centering social injustice, de-centering tech: The case of the Dutch child benefits scandal and beyond at Privacy Camp 2022, a conference that brings together digital rights advocates, activists, academics and policymakers. Together with Merel Koning (Amnesty International), Nadia Benaissa (Bits of Freedom) and Sanne Stevens (Justice, Equity and Technology Table), the discussion used the Dutch child benefits scandal as an example to highlight issues of deeply rooted racism and discrimination in the public sector. The fixation on algorithms and automated decision-making systems tends to obscure these fundamental problems. Often, the use of technology by governments functions to normalise and rationalise existing racist and classist practices.
As Nadia Benaissa articulated: “The system isn’t broken, the system and its algorithms are doing exactly what they are told by people. Sanne Stevens stressed that we cannot have fair technology in an unfair world. We must resist tech solutionism and be careful not to view tech as the main source of evil. Look beyond the tech and see the broader ecosystem of discrimination and racism.
Moving away from a technosolutionist focus, the speakers also debated about the role of law, acknowledging that legal protection has its limits, particularly in extremely marginalised and disenfranchised communities. The dominance of legal and technical framings of the ways problems are recognised and conceptualised can lead to creating expert-driven spaces that are inaccessible to those communities in which these injustices occur. The discussion concluded that building and offering safe spaces together with anti-racist activists and communities must begin by truly listening, valuing people’s time, and showing up.
PrivacyCamp22: Event Summary
Watch the recording: Centring social injustice, de-centring tech: The case of the Dutch child benefits scandal
Bits of Freedom speaks to the Dutch Senate on discriminatory algorithms
In an official parliamentary investigative committee, the Dutch Senate is investigating how new regulation or law-making processes can help combat discrimination in the Netherlands. The focus of the investigative committee is on four broad domains: labour market, education, social security and policing. As a part of these wide investigative efforts the senate is hearing from a range of experts and civil society organisations. Most notably, one contribution stands out from the perspective of racist technology: Nadia Benaissa from Bits of Freedom highlighted the dangers of predictive policing and other uses of automated systems in law enforcement.
Benaissa highlighted how the data feeding predictive policing systems are not neutral or objective but always rely on past events and human judgements and, as such, always tell a story. In the following clip she explains (in Dutch) why it is important to not only focus on specific sensitive data such as race or gender, but that the wider context should be taken into account in responding to the question of which data should be omitted to prevent discrimination:
She also spoke vividly on the importance of transparency and accountability in the use of data and algorithmic systems. Not only with policingm but by the entire government. Here, she specifically referred to the proposed EU AI act which does not go far enough in this regard, and called upon the Senate to create human rights impact assessments for any type of algorithms used by the Dutch government.
Besides Nadia’s contribution, the Senate also heard several other experts that spoke on topics closely related to racist technologies. For example, Sennay Ghebreab talked about possible discrimination in the use of algorithms in social security, and Dionne Abdoelhafiezkhan, from IZI solutions and Controle Alt Delete, spoke about ethnic profiling within the police.
Nadia Benaissa at the Parlementaire onderzoekscommissie effectiviteit antidiscriminatiewetgeving in the Dutch Senate
De discriminatie die in data schuilt
Facebook has finally stopped enabling racial profiling for targeted advertising
Around 2016 Facebook was still proud of its ability to target to “Black affinity” and “White affinity” adiences for the ads of their customers. Our Center’s Hans de Zwart wrote an op-ed decrying this form of racial profiling that was enabled by Facebook’s data lust.
It has taken till this year for Facebook to finally admit that their behaviour was problematic and do something about it. According to Reuters, Facebook has removed ad-targeting options that refer to sensitive topics “such as ads based on interactions with content around race, health, religious practices, political beliefs or sexual orientation.”
Maybe Facebook’s about-turn has something to do with the legal pressure they are currently under. Their options for targeted advertising allowed for excluding people from viewing housing related ads based on characteristics that are “protected” under American anti-discrimination law. In simple terms: it was possible to only show housing ads to White people).
Unfortunately, Facebook’s measures will only be a stopgap. There are endless proxies for race that could be used by advertisers to keep their unlawful practices going. We should therefore completely ban targeted advertising on the basis of the characteristics or the behaviour of people.
Facebook plans to remove thousands of sensitive ad-targeting options
Holding Facebook Accountable for Digital Redlining
Racist Technology in Action: “Race-neutral” traffic cameras have a racially disparate impact
Traffic cameras that are used to automatically hand out speeding tickets don’t look at the colour of the person driving the speeding car. Yet, ProPublica has convincingly shown how cameras that don’t have a racial bias can still have a disparate racial impact.
When, in the interest of traffic safety, the city of Chicago lowered the threshold for how much you had to be over the speed limit to receive a ticket, it enlarged the racial gap in who gets these tickets. How could that be?
ProPublica highlights that due to existing racial inequalities Black and Latinx people more often had to be physically present at work during the COVID pandemic than their White fellow townspeople. They were disproportionally on the road, and therefore they were disproportionally ticketed.
Another aspect that created the gap was the difference in the built environment of where these groups of people live. Residential density, for example, is an important factor in how fast people drive. Dense residential areas have more traffic, cars, and pedestrians, leading to people intuitively driving more slowly. Black neighbourhoods in Chicago are far less dense than their White counterparts, leading once again to disproportionate ticketing.
This is a clear example of why debiasing technology will never be a sufficient measure when trying to create a more just world: the use of technology without bias can clearly still increase the structural inequalities that already exist in society.
Chicago’s “Race-Neutral” Traffic Cameras Ticket Black and Latino Drivers the Most
Events, exhibits and other things to do
Help us…
Do you know somebody who might think this newsletter is worthwhile too? Please tell them to subscribe here.
Did you read something that would be relevant for this newsletter? Then share it with us.
About the Racism and Technology Center
The Racism and Technology Center is non-profit foundation based in the Netherlands. It uses technology as a mirror to reflect and make visible existing racist practices in (Dutch) society. As a knowledge center, it provides a platform, resources, knowledge, skills and legitimacy to anti-racism and digital rights organizations to help them create understanding of how racism is manifested in technology with the goal of dismantling systems of oppression and injustice. Read more about the Center.
This is a monthly newsletter.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue