View profile

The case against racist tech in the Dutch municipal elections, abolishing tracking-based advertising, and oxymeters designed for white skin

Revue
 
 

Racism and Technology Center

March 19 · Issue #16 · View online

Curated news, research and activism on the intersection between racism and technology.


Racism and technology in the Dutch municipal elections
Last week in the Netherlands all focus was on the municipal elections. Last Wednesday, the city councils were chosen that will govern for the next four years. The elections this year were mainly characterised by a historical low turnout and the traditional overall wins for local parties. However, the focus of the Racism and Technology Center is, of course, on whether the new municipal councils and governments will put issues on the intersection of social justice and technology on the agenda.
In the two biggest cities (Amsterdam and Rotterdam) the largest parties showed a varied picture. Some parties did not pay any attention to social justice issues and technologies, such as the VVD and Leefbaar Rotterdam. Others, such as the PvdA, do emphasize the importance of privacy and fair government services, also in the context of automation and technology, but do not make a strong connection to social justice issues. D66 in Rotterdam and GroenLinks in Amsterdam do explicitly make this connection, zeroing in on discriminatory or sexist algorithms. The case against racist technologies is made most strongly by BIJ1 in Amsterdam, who dedicated an entire chapter of their programme to digital rights and technology.
It seems as if scandals such as the child benefits scandal and perhaps also the SyRI scandal have increased the understanding of how algorithms and risk profiles can be discriminatory. Even though societal awareness of how automated systems can exacerbate societal inequalities and injustices has grown over the past year, there is still a lot of work to be done to push this topic more to the front of political debate.
Bekijk hier de uitslagen van de gemeenteraadsverkiezingen
Disinformation and anti-Blackness
In this issue of Logic, issue editor, J. Khadijah Abdurahman and André Brock Jr., associate professor of Black Digital Studies at Georgia Institute of Technology and the author of Distributed Blackness: African American Cybercultures converse about the history of disinformation from reconstruction to the present, and discuss “the unholy trinity of whiteness, modernity, and capitalism”.
They touch on how research on mis- and disinformation has proliferated and has only been perceived as deeply problematic when it disrupts the interests of whiteness and white power, despite it surfacing in the mid-2000s around Gamergate or troll farms attacking Black Lives Matters activists. Much of this research tends to frame it as an information misbehaviour, rather than a consequence of racist and xenophobic cultural misbehaviours. Relating this idea to the digital divide, and the proliferation of developmental ICT projects in Pakistan, Africa, and so forth, they underscore that liberal thinking often neglects turning the gaze around to question what created the conditions in these non-Western countries, and why they persist.
In the rest of the conversation, Abdurahman and Brock Jr. reckon with the notions of Western liberal rationality, logic and enlightenment which are deeply embedded in sectors such as data science and information science, the ways in which anti-Blackness props up capitalism, and the role of emotions in reconstructing our societies. Reorienting the structures and values of whiteness is a slow but crucial process.
(dis)Info Studies: André Brock, Jr. on Why People Do What They Do on the Internet
72 civil society organisations to the EU: “Abolish tracking-based online advertising”
The Racism and Technology Center co-signed an open letter asking the EU member states to make sure that the upcoming Digital Services Act will abolish so-called ‘dark patterns’ and advertising that is based on tracking and harvesting personal data.
The current advertising practices allows for “the manipulation of the public debate, discrimination, and the amplification of harmful content.” Furthermore, …
… tracking-based advertising allows the exploitation of people’s vulnerabilities, perpetuating already existing biases and marginalisation, and can lead to the exclusion of certain groups of people, such as women and older people when it comes to [for example] job ads.
Tracking-based advertising also enables the exclusion of Black people as we know. It is long overdue that regulation puts an end to these practices.
Open Letter: Abolish manipulative dark patterns and creepy online ads, ask 72 civil society organisations
Racist Technology in Action: Oxygen meters designed for white skin
‘Oximeters’ are small medical devices used to measure levels of oxygen in someone’s blood. The oximeter can be clipped over someones finger and uses specific frequences of light beamed through the skin to measure the saturation of oxygen in the blood. These small and relatively inexpensive devices have played an important role in the COVID-19 pandemic, as oxygen saturation levels are one of the main indicators of whether someone suffering from COVID is in immediate need of medical help or can ride out the illness at home. One of the dangers of COVID is that oxygen levels drop without the patient noticing, so called ‘silent hypoxia’, which can mean that by the time the patient notices and becomes breathless, there is very little time to get help. Oximeters can detect dropping oxygen levels sooner, and make sure people get help in time.
The problem, however, is that these devices are developed on white skin and do not work as accurately on darker skin. The light beam that goes through the skin to measure the oxygen saturation is reflected back more in darker skin, which might make the device show higher oxygen levels than are actually present in the blood. This means people with darker skin go to the hospital later than people with white skin, which could be part of the explanation for the higher deathrates among people with darker skin.
What is wrong with these oximeters is the same blind spot that explains automated soap dispensers not working for darker skin: technologies are developed by and only tested on white people. However, in the case of the pandemic this racist technology compounded with other factors of systemic racism to result in higher rates of illness, long term effects and deaths for people of colour (here for Amsterdam, here for the UK and here for the US).
Covid: Pulse oxygen monitors work less well on darker skin, experts say
Events, exhibits and other things to do
Help us…
Do you know somebody who might think this newsletter is worthwhile too? Please tell them to subscribe here.
Did you read something that would be relevant for this newsletter? Then share it with us.
About the Racism and Technology Center
The Racism and Technology Center is non-profit foundation based in the Netherlands. It uses technology as a mirror to reflect and make visible existing racist practices in (Dutch) society. As a knowledge center, it provides a platform, resources, knowledge, skills and legitimacy to anti-racism and digital rights organizations to help them create understanding of how racism is manifested in technology with the goal of dismantling systems of oppression and injustice. Read more about the Center.
This is a monthly newsletter.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue