View profile

Filtering out the “Asians,” the Dutch love affair with ethnic profiling, and Google's mainstream bias

Revue
 
 

Racism and Technology Center

February 21 · Issue #2 · View online

Curated news, research and activism on the intersection between racism and technology.


Filtering out the “Asians”
The article’s title speaks for itself, “Your iPhone’s Adult Content Filter Blocks Anything ‘Asian’”. Victoria Song has tested the claims made by The Independent: if you enable the “Limit Adult Websites” function in your iPhone’s Screen Time setting, then you are blocked from seeing any Google search results for “Asian”. Related searches such as “Asian recipes,” or “Southeast Asian,” are also blocked by the adult content filter. There is no clarity or transparency to how search terms are considered adult content or not, and whether the process is automated or done manually. Regardless of intention, the outcome and the lack of action by Google or Apple is unsurprising but disconcerting. It is far from a mistake, but rather, a feature of their commercial practices and their disregard to the social harms of their business model.
A possibility to why “Asian” is deemed as restricted content is that the word “Asian,” and relatedly, “Asian girls” is often associated with pornography, further hyper-sexualised by Google search. In Algorithms of Oppression, Safiya Noble illustrates how Google’s search engine reinforces existing racial and gender stereotypes while absolving itself from any responsibility. Noble utilised a variety of keywords including “Asian girls,” “Latinas,” and “black girls,” in which the search returned sexualised and objectified results.
Google owns 90% of the global search engine market. As Google remains a key portal to how people, including children, perceive and relate to the world, we worry about the misrepresentation, exclusion and erasure of histories that are being inflicted onto individuals and communities.
Your iPhone's Adult Content Filter Blocks Anything 'Asian'
The Dutch government's love affair with ethnic profiling
In his article for One World, Florentijn van Rootselaar shows how the Dutch government uses automated systems to profile certain groups based on their ethnicity. He uses several examples to expose how, even though Western countries are often quick to denounce China’s use of technology to surveil, profile and oppress the Uighurs, the same states themselves use or contribute to the development of similar technologies.
Central to the article is the municipality of Roermond’s use of ‘predictive policing’. In 2019 Roermond started the so-called ‘Sensing project’ where an automated system was used to predict where police controls or presence would be most needed. Concretely, an automated camera based system would direct police controls towards cars with an Easter European numberplate. The core, and racist, assumption behind this focus is that theft, drug trafficking and pickpocketing are all mostly committed by people from the Roma communities, coming mostly from Eastern Europe. The article references an in-depth report on the Sensing project from Amnesty International that calls the project a clear example of “automated ethnic profiling” that amounts to “racial discrimination”.
Hoe Nederland A.I. inzet voor etnisch profileren
The internet doesn't have ‘universal’ users
Since 2017, Mozilla – the makers of the Firefox browser – have written a yearly report on the health of the internet. This year’s report focuses on labor rights, transparency and racial justice. The piece about racial justice makes an interesting argument about how the sites we see on the first page of a search engine are a reflection of the general popularity of these sites or their ability to pay for a top result. This leads to a ‘mainstream’ bias.
That is why over a decade ago Arnold Brown tried to battle this mainstream bias through launching Blackbird, a version of Firefox intended for an African American audience. On television you could have channels with content geared towards a particular culture, with each channel having a fair chance of being selected. But the internet flattened those options into a single page of search results. With Blackbird, Brown was hoping to engender a community around black culture. Nowadays social media platforms provide a way to do this (with ‘Black Twitter’ being a primary example), but as Brown rightfully notes: these have no community ownership or control.
Racial Justice: Decode the Default
Racist technology in action: Gun, or electronic device?
Image: AlgorithmWatch
Image: AlgorithmWatch
The answer to that question depends on your skin colour, apparently. An AlgorithmWatch reporter, Nicholas Kayser-Bril, conducted an experiment that went viral on Twitter, showing that Google Vision Cloud (a service which is based on a subset of AI known as “computer vision” that focuses on automated image labelling), labelled an image of a dark-skinned individual holding a thermometer with the word “gun”, whilst a lighter skinned individual was labelled holding an “electronic device”.
Google has since apologised, citing accidental mislabelling and has updated its algorithm. Yet, this is nothing new. In 2015, Google had another incident where a photo of two dark-skinned individuals were “accidentally” tagged as “gorillas”, which still does not appear to be fixed. Moreover, computer vision has proven over time to provide discriminatory and racist outputs. In reality, these “accidents” and labelling “errors” have tangible consequences for individuals.
Events, exhibits and other things to do
Help us…
Do you know somebody who might think this newsletter is worthwhile too? Please tell them to subscribe here.
Did you read something that would be relevant for this newsletter? Then share it with us.
About the Racism and Technology Center
The Racism and Technology Center is based in the Netherlands. It uses technology as a mirror to reflect and make visible existing racist practices in (Dutch) society. As a knowledge center, it provides a platform, resources, knowledge, skills and legitimacy to anti-racism and digital rights organizations to help them create understanding of how racism is manifested in technology with the goal of dismantling systems of oppression and injustice. Read more about the Center.
This is a monthly newsletter.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue