Australian men like South American women

How Google pushes sexist clichés

"Google Images" gives the world a face: if you want to see what something or how someone looks, you will probably google it. However, a study by Deutsche Welle now shows the distorted results that the search giant's algorithms can lead to: DW journalists analyzed more than 20,000 images and websites.

Result: image searches in English for the terms "Dominican women", "Brazilian women", "Thai women" or "Ukrainian women" are more likely to lead to younger women in provocative poses and revealing clothing than the search for "American women". If, on the other hand, you search Google for "German women" in English, you will mainly come across women politicians and athletes.

This pattern is already visible to the naked eye, and simple searches confirm this. Quantifying and analyzing the results is more difficult, however. The definition of what constitutes a sexualized image is naturally subjective and depends on cultural, moral and social influences.

What makes a picture suggestive?

To classify thousands of images, DW analysis relied on Google's own "Cloud Vision SafeSearch", an artificial intelligence application. The image processing software is trained to recognize images with explicit content. Google uses them to mark "provocative" images when the depicted people wear "skimpy or transparent clothing" and such strategically concealed nudity is displayed with suggestive or provocative poses. Close-ups of explicit areas of the body are also recognized.

When searching for women from the Dominican Republic and Brazil, Google delivers up to 40 percent lewd images. In contrast, the rate for searches for "American women" is five percent and for searches for "German women" is four percent.

Hit in Google's image search for "German women" and "Brazilian women"

The use of such algorithms is controversial, especially since this type of computer program is subject to as many - if not more - prejudices and cultural constraints than a human viewer. Elsewhere, this particular software has even produced racist results. In addition, Google's AI works as a closed system, which can lead to further distortions. Nevertheless, the manual review of all images, which Cloud Vision has categorized as presumably explicit, has shown that the results are still useful: They give an insight into how Google's in-house technology evaluates the images displayed by the search engine.

Links to relevant websites

Every image shown on the results page also links to the website that originally published it. While some of the images are not overtly suggestive, many of these sites provide content that reduces women to sex objects. Latin American women as well as women from Eastern Europe and Southeast Asia are often portrayed here in a blatantly clichéd manner.

In order to find out how many image results lead to such websites, the short description that appears directly under an image in the search results gallery was searched for words such as "marry", "dating", "sex" or "hottest". All websites with at least one of these keywords in the title were manually checked to see whether they actually contained sexist content.

The results show how women from certain countries are almost completely reduced to sexual objects. Of the first 100 search results that were displayed after an image search for the terms "Ukrainian women", 61 linked to this type of content. The search for women from the Czech Republic, Moldova or Romania as well as women from South America and Southeast Asia produced similar results.

Searching images for Western European women leads to such results, but far less often. Out of 100 images that a search for "German women" produced, only 16 were classified as suggestive, and just six for French women.

Stereotypes fuel web results

While the keywords are telling, they don't tell the whole story: a significant portion of the results can be found on websites with addresses such as "toprussianbrides.com", "hotlatinbrides.org" and "topasiabrides.net". The majority of these pages are presented as international marriage agencies or so-called "mail-order bride" services. These offers promise to match men with women of certain nationalities for a fee.

Others offer stereotypical instructions for finding a partner from certain cultures or come up with reviews of niche apps for finding a partner. It quickly becomes clear who the target audience is: Western men who are either looking for a submissive foreign wife or a sexual partner. "Have you mainly dealt with career-oriented Western women up to now? Then a relationship with a Ukrainian bride will feel very different to you," says one such website, which featured prominently in the first row of Google results.

Tamara Zlobina is editor-in-chief of the Ukrainian internet magazine "Gender in Detail". This type of portrayal, she suspects, is related to a phenomenon that first appeared in her country in the 1990s: "After the collapse of the Soviet Union, Ukraine was an extremely poor country. Many women went to Western Europe to raise money for their families to earn, "she says.

In the meantime, this reality is changing rapidly, as the economic situation in the country - and with it the educational and professional prospects of Ukrainian women - has improved significantly. "I would rather have diplomats, politicians, revolutionaries or the women who fight in our border wars reported. We have a lot of wonderful women. That should be visible - not this bridal market!"

According to Sirijit Sunanta, Professor of Multicultural Studies at Mhidol University in Bangkok, real clichés also influence the way Thai women are portrayed on the Internet: "Thailand is a kind of Disneyland for prostitution and sex tourism Internet goes away if you do a Google search, "Sunanta told Deutsche Welle. And she adds, "When the stereotypes apply to women of a particular nationality, they are harmful on another level: It reduces their complexity. Women are different everywhere."

A question of language

The language used in the search can also influence the results. If you search for "Brazilian women" in English or for "mulheres brasileiras" in Portuguese, which means exactly the same thing, Google delivers a different amount of sexual content: When searching for "Brazilian women", 41 out of 100 images are marked as suggestive. The number drops to nine if you search for "mulheres brasileiras". The situation is similar with other regional languages.

"The data fed into the algorithms reflects the perceptions, prejudices and consumption patterns of only a part of humanity," says Renata Avila, a fellow at the US Stanford Institute for Human-Centered Artificial Intelligence. "These prejudices are not solely due to technology, but often to cultural factors. Women of certain nationalities are divided into sexual and service roles by a male, English-speaking culture," Renata Avila told DW.

According to your observations, these are not isolated cases, but rather the whole is part of a deeper, systemic problem. Fairer algorithms simply do not fit into the business model of global "big tech" companies, which are primarily concerned with collecting data and increasing information turnover.

The Brazilian lawyer Joana Varon, founder of the think tank "Coding Rights", sees it similarly. Search engines tended to reproduce the type of content that is widely available online, according to Varon. White men from developed countries would also have more access to the tools and strategies needed to publish content, which in turn would drive page views. "If an algorithm does nothing to compensate for this, then it is racist, sexist and patriarchal," says the lawyer and makes it clear: "Commercial algorithms and their providers should take responsibility for what they generate." Otherwise, they augmented an oppressive worldview using a global search tool.

"Big tech companies need to be regulated!"

She calls for more supervision, more transparency and more competition. "There must be no monopoly for all services. Large tech companies need to be regulated." At the same time, one should promote alternative tools that do not, however, originate from the same paradigm. Renata Avila agrees: "We need a new kind of technology based on transparency and accountability". That is largely incompatible with today's thinking in Silicon Valley. "So there can only be solutions with the participation of the diverse, global community."

Deutsche Welle presented the Google press office with a list of questions about the biased behavior of the image search algorithm. The company did not answer the questions in detail. Instead, it sent a statement acknowledging that the search results showed "explicit or unsettling content", "including results that reflect negative stereotypes and prejudices that exist on the web". This is a problem because it has "unequal effects on women and women of color".

According to Google, the content that appears in search results is influenced by the way information is organized and tagged on the internet. The company said it was working to find "scalable solutions" to such problems. However, these were not disclosed in detail.

The full explanation from Google, the data, code, and methodology behind this analysis can be found in this GitHub repository.