Artificial intelligence is inherently sexist

Some time ago, hundreds of thousands of people submitted their self-portraits to an online beauty pageant. Why not? But instead of the usual old men, this time the jury consisted of AI - artificial intelligence. It should be the beauty pageant that outclasses all other beauty pageants. Not as humiliating as Miss Universe elections, but rather objective, sober, impartial. The AI ​​software should be the digital equivalent of the Snow White Mirror. There was only one problem: Although the participants came from more than 100 countries, only one person with dark skin was among the 44 who were ultimately chosen.

A similar story only played out on Facebook in the past few weeks. There you could see how friends had their faces automatically processed by an AI program called the Face App. The images could be changed from young to old or from male to female. This version of the app also had an option called "hot", so you could let the software make you attractive. There was just one more problem: the enhancements to the program were always the same. The eyes became a little rounder, the skin tone a little lighter, and the noses a little smaller.

Both the organizers of the beauty contest and the makers of the app now assert that the preference for white western ideals of beauty was not intended, but was only due to the data collections on which their software is based. There are a lot of such data packets. One contains around 14 million pictures of people of all ages and origins, another 300,000 photos of everyday objects in their natural habitat. Thousands of videos just showing different facial features and actions. It is logically the primary source for AI developers to show their programs how the world works.

AI trains with data in the network - and learns the prejudices right away

But regardless of whether it is about pictures or words, contexts or associations: the problem always remains the same. The supposed objectivity of machines is fed by the subjectivity of people. For almost 30 years, mankind has been flushing its knowledge into the World Wide Web. When AI software is trained with these data collections, they also learn the prejudices and clichés that exist there. In the case of the beauty programs, it was people who told the program that white should be more beautiful than black. But a computer program cannot question such a rule. Like a child, it takes them for granted and makes appropriate judgments.

It is a problem that is far from being trivial or even theoretical. Even today, women are less likely than men to see job advertisements for better paid jobs on career portals. Because the algorithms associate femininity with home work rather than gainful employment.

Another example is job titles that are expressed gender-neutral in one language - they change after a machine translation into another language. From "he / she is a doctor" and "he / she is a nurse" becomes "he / she is a doctor" and "she is a nurse" in the interpretation of the software. Our intelligent machines of tomorrow are just as racist or sexist as people are today.