Algorithms can draw cars, criminal, convict, Bank, shops – and sometimes quite plump come. The stories are so should numerous as they are painful:

An algorithm that Amazon in the selection of candidates to support, suggested especially men, Google image recognition mistook black people with Gorillas and called on Asian Looking people, to open the eyes. And Youtube have been assigned to the great fire of Notre Dame Videos of the 9/11 terrorist attack in New York. Just a bad algorithm?

The criticism of Algorithms, the women have recently – and equality Minister of interior of the countries addressed. Disturb you at the Auto-completion in search engines. Who is typing in the search field of the Microsoft search engine of the Federal “women”, get in the first place, proposed to Supplement the record with “don’t car” to drive. The Federal government should now examine whether the Algorithms “mechanisms of Discrimination” are based on.

This approach Lorena Jaume-Palasí finds wrong. She is the founder of the Ethical-Tech Society in Berlin, which deals with the social Dimension of technology. “The cause of discrimination are always people,” says Jaume-Palasí the German press Agency. “Instead of regulating the causes that cause discrimination, it focuses on the technology, the human discrimination behavior reflects.” to understand

in order To Jaume Palasís criticism, one must understand Algorithms. Algorithm instructions for solving a particular problem – such as a recipe. He defines how a certain process should be executed. Artificial intelligence (AI) based Algorithms. This intelligent behavior is simulated. The machine is to make informed decisions. So that you can, it is lined with records. The technology detects relationships and can now, based on this background knowledge, for example decision-making recommendations.

one explanation for the biased results, for example in the search for suitable personnel provides: “In the past, there was in company the recruitment practice that more white men were hired,” says Susanne Dehmel from the Digital-to-Association Bitkom.

to Train the algorithm with these historical data from the candidate level accordingly. Also in the case of the face detection, the discriminated-against black and Asian-looking people, the chances are that the Problem is not the algorithm. Questionable is the selection of images, with which the machine was trained, it was probably rather.

The claim, to solve problems as, or better than humans, is artificial intelligence, in many cases, so do not yet meet. Dehmel can see the results of something Positive: The technologies held by the society, a mirror, and showed how acute discrimination is still. To give search engine providers a educational order, as far as possible discrimination, to spew out free content, find Dehmel wrong. You filtered according to relevance, results, result from, what people click on. “The request would overload the function.”

That Google engages in the presentation of the results, show various examples, such as the US-American Professor Safiya Umoja Noble. In your 2018 published book “Algorithms of Oppression” criticized them, among other things, racist and stereotypical proposals for auto-completion of the English sentence, “Why black women are so”. Who is typing today, the phrase in the search engine, receives no additions proposed.

Great attention was also paid to the dispute between Bettina Wulff and Google the combination of your Name with ehrabschneidenden terms in the Autocomplete function learned in Germany 2015. The group relented and removed the additions to the wife of the Federal President at that time.

Regardless, Google has announced in June 2018, is to devote special attention to the fact that Software has no “unfair prejudice” or according to skin color, gender, sexual orientation, or income discrimination.

One that wanted to take things into their own hands, is Johanna Burai. The designer from Sweden has bothered to search results in the Search for photos of hands. Burai found on Google images mainly of white hands. 2015, it has launched a campaign with the title of “World White Web” to life, against the “white standard” in the network approach. On one site, she has uploaded since then, numerous photos of hands of different skin colors, so others can share, and want to rinse in the long term, other images into the System.

But how can it be ensured, that AI technologies will meet in the future, wiser and fairer decisions? Legal action needs to be expert Dehmel not see it. “It’s a competence problem. Only when one understands how the technology works, one can also discrimination carefully,” says Dehmel. That it is not sufficient, information on Gender and skin to take color from the data sets have examples from the past. Algorithms substitute-variables in the historical data and came to the same discriminatory results found. Dehmel proposes, therefore, a variety of data sets, a careful design of the training and test runs.

Jaume-Palasí calls for continuous control of the algorithmic systems. Behind AI more than a developer and a data scientist should be. For the control, “you need sociologists, anthropologists, ethnologists, political scientists. People, applied the results in the respective sector in which the technology can better contextualize”, says the data bioethicist. “We need to. finally, the agreement Believe that AI is a computer science topic It is socio-technical systems and the professional profiles that we need, are much more diverse.”

dpa