Germany, China, U.S. introduce behavior ratings: Algorithms decide which individuals deserve benefits

by WorldTribune Staff, March 4, 2018

Governments are testing rating systems for each individual’s value which dole out or withhold benefits based on how people perform, an analyst said.

“We live in a world where judgment is being replaced by numbers – by scores that calculate the value of a human being, with the help of algorithms,” says Gerd Gigerenzer, director of the Harding Center for Risk Literacy at the Max Planck Institute for Human Development in Berlin.

A poor social credit score in China means you can’t book your dream vacation.

China is currently testing a “social credit system” which uses a “combination of mass surveillance and big data to score citizens,” Gigerenzer noted.

The system is currently voluntary but will be mandatory by 2020.

“At that stage, every citizen and company will be ranked whether they like it or not,” Gigerenzer said.

“If you don’t adhere to social conventions, if you search for the wrong website, if you buy too many video games, if you cross the road on a red light or even if you have friends with a low score, then your own score will fall,” Gigerenzer told the publication Der Tagesspiegel.

Citizens with poor ratings may be prevented from such things as booking flights, purchasing high-speed train tickets or getting bank loans.

“Should your score fall too much, your children won’t be able to go to the better schools and many other limitations will apply,” Gigerenzer said.

In Germany, the private company Schufa (similar to FICO in the U.S.) assesses the creditworthiness most residents and over 5 million companies.

The Schufa ratings is required to be produced by any German wanting to rent a house or get a loan. The score cane be lowered “if you happen to live in a low-rent neighborhood, or even if a lot of your neighbors have bad credit ratings,” the analyst said.

As a warning against relying on such systems, Gigerenzer pointed to the U.S. system COMPAS, a recidivism-prediction algorithm which was developed to help judges with sentencing by looking at defendants’ criminal histories and then predicting what likelihood there was of them committing more crimes.

The algorithm turned out to be wrong in over a third of the cases. In recent experiments, ordinary people with no experience in the field did a better job of telling offenders’ fortunes than the algorithm could, Gigerenzer noted.

“It would be tragic if somebody’s life was destroyed just because others put blind faith in a commercial algorithm,” Gigerenzer said.

“If we don’t do anything, then one day a corporation or a government institution will pull all the information from different data banks together and come up with a social credit score,” Gigerenzer warns. “And at the end, we will be in the same state as the Chinese. At the moment we are investing billions in digital technologies. When we should be investing just as much in digital education so that humans are aware what algorithms really can, and cannot, do. We cannot just stand by as they are used to change our minds and our societies.”


Subscribe to Geostrategy-Direct __________ Support Free Press Foundation