Just a number?
This blog is edited from an article written in Wired Magazine (http://www.wired.co.uk/article/chinese-government-social-credit-score-privacy-invasion) by Rachel Botsman. It paints such a dramatic picture of the way that we might be living soon – and very soon, in China – that we thought that we should point you towards the full article. Welcome to the future.
On June 14, 2014, the State Council of China published an ominous-sounding document called, “Planning Outline for the Construction of a Social Credit System’’.
The system is designed to gather data from practically every activity that a citizen would undertake. All these behaviours are then rated as either positive or negative and distilled into a single number, according to rules set by the government. This creates a Citizen Score, telling everyone whether or not a citizen is trustworthy.
The Chinese government is pitching the system as a desirable way to measure and enhance “trust” nationwide and to build a culture of “sincerity”. As the policy states, “It will forge a public opinion environment where keeping trust is glorious. It will strengthen sincerity in government affairs, commercial sincerity, social sincerity and the construction of judicial credibility.”
By 2020, participating in China’s Citizen Scores will be mandatory. The behaviour of every single citizen and legal person (which includes every company or other entity) in China will be rated and ranked, whether they like it or not. A trial of the system is currently been run by Alibaba and Tencent (who run the messaging app WeChat with over 850 million active users) and includes data from many other sources including the Chinese equivalent of Uber and Baihe, the country’s largest on line matchmaking service. A complex algorithm then makes a machine based judgement on how ‘trustworthy’ a citizen is. Apparently.
This judgement is measured by a score ranging between 350 and 950 points and is based on five key factors. The first is credit history. For example, does the citizen pay their electricity or phone bill on time? Next is fulfilment capacity, which is defined in guidelines as, “a user’s ability to fulfil his/her contract obligations”. The third factor is personal characteristics, verifying personal information such as someone’s mobile phone number and address. But the fourth category, behaviour and preference, is where it gets interesting.
Under this system, something as innocuous as a person’s shopping habits become a measure of character. Alibaba admits it judges people by the types of products they buy. “Someone who plays video games for ten hours a day, for example, would be considered an idle person,” says Li Yingyun, Sesame’s Technology Director. “Someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility.” So the system not only investigates behaviour – it shapes it. It “nudges” citizens away from purchases and behaviours the government does not like.
Friends matter, too. The fifth category is interpersonal relationships. What does their choice of online friends and their interactions say about the person being assessed?
Posting dissenting political opinions or links mentioning Tiananmen Square has never been wise in China, but now it could directly hurt a citizen’s rating: now, a person’s own score will also be affected by what their online friends say and do, beyond their own contact with them. If someone they are connected to online posts a negative comment, their own score will also be dragged down.
If their score reaches 600, they can take out a Just Spend loan of up to 5,000 yuan (around £565) and use it to shop online, as long as it’s on an Alibaba site. Reach 650 points and they may rent a car without leaving a deposit. They are also entitled to faster check-in at hotels and use of the VIP check-in at Beijing Capital International Airport. Those with more than 666 points can get a cash loan of up to 50,000 yuan (£5,700). Get above 700 and they can apply for Singapore travel without supporting documents such as an employee letter. And at 750, they get fast-tracked application to a coveted pan-European Schengen visa. The best way to understand the system is as a sort of bastard love child of a loyalty scheme.
Higher scores have already become a status symbol, with almost 100,000 people bragging about their scores on Weibo (the Chinese equivalent of Twitter) within months of launch. A citizen’s score can even affect their odds of getting a date, or a marriage partner, because the higher their Sesame rating, the more prominent their dating profile is on Baihe.
We’re also bound to see the birth of reputation black markets selling under-the-counter ways to boost trustworthiness. In the same way that Facebook Likes and Twitter followers can be bought, individuals will pay to manipulate their score. What about keeping the system secure? Hackers (some even state-backed) could change or steal the digitally stored information.
The new system reflects a cunning paradigm shift. The government is attempting to make obedience feel like gaming. It is a method of social control dressed up in some points-reward system. It’s gamified obedience.
Currently, people are not penalised for being “untrustworthy” – it’s more effective to lock people in with treats for good behaviour. But the system will eventually mean that untrustworthy people can’t rent a car, can’t borrow money or even can’t find a job. China’s Education Bureau has been approached about sharing a list of its students who cheated on national examinations, in order to make them pay into the future for their dishonesty.
Penalties are set to change dramatically when the government system becomes mandatory in 2020. Indeed, on September 25, 2016, the State Council General Office updated its policy entitled “Warning and Punishment Mechanisms for Persons Subject to Enforcement for Trust-Breaking”. The overriding principle is simple: “If trust is broken in one place, restrictions are imposed everywhere,” the policy document states.
China’s trust system might be voluntary as yet, but it’s already having consequences. In February 2017, the country’s Supreme People’s Court announced that 6.15 million of its citizens had been banned from taking flights over the past four years for social misdeeds. The ban is being pointed to as a step toward blacklisting in the SCS. “We have signed a memorandum… [with over] 44 government departments in order to limit ‘discredited’ people on multiple levels,” says Meng Xiang, head of the executive department of the Supreme Court. Another 1.65 million blacklisted people cannot take trains.
Where these systems really descend into nightmarish territory is that the trust algorithms used are unfairly reductive. They don’t take into account context. For instance, one person might miss paying a bill or a fine because they were in hospital; another may simply be a freeloader. And therein lies the challenge facing all of us in the digital world, and not just the Chinese. If life-determining algorithms are here to stay, we need to figure out how they can embrace the nuances, inconsistencies and contradictions inherent in human beings and how they can reflect real life.
You could see China’s so-called trust plan as Orwell’s 1984 meets Pavlov’s dogs. Act like a good citizen, be rewarded and be made to think you’re having fun. It’s worth remembering, however, that personal scoring systems have been present in the west for decades.
For the majority of Chinese people, they have never had credit scores and so they can’t get credit.
China’s lack of a national credit system is why the government is adamant that Citizen Scores are long overdue and badly needed to fix what they refer to as a “trust deficit”.
The government also argues that the system is a way to bring in those people left out of traditional credit systems, such as students and low-income households. The guiding principle is that the behaviour of the majority is determined by their world of thoughts. A person who believes in socialist core values is behaving more decently.
Indeed, the State Council’s aim is to raise the, “honest mentality and credit levels of the entire society” in order to improve, “the overall competitiveness of the country”.
Governments around the world are already in the business of monitoring and rating. In the US, the National Security Agency (NSA) is not the only official digital eye following the movements of its citizens. In 2015, the US Transportation Security Administration proposed the idea of expanding the PreCheck background checks to include social-media records, location data and purchase history. The idea was scrapped after heavy criticism, but that doesn’t mean it’s dead. We already live in a world of predictive algorithms that determine if we are a threat, a risk, a good citizen and even if we are trustworthy. We’re getting closer to the Chinese system – the expansion of credit scoring into life scoring – even if we don’t know we are.
So are we heading for a future where we will all be branded online and data-mined? It’s certainly trending that way. Barring some kind of mass citizen revolt to wrench back privacy, we are entering an age where an individual’s actions will be judged by standards they can’t control and where that judgement can’t be erased. The consequences are not only troubling; they’re permanent. Forget the right to delete or to be forgotten, to be young and foolish.
While it might be too late to stop this new era, we do have choices and rights we can exert now. For one thing, we need to be able rate the raters. In his book The Inevitable, Kevin Kelly describes a future where the watchers and the watched will transparently track each other. “Our central choice now is whether this surveillance is a secret, one-way panopticon – or a mutual, transparent kind of ‘coveillance’ that involves watching the watchers,” he writes.
Our trust should start with individuals within government (or whoever is controlling the system). We need trustworthy mechanisms to make sure ratings and data are used responsibly and with our permission.
It is still too early to know how a culture of constant monitoring plus rating will turn out. What will happen when these systems, charting the social, moral and financial history of an entire population, come into full force? How much further will privacy and freedom of speech (long under siege in China) be eroded? Who will decide which way the system goes? These are questions we all need to consider, and soon. Today China, tomorrow a place near you. The real questions about the future of trust are not technological or economic; they are ethical.
If we are not vigilant, distributed trust could become networked shame. Life will become an endless popularity contest, with us all vying for the highest rating that only a few can attain.
This is an edited version of an article published in the November 2017 issue of WIRED magazine, featuring an extract from Who Can You Trust? How Technology Brought Us Together and Why It Might Drive Us Apart (Penguin Portfolio) by Rachel Botsman, published on October 4. Since this piece was written, The People’s Bank of China delayed the licences to the eight companies conducting social credit pilots. The government’s plans to launch the Social Credit System in 2020 remain unchanged.
LAW Creative is an award-winning marketing agency with proven expertise in digital marketing. To learn more, contact keith.sammels@lawcreative.co.uk.