Big fuss about a big policy plan – and why this matters for corporate social responsibility: the Chinese social credit system

By Dieter Zinnbauer & Hans Krause Hansen.

Few statist policy blueprints on matters pretty technical have captured our collective imagination as has the Chinese Social Credit System (SCS). Announced by China’s State Council on June 14, 2014, and building on experimentation with related mechanisms since the early 2000s, it sets out a hugely ambitious effort, officially described to instil societal trust, integrity and cohesion in a highly complex society. To get there it seeks to combine cutting-edge technology and vast amounts of data to create incredibly granular behavioural profiles of both companies and individuals. Good and bad behaviours are meant to be recorded in and through elaborate rating systems and blacklists, and made public on digital platforms. The expectation is that punishments and rewards will deter deviance and incentivise good conduct in close to any sphere of life.

With the West in the mirror

After years of relative in-attention, the SCS has loudly burst onto the Western media landscape. Here, it is typically described in Orwellian terms as a totalitarian system of surveillance and control. On closer inspection, the SCS is in fact embryonic, fragmentary and faced with enormous implementation challenges.

But the scale, scope and level of invasiveness associated with the data collection effort currently emerging in China should not look so shockingly unprecedented to Western publics once they begin to scrutinize their own backyards. Take the use of social media in the policing of protests as an example. Here the UK government engages in the analysis of big data to predict, pre-empt and respond in real time to a range of issues, including public dissent. Take information on someone’s physical whereabouts as another example. As it turns out the exact location of cell phone owners in 95% of the US is being tracked with the help of all major carriers in close to real time (ok, with a 15 seconds delay) and related data is being available to nudge people’s behaviour for a wide variety of purposes, e.g. by sending them last-minute campaign pitches when they wait in line outside a particular polling station or anti-abortion messages when they are found to linger outside health clinics that carry out these procedures or by sending political messages when they wait in line outside a particular polling station.

Or take the most popular new media companies. They are collecting extremely granular dockets of what their users do, say and who they socialise with on their own platforms. But less in the spotlight they also track users and non-users alike across millions of other websites and across the bulk of the most popular mobile applications, recording anything from detailed surfing behaviour down to the modes of movement – is the user currently cycling or on the train? What’s more, they increasingly merge theses profiles with billions of data points collected by other parties. One leading new media company claims to have access to information on 70% of all credit card purchases and thus approximating a rather totalitarian 360 degree, 24/7 view of user conduct, all the way to – no kidding – the barometric pressure of the users’ environment.

Public and private entanglements

A special matter of concern in the West relating to SCS is its fusion of socialist government and private sector capabilities, technical affordances and interests that make such a system feasible in the first place.

However, long gone in the West are the times when governments were the main purveyors and guardians of data about their citizens.  Even the holy grail of state information prowess, the census is not immune to private sector resources and influences. The UK government for example is exploring ways to make its census more cost-effective with the help of other big data sources and acknowledges that this will also have to include privately-held ones.

And there is also a proximity of big tech and political actors on a much more fundamental level. Tech companies evolved into some of  the most vocal and most prolific donors and lobbyists on the political scene. An entirely legitimate democratic engagement, but it raises questions about outsize influence given the scale of these efforts. Yet, much more unnerving, the leading social media and tech companies in the US   seconded staff as pro-bono experts to become part of the support teams of most presidential candidates in the run up to the 2016 presidential elections, giving them unique insights and connections into the affairs of some of the leading politicians in the country.

Subtle social sorting and weak institutional safeguards

A factor that explains the extraordinary attention that the SCS has received might pertain to the breadth of sanctions and consequences that these early uses have already resulted in. Bad social credit makes it more difficult for Chinese citizens to travel, find a home or get a job.  Unfortunately, this is nothing new and happens all over the world.  Under the label of risk- management citizens whose criminal record or financial credit history contains some irregularities have long been subjected to inferior treatment when renting a home, looking for a job or seeking insurance.

In principle, the protection of individual rights and limits on state over-reach and surveillance in most western countries relies on a host of elaborate institutional safeguards, checks and balances. While some of the egregious examples referenced above have actually been remedied when they were exposed, thus attesting to some degree of efficacy of legal and broader societal protections, other incidences have not been resolved and are somehow even seen as acceptable.

So shifting some of the attention and moral outrage that is being directed towards the Chinese SCS back to the home turf, and to investigate what troubling data practices and regulatory gaps that are germinating over here is more than warranted. In the wake of the Facebook and Cambridge Analytica scandals this has begun to happen and more commentators are noting the troublesome parallels between Chinese SCS and emergent data surveillance and discrimination issues in the West.

Enter the urgent business of business

And this is where business and its social responsibility comes in. Because one of the fundamental differences between the SCS and many issues in the West is that the disciplinary power, control functions and discriminatory implications of big data-driven social scoring are not primarily organised and instrumentalised through government, but deployed by the private sector and working their way into everyday lives.

Egged on by a growing populist Tech-lash, a whirlwind of new regulatory efforts and undoubtedly also in many cases by a deeper sense for doing no harm, the new tech companies have begun to take note, moving from denial to a gradual re-examination of some of their working principles, practices and normative anchoring.

Yet, the proof is still in the pudding whether this is a substantive change of minds and hearts. The Performance of the new tech sector on some standard measures of corporate integrity and transparency is still mediocre and lagging many other established industries.

The ways to a much more comprehensive, proactive and transformational integration of corporate social responsibilities into the strategy and practice of tech will have to coalesce around a broad band of issues, ranging from responsible stewardship of data, platform power and emergent artificial intelligence capabilities to bread and butter CSR issues such as responsible corporate political activity and supply chain and subsidiary integrity.

Think tanks and tech activists are putting forward a sprawling pool of ideas and initiatives from data collaboratives or privacy by design standards to high-profile research endeavours into artificial intelligence ethics. Meanwhile  European regulators are putting into force trailblazing rules as we write this column.

But a big tech embrace of a substantive and comprehensive notion of corporate social responsibility is urgently required to stave off the threat of an even more populist, illiberal, unequal, misogynistic and fragile future in which the tech industry is more part of the problem than a solution to it.


Dieter Zinnbauer is Governing Responsible Business Research Fellow at Copenhagen Business School in the Department of Management, Society and Communication.

Hans Krause Hansen is Professor at the Department of Management, Society and Communication at Copenhagen Business School. He teaches and researches about various aspects of public and private governance, including corruption, anti-corruption and transparency regimes in the global North and South.

 

Pic by Alias, Flickr.

Droned

by Glen Whelan.

A Military Heritage

A drone is an unmanned aircraft. Long used to refer to male honeybees – whose main function is to fertilize a receptive queen bee (and then die a seemingly horrific death) the word was first used to refer to remote-controlled aircraft by the US Navy back in the 1930s. The word was chosen as a homage to ‘the Queen Bee’, a remote-control aircraft that the Royal Navy demonstrated to the US Navy, and that inspired the US Navy to develop similar aircraft.

In the 1990s, the word drone was being used as a verb to describe the act of turning a piloted aircraft into an unpiloted one.[i] And by 2009, the word drone was being used to describe the act of remotely killing someone. As Fattima Bhutto wrote in 2009:

“Droned” is a verb we use now in Pakistan. It turns out, interestingly enough, that those US predator drones that have been killing Pakistani citizens almost weekly have been taking off from and landing within our own country. Secret airbases in Balochistan – what did we ever do before Google Earth? [ii]

Various Civilian Uses

With the development of consumer market autonomous drones[iii] that can be told to follow yourself or another person, it seems that the word ‘droned’, or ‘droning’, is soon to be used more regularly. Rather than just being used to describe acts of murder (or defense), however, it seems it will be used to refer to the act of being filmed or recorded by (autonomous) flying devices more generally.

Such filming will clearly be a good thing for legitimate film-making. And there are possibilities for autonomous drones to be used to improve accountability: as a form of sousveillance in response to surveillance by the powerful. But drones have other uses as well. Indeed, there are already numerous cases of drones being used for stalking around the world. Late last year for example, it was reported that:

“A group of women living in a rural setting near Port Lincoln on South Australia’s Eyre Peninsula have been woken at night by a drone looking into their home…. One of the women, who like the rest of the group did not want to be identified, was asleep and alone at home on her relatively remote hobby farm. She was woken by a bang on her bedroom window and when she looked out into the darkness was confronted by a camera attached to a drone, hovering within centimetres of her window”.

Technologically Changing Society

Whilst such reports are alarming, Nick Bilton[iv] has used a personal anecdote to suggest that the negatives of being droned could be overstated. As he writes:

“I was sitting in my home office, working on this very column about neighbours getting into arguments over drones, when I heard a strange buzzing sound outside. I looked up and hovering 20 feet (around 6 metres) from my window was a black drone with a beady-eyed camera pointed at me.

At first, I was upset and felt spied upon. But the more I thought about it, the more I came to the opposite conclusion. Maybe it’s because I’ve become inured to the reality of being monitored 24/7, whether it’s through surveillance cameras or Internet browsers. I see little difference between a drone hovering near my window, and someone standing across the street with a pair of binoculars. Both can peer into my office.”

Whether or not the majority of people would agree, or disagree, with Bilton’s sentiment, is well beyond the present piece. But what should be noted with regard to it, is that he seems to be correct to emphasize that droning will have a material impact on what we deem (un)acceptable. Thus, as more and more people get droned – and as the capacity to make more sophisticated autonomous drones gathers pace – we should expect social norms and practices regarding privacy and personal (air) space to change as well.


Glen Whelan teaches at McGill, is a Visiting Scholar at York University’s Schulich School of Business, and the social media editor for the Journal of Business Ethics. He was GRB Fellow at CBS in 2016/2017.  His research focuses on the moral and political influence of corporations, and high-tech corporations in particular. He is on twitter @grwhelan.

Links

[i] Zimmmer, B. 2013. The flight of ‘drone’ from bees to planes. The Wall Street Journal, July 26. https://www.wsj.com/articles/SB10001424127887324110404578625803736954968

[ii] Bhutto, F. 2009. Missing you already. New Statesman, March 12. https://www.newstatesman.com/asia/2009/03/pakistan-war-government-terror

[iii] https://www.skydio.com

[iv] Bilton, N. 2016. When your neighbor’s drone pays an unwelome visit. The New York Times, January 27. https://www.nytimes.com/2016/01/28/style/neighbors-drones-invade-privacy.html

Pic by Cambodia, P.I. Networt, Flickr. No changes made.