‘Far more than surveillance’ is already in place and ‘cyberocracy’ could change how government is run

The machinery of government used to mean something entirely different.

Traditionally a way to describe the processes and structures of Parliament, now the machines of government are those gathering, processing and utilising data about its voters.

And it could end up changing who is in charge of politics forever.

People have become used to data gathering: Google, Facebook, Twitter, Amazon, Apple and others knowing a lot about you is not a new story.

Big tech companies hold, in the words of consultant Dylan Curran, ‘every message you’ve ever sent or been sent, every file you’ve ever sent or been sent, all the contacts in your phone and all the audio messages you’ve ever sent or been sent’.

The Cambridge Analytica scandal showed the extent data can be harvested for user information then to target advertising towards social media users.

While this has predominantly been used to sell you products, political parties are quickly catching up, using data to find out what voters are talking about before selling you policies you’re more likely to vote for.

‘The technology used by politicians is the same as what every corporation has to sell products and they’ve been doing it a lot longer,’ political strategist James Kanagasooriam tells

‘It’s not dark arts, it’s how every company in the world sells products. It finds out who its customers are, builds propositions through data, tests it then feeds it back in.

‘20 or 30 years ago, political decisions were still being taken based on data.

‘It wasn’t called “data”, it was “polling” but we’re seeing that kind of information more quickly and a quicker feedback loop.’

The cliche attributed (without clear evidence) to Sir Francis Bacon that ‘knowledge is power’ takes on more significance when data provides potentially limitless information and knowledge.

Facebook, for example, is said to be able to tell if you’ve just broken up with your partner and Google stores 14.72GB about this author. Facebook stores 1.93GB.

A lot of it isn’t worrying – the photos and files you choose to store for example.

But Google, if you let it, knows exactly where you’ve been, who you’ve been emailing, what you’ve been listening to and exactly what’s in your calendar.

If data is knowledge and knowledge is power, we’ve given away a lot of it to be bought on the open market in return for free web services.

Google was asked to comment on the data it collects.

As nation-states grapple with big companies to retain control of their population’s information, who owns this data becomes a key question in the future of democracy.

This ‘quicker feedback loop’ the social web has created has led some experts to start predicting a ‘cyberocracy’, where democracy has given way to ruling by infinite information and data.

The more information gathered about every individual, it is argued, the more efficient a government would be as it would know exactly, at any given time, what a majority of voters needed.

Whoever had the most information and/or the best way of using it would be the one in power.

‘Cyberocracy is coming,’ academics David Ronfeldt and Danielle Varda wrote in the book Culture And Civilization: Volume 2.

‘Information and its control will become a dominant source of power, as a natural next step in political evolution.’

The idea of governing by information isn’t a new one.

In East Germany, secret police known as the Stasi had files on more than one-third of its population.

The ‘highly effective’ operation relied on collecting as much information as possible. The key limiting factor to its effectiveness was because the filing system for that amount of information was impossible to manage.

With the social web and digital storage, this is much less of a problem.

A sophisticated team from any political party would be using [data] to their advantage to shape policies that will win elections – Matt Navarra

‘While it is too early to say precisely what a cyberocracy will look like, the outcomes include new kinds of democratic, totalitarian and hybrid governments, along with new kinds of state-society relations,’ Ronfeldt and Varda write.

‘Thus, optimism about the information revolution should be tempered by an anticipation of its potential dark side.’

The death of democracy and the transition to an information state? What a light and cheerful idea.

Few (if any) are advocating a return to Stasi rule and it is easy to make Nineteen Eighty-Four or Brave New World references here.

But it seems to be a question of not ‘if’ more data will be used but ‘how’.

One example is the Stasi-like doomsday view but another more optimistic possibility is the opening of governments to true scrutiny as data could be democratised and open to all to create a ‘fairer’ society.

This isn’t a question about the future but how data is being used now:

‘There is a growing push toward drawing on big data for “better” decision making,’ Dr Elke Schwarz, lecturer in political theory at Queen Mary, University Of London, tells

‘And the excitement about more data equals better information continues to swell.

‘[It’s] probably most notable in matters of security. Think, for example, predictive policing practices already in place in UK policing, or consider the identification of targets [in military strikes] for targeted killing, often determined through algorithmic identification of potential suspects.

‘Both practices are highly controversial for obvious reasons (black box algorithms, what is the data on which decisions are made? etc) and there is heated debate about regulation, oversight, bias, issues of equality and rightfully so.’

Controversial they may be but they’re already here.

And the issue, as Dr Schwarz puts it, not that the data exists but who or what gets the deciding vote in what to do with it that is important.

It’s a big question: Does big data mean that humans can make ‘better’ decisions or can AI take decisions out of the hands of governments completely?

While ‘algorithmic warfare’ and autonomous weapons are already being funded by the Pentagon in the US (and it’s not the only country), algorithmic politics is less advanced.

‘Algorithms are great at cutting up your messages and targeting them at the voters who will be most engaged by a specific policy or word or image,’ James Morris, managing director of PR and marketing consultancy Edelman, tells

‘But we are a long way from algorithms creating the raw materials themselves because they go way beyond messages.

‘They include the character of the leadership; the criticisms a party chooses to make of their opposition; the way parties make decisions; and the brand of the party.

‘A brand is a result of decades of history, not an algorithm. Leaders can try to mask their character but it always escapes.’

There are several different ways data can be used now:

  • To influence voters with better advertising
  • To find out what the electorate really thinks about issues without polling
  • To use big data to help make policy decisions

All three of these are already visible in everyday politics:

‘You’d hope political parties would take social media into account given that’s how people get and share information now,’ social media consultant Matt Navarra tells

‘A lot of it can be made readily available to political parties. Some obviously can’t be, not legally anyway.

‘Some parties are more savvy than others. Boris Johnson is learning from the Trump camp and how they’ve used big data, on Facebook particularly.

‘A sophisticated team from any political party would be using that to their advantage to shape policies that will win elections.’

In 2016 before the EU referendum, Vote Leave created a site offering a £50m prize if users could guess the outcome of every game in the football European Championships that summer.

The chances to win were 5,000,000,000,000,000,000,000/1 but that didn’t stop a lot of people giving over their data to analysts. Details included phone numbers, addresses and their views on Brexit.

But this might only be the tip of the iceberg.

As it stands, most data is collected in fragmented networks run by private organisations rather than governments and, as previously discussed, official bodies and the law move significantly slower than private businesses can.

This means there is difficulty, for better or worse, in creating a ‘full’ picture of an individual or a society.

Modern society has been likened to a body where the brain is not able to talk to the arms or legs, which then, in turn, cannot talk to the heart.

Being able to collate all this information would turn ‘big-data’ into what has been called ‘sensory apparatus’.

‘That is not to say that societies will increasingly resemble “organisms” that have a central nervous system and brain,’ Ronfeldt and Varda write.

‘That’s going too far. But something is taking shape for which sensory apparatus seems an appropriate term.

‘A term like surveillance system is too limited (and biased).

The best of The Future Of Everything

‘Far more than surveillance is occurring and the diverse array comprising this apparatus are far from forming an integrated system.

‘Bits and pieces have existed for decades – they are normal for complex modern societies – and more pieces are being emplaced. No central master hub exists (and presumably never will).

‘[But] the scope and scale of this apparatus are growing far beyond what government, business and civil society actors have ever had at their disposal or had to cope with.’

In theory, this would mean that everything can be tracked and would be available for AI to make decisions from.

If there was a spike in searches for foodbanks, for example, would the AI react more quickly to that trend and make sure money was available for those in need?

In short, would AI do a better job of politics than politicians do?

‘This really depends on the definition of the word “better”,’ Dr Schwarz tells

‘AI systems are faster, more efficient and can optimise with greater ease.

‘What it cannot do is assess how policy might have a beneficial or a detrimental effect on a community.

‘Not only is AI nowhere near not sophisticated enough to do such a thing, but it is unlikely that there will be a context within which it will ever be “objectively” better.’

Dr Schwarz points to the evidence that, because of the way that AI is programmed by humans, it builds in existing prejudices to its decision-making.

And she is not the only one warning about the ‘clinical’ decisions of machines having an in-built bias.

‘The biggest risk of algorithms is that they re-enforce the past when politics should be about shaping the future,’ James Morris tells

‘Algorithms learn from the data that exists which means they tend to do replicate whatever biases are in that data.

‘Any use of algorithms in politics and government needs to correct for bias or it will entrench it.’

If machines are only as good or bad as those who programme them, who is first to ‘own’ the data is vitally important.

And that ‘who’ in the age of populism and personality politics is important.

Data policy run by Vladimir Putin is very different from Angela Merkel’s and different again to either Donald Trump’s or Boris Johnson’s.

But in an age of ‘making America great again’ or criticising the ‘doubters, the doomsters, the gloomsters’ of Brexit, James Morris’ point about ‘character of the leadership’ and ‘brand of the party’ is clear to see.

This would be difficult to replicate algorithmically, at least for now.

‘The idea of the self-evolving, self-adjusting humanless way of doing politics is far away from how things are really done,’ James Kanagasooriam says.

‘Numbers help you prove or disprove theories but if you don’t know what you’re looking for, the data is basically meaningless.

‘Having more information and having knowledge are two different things.

‘If you’re not able to piece information together, it’s worthless.’

Then at least for now, the machinery of government relies on people to operate it.

But if ‘cyberocracy’ definitely ‘is coming’ – even if its creator says it is too early to define fully – who controls the power that data has is seen as vital to the future.

‘As humans, we only have a limited capacity to override machine-made decisions,’ Dr Schwarz tells

‘The more we outsource morally relevant decisions to machines, the less attuned we are to what it means to be morally responsible.’

This phenomenon, known as automation bias, means we’re more likely to believe an automated decision over one made by a human.

If that’s the case, whatever happens, it’ll still be the politicians that voters don’t trust.

The Future Of Everything

Future Of Everything

This piece is part of’s series The Future Of Everything.

From OBEs to CEOs, professors to futurologists, economists to social theorists, politicians to multi-award winning academics, we think we’ve got the future covered, away from the doom-mongering or easy Minority Report references.

Every week – new pieces every Wednesday morning – we’re explaining what’s likely (or not likely) to happen.

Talk to us using the hashtag #futureofeverything  If you think you can predict the future better than we can or you think there’s something we should cover we might have missed, get in touch: or

Read every Future Of Everything story so far


Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.