Facebook has been likened to a “digital gangster” by a critical parliamentary report into disinformation and fake news. One witness in the 18-month inquiry into the way digital platforms have transformed the flow of information describes the current use of technology as “hijacking our minds and society”.

The report, commissioned by the Digital, Culture, Media and Sport select committee, concludes that “political choices might be influenced by online information”. It goes on to say:

We must use technology, instead, to free our minds and use regulation to restore democratic accountability. We must make sure that people stay in charge of the machines.

It is an indisputable conclusion. But there is a lot less certainty around what this regulation should look like. I would argue that the algorithms that are used by Facebook and other digital media can only be controlled by algorithmic regulation. 

Facebook and Twitter are built on what economists call network effects – the more people using them, the more useful they are

The two key recommendations of the report are an investigation into Facebook’s data use and anti-competitive practices. This followed conclusions that “the Cambridge Analytica scandal was facilitated by Facebook’s policies” and “it is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws”. Damian Collins MP, chair of the committee behind the report went so far as to say: “Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.”

A new approach

Regulatory scrutiny requires large volumes of information and evidence. But the pace at which platforms such as Facebook’s evolve and innovate is much faster than the time needed by the authorities to scrutinise them. Hence, evidence-based regulation is necessarily out of sync with the algorithms that control flows of information in the digital age. If there are data breaches or users are manipulated – as the report found took place with regard to voting – it is too late to regulate this after the fact.

A new approach to regulation is necessary. One that is based on the use of algorithms to monitor digital platforms. These regulatory algorithms should be designed to collect real-time information on how the platforms shape what users see on their news feeds and timelines. They can adapt as the algorithms used by platforms adapt.

Not enough competition? Wachiwit / Shutterstock.com

This requires an understanding of key features of different platforms, such as the role of specific, influential users who are responsible for shaping opinion across the network. If regulators want to encourage digital literacy and get users to slow down (as the report recommended), algorithms can monitor the ways that platforms encourage the opposite.

When it comes to competition, the barriers to entry are high. Platforms such as Facebook and Twitter are built on what economists call network effects – the more people using them, the more useful they are. This makes them natural monopolies. They use algorithms to boost their reach by constantly suggesting new connections to current users, thereby increasing the switching costs if users want to change to another platform. Regulatory algorithms could be designed to calculate in real time the size of direct and indirect connections for each user, and use them as a proxy for their psychological costs of switching networks.

Another issue for competition watchdogs is that the metrics they traditionally use to monitor market power are rendered obsolete by the business models of social media platforms. It is very difficult to measure the market share of a platform. Even if you know the number of registered users, each one should have a different weight based on their number of connections, which should in turn be weighted depending on the number of their connections.

One way to do this is for algorithms to produce synthetic measures of the relevance of each connection. This would follow a logic similar to that used by Google in ranking the relevance of search suggestions, considering not only immediate connections, but also the sequence of their weighted connections, measuring how central various users are to the network. Thus, they could provide a much better approximation of market shares.

Until now, big tech companies have largely been left to regulate themselves. The UK report rightly challenges this and makes clear that this era of self-regulation must end. But, more than this, a concerted effort is needed to develop new tools to face the scale and, critically, the timing of the threats posed by the enormous market power concentrated in the data centres of a few dominant platforms.

If social media platforms can deploy advanced algorithms to process information on their users in real time, while simultaneously learning from this data, regulators must do the same in terms of monitoring them. The priority for policy makers must switch to prevention, rather than investigating and punishing abuses after society is already damaged. Regulatory algorithms can react in real time to the strategies of various platforms, thereby improving the likelihood of people staying in charge of the machines.The Conversation

Emanuele Giovannetti, Full Professor in Economics, Anglia Ruskin University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.