*/
Social media regulation, free speech and the rule of law: democratic legislatures are scrambling to regulate and the government’s white paper on online harm needs careful scrutiny
It is not an easy time to be a tech giant. Newspapers are crammed with stories of social media discontents: Online hatred, driving MPs into fear over Brexit-related threats and harassed children and vulnerable adults to suicide. The spread of disinformation and fake news and their corrosive impact on social cohesion. Misuse of tech companies’ platforms by terrorists, paedophiles and puppeteers seeking to circumvent democratic election outcomes. A claim that in March 2018, Facebook failed to prevent its platform morphing into a ‘beast’ which incited vitriol and crimes against Rohingya Muslims. The Cambridge Analytica scandal, which led to the Information Commissioner’s Office fining the company the statutory maximum under its powers. Reports that social media algorithms contain inherent gender, racial and other biases. Claims that big tech fails to cooperate adequately with the security services, caught like a rabbit in headlights between the demands of national security and the inevitability of their customers crying foul over data privacy.
Democratic legislatures have scrambled to castigate and regulate. In October 2017, Germany passed the Network Enforcement Act, with a power to fine tech companies up to €50 million for failure to respond to take-down requests on their platforms. In March 2018, Mark Zuckerberg was hauled before the US Congress for a five-hour face-off over Cambridge Analytica. Here in the UK, the Commons’ Digital, Culture, Media and Sport (DCMS) Committee’s 18-month investigation, Disinformation and Fake News, recommended a system of online regulation in the strongest terms – and pulled no punches in criticising Facebook. Its final report accused the company of using its ‘opaque business structure’ to ‘conceal knowledge and responsibility for specific decisions’ and expressed ‘concerns about the good faith of the business and its capacity to participate in the work… in the public interest, as opposed to its own interests’. In February 2019, the Minister for Digital, Margot James MP, announced an intention to tackle what she described as ‘an absolute indictment of a system that has relied too little on the rule of law’.
In April 2019, the government published a new white paper, Online Harms. It intends to introduce a new system of regulation, ultimately funded by a levy upon tech companies. It also plans to legislate for a statutory ‘duty of care’, forcing tech companies to take responsibility for safeguarding users against harmful content on their servers. A consultation on the white paper’s proposals opened on 8 April and runs until 1 July. The ambition is to legislate in the next Parliamentary session – or as soon after as the Brexit quagmire permits.
The task at hand is complex. Comparison to the regimes for regulating print and broadcast news serves to illustrate how. Both regimes are part-product of historical hangover and riddled with gaps. Broadcast media has always been censored far more heavily than print. One justification for that distinction is the sheer power of moving images to provoke a reaction. Social media newsfeeds are full of moving images and arguably more compelling than TV – because they involve interactivity. On the other hand, social media companies ‘select’ rather than ‘generate’ content. Via algorithms they wield enormous power to decide what of the endless internet we consume via newsfeed. But it is the power of a gatekeeper, not the writer – and an interference with the power to curate a newsfeed is not in the same league as an interference with journalistic freedom. What the government is seeking to do is to curb the harm that users cause to one another online – and only in that limited way affect the algorithms developed by big tech. Contrary to the wishes of the DCMS Committee, tackling the potentially harmful effects of targeted advertising, for example, has been ruled out of scope.
The white paper’s concept of the statutory duty of care is novel and requires careful thought. If the stated intention is to bring the rule of law to ‘the Wild West’ of the web, the risk is that unless carefully framed and limited, such a duty could have the opposite effect. The DCMS Committee’s highly political recommendations drew upon a ‘ledger of online harms’ covering a range of behaviour, from the borderline criminal (electioneering which may breach electoral law) to the merely undesirable (loss of ability to focus without distraction). The white paper has attempted to further confine the regulator’s scope – but nonetheless lists no fewer than 23 forms of online harm, described as ‘neither exhaustive nor fixed’. Much of the listed conduct is already criminalised, such as terrorist activity, organised immigration crime, modern slavery and revenge porn. Other harms including ‘trolling’ and ‘intimidation’ are not obviously criminal – and highly vague. The harm likely to crystallise in respect of such conduct is hurt feelings: highly subjective, so unacceptable usage must be closely defined.
The white paper largely leaves the content of the duty of care to be spelled out in codes of practice, to be drafted by the regulator when the system becomes operational. The ability to restrict non-criminal online speech is a significant power to devolve – and if it is to happen, the codes’ drafters must be alive to the delicate considerations involved in balancing the competing interests of freedom of speech, protection of the rights of others, privacy and national security. The risk is that vagueness accompanied by threat of sanction could drive companies towards a cautious and restrictive approach, resulting in unnecessary online censorship. The sanctions envisaged are already serious indeed: as well as significant fines, the government is consulting on whether the regulator should be empowered to disrupt business activities, undertake ISP blocking and implement a regime for senior management liability.
The better approach would be to limit regulation to the codes of practice, rather than attempt to define a broader and unified duty. The harms have different causes. Many are not obviously connected. And as Margot James MP acknowledged, the rule of law is worth protecting.
The reduction in exposure to ideas which shock or offend is one consequence of our reliance on social media. The ‘echo chamber’ era is an online harm in itself – but not the white paper’s target. Drafters of the new codes of practice must be alive to the right (but not the duty) to offend, definitively described by the Strasbourg Court in the seminal case of Handyside v United Kingdom (5493/ 72): ‘Freedom of expression… is applicable not only to “information” or “ideas” that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broadmindedness without which there is no “democratic society’’.’
Given what is at stake, court oversight is essential. The power to judicially review the new regulator’s decisions may not be sufficient to hold it to account. A built-in right to a substantive appeal (as is currently possible in relation to Ofcom’s decisions) may prove critical – and is a matter on which the government is also consulting. For all those who care about the integrity of our democracy, there is still time to respond.
Still time to respond: the Online Harms consultation is open until 1 July 2019 www.gov.uk/government/consultations/online-harms-white-paper
It is not an easy time to be a tech giant. Newspapers are crammed with stories of social media discontents: Online hatred, driving MPs into fear over Brexit-related threats and harassed children and vulnerable adults to suicide. The spread of disinformation and fake news and their corrosive impact on social cohesion. Misuse of tech companies’ platforms by terrorists, paedophiles and puppeteers seeking to circumvent democratic election outcomes. A claim that in March 2018, Facebook failed to prevent its platform morphing into a ‘beast’ which incited vitriol and crimes against Rohingya Muslims. The Cambridge Analytica scandal, which led to the Information Commissioner’s Office fining the company the statutory maximum under its powers. Reports that social media algorithms contain inherent gender, racial and other biases. Claims that big tech fails to cooperate adequately with the security services, caught like a rabbit in headlights between the demands of national security and the inevitability of their customers crying foul over data privacy.
Democratic legislatures have scrambled to castigate and regulate. In October 2017, Germany passed the Network Enforcement Act, with a power to fine tech companies up to €50 million for failure to respond to take-down requests on their platforms. In March 2018, Mark Zuckerberg was hauled before the US Congress for a five-hour face-off over Cambridge Analytica. Here in the UK, the Commons’ Digital, Culture, Media and Sport (DCMS) Committee’s 18-month investigation, Disinformation and Fake News, recommended a system of online regulation in the strongest terms – and pulled no punches in criticising Facebook. Its final report accused the company of using its ‘opaque business structure’ to ‘conceal knowledge and responsibility for specific decisions’ and expressed ‘concerns about the good faith of the business and its capacity to participate in the work… in the public interest, as opposed to its own interests’. In February 2019, the Minister for Digital, Margot James MP, announced an intention to tackle what she described as ‘an absolute indictment of a system that has relied too little on the rule of law’.
In April 2019, the government published a new white paper, Online Harms. It intends to introduce a new system of regulation, ultimately funded by a levy upon tech companies. It also plans to legislate for a statutory ‘duty of care’, forcing tech companies to take responsibility for safeguarding users against harmful content on their servers. A consultation on the white paper’s proposals opened on 8 April and runs until 1 July. The ambition is to legislate in the next Parliamentary session – or as soon after as the Brexit quagmire permits.
The task at hand is complex. Comparison to the regimes for regulating print and broadcast news serves to illustrate how. Both regimes are part-product of historical hangover and riddled with gaps. Broadcast media has always been censored far more heavily than print. One justification for that distinction is the sheer power of moving images to provoke a reaction. Social media newsfeeds are full of moving images and arguably more compelling than TV – because they involve interactivity. On the other hand, social media companies ‘select’ rather than ‘generate’ content. Via algorithms they wield enormous power to decide what of the endless internet we consume via newsfeed. But it is the power of a gatekeeper, not the writer – and an interference with the power to curate a newsfeed is not in the same league as an interference with journalistic freedom. What the government is seeking to do is to curb the harm that users cause to one another online – and only in that limited way affect the algorithms developed by big tech. Contrary to the wishes of the DCMS Committee, tackling the potentially harmful effects of targeted advertising, for example, has been ruled out of scope.
The white paper’s concept of the statutory duty of care is novel and requires careful thought. If the stated intention is to bring the rule of law to ‘the Wild West’ of the web, the risk is that unless carefully framed and limited, such a duty could have the opposite effect. The DCMS Committee’s highly political recommendations drew upon a ‘ledger of online harms’ covering a range of behaviour, from the borderline criminal (electioneering which may breach electoral law) to the merely undesirable (loss of ability to focus without distraction). The white paper has attempted to further confine the regulator’s scope – but nonetheless lists no fewer than 23 forms of online harm, described as ‘neither exhaustive nor fixed’. Much of the listed conduct is already criminalised, such as terrorist activity, organised immigration crime, modern slavery and revenge porn. Other harms including ‘trolling’ and ‘intimidation’ are not obviously criminal – and highly vague. The harm likely to crystallise in respect of such conduct is hurt feelings: highly subjective, so unacceptable usage must be closely defined.
The white paper largely leaves the content of the duty of care to be spelled out in codes of practice, to be drafted by the regulator when the system becomes operational. The ability to restrict non-criminal online speech is a significant power to devolve – and if it is to happen, the codes’ drafters must be alive to the delicate considerations involved in balancing the competing interests of freedom of speech, protection of the rights of others, privacy and national security. The risk is that vagueness accompanied by threat of sanction could drive companies towards a cautious and restrictive approach, resulting in unnecessary online censorship. The sanctions envisaged are already serious indeed: as well as significant fines, the government is consulting on whether the regulator should be empowered to disrupt business activities, undertake ISP blocking and implement a regime for senior management liability.
The better approach would be to limit regulation to the codes of practice, rather than attempt to define a broader and unified duty. The harms have different causes. Many are not obviously connected. And as Margot James MP acknowledged, the rule of law is worth protecting.
The reduction in exposure to ideas which shock or offend is one consequence of our reliance on social media. The ‘echo chamber’ era is an online harm in itself – but not the white paper’s target. Drafters of the new codes of practice must be alive to the right (but not the duty) to offend, definitively described by the Strasbourg Court in the seminal case of Handyside v United Kingdom (5493/ 72): ‘Freedom of expression… is applicable not only to “information” or “ideas” that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broadmindedness without which there is no “democratic society’’.’
Given what is at stake, court oversight is essential. The power to judicially review the new regulator’s decisions may not be sufficient to hold it to account. A built-in right to a substantive appeal (as is currently possible in relation to Ofcom’s decisions) may prove critical – and is a matter on which the government is also consulting. For all those who care about the integrity of our democracy, there is still time to respond.
Still time to respond: the Online Harms consultation is open until 1 July 2019 www.gov.uk/government/consultations/online-harms-white-paper
The beginning of the legal year offers the opportunity for a renewed commitment to justice and the rule of law both at home and abroad
By Louise Crush of Westgate Wealth Management sets out the key steps to your dream property
A centre of excellence for youth justice, the Youth Justice Legal Centre provides specialist training, an advice line and a membership programme
By Kem Kemal of Henry Dannell
By Ashley Friday of AlphaBiolabs
Providing bespoke mortgage and protection solutions for barristers
Joanna Hardy-Susskind speaks to those walking away from the criminal Bar
From a traumatic formative education to exceptional criminal silk – Laurie-Anne Power KC talks about her path to the Bar, pursuit of equality and speaking out against discrimination (not just during Black History Month)
Yasmin Ilhan explains the Law Commission’s proposals for a quicker, easier and more effective contempt of court regime
Irresponsible use of AI can lead to serious and embarrassing consequences. Sam Thomas briefs barristers on the five key risks and how to avoid them
James Onalaja concludes his two-part opinion series