A law firm which offers more

Call us: 0113 246 0622

Facebook, Google and Online Tech Companies Could Face New Regulations

Comments

The Digital, Culture, Media and Sport Select Committee has concluded an 18-month investigation into the proliferation of disinformation and fake news on social media platforms such as Facebook and YouTube.

Although the investigation focused particularly on the evidence surrounding Facebook and the Cambridge Analytica scandal, the repercussions of the report will likely have a wide-ranging impact on a number of the world’s top online companies and may impact businesses more widely.

Publisher or Platform?

In the past, social media companies have been keen to stress that they merely offer a platform by which users can upload their own data and content, rather than create or publish content themselves. As a result of this, businesses like Twitter, Facebook and YouTube have stated that they are a ‘platform’ rather than ‘publisher’ and should, therefore, not be held responsible for the content on their websites.

Rather than enter into a protracted debate on whether the term ‘platform’ or 'publisher’ applies to social media companies, the investigation instead recommended that a new type of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a platform or a publisher. The report goes on to ask the Government to consider the new type of tech company in its forthcoming White Paper.

Although we will need to wait and see what the White Paper includes, it seems likely that, at this point, social media companies will be allocated their own ‘type’ and that such a type will bring with it stricter regulations to which the companies must comply.

Code of Ethics

The report suggests that regulations for online tech companies should be set out in a new Code of Ethics, like the Broadcasting Code. It would be developed by technical experts and overseen by an independent regulator and would set out what content is not acceptable to be published on social media, such as harmful or illegal articles, and should make it clear to tech companies how to identify such content.

By having such a Code, the intention is said to be to establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber-security structures are in place.

The overseeing body of the Code would have statutory powers to obtain information from online tech companies, which would include details on a company’s security mechanisms and algorithms.

With such regulations and powers in place, it seems likely that, not only will online tech companies have to adhere to stricter rules, but that the regulatory group in place to police those rules will be given more effective tools with which to do so than those currently in place.

Electoral Law

It has been widely publicised that many of the ‘fake news’ stories shown on social media relate to politics and elections. The investigation paid particular attention to the issues and called for the Government to examine how UK law can provide guidance to tech companies. Specifically, the report suggests that there should be set definitions for digital campaigning, including having agreed definitions of what constitutes online political advertising, such as agreed types of words that continually arise in adverts that are not sponsored by a specific political party.

The report goes further than simply providing guidance on definitions, however, and goes on to state that electoral law is not fit for purpose and needs to be changed to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, microtargeted political campaigning.

The report calls for absolute transparency of online political campaigning including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser.

Further powers for the Electoral Commission are also suggested, such as increasing the maximum fine that the Electoral Commission can impose from £20,000 to a percentage of turnover of the offending business.

Ramifications of the Report

The report will have far-reaching effect should the suggestions made be followed by the Government.

Not only will the regulatory changes set out in the report affect online tech companies by incorporating stricter rules on content hosting, but also by setting up a new regulatory body with bespoke tools and remedies to deal with related breaches.

Additionally, it looks increasingly like political parties themselves will have to review their electoral policies in line with regulations affecting all forms of digital media campaigning. This would apply not only to campaigning in the UK but also campaigning on an international level or allowing campaigns to be influenced by international groups.

If you have an online social media presence you should certainly keep track of how the Government implements the suggestions in the report. Should the suggestions be put into action, many companies would be prudent to review their online presence and seek advice on how to comply with any new regulations which might affect them. 

If you have any questions please do let me know, you can give me a call on 0113 336 3377 or send an email to dominic.higham@clarionsolicitors.com.

 

Disclaimer: Anything posted on this blog is for general information only and is not intended to provide legal advice on any general or specific matter. Please refer to our terms and conditions for further information. Please contact the author of the blog if you would like to discuss the issues raised.