l

The content of this website is intended for healthcare professionals only

Social media companies must be legally obliged to protect young people’s health, say MPs

Science and Technology Committee calls on government to strengthen regulatory framework

Caroline White

Friday, 01 February 2019

Social media companies must be subject to a legal duty of care to protect young people’s health and wellbeing when accessing their sites, concludes a report* from the parliamentary Science and Technology Committee.

Some 70% of 12–15 year olds have a profile on a social media platform, while the Office of Economic Cooperation and Development (OECD) reports that 94.8% of 15-year-olds in the UK used social media sites before or after school.

The inquiry looked at whether the growing use of social media and screens among children was healthy or harmful, the evidence base for such claims, and whether any new measures or controls were required.

While the Committee heard that social media can have a positive impact, the evidence submitted also pointed towards the potential negative effects of social media on the health and emotional wellbeing of young people.

These ranged from interfering with sleep and body image, to bullying, grooming and ‘sexting’. Although these risks existed before the advent of social media, its rise has helped to facilitate it—especially child abuse, says the report.

The National Crime Agency reported that referrals to it from the National Centre for Missing and Exploited Children had "increased by 700% over the last four years.” Despite these shocking statistics, the quality and quantity of academic evidence on the effects of social media remains low, notes the report.

Social media companies must be willing to share data with researchers, within the boundaries of data protection legislation, especially on those who are at risk from harmful behaviours, it recommends.

And it calls on the government to consider what legislation is required to improve researchers’ access to this type of data, to ensure that social media companies help protect their young users, identify those at risk and help improve current online safety measures.

The report found that regulation and legislation is patchy and loose, resulting in a "standards lottery." This can’t guarantee that young people are as safe as possible when they go online.

Key areas that are not currently the subject of specific regulation, identified by Ofcom, include: platforms whose principal focus is video sharing, such as YouTube; platforms centred around social networks, such as Facebook and Twitter; and search engines that direct internet users towards different types of information from many internet services, such as Google and Bing.

The report recommends that a comprehensive regulatory framework that clearly sets out the responsibilities of social media companies towards their users must be created.

The government's forthcoming Online Harms White Paper, and subsequent legislation, presents a crucial opportunity to put a world-leading regulatory framework in place, says the report. But the Committee is concerned that the government’s forthcoming framework may not be as coherent as it ought to be.

There’s a need to establish a regulator to provide guidance on how to spot and minimise the harms social media presents, as well as taking enforcement action when warranted. These actions must be supported by a strong sanctions regime, says the report.

The government must take swift action to tackle the current threat young users face, says the report. An effective partnership across civil society, technology companies, law enforcement, the government and non-governmental organisations, aimed at ending child sexual exploitation (CSE) and abuse online, is needed, it says.

The Committee recommends that the government sets itself an ambitious target to halve reported online CSE in two years and eliminate it in four years.

Committee chair Norman Lamb, said: “It is frustrating that there is not yet a well-established body of research examining the effects of social media on younger users.

“More worryingly, social media companies—who have a clear responsibility towards particularly young users—seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world.”

He added: “We understand their eagerness to protect the privacy of users but sharing data with bona fide researchers is the only way society can truly start to understand the impact, both positive and negative, that social media is having on the modern world. During our inquiry, we heard that social media companies had openly refused to share data with researchers who are keen to examine patterns of use and their effects. This is not good enough.”

He went on: “We concluded that self-regulation will no longer suffice. We must see an independent, statutory regulator established as soon as possible, one which has the full support of the government to take strong and effective actions against companies who do not comply.”

Dr Max Davie, officer for health promotion for the Royal College of Paediatrics and Child Health, added: “For the vast majority of young people, they live in a world where technology forms a staple part of everyday life, and whilst this needs to be accepted, we all should be mindful of the implications – good and bad – that this can pose. Today’s children and young people are digital natives, and as technology continues to rapidly develop, so too must research on the health implications associated with it.”


*Impact of social media and screen-use on young people’s health. A report prepared by the Science and Technology Committee, 31 January 2019.

Registered in England and Wales. Reg No. 2530185. c/o Wilmington plc, 5th Floor, 10 Whitechapel High Street, London E1 8QS. Reg No. 30158470