Marshalled List of Amendments
Tuesday 10 February 2026
Item 4 - Order Paper 150/22-27 – Tuesday 10 February 2026
Motion: Social Media Restrictions for Children and Young People
That this Assembly recognises the growing body of evidence linking social media use as a cause of harm to the mental health, wellbeing and development of children and young people; notes the increasing concern around exposure to harmful content, addictive design features and online abuse; is aware that several countries are now moving towards stronger age-based restrictions on social media platforms; acknowledges that restrictions alone are not sufficient and that a holistic approach is required, including stronger platform regulation, digital literacy, parental support and effective enforcement; further notes the British Government’s consultation on online harms and child safety; believes that proportionate restrictions should be introduced to prohibit children under the age of 16 from accessing social media platforms, alongside wider reforms to make online spaces safer; and calls on the Secretary of State for Science, Innovation and Technology to introduce these measures at the earliest opportunity.
Leader of the Opposition
Amendment
Leave out all after ‘wellbeing and development of children and young people;’ and insert:
‘acknowledges that whilst social media presents an opportunity for young people to communicate with their friends and family, there is a lack of regulation and accountability by social media platforms which is allowing young people to be exposed to harmful content, addictive design features and online abuse; further acknowledges that several governments across the world are exploring ways to better protect young people from online harms; further recognises that rather than robust regulation, an outright ban on under 16's used prematurely or in isolation could result in unintended consequences such as forcing young people onto the dark web, or trying to evade a ban through the use of virtual private networks or having their age-based identification misused or exploited; notes that the starting point for regulation must be based on holding large multi-national platforms to account for failing to remove explicit and harmful content from their platforms; and calls on the British Government to properly regulate social media platforms and to implement proportionate sanctions on those who fail to remove illegal and harmful content or prevent it being posted; and further calls on the Secretary of State for Science, Innovation and Technology to look at international practice, the United Nations Convention on the Rights of the Child and broader evidence when assessing all options to regulate internet safety, including stronger platform regulation, digital literacy, parental support and effective enforcement.’
Ms Emma Sheerin