A new set of laws will soon come into effect designed to protect children and adults from harmful material online - including suicideImage shows a child's hands on a computer keyboard.jpeg​​​​​​​ and self-harm content.

The Online Safety Bill passed its final Parliamentary debate on September 19th and is now ready to become law. 

The bill will see the introduction of new rules for platforms which host user-generated content and for search engines. 

The Government says the bill takes a "zero-tolerance" approach to protecting children and makes sure social media platforms are held responsible for the content they host. 

If they do not act rapidly to prevent and remove illegal content and stop children seeing harmful material, they will face significant fines or bosses may even face prison. 

This bill includes that: 

  • All platforms will have to ensure children (under 18s) cannot access any harmful suicide and self-harm content, for example either by ensuring only adults can access their platform (eg age verification tools) or by preventing the content appearing on the platform, or removing it immediately. 
  • All platforms will have to ensure adults and children cannot access any illegal suicide and self-harm content.
  • The biggest platforms - or those with particularly high risk - will have to undertake and publish risk assessments, set out how they are going to deal with different types of content in their terms and conditions and ensure they are implemented. 
  • Platforms will also have to provide tools for users to opt-in or out of seeing priority harm areas of content including suicide and self-harm. 

Technology Secretary Michelle Donelan said: "The Online Safety Bill is a game-changing piece of legislation," adding it is an "enormous step forward in our mission to make the UK the safest place in the world to be online".