Our joint letter to Sir Keir Starmer about the Online Safety Act

Rt Hon Sir Keir Starmer MP 
Prime Minister
10 Downing Street 
London 
SW1A 2AA 

Dear Prime Minister, 

Public letter: dangerous suicide and hate forums 

We write to you as CEOs and senior leaders working across mental health, suicide prevention, the countering of anti-Jewish racism and other forms of hate, and as parents and family members affected by suicide. 

We were encouraged to hear you set out last year that a Labour government would reverse the rise in the number of deaths from suicide, highlighting that suicide is the biggest killer of young lives in this country. 

Your focus on suicide could not be more timely. As you will be aware, the latest figures from the ONS show 6069 suicides registered in England and Wales in 2023; the highest rate in over 20 years. In Lord Darzi’s recent report to the Secretary of State for Health on the state of the NHS, he highlighted that “there has been a worrying increase in suicides of young people” and advised that “suicide rates are now at their highest levels this century, and this is an area where close attention will need to be paid in the years ahead”.  

Addressing the shockingly high rates cannot be achieved without tackling harmful online suicide material. A 2017 inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s, and 13% of deaths in 20–24-year-olds.1 Three-quarters of people who took part in Samaritans’ research with Swansea University said they had harmed themselves more severely after viewing self-harm content online.2 

We hope, therefore, that you agree that online suicide forums represent a clear threat to the government’s ambitions and must be addressed immediately. 

In this context, we are writing to you to alert you to Ofcom’s alarming recommendation that the DSIT Secretary of State should not use the full powers available to him under the Online Safety Act to regulate the most dangerous online forums that promote and glorify suicide. This is contrary to the clear will of Parliament - and indeed Labour’s then front bench representatives in both the Houses of Commons and Lords - that this be the case. 

During the passage of the Online Safety Act, there was a significant strength of feeling in Parliament that the harm caused online was not only occurring on large platforms but smaller ones too, and that the Bill needed to reflect this. 

The previous government lost a vote in the Lords on an amendment on this topic that had cross-party support - indeed, your colleague Lord Knight of Weymouth called it ‘a no-brainer’.  As a result of this defeat, the government subsequently brought forward its own amendment to the Bill in the Commons, which Alex Davies-Jones MP, then Labour’s shadow digital spokesperson, welcomed as, without it: ‘we could have been left in a position where some of the most harmful websites and platforms, including 4chan and BitChute, which regularly host and promote far right, antisemitic content, slipped through the cracks of the legislation. None of us wanted that to happen.’ 

As a result, Schedule 11 of the Act now allows the Secretary of the State to determine which providers should be in Category 1 based on functionality (and other ‘characteristics’) alone rather than requiring that they also be of a certain size. 

This would allow a limited number of small but exceptionally dangerous forums to be regulated to the fullest extent possible. These include forums that are permissive of dangerous and hateful content as well as forums that explicitly share detailed or instructional information about methods of suicide or dangerous eating disorder content. One way to do this using Schedule 11 would be to: 

  • Set a ‘functionality’ that would trigger the possibility of category 1 categorisation under Schedule 11 1(1)(b). This might specify having question and answer comment threads that persist and are searchable by others (including non-registered users), a feature of all the sites of concern. 
  • Set a ‘characteristic’ or ‘factor relating to the service’ under Schedule 11 1(1)(c). This might be the fact that Ofcom or coroners are able to reasonably link one or more deaths or incidents of serious violent crime to that service. 

Given the cross-party support for such an approach to regulation of these platforms, we were dismayed to see that Ofcom, in its recently published advice to the previous Secretary of State on categorisation, explicitly recommended not using this power to address these extremely dangerous sites.3 We see no justification for this and Ofcom have not provided any. The current Secretary of State is expected to lay secondary legislation according to this advice though, as Baroness Jones recently noted in the Lords, he can divert from it. We would urge that the government takes this course of action. We provide evidence below to support this approach. 

In the private appendix attached to this public letter, we have shared examples of the sorts of suicide material found on the specific site we are most concerned about. 

A BBC report has linked this site to at least 50 UK deaths,4 and we understand that the National Crime Agency is investigating 97 deaths in the UK thought to be related to the site. 

This would then allow sites like the highly dangerous suicide forum, which, as outlined above, has been linked to a very significant number of UK deaths, many of which are caused by users accessing legal content,5 to be regulated at the same level as sites like Facebook and Instagram. Whilst it would not shut them down, it would make them accountable in ways they would currently not be and force them to give users choice about what type of content they saw, adding friction into the process of accessing extremely dangerous material.6 

There are very similar issues with platforms that host violently antisemitic and Islamophobic content, with one site inspiring the shooting in Buffalo in the US and evidence that another smaller social media service was used to stoke this summer’s racist riots. The cost of ignoring small, high harm platforms, in human lives, public disorder, cost to the taxpayer and more besides is significant, and there is both an ethical and political obligation for the government to act.  

We would argue that the events of the summer, in tandem with the ongoing human cost of a growing number of suicides, are sufficient evidence in themselves to justify the Secretary of State deciding to divert from Ofcom’s advice and set the categorisation thresholds for the regime in the most robust and expansive way the Act allows. 

Ofcom’s current recommendations, which involve services having content recommendation systems, and having the functionality for users to forward or re-share content, in addition to having a large size, would do nothing at all to address the services we are concerned about. 

We hope that you will be able to take action on addressing this major oversight in the advice that the government has been given by Ofcom. 

Yours sincerely, 

Imran Ahmed 
CEO, Center for Countering Digital Hate 

Andy Bell 
Chief Executive, Centre for Mental Health 

Julie Bentley 
CEO, Samaritans 

Andy Burrows 
CEO, Molly Rose Foundation 

Ellen O’Donaghue 
CEO, James’ Place 

Ged Flynn 
Chief Executive, PAPYRUS Prevention of Young Suicide 

Alice Hendy MBE 
CEO, R;pple Suicide Prevention  

Dr Sarah Hughes 
CEO, Mind 

David Parfett 
Bereaved parent 

Andrew Radford 
Chief Executive, Beat  

Mark Rowland 
CEO, Mental Health Foundation 

Adele Zeynep Walton  
Bereaved sister 

Matthew Smith 
Chief Operating Officer, If U Care Share Foundation 

Danny Stone 
Chief Executive, Antisemitism Policy Trust 

Maeve Walsh 
Director, Online Safety Act Network 

CC 
Baroness Jones of Whitchurch 
Baroness Merron 
Rt Hon Peter Kyle MP 
Rt Hon Wes Streeting MP 
Rt Hon Yvette Cooper MP  
Dame Melanie Dawes 
Lord Knight 
Alex Davies-Jones MP 
Lord Darzi 
Prof. Louis Appleby

  1. Appleby, L. et al. (2017). Suicide by children and young people. National Confidential Inquiry into Suicide and Homicide by People with Mental Illness (NCISH). Manchester: University of Manchester.
  2. Samaritans (2022). How Social Media Users Experience Self-Harm and Suicide Content 
  3. Ofcom (2024) Categorisation: Advice Submitted to the Secretary of State
  4. Available here: https://www.bbc.co.uk/news/uk-67082224
  5. For example, information about methods.
  6. These include transparency reporting, enhanced requirements to carry out risk assessments, and user empowerment duties. Ofcom has set out the additional duties here: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/additional-duties-for-categorised-online-services/