Child safety campaigners yesterday said they are 'disappointed' that Ofcom's new online safety code lacks measures to tackle suicide and self-harm content.
It comes as the regulator published a new set of rules that compel social media companies to take action against illegal and harmful content in order to comply with the Online Safety Act.
The rules come into effect in the Spring and Ofcom said yesterday that online platforms have three months to comply or face enforcement action and large fines.
However The Molly Rose Foundation - set up in memory of teenager Molly Russell, who took her own life in 2017 after being exposed to self-harm content on social media - said it was 'astonished' and 'disappointed'.
'Ofcom's task was to move fast and fix things but instead of setting an ambitious precedent these initial measures will mean preventable illegal harm can continue to flourish,' the charity's chief executive Andy Burrows said.
He added: 'We are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.
'Robust regulation remains the best way to tackle illegal content, but it simply isn't acceptable for the regulator to take a gradualist approach to immediate threats to life.
'Today makes clear that there are deep structural issues with the Online Safety Act. The Government must commit to fixing and strengthening the regime without delay.'
Ofcom chief executive, Dame Melanie Dawes, said the spotlight is now on tech firms and 'it's time for them to act' to meet the 'strict safety standards' in the code.
Maria Neophytou, acting chief executive at the NSPCC, said the charity was 'deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material'.
She added that Ofcom's proposals will 'at best lock in the inertia to act and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement'.
However Ofcom chief executive, Dame Melanie Dawes, said the spotlight is now on tech firms and 'it's time for them to act' to meet the 'strict safety standards' in the code.
'Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them,' she added.
Technology Secretary Peter Kyle said the publication of the first set of codes under the Online Safety Act was a 'significant step' in making online spaces safer.
He added: 'These laws mark a fundamental reset in society's expectations of technology companies. I expect them to deliver and will be watching closely to make sure they do.'
Ofcom yesterday released the first of its codes of practice, which specifically focus on illegal online harms. More rules are expected next year, including on the response to child sexual abuse material.
The rules released yesterday relate to content such as terrorism material, hate, fraud and assisting or encouraging suicide.
If tech companies fail to comply, Ofcom has the power to fine firms up to £18 million or 10 per cent of their global turnover. In very serious cases it can also apply for sites to be blocked in the UK.
Tech firms now have until March to start putting Ofcom's proposals into place on their sites to ensure they are in line with the Online Safety Act when it comes into force.