New rules have been unveiled to protect children online, which include limiting direct messages and removing them from suggested friend lists.
They form part of Ofcom’s first draft codes of practice under the Online Safety Act, which was signed into law a week ago.
It focuses on illegal material online such as grooming content, fraud and child sexual abuse.
Platforms will be required by law to keep children’s location data private – and restrict who can send direct messages to them.
Ofcom will publish more rules in the next few months around online safety and the promotion of material related to suicide and self-harm, with each new code requiring parliamentary approval before it is put in place.
It hopes the codes announced today will be enforced by the end of next year.
The code also encourages larger platforms to use hash matching technology to identify illegal images of abuse – and tools to detect websites hosting such material.
Ofcom said services should use automatic detection systems to remove posts linked to stolen financial information, and block accounts run by proscribed organisations.
Tech firms must also nominate an accountable person, Ofcom said, who reports to senior management on compliance with the code.
Read more:
What is the Online Safety Bill?
‘Sextortion’ cases soar
Ofcom chief executive Dame Melanie Dawes told Sky News: “I think without regulation it isn’t getting better fast enough, and in some areas it is going in the wrong direction.
“The more that we see innovation in things like AI, it means I’m afraid it’s easier for the bad guys to create fraudulent material – that ends up cheating us of our money – and it makes it easier to prey on children.”
Technology Secretary Michelle Donelan said the publication of the first codes marked a “crucial” step in making the Online Safety Act a reality by “cleaning up the Wild West of social media and making the UK the safest place in the world to be online”.
She added: “Before the bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first.
“By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today.”