New rules have been unveiled to protect children online, which include limiting direct messages and removing them from suggested friend lists.
They form part of Ofcom’s first draft codes of practice under the Online Safety Act, which was signed into law a week ago.
It focuses on illegal material online such as grooming content, fraud and child sexual abuse.
Platforms will be required by law to keep children’s location data private – and restrict who can send direct messages to them.
Ofcom will publish more rules in the next few months around online safety and the promotion of material related to suicide and self-harm, with each new code requiring parliamentary approval before it is put in place.
It hopes the codes announced today will be enforced by the end of next year.
The code also encourages larger platforms to use hash matching technology to identify illegal images of abuse – and tools to detect websites hosting such material.
More on Online Safety Bill
Ofcom said services should use automatic detection systems to remove posts linked to stolen financial information, and block accounts run by proscribed organisations.
Tech firms must also nominate an accountable person, Ofcom said, who reports to senior management on compliance with the code.
Please use Chrome browser for a more accessible video player
2:17
Read more:
What is the Online Safety Bill?
‘Sextortion’ cases soar
Ofcom chief executive Dame Melanie Dawes told Sky News: “I think without regulation it isn’t getting better fast enough, and in some areas it is going in the wrong direction.
“The more that we see innovation in things like AI, it means I’m afraid it’s easier for the bad guys to create fraudulent material – that ends up cheating us of our money – and it makes it easier to prey on children.”
Technology Secretary Michelle Donelan said the publication of the first codes marked a “crucial” step in making the Online Safety Act a reality by “cleaning up the Wild West of social media and making the UK the safest place in the world to be online”.
She added: “Before the bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first.
“By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today.”
Please use Chrome browser for a more accessible video player
17:59
Susie Hargreaves, chief executive of the Internet Watch Foundation, said: “We stand ready to work with Ofcom, and with companies looking to do the right thing to comply with the new laws.
“It’s right that protecting children and ensuring the spread of child sexual abuse imagery is stopped is top of the agenda.
“It’s vital companies are proactive in assessing and understanding the potential risks on their platforms, and taking steps to make sure safety is designed in.
“Making the internet safer does not end with this bill becoming an act. The scale of child sexual abuse, and the harms children are exposed to online, have escalated in the years this legislation has been going through parliament.
“Companies in scope of the regulations now have a huge opportunity to be part of a real step forward in terms of child safety.”