Ofcom has launched an investigation into TikTok over whether it provided inaccurate information about its parental controls.
While it was generally complimentary about the steps they take, Ofcom said it had “reason to believe” that TikTok provided “inaccurate” information about its family pairing system.
The feature allows adults to link their accounts to their children’s for control of settings like screen time limits.
Ofcom will now investigate whether the firm “failed to comply with its duties” by not responding appropriately.
TikTok blamed technical issues and said Ofcom was made aware, and that it would provide the necessary data.
A spokesperson said the platform enforced an age requirement of 13 and Ofcom’s report showed it puts “extensive effort and resources” into “finding and removing” those who do not meet it.
Ofcom’s report comes two years after it issued guidance to video-sharing apps about how they must protect younger users from encountering harmful content.
Ofcom’s report, published on Thursday, found TikTok, Snapchat, and Twitch all met the requirements it laid out two years ago.
They do not allow sign-ups from children under 13 and use various methods, including AI and human moderators, to identify potentially underage accounts and remove them.
For TikTok, this was around 1% of its monthly active UK user base.
All three platforms also classify and label content to make sure it’s age-appropriate.
But while Snapchat and TikTok require users to have an account to access most content, Twitch is fully open.
On parental controls, it said Snapchat and TikTok offer them, but Twitch’s terms and conditions require parents to supervise their children in real time.
Ofcom said while it was satisfied steps were being taken to protect young users, they “can still sometimes face harm while using these platforms”.