TikTok has launched the ability for parents and guardians to filter out videos they don’t want their children to see.
The feature is an addition to the app’s family pairing functionality, which allows adults to link their account to their teenager’s for control of settings like screen time limits.
TikTok‘s users could already dictate content filters for themselves, allowing them to avoid videos associated with specific words or hashtags.
Julie de Bailliencourt, global head of product policy, told Sky News that giving parents the ability to set them up was more with user safety in mind.
However, teenagers will only be alerted to their parents’ selected filters at first and can simply not opt-in.
“We wanted to make sure we had the right balance of pragmatism and transparency to enable families to choose the best experience for their own family because every family is different,” said Ms De Bailliencourt.
“We also wanted to make sure we respect young people’s right to participate. So by default, teens can view the keywords their parent or caregiver has added.”
It comes after the company faced criticism over children being exposed to self-harm and eating disorder clips, which are sometimes shared using “coded” hashtags – phrases with slightly tweaked spellings – to bypass the platform’s moderation.
Ms De Bailliencourt said she hoped the feature would “spark a conversation” between teens and their parents or guardians about online boundaries.
Please use Chrome browser for a more accessible video player
7:16
TikTok committed to Online Safety Bill
The optional feature comes as the government’s long-awaited Online Safety Bill nears fruition after a final day of scrutiny by a House of Lords committee.
A late amendment to the proposed legislation last week, which aims to regulate online content to keep people safe, could see coroners and bereaved parents granted access to data on the phones of deceased children.
It comes after a campaign by parents whose children’s deaths were linked to their social media activity.
TikTok would not be drawn on the specific amendment, only that it is working closely with the government on the development of the legislation.
The platform has come under mounting pressure over its links to China, as it’s owned by Beijing’s ByteDance, and earlier this year it was banned from UK government phones.
Read more:
TikTok fined £12.7m for misusing children’s data
TikTok to remove climate change denial content
More young people using TikTok to access news
Please use Chrome browser for a more accessible video player
1:43
Young users to help form moderation guidelines
TikTok has also announced the formation of a global youth council, made up of young people who use the platform, which will launch later this year.
It will operate similarly to TikTok’s content and safety advisory councils, made up of independent experts who help inform its approach to moderation.
Meanwhile, the company said there had been no change to its policies around election misinformation after rival YouTube’s decision to stop deleting false claims that the 2020 US vote was stolen.
The Google-owned platform announced the change earlier this month – a reversal of a policy that had been in place since after the last presidential election, which Donald Trump wrongly claims was illegitimate.