TikTok is introducing screen time limits in response to concerns about how minors are using the social media app.
Families have struggled to control the amount of time their children spend on the Chinese-owned video sharing platform.
The changes arrive during a period in which governments across the world are growing increasingly sceptical about the app’s security.
Who it applies to
Under the changes, every account held by a user under the age of 18 will have a default 60-minute daily screen time limit in the coming weeks.
Cormac Keenan, head of trust and safety at TikTok, said in a blog post on Wednesday that when the 60-minute limit is reached, minors will be prompted to enter a passcode and make an “active decision” to keep watching.
For accounts where the user is under the age of 13, a parent or guardian will have to set or enter an existing passcode to allow 30 minutes of extra viewing time once the initial 60-minute limit is reached.
TikTok said it came up with the 60-minute threshold by consulting academic research and experts from the Digital Wellness Lab at Boston Children’s Hospital.
TikTok also said that it will begin prompting teenagers to set a daily screen time limit if they opt out of the 60-minute default.
The company will send weekly inbox notifications to teen accounts with a screen time recap.
Some of TikTok’s existing safety features for teenagers’ accounts include having accounts set to private by default for those between the ages of 13 and 15 and providing direct messaging availability only to those accounts where the user is 16 or older.
TikTok announced a number of changes for all users, including the ability to set customised screen time limits for each day of the week and allowing users to set a schedule to mute notifications.
The company is also launching a sleep reminder to help people plan when they want to be offline at night.
For the sleep feature, users will be able to set a time and when the time arrives, a pop-up will remind the user that it is time to log off.
Concerns over TikTok usage
In 2022, children in the United Kingdom spent an average of 114 minutes per day on TikTok.
A study published by the Center for Countering Digital Hate (CCDH) in December found that TikTok shows eating disorder-related content to some teenagers within eight minutes of them signing up to the video platform.
The report found that the Chinese-owned platform, which has more than one billion users worldwide, recommends videos about body image and mental health to teenagers every 39 seconds on average, and that the algorithm is “more aggressive” for vulnerable users.
TikTok amended its community guidelines earlier this year to toughen up its stance on eating disorder-related content.
It means videos that promotes unhealthy eating behaviours or habits are “not allowed” on the platform, and moderators work to take them down.
But the CCDH found that users easily evaded controls by dressing up content as recovery-related and using coded hashtags, in some cases co-opting the name of British singer Ed Sheeran.
Social media executives, including those from TikTok, have been called before the US Congress to explain how they are preventing harm for young users.
Outside exorbitant use by minors, there are growing concerns about the security of the app around the world.
The European Parliament, the European Commission and the EU Council have banned TikTok from being installed on official devices.
That follows similar actions taken by the US federal government, Congress and more than half of the 50 US states.
Canada has also banned it from government devices.