TikTok became the latest tech company to announce tighter protections for teenagers as social media platforms come under increased scrutiny over their privacy safeguards.
The short video-sharing app will roll out a number of features in the coming months, including a default curb for 16 and 17-year-olds on in-app messaging unless it is switched to a different setting. Under 16s will see a pop-up message when they publish their first video, asking them to choose who can watch.
And users aged 16 and 17 will be able to receive a pop-up asking them to confirm who can download their videos. Downloads are already disabled on content posted by under 16s.
The Chinese-owned platform will also stop sending push notifications to users aged 13 to 16 from 9pm — and an hour later for 16 to 17-year-olds — with the aim of reducing their screen time at night.
The moves announced by head of child safety public policy Alexandra Evans and global head of privacy Aruna Sharma build on previous measures to protect young users from predators, bullies and other online dangers.
“It’s important to ensure even stronger proactive protections to help keep teens safe, and we’ve continually introduced changes to support age-appropriate experiences on our platform,” Evans and Sharma said.
“We want to help our younger teens in particular develop positive digital habits early on.”
Google, YouTube and Facebook-Instagram have all recently bolstered defenses for teen users, while critics have been urging Facebook to abandon plans for a children’s version of Instagram.
TikTok was the world’s most downloaded app last year, overtaking Facebook and its messaging platforms, according to market tracker App Annie. The video app surged in popularity, according to market tracker App Annie, despite efforts by former president Donald Trump to ban it or force a sale to US-based investors.