TikTok is making a trio of changes to its mobile app that will make the video-watching experience a tad safer for its young audience. Last year, the U.K.’s broadcasting and telecommuting regulator, The Office of Communications (Ofcom), released new rules requiring sites like TikTok to protect under-18s on their platforms. While there is a public want for it in the U.S, no official ruling has been made. Some of the changes coming were announced by TikTok a while ago, but they are not exactly unique in their approach.
For example, in July last year, Instagram introduced a Sensitive Content Control system that allows users to specify the level of sensitive content that they want to appear in the Explore feed. Divided across two tiers – Limit and Limit Even More – the idea is to let users choose the level of content sensitivity that they are comfortable with watching. Facing hot criticism from lawmakers, parents and child safety organizations, TikTok also promised concrete changes to how it handles and pushes content to its young audience in February this year and has finally revealed the work it has done so far.
The first in line is a new behind-the-scenes categorization system that will sort videos based on what TikTok calls “Thematic Maturity.” The ByteDance-owned social media platform hasn’t shared any granular details on the metrics dictating the categorization but only says it is similar to what the audience is used to seeing with movies, TV shows and games. In the coming weeks, the company will further boost it with an automated system that will block “content with overtly mature themes” from appearing for users aged between 13 and 17 years. Aside from flagging videos that contain mature themes, the system will also target videos with complex themes that are not suited for viewers under the age of 18 years. For example, if a video – even if it is fictional – is deemed too scary, a maturity rating will be attached to it, stopping it from reaching users under 18 years of age.
Another meaningful change that TikTok is adding to its video-sharing platform is the ability to filter content based on specific hashtags or words. Just like Twitter and its advanced muting options for problematic words and annoying hashtags, TikTok users can also do the same for videos appearing in the For You and Following feeds. TikTok says this facility will reach its global audience in the coming weeks. More than just blocking sensitive or triggering content, it will also save users from the avalanche of related videos that they no longer want to see.
TikTok is also handling the menace of videos covering a sensitive topic that bombard a user’s feed. Soon after, the algorithm detects that a user likes or interacts with it in any capacity. The company says it won’t recommend a series of videos that hover around the same subject matter, especially if it happens to be something specific – and sensitive – such as “dieting, extreme fitness, sadness, and other well-being topics.” TikTok says it has received favorable results while testing the system among its U. S. audience. Work is also underway to fix any potential errors with the algorithm that causes it to push a narrow range of content to users, locking them in a bubble of similar ideas, which can turn out to be harmful in the long run. The core objective here is to improve the diversity of content subjects and also the creators.