As social media companies continue to grapple with pressure to improve security on their platforms, TikTok has announced that it will be placing new wide-ranging restrictions on the use of beauty filters on the platform, the Guardian reports.
The social media company which boasts billions of users, most of them young people, announced the changes during a safety forum at its European headquarters in Dublin.
According to the company, there have been widespread concerns that these beauty filters have resulted in pressure on teenagers, particularly girls, to adopt a polished physical appearance with negative emotional repercussions. For instance, some young people have noted that after using filters they found their real faces ugly.
The restrictions which will come into effect in the coming weeks, are aimed at stemming the rising tide of anxiety and falling self-esteem that are leading to mental health challenges for teens and young adults on the platform. Per the new restrictions, teens below the age of 18 will be blocked from artificially making their eyes bigger, plumping their lips and smoothing or changing their skin tone.
Beauty filters to be restricted include those provided by TikTok on its platform as well as others created by users themselves. This means restrictions will apply to filters such as “Bold Glamour” – that change children’s features in a way that makeup cannot. Comic filters that add bunny ears or dog noses will also be affected.
There are, however, fears that the effectiveness of the restrictions will be hampered by the fact that a lot of these teens have falsified their ages on the platform. This is despite an ongoing effort by the company to block underage users discovered to have falsified their ages.
TikTok’s beauty filter restriction for teenage users comes at a time when the world is debating the appropriateness of having teens below the age of 16 on social media networks. In the United Kingdom, lawmakers are proposing tougher regulation of underage social media use under the Online Safety Act. The law would compel social media companies to ban users under the said age from their platforms.
Similarly, Australian Prime Minister Anthony Albanese announced last Thursday that a law to ban children under 16 from social media is in the works. Although the bill would need to be passed by the House, it has however clearly stated the age limit and pointed out that the responsibility would be on the social media platforms to enforce the law.
TikTok is already taking steps to ensure that it cleaned its house up long before these laws come in place as it announced it was tightening its systems to block users under 13 from the platform. The company also said it will be trialling new automated systems that use machine learning to detect people cheating its age restrictions before the end of the year.
The platform said every quarter it deletes 20 million accounts belonging to users strongly believed to be underage worldwide with a promise of more. According to TikTok’s lead on child safety public policy, Chloe Setter:
“We are hoping that this will give us the ability to detect and remove more and more quickly. It can obviously be annoying for some young people but the platform will take a safety-first approach,” she said.
She also noted that people who were wrongly blocked will be able to appeal.
The new policies of restricting beauty filters and age verification are part of a series of adjustments to online safety that social media platforms are putting in place before various laws go into effect and tougher regulations are enforced in the coming months, with potentially heavy fines for breaches of online safety rules.
See also: TikTok partners NITDA, Data Science Nigeria to strengthen digital safety across Nigeria