TikTok has announced that it will be making an important move to protect the mental well-being of its young users. From next month, the social media giant will limit the use of beauty filters that distort one’s features such as large eyes, smooth skin or puffy lips for users under 18 years old. This follows from rising fears that such filters might cause anxiety and low self-esteem in teenage users.
Beauty filters are very popular on the platform but according to a few experts, they have been affecting the mental health of young people. Teenagers feel under pressure and inferior and face body image issues. Some users reported that once they start using beauty filters, their real faces appear “ugly” to them. This, in turn, has pressed social media companies to act in order to suppress the negative impacts of beauty standards.
The new restrictions on filters by TikTok will target those that dramatically change a user’s appearance, such as the “Bold Glamour” filter. Filters that add fun effects, like bunny ears or dog noses, will not be affected by these new rules. The company said that it wants to reduce the pressure to look a certain way online without limiting fun.
It is part of a bigger effort to enhance the safety of children online. Additionally, it will make the signing up process difficult for individuals below 13 as the company is looking forward to enhancing its age verification systems. Before the end of the year, TikTok will begin utilizing automated systems that depend on machine learning to identify people lying about their ages. Currently, the platform eliminates about 20 million accounts of minors every quarter. These changes come just ahead of enforcing the UK’s Online Safety Act due in 2024.
It’s not beauty filter restrictions only, though, that would hit TikTok as a result of safety updates. Other social media networks are also modifying the rules for minor users. For example, Meta-owned Instagram came up with “teen accounts” through which parents can have more control over what their children are doing such as limiting access to the application at certain times. The popular gaming website Roblox is restricting access to violent and inappropriate content from younger users as an outcome of concerns about the safety of children while on the internet.
Despite these changes, experts say that there is still more to be done to protect young users from harmful content. Andy Burrows, CEO of the Molly Rose Foundation group dedicated to suicide prevention-appreciated TikTok’s efforts but he believes that they’re doing this in response to the new regulations and not to protect their users’ mental health. He further demanded clear transparency in TikTok’s age verification systems, insisting that the platform has a long way to go when it comes to blocking unhealthy content from reaching young users.
The NSPCC also welcomed the changes but insisted that these were only “the tip of the iceberg.” Richard Collard, associate head of policy for child safety online, urged other platforms to follow TikTok’s lead and called on regulators to enforce stricter age limits for social media usage. He said that tech companies must also do more to create an age-appropriate experience for their users.
The debate about which is the best approach to ensure children are safe online continues as social media platforms face increasing pressure to protect young people. Though the one made by is the right move, experts agree that more comprehensive actions are required in order to address the vast array of risks that arise from social media.
Leave feedback about this