A group of Welsh students interrogated social media bosses from TikTok and Meta over the dangers of their platforms during a school event.
The students attending Pontypridd High School in Rhondda Cynon Taf raised concerns about harmful content on the apps and how it can affect users’ mental health.
Both TikTok and Meta responded by saying that they take user feedback seriously and are constantly working to improve safety features on their platforms.
Alex Davies-Jones MP organised the event as part of the UK government’s online safety bill, which is currently going through Parliament.
This crackdown on harmful content comes as studies have shown that social media can damage teenagers’ mental health. In one recent study, doctors demanded that social media firms hand over data to help them understand the impact of social media on young people’s health.
Caitlin, a student at Pontypridd High School, said she had seen “many challenges aimed at young girls to eat roughly 300-400 calories in a day” on TikTok.
“I think the reporting process should be looked into a bit more since it takes so long for things to be taken down; by the time they are taken down, the harm’s already been made,” she said.
TikTok said it strictly removed content that promotes disordered eating.
After posting about Manchester United Footballer Mason Greenwood, Caitlin had to deal with online abuse. She said she was called “fat”, “ugly”, and told to kill herself.
“I just want people to know they’re not alone,” she said. “There are so many people that are going through the same thing.”
Meta, which bills itself as a “social media for good”, was also questioned by the students about the content on its platform.
“We have a team of moderators who review all of the content uploaded to our platform,” said Meta’s head of safety, Kate O’Neill.
“We also have a set of community guidelines which outline what is and isn’t acceptable on Meta, and we take those very seriously.”
O’Neill said Meta was working on a new feature allowing users to report content anonymously.
“We know that sometimes it’s hard to speak up, especially if you’re worried about retaliation or being targeted,” she said.
“So we want to make sure that people have a way to report content anonymously if they don’t feel comfortable doing it directly.”
She suggests that users can take control of their experience on Meta through features such as blocking and muting.
“For example, if you don’t like the word ‘hate’ or ‘loser’, whatever it may be, you can set a list of words that you will always get filtered from your comments,” O’Neill said.
Megan Thomas, public policy associate manager at Meta, which owns Facebook and Instagram, said the company has recently developed new features “designed to help prevent people from having to experience any kind of harmful content on our platforms in the first place.”
“We’re always looking for ways to make our platforms safer and more welcoming for everyone,” she said.
The UK government’s online safety bill is currently going through Parliament and is expected to become law later this year. It will include a new regulator to issue heavy fines to social media firms that fail to protect users from harmful content.
Do you think that social media platforms are doing enough to protect users from harmful content? Let us know in the comments below.
For more updates regarding international education, follow us on IPGCE and WeChat.