Disturbing content on social media could be wreaking havoc on the developing brains of an entire generation.
Last week a video of a US veteran's live-streamed suicide went viral after showing up on TikTok's 'For You' homepage.
Many of the app's young users say they saw the clip by accident, while others seem to have actively sought it out. Some went further still by reposting it, often disguised with a few seconds of innocent footage before cutting to the suicide.
It's not the first time the popular app has been linked to violent or upsetting content. TikTok has been besieged with propaganda videos uploaded by far-right extremists, many of which involve racist slurs or Holocaust denials.
Even ISIS beheading videos have found their way onto the app.
Daily Star's newsletter brings you the biggest and best stories – sign up today
Approximately 41% of TikTok's users are aged between 16 and 24, with many even younger.
The brain is not considered fully developed until about the age of 25, with the prefrontal cortex — responsible for decision making, complex cognitive behaviour and moderation of social behaviour — incomplete until then.
Repeated exposure to upsetting content, such as violent videos or graphic sexual material, during these important years could alter an adolescent's brain in a way that could be damaging in future.
There are already major concerns for what short video-sharing platforms are doing to people's cognitive function.
Social media use in general is associated with shortened attention spans, and now some TikTok users are reporting they can't even concentrate for the entire length of a TV episode because they're so used to watching clips of just a few seconds.
Concerns about violent video games making youths more aggressive — a moral panic which peaked after the Columbine high school shooting in 1999 — proved largely unfounded.
The American Psychological Association Task Force found that violent video game exposure was associated with increased aggressive behaviour, but not with criminal behaviour.
However teenagers today are consuming far more media than they were 20 years ago, with the invention of smart phones giving people 24/7 access to just about any online content in the world.
TikTok in particular has experienced enormous growth this year due to the coronavirus pandemic.
Many experts are concerned about a rise in anxiety and depression reported among adolescents that may be linked to social media use.
But there are also fears that the internet could be affecting brain development in other ways, increasing young people's likelihood of engaging in violent behaviour just as being exposed to real-world violence is known to do.
Repeated exposure to violence at a young age can desensitise people to it, meaning they go on to engage in violent behaviour themselves.
The notorious Slenderman case is one horrific example. In 2014 two twelve-year-old girls lured their friend to the forest where they stabbed her 19 times.
When questioned by police, the girls said they'd been inspired by the creepy Slenderman meme, an internet urban legend of a very tall, thin man in a suit who kills children.
Last month Zachary Latham, 18, was charged with killing his neighbour after multiple altercations which Latham would often film and post to TikTok. It's alleged he stabbed William Durham to death, and the victim's family say the teenager did it so he could film the attack and become "TikTok famous".
Other apps also seem to motivate some people to film acts of violence. Clips of lynchings, rapes and other acts of mob brutality are known to circulate on WhatsApp and other social media apps among Indian users.
Far from being condemned, the violent footage — typically targeting Muslim victims due to the country's escalating religious tension — is celebrated and shared more widely, encouraging others to engage in other acts of violence.
In 2018, Metropolitan police commissioner Cressida Dick, blamed youth violence in the UK on social media sites that "rev people up", fuelling conflicts that often turn deadly in minutes. Fights and assaults are regularly filmed and posted online by their young participants.
In regards to the suicide clip, a TikTok spokesperson told Daily Star Online: "Our deepest sympathies are with the family and friends of the individual. Following an internal review, we found evidence of a coordinated effort by bad actors to spread this video across the Internet and platforms, including TikTok.
"We detected and removed these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide. We also took swift action by banning accounts that were uploading this content."
Regarding graphic content, the spokesperson said TikTok has suggested pooling its resources with other social media giants to work together to keep users safe.
"This is an industry-wide challenge, which is why we have proposed to our peers across the industry that we work together on creating a 'hashbank' for such violent, graphic content and warn each other when such content is discovered so that we can all better protect our users, no matter the app they use.
"We will never stop working to make TikTok an even safer platform for our community so they can continue to freely express their creativity."
They said user safety is the app's "top priority", and that the Community Guidelines and Terms of Service make clear what is acceptable on the platform.
"We use a combination of technologies and moderation teams to review, identify and remove any content or account that is in violation of these, including violent and graphic content.
"We have built several tools to help parents manage their child's experience on TikTok. This includes controls on what content they can see, and how long they can spend online. Parents can read about these tools in our Safety Centre."
Tom Madders from adolescent mental health non-profit Young Minds says social media companies have a responsibility to prevent their users from being exposed to disturbing content.
"Graphic images and videos can be distressing and triggering for young people who see them," he told Daily Star Online.
"To minimise distressing content and its effect on young people, social media companies need to take responsibility for the mental health of those who use their platforms.
"That means making sure their rules are enforced, responding quickly to inappropriate content, signposting towards support and taking action to promote positive mental health.
"Alongside this, the Government needs to introduce the Online Harms law urgently so social media companies are accountable and clear on their responsibilities."
Young Minds strongly encourages anyone who saw the suicide video and needs support to speak to someone they trust, or to contact the Crisis Messenger by texting YM to 85258.
Source: Read Full Article