How do we protect young people from harmful social media content?
As a dad, I get asked a lot of questions by my kids about stuff they can find out pretty easily themselves if they could be bothered – factual things, like, ‘Who’s the PM of France?’. They’re glued to a device that can answer any question you can possibly dream of. When I’m asked, I turn into my own dad. “When I were a lad, I had to leave the house, get on a bus, go to a library, find the section, find the book, then use the index to find stuff out! Google it.”
Perhaps they’re just checking if I know the answer. Kids can be cunning. BTW, in case you’re wondering, the French PM is Élisabeth Borne. I’ve heard of her, but I had to Google it. Saved some bus fare there.
The accessibility of information we have these days is pretty mind-blowing if you think about it and unimaginable a generation ago. Kids today are fortunate to have that ease-of-access to things they need to know. It’s one of the wonderful things about the internet. But there’s the downside. The stuff they don’t need to know, or at least, finding out about things they probably shouldn’t on their own.
Social media can be a fantastic thing and was a lifeline to many people during the pandemic. However, the recent tragic case of Molly Russell throws a damning spotlight on the kind of online social media content that is damaging for children, and which can even be, in poor Molly’s case, fatal. This is not a new phenomenon and the safeguarding of children online has been a high-profile concern for as long as the internet has been around. So why is it still a huge, and growing problem? And where does the responsibility lie to keep children safer?
Anyone with kids who have smartphones or tablets will know that it’s impossible to monitor them all the time. There are so many social media platforms and as time goes by, they become much more savvy than adults about social media and the new apps that are released. Those of us in our middle years, on the whole, understand Facebook and Twitter. I’m lost with Snapchat and Tik-Tok, but then again, they’re not aimed at me. They’re aimed at kids – hard to be the Smartphone Police and monitor something you don’t understand. We hope, as parents, that we have educated our kids enough so they can police themselves and talk to us when they come across something that makes them uncomfortable. Sadly, this is not always the case so how do we protect young people from harmful social media content?
Who’s the grown-up here?
While parents have to be watchful, a lot of the responsibility has to come from social media companies. A recent Ofcom investigation discovered that a third of children lie about their age to sign up to social media platforms. Of course they do! Kids fib, just as I fibbed when I was 15 that I absolutely wasn’t drinking White Lightning in Millhouses Park and smoking Benson & Hedges that Paul Thornley had nicked from his dad. There was peer pressure in 1986 just as there is peer pressure in 2022, especially as children feel like they should sign up because their mates already have. ‘Fear of missing out’ is still a thing – only the location has changed. They also sign up because they don’t understand or foresee the dangers that will come up.
Most social media platforms require users to be over 13 years old. Ofcom found that 32% of children have accounts that feature adult content. 47% of eight to fifteen year olds have set their age as 16 or over. And, 60% of children under 13 have their own profiles. Anna-Sophie Harling, from Ofcom said, “We need to work both with parents and young people, but also platforms, to make sure that the ages at which those accounts are set are done in an accurate way.” It could be argued that plenty has been done to alert and educate parents and children over the years about the dangers of unsuitable content, and strangers, that are prevalent online. Yet, here we are, and nothing has changed much. The other thing that hasn’t changed is social media companies’ efforts to seriously do anything about it.
Social media platforms ask for a user’s age so appropriate content can be delivered. Clearly, if a child says they’re 18 when they’re 10, then they are at risk of being exposed to material that they don’t understand, is harmful, and will alter behaviour over time. ‘Enter your date of birth’ is hardly a robust solution. Add into this, targeted marketing so users end up with suggested material based on their browsing habits. Bad news if what you’re looking at isn’t age appropriate or is damaging. Molly Russell experienced this as ‘binge periods’ of unrequested material generated by social media algorithms featuring self-harm which ultimately contributed to her terrible death.
After Molly’s inquest, coroner Andrew Walker speaking about Pinterest and Instagram (owned by Meta), said, “In some cases, the content was particularly graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that could not be recovered from. The sites normalised her condition, focusing on a limited and irrational view without any counterbalance of normality.” Molly’s death occurred because she was able to access, and was sent, content that she shouldn’t have been able to see which exacerbated her underlying depressive illness.
Social media companies need users in order make money. Young people use social media the most. Why on earth would they make it more difficult for users to sign up? But, they have a responsibility to deal with this and the kind of content that certain users are able to see but their reluctance to do so, while repugnant, is understandable – profit. After the inquest, a spokeswoman for Meta said the company was “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers” and that it would “carefully consider the coroner’s full report when he provides it”. A Pinterest executive told the court that the platform “should be safe for everyone”, and that “there was content that should have been removed that was not removed” when Molly was using it. Flimsy acceptance, but not a firm commitment to change very much very quickly.
How do we protect young people from harmful social media content? Ultimately, parents cannot do this alone. Molly’s father, Ian, said, “It’s time the toxic corporate culture at the heart of the world’s biggest social media platform changed. It’s time for the government’s Online Safety Bill to urgently deliver its long-promised legislation. It’s time to protect our innocent young people, instead of allowing platforms to prioritise their profits by monetising their misery.”
We desperately live in hope, but sadly for this generation, the genie is probably already out of the bottle.
Give us a bell or drop us a line. For website design, digital marketing, web hosting, graphic design for print and SEO, contact us for a chat about how we can help you and your business.