Cigs for Kids
Kids - from toddlers to teens - are exposed to a highly addictive substance every day
People like to make an over-generalized assumption that it is the rise of social media that is making people more mentally unwell. There isn't actually any evidence of this: in fact research has consistently found no correlation between screen time and mental health.
But social media is far from harmless. And that's especially the case for children.
Social platforms have, for years now, turned a blind eye as kids way below their minimum age restrictions have signed up to their platforms.
This is a huge problem. Social media is awash with sexual predators, who have almost nothing blocking their access to extremely young children. And on any social media platform, it takes minutes to find ‘pro-ana’ content supporting eating disorders.
But even beyond the concrete evidence of wrongdoing, conceptually it’s clear why 10-year-olds should not be on Instagram. These are platforms that are deliberately designed to be addictive to users and deliberately designed to alter your brain chemistry.
In other words: getting kids hooked on social media is not a million miles off getting them hooked on nicotine.
Incidentally, over the last five years we’ve done just that. Politicians have only just started to wake up to the fact that those bright pink Elf Bars with flavour names like ‘Watermelon and Candy’ were being marketed towards children. One in ten kids aged 11-15 now vape regularly.
Vapes, while well intentioned and undoubtedly helpful for adult smokers, have been immensely harmful for children that would otherwise never have picked up a cigarette. One day I suspect we’ll look back on social media as the same way we look at vapes today: a pretty brutal net-harm that took us far too long to notice.
TikTots
TikTok has done a good job of branding itself among a class of apps that is more about passive consumption than connection and active use.
In other words, it’s presented itself as being more about watching videos (like YouTube or Netflix), than it is about social networking or content creation (like Instagram or Twitter). This gives parents a false sense of security: it is not that different from plonking a kid in front of the TV.
That may explain why 16% of three- and four-year olds in Britain are on TikTok, and a third of five- to seven-year olds.
Look up any article about this, and a TikTok spokesperson will give the same response: TikTok is a "strictly a 13+ platform". But when they claim that 12-year-olds shouldn’t be on their platform, they may as well say “whaaaaaa! whaaa?! I never knew that!”
TikTok is supercharged with addictiveness, and has outsized detrimental effects on young people.
Research has confirmed: TikTok is the most highly addictive of all social media platforms.
In fact, it takes just 35 minutes to become addicted to TikTok. Once hooked, TikTok’s own research finds the impacts to be a “loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”
To come back to the impact on mental health, social media may not always cause mental illness, but it can certainly exacerbate it in ways no other tool or technology has been able to do before. If you install TikTok and engage with content related to poor mental health, it will take between 3 and 20 minutes for more than half of the videos you see to be about mental illness, including romanticizing, normalizing or encouraging suicide.
So what can we do?
Australia has introduced a new bill to ban under-16s from social media, and put strict penalties on social media companies found to breach the bill.
The challenge: the only way to truly enforce an age restriction on social media is by insisting on a photographic ID. For data protection, that’s a red flag: it’s a scary thought that big tech companies (especially ones based in adversarial countries) could have access to billions of people’s government-issued IDs.
Not to mention that, for all its ills, anonymity on social media is not always a bad thing: particularly for whistleblowers.
So, what's the solution?
First: governments should stall the addictiveness of social media. Social media companies measure their success based on usage time: in other words, how addictive they can make their platforms. The government must intervene, with the help of psychologists and neuroscientists, to curtail the features and design choices that boost addictiveness.
Many tech companies use dark patterns to influence their users. Dark patterns are intentional design choices that push users towards a specific action - usually one that is bad for them, but good for the company. For example, making it difficult to delete your account, or sending a cryptic notification to alarm you into logging in (like “Someone’s talking about you!”).
We should force tech companies to introduce benevolent dark patterns. For example:
The infinite timeline that is now a feature of so many platforms could be reworked to end.
Late-night scrolling could be disincentivised by reducing the quality of content after midnight.
Profiles could default to ‘private’, with additional obstacles to making them public.
All of these changes would serve to keep kids safer, and reduce the addictiveness of these platforms.
Above all, social media companies should be forced to disclose just how harmful they are to kids.
These companies have done research into the dangers they pose. Meta, for example, knows that Instagram is force for bad for teenage girls. Yet they can suppress this research, and ensure it never sees the light of day - like a cigarette company covering up the link between tobacco and lung cancer.
Governments should force social media companies to conduct and publish research into the harms of social media. If parents were armed with that information, but additionally children, they could at least make more informed choices about how to expose their children and themselves to this highly addictive substance.
When does it become parental responsibility to police their own child’s social media usage?