The proposed social media ban for children under 16 – a world-first legislation – has passed through the Australia Senate with support from both sides of the aisle.
But it has already come under fire from the world’s largest social media platform Meta – the company behind Facebook, Instagram, Threads and WhatsApp – who claim the new law was “rushed through without properly considering the evidence”.
The new law won’t come into effect for 12 months and at that time any user under 16 will be banned from accessing social media platforms like Facebook, Snapchat, TikTok, Instagram and others.
This whole proposal came about from a concern about the mental health of youngsters and the effects of social media on their self-image, self-worth and general wellbeing.
The move was also seen as a possible antidote to bullying after a number of cases where children as young as 12 have committed suicide after relentless bullying after constant attacks on this platform.
Here is the full statement from Meta:
“Naturally, we respect the laws decided by the Australian Parliament. However, we are concerned about the process which rushed the legislation through while failing to properly consider the evidence, what industry already does to ensure age-appropriate experiences, and the voices of young people.
“Last week, the Parliament’s own committee said the “causal link with social media appears unclear,” with respect to the mental health of young Australians, whereas this week the rushed Senate Committee report pronounced that social media caused harm. This demonstrates the lack of evidence underpinning the legislation and suggests this was a predetermined process.
“The task now turns to ensuring there is productive consultation on all rules associated with the Bill to ensure a technically feasible outcome that does not place an onerous burden on parents and teens and a commitment that rules will be consistently applied across all social apps used by teens.
“One simple option is age verification at the operating system and app store level which reduces the burden and minimises the amount of sensitive information shared.”
Under the new laws the social media platforms like Facebook, Instagram and TikTok could be fined a staggering $50m if they are not co-operating and doing their part to keep under 16s off their respective platforms.
But on the user side there appears to be no penalty for kids and parents who break the rules.
Mental health and digital wellbeing experts are divided on the ban.
Tech Guide (so me, Stephen Fenech) believes the move is a grab for votes as we approach a Federal election in 2025.
The idea that a total ban across the board for under 16s to fix these issues is like breaking an egg with a sledgehammer – it’s a huge move to address some aspects of social media that are affecting some teens and their mental health.
The two main areas that the Government is trying to address is mental health and self-image of teens and bullying.
These issues are so large and wide ranging that they go well beyond just social media.
On the bullying side – a lot of the bullying happens on messaging apps like WhatsApp, Messenger and text messages – but these platforms are not included in this ban.
I think the Government should have flipped the responsibility for the content seen on social media and the effect it is having on the content creators themselves whose only goal is to go viral and they go to extremes to achieve these objectives.
Why aren’t they being held to account?
We spoke with digital wellbeing expert Dr Joanne Orlando on Episode 622 of the Tech Guide podcast and she also disagrees with the ban. You can listen to the whole episode through the player below.
“One strategy like banning social media for under 16 isn’t going to fix such a complex problem,” she said.
“And I think that’s the problem with this new proposed legislation. The government needs to think about it a little bit different way.”
Ms Orlando said a lot of positive aspects are also now going to be taken away.
“There’s a lot of issues with the content, but there’s also a good side to it,” she said.
“For young people that’s the way they keep in contact with their friends. It’s very central to their social life.
“It’s how they get a lot of new ideas for learning and interests and the platform is really integral to being an adolescent and young adult now.
“So rip that away. And what are they left with? It’s the only world they’ve known. So I think we need to think about a little bit differently.”
The other issue is the perception from this ban that the government is trying to parent our children on our behalf.
“There’s a lot of parents who are saying, hey, why is the government telling me how to parent? That’s not their role. I didn’t invite them to,” Ms Orlando told Tech Guide.
“I’m the parent, I’ll make the decision for my own children.
“So there’s a lot of people, a lot of parents who don’t think this is the role of government at all. They want to be able to choose how they bring their child up.”
And on the point of making content creators responsible for the effects of their content, Ms Orlando agreed with our opinion.
“So at this point, the ban is kind of just pointing at the finger at the young people.
“You’re not mature enough to use social media. Yet, the content creators and the platforms can kind of do what they want, can’t they?
“Content creators can create anything. It could be full of misinformation. They’re still allowed to put it up and it will still go viral because probably because they’re saying something extreme on there that’s going to get viral.
“So I think absolutely they should be held accountable. Why aren’t they being fined for uploading content that is deliberately misleading or deliberately misinformation?”