article
Why Canada should avoid Australia’s teen social media ban: A call for better solutions
This article was originally published in The Toronto Star.
As of Wednesday, all youth under 16 in Australia will be banned from major social media platforms like TikTok, Instagram, Snapchat, YouTube, Reddit, Twitch, and X. For over a decade, whistleblowers, politicians, academics, and experts around the world have sounded the alarm about the online harms people of all ages are exposed to.
Our own research with over 1,000 Canadian teenagers highlighted that youth face a long and diverse range of social media harms, from cyberbullying and image based-abuse (aka “revenge porn”), to algorithms which push violent and gory content, and scammers and predators who target teens through social media — and the list goes on.
And while we agree that something needs to be done when it comes to curbing these harms, a ban is not the right move. Instead of focusing on keeping teens off social media platforms, we should be focused on education and support for teens navigating the online world, and stopping the spread of problematic, violent, and hateful content for people of all ages.
The ban does nothing to prepare teens to respond to digital harms. It makes no investments in education, community training, or parental support. Youth will not be magically prepared to address problematic online behaviours or content when they turn 16.
The time and resources spent on the ban could be better spent on things like providing education and support for digital citizenship, media literacy, privacy rights or resource centres.
If social media is problematic for a 13, 14 or 15 year old, it’s still likely to be problematic for a 16, 25, or 80 year old. There is no body of research that establishes 16 as a “safe threshold” for social media use and the age for healthy use can vary across genders.
The Sydney Harbour Bridge is illuminated on Wednesday in Australia on the first day of the national under 16 social media ban coming into effect. The new Online Safety Amendment laws are aimed at restricting young people’s access to major social media platforms.
Brendon Thorne Getty Images
We are also seriously concerned that under the current model, companies will not be inclined to improve their reporting systems for harmful content. In fact, in response to the ban, YouTube is actually removing a feature that would allow teens to report content they find inappropriate.
Youth under 16 who find ways to use these platforms, despite the bans, will be unlikely to come forward and ask for help if things go wrong. After all, they weren’t supposed to be online in the first place.
The answer to mitigating online harms is not kicking teens offline.
Children and teens are active digital citizens and deserve tools and platforms that are fun, safe, and empowering. If it’s clear that the provision on offer is none of those things, the onus should be on the platforms to change how they are designed and run. The right and brave move then is actually to regulate social media companies and force them to adopt a “safety by design” approach.
This broadly means that, as with the offline world, companies have a responsibility to design products that prioritize children’s safety. For example, many products are not released until after they have undergone rigorous testing for safety and reliability and are subject to independent oversight.
While other countries have these independent auditors, Canada currently doesn’t — although this could change if it reintroduces an Online Harms Bill that proposes a digital ombudsperson.
Safety by design also means social media companies need to be transparent about their own internal research and metrics. These companies collect an incredible amount of data on users and the kinds of content and problematic behaviour happening on these platforms.
In the U.K., companies are forced to publish annual transparency reports and there are calls to make their internal research accessible to academics and others for public scrutiny.
Social media companies also need to be accountable to the ways the platforms are designed and run. These platforms are designed in ways that push certain content and elicit particular engagements.
Unfortunately, rather than taking accountability of online content, social media platforms like X and Meta seem to be going the other way, getting rid of professional content moderators and slashing their Trust and Safety teams who review and respond to flagged or reported content or accounts. Holding these companies accountable is an important step in ensuring children’s best interests are kept in mind.
Around the world, we can see a desire to prioritize children’s well-being. But the way to do this is not to keep kids off these platforms, but to design them with safety principles at the forefront and children’s best interests in mind.