Share this article on:
The online industry has six months to develop a set of rules and processes to protect children from harmful content.
Australia’s eSafety commissioner, Julie Inman Grant, has given notice to social media companies, search engine operators, and pornography websites, requiring them to establish “enforceable codes” to protect children from high-impact content online.
The eSafety Commissioner made the official announcement on 2 June after months of discussion on introducing an aged-based social media ban, which the federal government is currently trialling.
The codes will focus on pornography but cover a wide range of other services, including app stores, ISPs, online dating services, and even multiplayer video games.
Inman Grant noted the ever-decreasing age when children are first introduced to pornography as a key concern.
“Our own research shows that while the average age when Australian children first encounter pornography is around 13, a third of these children are actually seeing this content younger and often by accident,” Inman Grant said in a statement.
“We know kids will always be curious and will likely seek out porn as they enter adolescence and explore their sexuality, so, many of these measures are really focused on preventing unintentional exposure to young children.
“And it’s not just porn sites we are talking about here, with 60 per cent of young people telling us they were exposed to pornography on social media. This exposure was often unintentional and happened on popular services, including TikTok, Instagram and Snapchat.
“The last thing anyone wants is children seeing violent or extreme pornography without guidance, context or the appropriate maturity levels because they may think that a video showing a man aggressively choking a woman during sex on a porn site is what consent, sex and healthy relationships should look like.”
The code and the measures it takes to protect children could include age-gating, introducing default safety levels, and creating better tools to allow users to filter out unwanted content.
Industry bodies have until 2 October 2024 to present a preliminary draft, with final codes due for registration before 19 December 2024. The process will also include a period of public consultation, and eSafety has released a position paper to help the industry respond.
“We want industry to succeed here, and we will work with them to help them come up with codes that provide meaningful protections for children,” Inman Grant said.
“However, if any code should fall short, under the Online Safety Act, I have the power to set the rules for them by moving to standards.”
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.