As the old adage goes, “On the internet, nobody knows you’re a dog”. But in Australia it might soon be the case that everything from search engines and social media sites, to app stores and AI chatbots will have to know your age.
The Albanese government trumpeted the passage of its legislation banning under 16s from social media – which will come into effect in December – but new industry codes developed by the tech sector and eSafety commissioner Julie Inman Grant under the Online Safety Act will probably have much larger ramifications for how Australians access the internet.
Measures to be deployed by online services could include looking at your account history, or using facial age assurance and bank card checks. Identity checks using IDs such as drivers licences to keep children under 16 off social media will also apply to logged-in accounts for search engines from December, under an industry code that came into force at the end of June.
The code will require search engines to have age assurance measures for all accounts, and where an account holder is determined to be aged under 18, the search engine would be required to switch on safe search features to filter out content such as pornography from search results.
Six more draft codes being considered by the eSafety commissioner would bring similar age assurance measures to a wide range of services Australians use every day, including app stores, AI chatbots and messaging apps.
Any service that hosts or facilitates access to content such as pornography, self-harm material, simulated gaming, or very violent material unsuitable for children will need to ensure children are not able to access that content.
In her National Press Club speech last month, Inman Grant flagged that the codes were needed to keep children safe at every level of the online world.
“It’s critical to ensure the layered safety approach which also places responsibility and accountability at critical chokepoints in the tech stack, including the app stores and at the device level, the physical gateways to the internet where kids sign-up and first declare their ages,” she said.
The eSafety commissioner announced the intention of the codes during the development process and when they were submitted, but recent media reporting has drawn renewed attention to these aspects of the codes.
Some people will welcome the changes. News this week that Elon Musk’s AI Grok now includes a pornographic chat while still being labelled suitable for ages 12+ on the Apple app store prompted child safety groups to call for Apple to review the app’s rating and implement child protection measures in the app store.
Apple and Google are already developing age checks at the device level that can also be used by apps to check the age of their users.
Founder of tech analysis company PivotNine, Justin Warren, says the codes would “implement sweeping changes to the regulation of communication between people in Australia”.
“It looks like a massive over-reaction after years of policy inaction to curtail the power of a handful of large foreign technology companies,” he says.
“That it hands even more power and control over Australians’ online lives to those same foreign tech companies is darkly hilarious.”
One of the industry bodies that worked with the eSafety commissioner to develop the codes, Digi, rejected the notion they would reduce anonymity online, and said the codes targeted specific platforms hosting or providing access to specific kinds of content.
“The codes introduce targeted and proportionate safeguards concerning access to pornography and material rated as unsuitable for minors under 18, such as very violent materials or those advocating or [giving instructions for] suicide, eating disorders or self-harm,” Digi’s director of digital policy Dr Jenny Duxbury says.
after newsletter promotion
“These codes introduce safeguards for specific use cases, not a blanket requirement for identity verification across the internet.”
Duxbury says companies may use inference measures – such as account history or device usage patterns – to estimate a user’s age, which would mean most users may not have to go through an assurance process.
“Some services may choose to adopt inference methods because they can be effective and less intrusive.”
However, those that do may be caught by surprise when it comes into effect, says Electronic Frontiers Australia chair John Pane.
“While most Australians seem to be aware about the discussion about social media, the average punter is blissfully unaware about what’s happening with search engines, and particularly if they go to seek access to adult content or other content that is captured by one of the safety codes, and then having to authenticate that they’re over the age of 18 in order to access that content, the people will not be happy, rightly so.”
Companies that don’t comply with the codes will face a fine similar to that of the social media ban – up to $49.5m for a breach. Other measures such as eSafety requesting sites be delisted from search results are also an option for non-compliance.
Pane says it would be better if the federal government made changes to the privacy act and introduced AI regulation that would require businesses to do risk assessment and ban certain AI activities deemed an unacceptable risk.
He says a duty of care for the platforms for all users accessing digital services should be legislated.
“We believe this approach, through the legislature, is far more preferable than using regulatory fiat through a regulatory agency,” he said.
Warren is sceptical the age assurance technology will work, highlighting that the search engine code was brought in before the outcome of the age assurance technology trial, due to government this month.
“Eventually, the theory will come into contact with practise.”
After recent media reporting about the codes, the eSafety commissioner’s office this week defended including age assurance requirements for searches.
“Search engines are one of the main gateways available to children for much of the harmful material they may encounter, so the code for this sector is an opportunity to provide very important safeguards,” the office said.