Old enough to Google? Why you'll soon need to verify your age to log in to search tools

By the end of the year, all Australian users will need to provide assurance of their age when they sign in to a search engine account.

A man sitting on a couch with Google open on his laptop.

Search engine providers will have to comply with a range of new rules in Australia as part of an online safety code. Source: Getty / Prykhodov

By the end of this year, the experience of using search engines in Australia won't be as simple as it has always been.

That's thanks to a new online safety code announced by Australia's eSafety Commissioner, Julie Inman Grant.

Among other measures, it will require all Australian users to provide assurance of their age when they sign in to a search engine account.

So what's the new code about? How will it work in practice? And how exactly will it affect kids — and adults — in Australia who use search engines such as Google?

What's in the new code?

The code orders providers of internet search-engine services such as Google and Microsoft (which owns Bing) to "implement appropriate age assurance measures for account holders" within six months.

The code requires providers to review and mitigate "the risk that Australian children will access or be exposed to online pornography, high-impact violence material, and self-harm material" in search engine results.
The industry code of practice defines the age of a “child” as being under 18. Under the new code, a search engine must apply tools and settings that “at a minimum” filter out online pornography and extremely violent material from search results.

Providers must also ensure advertising in these content areas is not served up in search results to child account holders.

Currently, Google account holders must be at least 13 years old.

The code creates several other rules for search engine providers that will impact everyone.
For example, providers must "prevent autocomplete predictions that are sexually explicit or violent" and prominently display crisis-prevention information, such as helplines, in the results for queries relating to topics such as self-harm, suicide and eating disorders.

Search engine providers will also have to blur some images in search results by default to reduce the risk of kids inadvertently accessing or being exposed to pornographic or violent material.

And they will have to provide parental controls to limit or alter children's access to adult material.

On top of these measures, the code requires search-engine providers to report to eSafety, invest in safety and moderation teams, and engage with community organisations.
The new code has been in development since July 2024. The Digital Industry Group Inc, an industry association representing tech companies including Google, Meta and Microsoft, co-led the drafting of it. A single breach could result in a search engine provider copping a fine of up to $49.5 million.

A single breach could result in a search engine provider copping a fine of up to $49.5 million.

How will the code work in practice?

The code does not spell out the measures to be used to assure someone's age.

They could include asking for government-issued ID or be similar to strategies currently being assessed for the Australian government's under-16s social media ban, such as facial recognition technology.

Yet, the government's recent age assurance trials highlighted concerns about the accuracy of age estimation tools, despite claims of their overall effectiveness.

Changing how people search

Once implemented, age assurance requirements will likely change how people engage with search engines and other applications.

Google is used by more than 90 per cent of Australians and for more than just searching.

The Google ecosystem includes Gmail, Google Drive, and Google Maps, providing seamless integration between search and other tools and tasks.
Several Google apps on a phone screen.
Google has a large suite of popular apps and services, such as Gmail, Google Drive and Google Maps. Source: Getty / Chesnot
Repeated age assurance requests could disrupt the seamlessness of content-sharing across devices that users now experience.

Many people also opt to remain logged in to their accounts on multiple devices, to quickly enable cross-device activities.

This means within a family, users of multiple ages may access content on a single account, even when they don't intend to do so.

Will search engines need to change this functionality, to more regularly log users off their accounts, and reconfirm the account holder's age?

And how will the code affect features such as Google's 'incognito mode', which is used for private searching?

The code will apply to "any features integrated within the search functionality and the user interface" of the service, including results generated by artificial intelligence (AI).

This means results generated by Google's Gemini AI service fall under the code, alongside traditional search results.
However, the code doesn't apply to "standalone applications or tools that are not integrated within the internet search engine service".

This means that while a browser extension such as ChatGPT for Google may fall under the code, as an integrated search engine service, the standalone ChatGPT app could be excluded.

This may make searching even more confusing for users, as many people may not understand the limitations of treating generative AI tools like search engines – but they are not.

Will the code work?

As with all age assurance checks, there may be ways people can get around these new search engine controls.

For example, they may use VPNs to trick the system into believing they are outside of Australia (and therefore not subject to age assurance checks).

Or, children may access content on older people's accounts and devices.

However, the code does preempt concerns that children might get around controls by simply not logging in to their accounts. And the code's insistence on reporting mechanisms means people of all ages will be able to report material and raise complaints about potential code violations.

In this way, the code seems to reflect the government's previously proposed (but now, paused) "digital duty of care" legislation, which aimed to hold technology companies to account for the content they provide.

One crucial question remains: will the steps companies take to comply with the code meet Australians' expectations for seamless, integrated search practices and personal privacy as they access information online?

Lisa M. Given is a professor of information sciences and the director of the Social Change Enabling Impact Platform at RMIT University.

Correction: A previous version of this article incorrectly said the code does not define the age of a child and implied eSafety was involved in the drafting of the code. It has been amended to reflect the fact the code does define a child as being a person under the age of 18 years and was drafted by industry.


For the latest from SBS News, download our app and subscribe to our newsletter.

The Conversation

Share
6 min read

Published

By Lisa M. Given
Source: The Conversation


Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world