Google to Block Access to Adult Apps for Under-18s in Singapore

By 2026, teens in Singapore will lose access to explicit mobile applications under Google’s new safety policy. Singapore’s Ministry of Development and Information’s statutory Board (IMDA; Infocomm Media Development Authority) has introduced the rules to bar teen access on Friday, 5 October.
The restrictions are a result of Singapore’s Code of Practice for Online Safety for App Distribution Services, which mandates age assurance systems and content safeguards for minors. This regulatory module obligates tech giants like Apple, Microsoft, Huawei, and Samsung to follow the ministry’s guidelines and bar teens from accessing sexual or violent content online.
Some of the controls to protect teens from explicit material online already exist, such as parental controls. Google claimed that their new safety protocols aim to bar access of teens by default, without the need for external controls.
“This isn’t just about giving parents more tools. It’s about our systems automatically providing an added layer of protection to ensure that every young person has age-appropriate experiences,” said Ben King, managing director of Google Singapore.
The updated framework would require Google to estimate users’ ages and restrict their access accordingly. This can be achieved through trained AI models, which can determine a user’s age based on their online activities. If the system detects someone is under 18, it will activate enhanced safety filters across Google’s apps and services.
These enhanced safety filters will automatically block users from accessing explicit or violent content. To store the necessary data for these advanced filters, Google will turn off location history tracking in Google Maps to reduce non-essential data collection
YouTube will also see new controls aimed at promoting “digital well-being” for under-18s. The video-sharing platform is likely to introduce reminders for users to take a break from continuous scrolling after a certain period. The platform also aims to restrict users from constant exposure to content with harmful ideals.
For users who are using the platform without signing in, Google said some safety features will still operate, such as blurring explicit visuals in Search and activating YouTube’s Restricted Mode. Adults wrongly flagged as under 18 will be able to confirm their age through an official ID or facial verification.
Rachel Teo, who is the Head of Government Affairs and Public Policy for Google in Singapore, said the company’s goal is to provide consistent protection even when users browse anonymously.
Although the initiative aligns with IMDA’s aim to enhance online safety for young users, similar age verification measures implemented in the United States and Britain have sparked privacy concerns. Critics warn that such systems could misclassify users and misuse behavioural data collected for age estimation.
The IMDA said the new code aims to ensure “safer, more trustworthy digital environments” and will require compliance from all major app distributors operating in Singapore by March 2026.