The UK “Children’s Code” – Laying New Grounds for Age Verification
A 12-month grace period for compliance with a set of standards, introduced for protecting children online, expired in the UK on 2nd September 2021. The Age Appropriate Design Code, commonly known as the Children’s Code, was initially introduced on September 2, 2020. However, the UK’s Information Commissioner’s Office (ICO), which serves as the UK’s data protection watchdog, allowed the 1-year grace period to promote its compliance.
For apps that offer digital services which are likely to be accessed by children, the new code means that a child’s “best interest” must be taken into account to avoid the risk of facing a fine worth £17.5 million.
Let’s take a deeper dive into what the Children’s Code is, what it means for online businesses, and how businesses can comply with the new age verification code.
What is the Children’s Code in the UK?
While the Children’s Code is not a law, it sets out a series of 15 flexible standards that apply to digital services, such as apps, online games, and web and social media sites, that have a high probability of being accessed by children in the U.K.
The aim of the ICO’s Children’s Code is to assist online businesses in creating and maintaining a safe, age-appropriate platform that fairly processes children’s data according to the GDPR. The code essentially advocates age verification policies. It states that unless businesses can prove their platforms are not likely to be used by children, they are left with two options: either make their service compliant with the Children’s Code, or have robust age verification processes in place to identify young users.
In addition to this, the Children’s Code forbids the use of “nudge” techniques that lure minors into giving up more information online than required. Listed below are key takeaways from the ICO’s age verification code for children.
The 15 Standards Issued by the U.K. Children’s Code
1- Best Interests of the Child
Organizations have been mandated to consider the needs of minors on their platform as a top priority. While best interests might vary from user to user, the Children’s Code gives a start warning that states “it is unlikely … that the commercial interests of an organisation will outweigh a child’s right to privacy.”
2- Data Protection Impact Assessments (DPIA)
A Data Protection Impact Assessment (DPI) needs to be integrated into the main design of a digital product or service. This assessment helps businesses identify the risks that children may face through the collection of personal data. Each organization must have a flexible and scalable DPIA in place, which should be published.
3- Age-Appropriate Application
Online businesses need to keep in mind the varying needs of children at different ages and stages of development. This requires organizations to establish age profiles of end-users through any method deemed fit, such as opting for an age verification solution provider, employing self-declaration techniques, or using artificial intelligence models.
Any information provided to the end-users must be clear, easily understandable, precise, and child-friendly. Such information should be concisely delivered at all points where data collection is activated.
5- Detrimental Use of Data
This standard prohibits all organizations to not use a child’s personal data in ways that may be harmful to their wellbeing, or that go against global data protection standards.
Suggested Read: Age Verification – Ultimate Online Protection for Minors
6- Policies and Community Standards
All terms and conditions, community standards, and implemented policies must be published by an organization.
7- Default Settings
Organizations should place “high privacy” as a default setting, since a majority of young users simply accept any default settings that are provided. The Children’s Code states, “It is not enough to merely allow children to activate high privacy settings”.
8- Data Minimization
Digital service providers must collect and retain a minimum amount of Personally Identifiable Information (PII) during services where a child is actively engaged. Additionally, separate choices of different data categories being collected must be provided to the end-users.
9- Data Sharing
According to this standard, a child’s personal information must not be shared with third parties, while disclosure can only include data sharing between the different parts of the same organization
As part of the default settings, the geolocation option must be turned off. An obvious sign must appear on the screen while the location of the end-user is being tracked. At the end of each session, the location tracking option must be switched off by default.
11- Parental Controls
As an age verification technique, young end-users must be provided with age-appropriate information regarding parental monitoring. If an online platform allows parents to monitor their child’s activity online or keep a track of their location, the platform must show an obvious sign to the child during the time of active monitoring.
Profiling, often used a marketing technique, must be switched off by default and enabled only when adequate measures are in place to protect the child’s online wellbeing from any violent or harmful activity.
13- Nudge Techniques
Organizations are advised to refrain from using nudge techniques, that are designed to encourage end-users to follow the provider’s choice of path.
14- Connected Toys and Devices
Children’s toys and other digital devices linked to the internet must comply with the Children’s Code.
15- Online Tools
Organizations must have in place “prominent and accessible tools” that facilitate children online in exercising their data protection rights, including streamlined processes of reporting.
Why This Code is Necessary
Personally Identifiable Information and financial data of end-users sits at the heart of digital services today. From the second end-users open an app, their data starts being collected. But who are the end-users? Who is collecting the data? And where is the data being used? Monitoring this information is necessary under global data protection laws, particularly as malign actors often utilize digital platforms to commit financial crimes, that are frequently directed towards minors. Age verification provides the perfect solution for safeguarding children against such threats. The above-mentioned statutory code of practice can be easily implemented through the use of AI-powered age verification solutions, which seek to protect children from the digital world.
Response to the Children’s Code
In the months following the introduction of ICO’s Children’s Code last year, there have been indications of some major platforms paying mind to the compliance deadline. For instance, TikTok launched a wide array of changes, including restrictions on sharing options for young end-users and automated disabling of notifications from the app past bedtime for those under 18.
How Online Businesses in UK Can Adhere to the Children’s Code
Age verification solution by Shufti Pro not only enables organizations to comply with the UK Children’s Code, but also streamlines customer onboarding while keeping intact global data protection standards. Here’s how the age verification works in four simple steps:
- The end-user adds their name and date of birth in provided fields
- The end-user uploads an image of their official ID document
- The age verification solution uses OCR (Optical Character Recognition) technology to extract data through an automated process
- The end-user is either verified or denied access to the digital service on the basis of the provided age verification threshold
Shufti Pro’s artificially intelligent age verification solution is currently assisting 500+ online businesses comply with global age verification laws, including the GDPR (General Data Protection Regulation), FFDCA (Federal Food, Drug, and Cosmetic Act), COPPA (Children’s Online Privacy Protection Act) and the Gambling Act of the UK.
Need a walkthrough of our age verification services? Avail our 7-day free trial or talk to our experts right away!