Online safety – the ICO’s Children’s Code

Children’s Code
© Mashiki |

Julian Hayes, Partner at BCL Solicitors LLP, examines the new Children’s Code in relation to online harms which came into force on September 2nd

Neither blessed with a catchy title nor immediately in force, the Age Appropriate Design Code grabbed few headlines when it was issued by the Information Commissioner (ICO) in 2020. Now re-badged as the Children’s Code and in force from 2 September 2021, it is being feted as an early blow in the UK government’s wider campaign against online harms and in particular the risks to the privacy of minors. Broadly drawn, both in terms of the online service providers affected and geographic reach, the Code provides guidance on safeguards for the online treatment of children’s personal data, with compliance underpinned by the potentially severe enforcement powers of the UK GDPR. Sensing the way the wind was blowing, the tech titans had already modified their services, spurring calls for similar measures in other countries. Misgivings over the ambit and practical impact of the Code remain, however, particularly in relation to the thorny issue of age-verification.

Long reach of the Code

Built around 15 high-level standards, the Code specifies the requirements which providers of ‘information society services’ (ISS) must meet if their products are likely – that is, are more likely than not – to be accessed by under 18 year-olds in the UK. The requisite standards, for the most part relatively uncontroversial, include prioritising the best interests of the child when designing and developing online services, establishing the age of individual users with a level of certainty appropriate to the risk, and upholding published terms, policies and community standards.

ISS providers include the majority of online services used by children, from social media platforms, search engines and online marketplaces through to content streaming services, messaging apps and online games. No mere parochial affair, the Code’s geographical sweep takes in not only UK entities but also those elsewhere which offer their services to UK users or monitor their behaviour.

Legal effect & sanctions

Although a product of the Data Protection Act 2018 (DPA), and notwithstanding that courts must have regard to its provisions, the Code does not itself have force of law. Instead, it spells out the measures which ISS providers must meet if they are to fulfill their obligations under key aspects of both the UK GDPR and the less well-known Privacy and Electronic Communications Regulations (PECR) which govern online marketing and brought us cookie banners.

The ICO has issued dire warnings that failure to conform to the Code may invite regulatory audit and will make it more difficult for companies to demonstrate compliance with the UK GDPR and PECR, with the ultimate sanction of heavy – and headline-grabbing – financial penalties for non-compliance. Given the passion which the online safety of children arouses, those suspected of transgressing can expect little indulgence from aggrieved complainants highly motivated to bring their suspicions to the data watchdog’s attention.

Ripple effect

In the months before the Code’s implementation, tech commentators discerned a slew of modifications by social media companies aimed at improving the privacy as well as physical and emotional well-being of young people on their platforms. Instagram disabled targeted advertisements for under 18s and restricted the ability of adults to message children, YouTube videos uploaded by under 18s automatically defaulted to private and ‘bedtime’ reminders were introduced by the streaming platform, whilst TikTok decided to end push notifications for kids after a 10pm ‘watershed’ and also placed curbs on direct messaging for under 18s.

Such measures have been welcomed by UK child safety campaigners and, in the US, have led to cross-party Congressional calls for tech giants to commit to the same standards for young US users of their platforms. Heeding the call, some of the biggest social media companies have applied the changes globally.

Closer to home, the Irish Data Protection Commissioner is consulting on its own ‘Fundamentals for a Child-Oriented Approach to Data Processing’ whilst the French data supervisor, CNIL has published eight recommendations to enhance the protection of children online. Against this backdrop, the ICO’s Code is in the vanguard of a global trend towards tackling some of the internet’s more harmful effects.

The risks of rushing in

Despite the praise garnered by the Code and the steps already taken by social media companies and other ISS providers to comply, its provisions have not escaped criticism. With likelihood of access by children being the trigger for the Code to apply, the breadth of companies potentially drawn within its scope is strikingly wide. Exhortations to adopt a “common sense” approach to whether children are likely to access a service will provide little comfort to well-intentioned smaller providers who may understandably fear regulatory action if they get it wrong. Such providers may err on the side of caution and apply the Code by making their entire service child-friendly by default to avoid regulatory scrutiny.

Where a service cannot be made child-friendly, providers will be obliged to introduce age-gating and age verification measures. The Code suggests a variety of techniques which providers might use, including self-declaration, artificial intelligence, using third party age verification services and hard identifiers such as official photo ID. Quite apart from the costs of such measures for SMEs, they are fraught with privacy risks, including the increased potential for hacking and surveillance. In 2019, the Government dropped a similar proposal for gate-keeping online pornography sites over privacy and data security concerns. The ICO intends to publish further guidance on age-verification later this year.

More generally, by being at the forefront of the online safety regulatory wave, the UK may encounter the hazards which inevitably arise when nimbly leading the way for larger economies. More sizeable markets will have greater sway over tech companies which are keen to achieve a global regulatory baseline to streamline their regulatory obligations. While the Code is laudable in its aims, by running ahead of the pack it risks getting out of step with similar regulation elsewhere, potentially deterring inward investment to the UK by large multinational ISS providers and limiting the ability of home-grown SMEs to scale-up and expand overseas.

Conclusion

The ICO characterises the Code as protecting children within the digital world rather than protecting them from it, and there is little in the high-level standards it contains which is in principle objectionable. Inevitably, the practical application of the Code is where the difficulties will lie but, with the ICO expressly training its regulatory sights on areas containing the highest potential harms and further guidance promised, it is to be hoped this is just the beginning of a constructive dialogue between providers and regulators which will address online risk in a proportionate manner without jeopardising the UK’s vibrant digital economy or the privacy rights of adults. Get this balance right and, in time, it may provide a template for a consistent regulatory model for the online sphere more generally.

LEAVE A REPLY

Please enter your comment!
Please enter your name here