UK app makers must now comply with the 'Children's Code'

Gabby Fernie

After a 12-month grace period, a set of regulations governing how online services should treat children’s data in the UK has come into force. 

The Age Appropriate Design Code (AKA the 'Children's Code') was written into law by data protection watchdog, ICO. It mandates websites and apps must take the “best interests” of their child users into account, or face fines of up to 4 per cent of annual global turnover.

The code applies to connected toys and games but also online retail and for-profit online services such as social media and video sharing platforms which have a strong pull for under-18's.

Among the code’s stipulations are that a level of “high privacy” should be applied to settings by default if the user is (or is suspected to be) a child.

The code also prohibits the use of “nudge” techniques aimed at encouraging children to give up more of their privacy than they would otherwise choose to, calls on companies to minimise the data they collect about children, and requires them to offer children privacy options that default to the maximum security.

“Overall, the age-appropriate design code being enforced from today is welcome news - any steps taken by social platforms to protect younger users is always a positive thing," said Chief of Strategy at Emplifi, Yuval Ben-Itzhak. “However, the recently suggested method, where kids are expected to type-in their birthdate to be restricted from content, is fragile. Without a reliable and advanced validation process, such as ID scanning or facial analysis technology, the data collection can be easily manipulated and the overall purpose will be missed. While well-intentioned platform vendors may claim to serve content per the typed-in age and feel they are doing right, under 18s are inherently digital-savvy, and it won’t take much for them to bypass this rule by setting their 'desired' age to access the content they like.

“Certainly, the user verification process is not an easy one to get right. While content moderation is challenging, it is still something algorithms and humans should label and filter while letting users select the type of content they are interested in. For global online platforms, it’s important to recognise the complexity of this challenge. For example, culture and religion will undoubtedly need to be considered when assessing content.”

The Age Appropriate Design Code comes into effect on Thursday.