•
Creating safe digital spaces
How enterprise brands can safeguard young online audiences.
As major brands and publishers strive to connect with younger audiences in today’s digital landscape, they face a dual challenge: engaging effectively while safeguarding the privacy of children.
As major brands and publishers strive to connect with younger audiences in today’s digital landscape, they face a dual challenge: engaging effectively while safeguarding the privacy of children.
To successfully navigate this difficult balance, forward-thinking organisations are adopting innovative strategies and technologies that bring together interactive content with robust data protection measures, enabling them to resonate with young readerships without risking their safety. Moreover, a growing number of brands are embracing privacy-by-design principles, integrating privacy considerations into every stage of content creation and distribution. This proactive approach not only ensures compliance with regulations, but also fosters trust and credibility among young audiences and their guardians.
For enterprises looking to achieve the same level of safeguarding without compromising content and connection, in this guide we provide an overview of the most widely observed measures organisations must take to protect minors, covering everything from age verification processes to parental consent mechanisms. Showcasing design elements from Mused – the V&A’s digital playground developed by Big Bite – we also highlight privacy in practice, sharing real world approaches that can help brands and publishers to attract young audiences while upholding their rights and well-being.
Recommended Regulations
Data protection regulations vary across the world, which means achieving compliance for content that can be accessed from anywhere can be a significant challenge, and one that should be considered as early as possible in the build process of any digital content or product. Specifically for the protection of children, key guidelines include:
The ICO Children’s Code
Launched by the Information Commissioner’s Office in 2021, the Children’s Code is a set of 15 standards that all online services must adhere to if they’re accessible to UK audiences under the age of 18. Officially referred to as the Age Appropriate Design Code, the standards aim to ensure that digital services used by children are designed with their best interests in mind, and are applicable to everything from apps and news services to online games and marketplaces.
Developed in response to growing concerns about the impact of digital technologies on children’s privacy and well-being, the Code emphasises the necessity for brands to prioritise children’s privacy and rights in the design of their digital services. Its key principles include data minimisation, which means collecting and retaining the least amount of personal data necessary, and it also mandates transparency, requiring services to clearly communicate to children how their data is used in age-appropriate language. Default settings must be set to the highest privacy level, ensuring that children are automatically given the most protective options without needing to opt-in.
Additionally, the Code calls for robust measures around profiling and geolocation. Profiling children for marketing or other purposes must be avoided unless there are compelling reasons, and geolocation services should be off by default and require clear, age-appropriate information before they can be activated.
Widely recognised as a pioneering step in digital child protection, the ICO Children’s Code has set a global benchmark, and organisations that fail to comply with these standards risk significant fines and reputational damage. As a result, many digital service providers have been driven to redesign their platforms and policies to meet these stringent requirements, reflecting a broader shift towards safer digital environments for young users. The Code not only safeguards children’s data but also fosters a culture of responsibility and ethical design in the digital industry.
Further information about the Children’s Code can be found here.
General Data Protection Regulation
Implemented by the European Union in 2018, the General Data Protection Regulation (GDPR) governs how personal data of individuals in the EU can be collected and processed. Introducing strict requirements for data processing alongside substantial penalties for non-compliance, the comprehensive legal framework also recognises that young people require special protection, and includes clear provisions aimed at safeguarding children.
Specifically, Article 8 of the GDPR sets the age at which children can consent to their own personal data being processed by ‘information society services’, which again covers a wide range of digital applications including educational sites, social media platforms, news sites and online games. It states that only children aged 16 and over can lawfully provide consent for the processing of their personal data, and in the cases of younger children, consent must be provided by an adult with parental responsibility. Member states are permitted to lower that age to 13, which the UK enforced via the Data Protection Act in 2018. Organisations must also make reasonable efforts to ensure that adults providing consent do in fact hold parental responsibility.
GDPR also mandates that information provided to children about data processing must be in clear and plain language that they can easily understand, ensuring that children are informed in an accessible way about how their data will be used, empowering them to make informed decisions. Similar to the ICO Children’s Code, the regulation also emphasises the principle of data minimisation, meaning that organisations should only collect the data that is strictly necessary for their purposes. This principle is particularly crucial for protecting children, as it limits the potential for misuse of their data.
Additionally, the GDPR encourages the implementation of robust safeguards to protect the privacy of children, such as default privacy settings that are more stringent. Profiling of children is also heavily scrutinised, and organisations must avoid processing children’s data for profiling or marketing purposes unless they can demonstrate compelling legitimate interests and adequate safeguards.
Consolidated text of the GDPR can be found here.
The Children’s Online Privacy Protection Act
A United States federal law enacted in 1998, the Children’s Online Privacy Protection Act (COPPA) applies to the collection of personal data of children under the age of 13, and sets out the responsibilities of organisations to protect children’s privacy. Administered by the Federal Trade Commission (FTC), COPPA applies to any online service – regardless of its country of origin – that’s either directed towards US users or knowingly collects information from children based in the US.
The law’s specific requirements on websites and online services directed at children or that knowingly collect personal information from children, with a primary goal of giving parents control over the data. Specifically, the law mandates that operators of such websites and services must provide clear and comprehensive privacy policies that describe the types of data collected, how the data will be used, and whether it will be disclosed to third parties.
One of the cornerstone requirements of COPPA is obtaining verifiable parental consent before collecting, using, or disclosing personal information from children. This consent must be obtained through reliable methods – such as a signed consent form, credit card transaction, or phone call with a trained customer service representative – to ensure that parents are fully aware and agree to the data collection practices affecting their children. COPPA also gives parents the right to review their children’s personal information collected by a service, as well as request its deletion and opt out of further data collection, empowering parents to actively manage their children’s online privacy and mitigate potential risks.
In addition, COPPA restricts the collection of certain types of information such as geolocation data, photographs, and videos, without explicit parental consent. It also requires that reasonable measures be taken to protect the confidentiality, security, and integrity of the collected information, and issues hefty penalties – sometimes reaching millions of dollars – to organisations that fail to comply.
Full details of the COPPA regulation can be found here.
Beyond data protection
On top of measures designed to prevent children from being ‘datafied’, there are numerous steps that brands and publishers can take as part of a broader strategy to protect children from online harm.
Content moderation and filtering
Developing robust content moderation systems is crucial. Organisations can employ advanced algorithms and human moderators to review and filter content, ensuring that harmful or inappropriate material doesn’t reach young users. Additionally, filtering software can be used to block access to specific websites or content types, helping to create a safer online environment tailored to children’s developmental stages.
Parental control features
Incorporating parental control features into digital products allows parents to set boundaries on their children’s online activities. These controls can include time limits, content restrictions, and monitoring tools that provide parents with insights into their children’s online behaviour, empowering parents to proactively manage and protect their children’s digital experiences.
Age-appropriate experiences
Digital products and services should be designed with the specific needs and cognitive abilities of children in mind, which means using simple, intuitive interfaces and avoiding complex navigation that could confuse younger users. Games and educational tools should be engaging but also safe, steering away from features that could expose children to risks, such as unmoderated chat functions or links to external websites.
Cybersecurity measures
Alongside compliance with data privacy and processing legislation, brands must ensure that stringent cybersecurity measures are in place to protect the personal information and activities of young users. This includes regular security audits, encryption of data, and protection against cyber threats such as hacking and phishing to prevent unauthorised access to children’s data and online activities.
Safe online communities
Digital spaces can provide children with supportive environments where they can socialise and learn, however if organisations enable children to interact online via features such as discussion boards, comment sections and chat areas, they must protect users from abuse, bullying, harassment and exploitation. In addition to clear codes of conduct, this means strong moderation at all times.
Reporting and response mechanisms
Establishing clear, accessible mechanisms for reporting inappropriate content or behaviour is essential. Children of all ages and their guardians should be able to report issues easily, and organisations must respond promptly and effectively. This can include hotlines, in-app/on-site reporting tools, and dedicated support teams trained to handle such reports.
Policy review and improvement
Finally, organisations should regularly review and update their policies and practices to keep pace with evolving online challenges, legislation and technological advancements. Continuous improvement ensures that child protection measures remain effective and relevant.
Privacy in practice
A firm fixture in the capital since 1852, the V&A is renowned for its diverse art, design and performance collections that span 5,000 years of human creativity. Originally conceived to inspire British designers and manufacturers, today it aims to champion creativity on a much wider scale while also inspiring younger generations.
In line with its objectives, the prestigious organisation has recently opened the doors to its renovated Bethnal Green exhibition space which is dedicated entirely to children. Formerly known as the Museum of Childhood, the rebranded Young V&A brings together almost 2,000 toys in surroundings that aim to spark imagination, play and design. In tandem with the launch, the V&A has also unveiled mused – its new online destination powered by a custom WordPress CMS developed by Big Bite.
Robust safeguarding
With online safety also a key requirement for the project, we’ve ensured that the site complies with a number of robust standards, including:
- The ICO Children’s Code – a statutory code of practice that covers how online services aimed at children, or likely to be used by children, should comply with UK data protection legislation.
- General Data Protection Regulation (GDPR) – an EU regulation that governs how personal data of individuals can be collected and processed.
- The Children’s Online Privacy Protection Act (COPPA) – a United States federal law that applies to the collection of personal data of children under the age of 13, and sets out the responsibilities of operators to protect children’s privacy.
As a result, the site protects users, provides peace of mind for parents and carers, and enables the V&A to share content in a safe and secure digital environment while inspiring the next generation of creative minds.
Read the full V&A – Mused case study.
Looking ahead
Protecting children online requires a multifaceted approach that goes beyond data protection policies, which means safeguarding young people must be a key consideration throughout every stage in the development of digital tools, apps, sites and features. By integrating robust content moderation systems, implementing effective parental control features, and designing age-appropriate experiences that cater to the cognitive abilities and developmental needs of children, enterprises can help to create a safer, more engaging online environment for young users.
Moreover, regular reviews and updates of these measures are crucial as online challenges evolve and new legislation emerges. Governments worldwide are increasingly scrutinising internet services and introducing laws such as the UK’s Online Safety Act 2023 to bolster child safety. Organisations must stay vigilant and proactive, ensuring they meet current legal requirements, anticipate future regulatory developments, and have the technical capabilities and platform flexibility to easily implement changes and additions as required.
Looking ahead, it’s evident that adequate safeguarding requires ongoing effort and adaptation, however by prioritising well-being alongside safety, organisations can create digital environments that not only comply with legal standards, but also allow young people to learn, explore, and thrive online.