Children’s Rights and Wellbeing
The Leadership Position
Children face four key risk categories (the 4Cs): content, contact, conduct, and contract – ranging from ads promoting unhealthy behaviours to illegal targeting of age-restricted products.
If children are your primary audience, ensure that child safety is a brand priority, rather than a function or a discipline within a business department.
Even if you are not marketing directly to children, we strongly recommend considering children in marketing plans if you are advertising on services likely to be accessed by children. It is essential to prioritise children’s safety and privacy considerations, even if advertising solely in adult-oriented media.
The Commercial Impact
- $11 billion in ad revenue is generated annually by social media platforms from US-based users under the age of 18, according to a 2022 study.
- Respecting child rights and wellbeing enhances company reputation and strengthens risk management and investor confidence.
- Regulations concerning children’s rights are increasing globally, underpinned by regulatory frameworks for children’s online safety in regions such as the UK, EU, US, and Australia as well as corporate sustainability directives.
- A third of children aged between 8 and 17 with a social media profile have an adult user age, by signing up with a false date of birth, according to research commissioned by Ofcom. Two in five (39%) had a user age profile of a 16+ year old.
Taking Action: Six Essential Steps
This Guide is designed to be used alongside the CAN Guiding Principles. Please complete the Principles before implementing Guides.
These six steps were inspired by the eleven principles of Child Rights by Design from the 5Rights Foundation.
In line with the United Nations Convention on the Rights of the Child (UNCRC), we recommend that you define a child as under 18. While much of this Guide focuses on online advertising, many of these issues also apply to all forms of media.
1. Embed Safety-by-Design in the Development and Distribution of Advertisements
- Produce an ‘Advertising to Children’ guide that outlines best practices based on the principles in this document.
- Consider using the Global Alliance for Responsible Media (GARM)’s Brand Safety Floor + Suitability Framework as a reference.
- Ensure a high level of safety, privacy and security by design and default for children in both your advertising and partner and vendor services.
- Consider requiring platform adherence to the industry standards such as the IEEE 2089 -2021 Standard for an Age Appropriate Digital Services Framework based on the 5Rights Principles for children.
- Evaluate both illegal harm and legal-but-harmful content when assessing risk.
- Use definitions contained in legislation such as the Online Safety Act (OSA) or the Digital Services Act (DSA).
- Avoid advertising to children that promotes unsafe or age-inappropriate products or services, e.g., gambling, loot boxes, tobacco, alcohol, films, pornography, and games aimed at adults.
- Ensure your advertising doesn’t exploit insecurities around body image.
- Evaluate platforms’ efforts to prevent hate speech, bullying, lies, extreme, and other harmful content.
- Use and develop tools to screen content before it is published publicly.
2. Comply with Legal Frameworks and Conduct a Child Rights Impact Assessment
- Comply with existing legal frameworks such as the USA Guidelines of the Children’s Advertising Review Unit (CARU), and General Data Protection Regulation (GDPR)
- Follow the ASA guidance outlined for children if targeting ‘family-oriented’ content.
- Consider conducting regular Child Rights Impact Assessment and Human Rights Impact Assessment before implementing digital products or services, or emergent technologies such as AI, and check platforms, vendors and partners have done the same.
- Consider due diligence as well as impact.
- Report problematic content or practices to internal compliance teams and/or regulatory authorities
3. Develop and Place Advertising that is Age Appropriate by Design
- Define a child as under 18 when designing marketing plans (in line with UNCRC).
- Adhere to the Information Commissioner’s Office (ICO)’s Age Appropriate Design Code or relevant geographical code when collecting data from children.
- Avoid microtargeting and dark patterns in advertising that might reach children.
- Use licensed characters and celebrities popular with children responsibly.
- Work with audience verification vendors to understand channel demographics and avoid placing age-inappropriate ads in environments with weak age verification.
- Carry out additional risk assessments if using platforms that are set up to be used by all ages.
- Consider unintended consequences of age verification, such as privacy.
- Ask platforms that use age verification to demonstrate adherence to industry standards such as the IEEE 2089.1-2024 Standard for Online Age Verification.
- Use sophisticated targeting mechanisms to reduce exposure to age-restricted products
- Work with a child-safe technology platform to produce pre-bid segments which exclude children-appealing content without requiring extensive data collection.
- Do not use digital measurement plans for digital campaigns targeted to children under the age of 13 (or higher, as applicable in the market).
- Ensure that family-oriented environments, such as OOH during sports events, avoid featuring age-inappropriate content such as adverts for gambling, alcohol, or games aimed at adults.
4. Support Child Users’ Decision-Making and Reduce Exploitative Features and Business Models that Harm their Agency
- Clearly label sponsored content to distinguish it from authentic, user-generated content.
- Incorporate age-appropriate visual cues and icons to account for literacy at different stages of development.
- Encourage the use of easily recognisable and consistently applied industry-wide standardised labels for different types of audio-visual commercial content.
- Label AI-generated content so children can understand what they are viewing.
- Vet influencers’ suitability based on their audience demographics and content:
- Ensure a substantial audience of their following are not “children” (under 18), unless the campaign has a positive message that respects their wellbeing.
- Tailor language and messages for both comprehension and cultural sensitivity.
- Implement special protections when working with influencers under 18.
- Conduct an internal risk assessment to consider the impact of a digital workload on child influencers’ economic, social, psychological, and physical wellbeing.
- Use advice from the National Network for Children in Employment & Entertainment (NNCEE) to help.
- Create safeguarding policies for your organisation, and with any contract with a parent or guardian.
- Irrespective of parental authority or parental contact and contracts, safeguarding the child is fundamental.
- Conduct an internal risk assessment to consider the impact of a digital workload on child influencers’ economic, social, psychological, and physical wellbeing.
- Turn off behavioural advertising targeting individuals under 18.
- For contextual advertising to under 18, use appropriate safety filters and vetted inclusion lists.
- Leverage contextual advertising options (Interactive Advertising Bureau Tier 1 and 2 categories) in Demand Side Platforms (DSPs).
- Use custom contextual pre-bid filters by known verification vendors.
- Consider the cumulative impact of algorithmic advertising on children’s mental health.
- Avoid exploiting children’s vulnerabilities, paying attention to ASA’s regulations regarding ‘Credulity and unfair pressure’ as well as CAN’s Anti Ad-Fraud Guide. Note that legislation such as the EU’s AI Act prohibits all AI systems that exploit the vulnerabilities of age.
- Consent is important for children; however, take into consideration their maturity, context and wellbeing.
- Make efforts to ensure that anyone who provides consent is at least 13 years old.
- Adapt informed consent procedures so users can easily access information on how their personal data are harnessed, shared and obtained.
5. Embed Privacy-by-Design and Data Protection in Marketing Development and Distribution
- Apply the highest available standards of data protection consistently across all jurisdictions.
- In line with UNCRC General Comment 25 and international best practice for data protection.
- Turn off collection of personal data for under-18s by default.
- Collect and retain only the minimum amount of personal data necessary to provide the elements of a product or service.
- Avoid profiling or targeting children based on their characteristics.
- Avoid neuromarketing, emotional analytics, and immersive advertising techniques.
- Ensure AI-created advertising respects children’s privacy
6. Treat Everyone Fairly and Provide for Diverse Needs and Circumstances
- Avoid stereotypes and negative lifestyle portrayals in advertising content.
- Design algorithms to avoid bias based on protected characteristics.
- Avoid picking up on proxy data points that may also introduce bias based on these characteristics.
- Include diverse positive role models that reflect society.
- Consider accessibility for all types of impairments.
- Prepare for potential trolling and how to support those affected.
- See advice from We Are Social.
Keep up with CAN’s good news
Sign up for our monthly newsletters for our latest good news, event invitations. recommended reads and more.