The Organization for Economic Cooperation and Development (OECD) proposes a structural shift in the regulation of the digital environment: that platforms take on child protection as a design obligation, rather than a function delegated to parents or schools.
The OECD sends a clear message in its latest report on child well-being: the current digital environment is designed without considering minors, even though they constitute a growing and particularly vulnerable part of their users. Instead of focusing on prohibitions or family recommendations, the document How's Life for Children in the Digital Age?directly addresses the structural root of the problem: the design of the platforms, the business incentives driving them, and the weak regulatory architecture surrounding them.
In other words, child protection cannot continue to be a task relegated to parents, teachers, or school psychologists. It must be an obligation incorporated into the operating model of technology companies. And from an economic perspective, this implies costs, responsibilities, and regulatory changes.
A market that monetizes children's attention
The report acknowledges that digital devices offer educational, creative, and social opportunities. But it also warns that business models based on maximizing screen time and data collection directly conflict with child well-being. Platforms are designed to retain users, not to promote healthy use.
The example of algorithmic recommendation systems is revealing: "they can expose minors to inappropriate content, reinforce harmful habits, or lead them to digital communities that worsen symptoms of anxiety, depression, or isolation," warns the OECD. In its current form, digital design does not discriminate by age or vulnerability. And although children access these spaces massively, they do so "in an ecosystem built without them in mind."
The OECD proposes a profound regulatory shift
To correct this systemic flaw, the OECD recommends a paradigm shift: integrating child safety into the core of digital development, through five key measures that should be adopted as regulatory standards:
Who pays for digital child safety?
The implementation of these measures has significant economic implications. For platforms, it means modifying products, assuming redesign costs, strengthening moderation teams, and facing possible sanctions for non-compliance. For regulators, it requires creating verifiable technical standards and providing competent agencies with technological oversight tools.
Some countries have already started moving in this direction. The report highlights the case of Australia, whose Online Safety Act requires technology companies - including those integrating generative artificial intelligence - to demonstrate that their systems do not endanger minors, under penalty of economic sanctions. But the OECD warns that this is not the norm, but the exception.
In Europe, the Digital Services Act (DSA) introduces similar obligations, but its impact will depend on how they are applied at a national level. According to the OECD, most countries still lack a coordinating public entity that oversees digital risks for minors in a cross-cutting manner.
Prohibiting networks for minors? The OECD is not clear about it
Although some jurisdictions have considered banning social media access for minors under 16, the OECD takes a more nuanced stance. It acknowledges the intention to protect, but warns that such absolute prohibitions are difficult to enforce, can violate rights, and shift responsibility to families and schools, instead of holding platforms accountable.
"The real change lies in making digital environments safe by design, not by excluding minors from them," the report points out. In addition, teenagers consulted by the OECD state that they do not want to be excluded from the digital world, but to participate in it with protection, autonomy, and clarity about how platforms operate.
A new digital contract with childhood
In economic and regulatory terms, the OECD's message is clear: child safety should not be an optional extra, but a mandatory component of the digital ecosystem. This implies reviewing regulatory frameworks, redefining responsibility standards for technology companies, and designing a new digital contract where minors are at the center, not on the margins of the debate.
What is at stake is not only the protection of children but also the democratic, ethical, and social quality of the digital economy of the future.