The Unfulfilled Promise of Child Protection Online: A Call for Stronger Enforcement and Centralized Oversight

The digital landscape presents immense opportunities for children, offering access to information, education, and entertainment. However, these same spaces also expose them to risks such as harmful content, exploitative commercial practices, and privacy violations. While the European Union has established a legal framework aimed at protecting children online, including the General Data Protection Regulation (GDPR), the Audio Visual Media Services Directive (AVMSD), and the Digital Services Act (DSA), enforcement remains inadequate, leaving children vulnerable.

One of the key weaknesses in the current framework lies in the decentralized nature of enforcement. The GDPR’s “one-stop-shop” principle, which grants the lead supervisory authority to the authority in the place of establishment of a platform, has proven ineffective in addressing the EU-wide reach of many online platforms. Similarly, the AVMSD’s country-of-origin principle, while suitable for national broadcasters, fails to adequately address the cross-border impact of video-sharing platforms. This fragmentation hinders effective regulation and creates loopholes that platforms can exploit.

The DSA represents a significant step forward by centralizing oversight of Very Large Online Platforms (VLOPs) in Brussels. However, even this legislation has blind spots, notably its limited coverage of the video game industry, a sector with significant influence on children. Many video and mobile games fall outside the scope of both the DSA and the AVMSD, leaving children exposed to potential harms within these digital environments.

Another challenge lies in the lack of concrete guidelines to support enforcement. While the guidelines under Article 28 of the DSA on child protection are promising, similar guidance is needed across other EU digital laws. The delayed guidelines on interpreting the GDPR in relation to children exemplify this gap. Clear and actionable guidance is crucial to ensure that apps, games, and AI systems are truly age-appropriate and child-centered.

The rapid development of artificial intelligence (AI) further complicates the landscape. AI systems, often designed with engagement as a primary goal, can pose similar risks to children as online platforms, including pressure on time and attention, exposure to harmful information, and unhealthy interactions. Given the potential for harm, precautionary measures and concrete guidelines for AI systems used by or affecting children are urgently needed.

Addressing these challenges requires a multi-pronged approach. Firstly, centralized oversight for platforms with EU-wide reach is essential. The one-stop-shop and country-of-origin principles should be limited to companies with a national scope. This would streamline enforcement and ensure consistent application of regulations across the EU.

Secondly, the legal framework must be expanded to cover all relevant sectors, including the video game industry. Children deserve the same level of protection in digital games as they do on social media platforms. Including this sector under existing or future legislation would close a significant gap in child safety.

Thirdly, the development of concrete guidelines for age-appropriate design and child-centered features is crucial. These guidelines should go beyond mere interpretations of existing laws and offer practical guidance on building digital products that prioritize children’s rights and well-being.

Finally, robust, inclusive, and privacy-friendly age verification systems are necessary to ensure that children access age-appropriate content. Prioritizing EU-developed zero-knowledge systems would uphold EU values and avoid reliance on external technologies.

It is crucial to remember that protecting children online does not stifle innovation. True innovation lies in creating digital spaces that are both engaging and safe for young users. By strengthening enforcement, centralizing oversight, expanding coverage, and developing clear guidelines, the EU can ensure that its digital landscape truly empowers and protects children. This is not just a legal imperative, it is a moral one.

Share.
Leave A Reply

Exit mobile version