The Battle for Children’s Well-being in the Digital Age: Legal and Regulatory Approaches to Combating Addictive Technologies

The pervasive influence of screens, social media, and online games on children has sparked a growing concern among legal professionals, academics, and policymakers. Last Friday, Seton Hall University School of Law hosted a conference dedicated to exploring legal responses to these addictive technologies, focusing on the impact of these platforms on young users and potential strategies for holding tech companies accountable. The discussions highlighted the urgent need for action to protect children from the potential harms associated with excessive screen time and exposure to manipulative online content.

New Jersey Attorney General Matthew Platkin delivered a powerful keynote address, emphasizing the importance of holding social media companies responsible for their role in perpetuating these harms. He specifically targeted Meta, accusing the company of knowingly exposing children under 13 to potentially harmful content, violating the Children’s Online Privacy Protection Rule (COPPA). Platkin also revealed Meta’s internal characterization of young users as a "valuable but untapped audience," raising concerns about the company’s prioritization of profit over user safety. The Attorney General’s remarks underscored the growing bipartisan effort to address the negative impacts of social media on children, reflected in a lawsuit filed by 41 states and Washington, D.C., against Meta.

This litigation alleges that Meta designed features on Facebook and Instagram that encouraged addictive behaviors known to be harmful to young users’ mental and physical health. The lawsuit draws heavily on evidence leaked by former Facebook employee Frances Haugen, revealing the company’s awareness of these harms. Platkin signaled that Meta is just the first target in this broader campaign, with investigations into TikTok already underway. Concurrently, hundreds of school districts and individuals are pursuing legal action against major social media platforms, including Meta, Snapchat, TikTok, and Google (YouTube), for the alleged role of their products in the youth mental health crisis. These cases, consolidated in federal and California state courts, are poised to establish legal precedents regarding platform design and its impact on young users.

Attorney Laura Marquez-Garrett, representing the Social Media Victims’ Law Center, presented harrowing accounts of young users who suffered severe consequences from their engagement with social media platforms. These cases included victims of online sexual exploitation, exposure to self-harm and violent content, and even instances of suicide following such exposure. Marquez-Garrett also highlighted Snapchat’s “Quick Add” feature, which allegedly facilitates connections between drug dealers and underage users, emphasizing the dangerous flaws in both product design and underlying algorithms. This testimony urged parents and caregivers to shift their focus from simply monitoring children’s social media use to understanding the specific content they are exposed to, emphasizing the need to address the insidious nature of harmful content recommendations.

Beyond litigation, experts discussed the importance of regulatory approaches to platform design. Fordham University School of Law professor Zephyr Teachout argued that while litigation can establish important standards, it is insufficient to ensure online safety for children. Teachout asserted that legislation is needed to comprehensively address the issue, emphasizing the need to recognize that design features are not protected speech. This view was echoed by other panelists who criticized the tech industry’s attempts to shift responsibility onto parents through parental control features. These efforts were labeled a "wild misdirection strategy" by Corbin Evans of the American Psychological Association, who argued that the onus for change must remain on the tech companies themselves.

The conference also addressed the complex ethical and philosophical implications of regulating addictive design. Gaia Bernstein, a law professor at Seton Hall University, highlighted the tech industry’s tendency to deflect blame onto parents, promoting an individualistic understanding of choice that fails to acknowledge the powerful influence of platform design on user behavior. Elettra Bietti, an assistant law professor at Northeastern School of Law, challenged this notion, arguing for a broader understanding of autonomy that considers the various infrastructural pressures and dynamics shaping online behavior. This perspective aims to counter First Amendment arguments that often obstruct platform regulation, emphasizing the need to balance free expression with the protection of vulnerable users.

The conference underscored the ongoing tension between protecting children online and safeguarding freedom of expression. Attorney General Platkin dismissed concerns about free speech restrictions as a red herring, arguing that deceptive business practices and consumer harm are not protected by the First Amendment. The debate surrounding the role of social media in the youth mental health crisis was also acknowledged, with experts emphasizing the complexity of the issue and the need to move beyond polarized positions to find effective solutions. This gathering of legal experts and academics highlighted the growing momentum towards holding tech companies accountable for the impact of their products on children, marking a crucial step in the ongoing battle to ensure children’s well-being in the digital age.

Share.
Exit mobile version