Australia Grapples with the Thorny Issue of Online Harms: A Call for Stronger Regulation
The digital landscape has become a breeding ground for harmful content, ranging from violent videos and misogynistic material to misinformation and scams. Recent events, such as the circulation of footage from the Sydney church stabbing and the ongoing domestic violence crisis, have intensified calls for greater regulation of online platforms. The defiant stance of Elon Musk, owner of X (formerly Twitter), against complying with Australian legal requests to remove violent content has further fueled the debate. Public concern is mounting, and the government is under pressure to implement effective measures to curb online harms. A newly established parliamentary committee will delve into this complex issue, navigating a minefield of legal, practical, and ethical challenges.
Government Initiatives and Parliamentary Inquiry: A Multi-Pronged Approach
The Australian government has embarked on a series of initiatives to address online harms. In May, the government announced measures to combat violent online pornography and misogynistic content targeting children and young people, including legislation to ban deepfake pornography and a pilot project on age-assurance technologies. Furthermore, a joint parliamentary select committee has been established to investigate the broader influence and impact of social media on Australian society. The committee’s remit encompasses Meta’s withdrawal from the News Media and Digital Platforms Bargaining Code, the role of Australian journalism in countering misinformation, the impact of platform algorithms on mental health, and the dissemination of harmful content, including scams, age-restricted material, child sexual abuse, and extremist material.
Reviewing Existing Frameworks and Addressing Misinformation: Refining Regulatory Tools
The parliamentary committee’s mandate includes reviewing the ongoing review of the Online Safety Act and contributing to the development of a bill to combat misinformation and disinformation. The government’s initial draft of this bill, released in 2023, faced criticism from the opposition. The committee’s deliberations may provide an opportunity to refine and strengthen the proposed legislation. The committee’s examination of Meta’s decision to abandon the News Media Bargaining Code, while seemingly redundant given the minister’s power to designate Meta, could serve to reinforce the principles underpinning the code and highlight the precarious financial state of Australian news media.
The Challenge of Online Age Verification: Balancing Safety and Privacy
Online age verification presents a significant challenge. While the concept is straightforward, implementation is complex, especially without robust consequences for non-compliance. Existing approaches, such as requiring minors to upload videos or photographs of their ID, contradict established online safety advice against sharing personal information. Effective age verification often requires parental intervention, either through software or supervision. However, the ease with which children can circumvent these measures by using other devices limits their effectiveness. The International Association of Privacy Professionals underscores the complexities of age verification and data protection, particularly when the age threshold is not tied to existing legal rights, like those conferred at age 18.
Learning from International Models: The European Union’s Digital Markets Act
Australia can draw inspiration from international regulatory models, such as the European Union’s Digital Markets Act. This act designates companies with significant market power as "gatekeepers" and imposes specific obligations on them, including data sharing, algorithm transparency, and interoperability. While Australia cannot replicate the collective power of the EU, elements of this approach could be adapted to the Australian context. The Digital Markets Act aims to create a fairer and more competitive digital marketplace by limiting the dominance of gatekeeper companies like Google, Amazon, Apple, TikTok, Meta, and Microsoft.
Balancing Public Demand, Legal Obstacles, and Political Consensus: Charting a Path Forward
There is widespread public support for government intervention to address online harms. However, legal and practical obstacles, coupled with the need for political consensus, make the task challenging. The debate surrounding the misinformation bill exemplifies the difficulties of achieving agreement on regulatory measures. Nevertheless, the growing impatience of both citizens and the government with self-regulation by tech companies and the limitations of parental responsibility signals a shift towards more robust and comprehensive online safety regulations. The parliamentary committee’s work will be crucial in shaping the future of online safety in Australia, balancing the need to protect individuals from harm with the preservation of fundamental rights and freedoms.