EU Considers $1 Billion Fine Against Elon Musk’s X, Citing Digital Services Act Violations

Brussels – The European Union is reportedly contemplating a landmark $1 billion fine against X, formerly known as Twitter, owned by Elon Musk. This potential penalty stems from alleged violations of the Digital Services Act (DSA), a comprehensive set of regulations designed to curb illegal and harmful online content and enhance transparency within social media platforms. Unusually, the EU is considering including revenue from Musk’s other ventures, such as Tesla and SpaceX, in the calculation of the fine, a move that could set a significant precedent for how regulators address multi-company ownership in the tech industry. This approach reflects the EU’s apparent determination to hold powerful tech figures accountable for the practices of their companies.

The potential fine, equivalent to 6% of X’s global revenue—calculated to include revenue from Musk’s other ventures—reflects the seriousness of the alleged violations. Sources familiar with the matter revealed that the EU’s investigation centers around X’s alleged failure to comply with several key provisions of the DSA. These include accusations of insufficient data provision to external researchers, lack of transparency regarding advertising practices, and inadequate verification processes for user accounts, particularly those bearing the coveted "verified" status. The EU’s assertive stance underscores its commitment to upholding the DSA and enforcing regulations against even the most prominent tech companies.

X has vehemently denied the allegations, characterizing the EU’s pursuit of penalties as an act of political censorship and an assault on free speech. The company maintains that it has diligently complied with the DSA’s requirements and has gone to great lengths to ensure user safety and protect freedom of expression within Europe. X’s Global Government Affairs team has vowed to explore all available legal avenues to challenge any potential penalties and defend its business operations. This sets the stage for a potentially protracted legal battle between the tech giant and the EU, a confrontation that could have far-reaching implications for the future of online content regulation.

The EU’s investigation into X’s practices began in 2023, culminating in a preliminary ruling in July 2024 that found the platform in violation of the DSA. X responded to the ruling by disputing the EU’s conclusions point-by-point and alleging that regulators had offered a “secret deal” whereby X could avoid fines by suppressing certain content. The EU has denied these claims, asserting that its communication with X followed standard regulatory procedures and involved clarifications about the settlement process. The conflicting narratives presented by both sides highlight the deeply entrenched nature of the dispute and suggest a difficult path towards resolution.

Beyond the immediate financial penalty, the EU is reportedly considering demanding product changes at X to address the identified concerns. The full scope of these potential changes, and the overall penalties, is expected to be announced in the coming months. However, the possibility of a settlement remains open, provided X agrees to implement changes that satisfy the EU regulators. The potential for settlement introduces an element of uncertainty into the outcome of the case, as the specific concessions required by the EU and X’s willingness to accept them remain unclear.

Adding to the complexity of the situation, X is facing a second, separate EU investigation related to its content moderation practices. This investigation focuses on allegations that the platform’s approach to policing user-generated content has inadvertently created a haven for illegal hate speech and disinformation. This second investigation could result in additional penalties for X, further compounding the company’s legal challenges. The dual investigations underscore the breadth of the EU’s concerns regarding X’s practices and signal a determined effort to hold the platform accountable for its role in the online ecosystem. The outcomes of these investigations could significantly impact the way online platforms moderate content and interact with regulatory bodies in the future.

Share.
Exit mobile version