X Corp. Challenges Minnesota’s Deepfake Law, Claiming Censorship and Constitutional Violations
ST. PAUL, MN – In a legal battle brewing in the heart of the Midwest, social media giant X Corp., formerly known as Twitter, has launched a First Amendment challenge against a Minnesota state law aimed at combating the spread of deepfakes during elections. The lawsuit, filed in the Federal District Court of Minnesota, targets Attorney General Keith Ellison, claiming the statute is unconstitutionally vague and infringes upon free speech rights. The law, enacted in 2023, prohibits the dissemination of deepfake content related to elections. X Corp., now owned by Tesla and SpaceX CEO Elon Musk, argues the law’s ambiguity forces platforms to censor legitimate political speech for fear of criminal prosecution.
At the heart of X Corp.’s argument is the statute’s broad definition of deepfakes and the punitive measures it imposes. The law criminalizes the spread of deepfakes within 90 days of a political party’s nominating convention, during absentee voting, or during any election. X Corp. contends that this timeframe is excessively restrictive and encompasses a significant portion of the political calendar. Moreover, the law carries penalties of up to 90 days in jail for a first offense and up to five years for subsequent violations, creating a chilling effect on online discourse. The lawsuit alleges that this threat of criminal liability incentivizes platforms like X to preemptively remove content rather than risk facing prosecution, effectively leading to censorship.
The complaint, drafted by attorney Erick Kaardal of the Minnesota-based law firm Mohrman Kaardal, argues that the law’s vague wording leaves social media companies uncertain about what constitutes prohibited content. This uncertainty, coupled with the threat of criminal penalties, compels platforms to err on the side of caution, suppressing legitimate political speech to avoid legal jeopardy. Kaardal emphasizes that the law lacks clear guidelines for distinguishing between genuine political satire, parody, or commentary and malicious deepfake manipulations, thereby jeopardizing protected forms of expression.
To illustrate the potential for abuse, the lawsuit cites an instance where an X user employed AI to generate an image depicting the arrest of then-presidential candidate Donald Trump. The complaint argues that under the Minnesota statute, X Corp. could be held criminally liable simply for hosting this AI-generated image on its platform, even if the image was clearly intended as satire or commentary and not meant to deceive voters. This, X Corp. contends, demonstrates the statute’s overreach and its potential to stifle legitimate political expression. The company argues that its constitutional rights, as well as the rights of its users, are being infringed upon by the law’s vague provisions and potential for arbitrary enforcement.
This isn’t the Minnesota law’s first encounter with controversy. In a separate legal proceeding concerning an alleged deepfake of former Vice President Kamala Harris, Attorney General Ellison’s reliance on an expert whose citations were later revealed to be AI-generated caused significant concerns about the state’s approach to combating misinformation. A federal judge subsequently barred the expert’s testimony. This incident further underscores the complexities and potential pitfalls of legislating against deepfakes, particularly in the rapidly evolving digital landscape.
While the lawsuit challenges the Minnesota law’s constitutionality, it also raises broader questions about the balance between protecting election integrity and upholding free speech principles in the age of AI-generated content. As deepfake technology becomes increasingly sophisticated and accessible, lawmakers grapple with the challenge of crafting legislation that effectively addresses the potential for manipulation without unduly restricting legitimate forms of expression. The outcome of this legal battle could have significant implications for the future of online political discourse and the regulation of deepfakes across the country. X Corp. is seeking a declaratory judgment that the law is unconstitutional and an injunction preventing its enforcement. Attorney General Ellison’s office has not yet commented on the lawsuit.