Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

AI-Generated Disinformation and its Impact on the South Korean Presidential Election

June 1, 2025

Misinformation Plagues South Korean Election

June 1, 2025

AI-Driven Detection of Disinformation Campaigns

June 1, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»News»Misinformation Expert’s Testimony in Minnesota Deepfake Case Relies on Fabricated Sources
News

Misinformation Expert’s Testimony in Minnesota Deepfake Case Relies on Fabricated Sources

Press RoomBy Press RoomDecember 27, 2024
Facebook Twitter Pinterest LinkedIn Tumblr Email

Misinformation Expert’s Credibility Under Fire Amidst Allegations of Fabricated Citations in Minnesota Deep Fake Law Case

A prominent misinformation researcher finds himself embroiled in controversy as allegations surface that he cited non-existent sources in an affidavit supporting Minnesota’s new law against election misinformation. Professor Jeff Hancock, the founding director of the Stanford Social Media Lab and a recognized authority on deception in the digital age, is facing scrutiny after lawyers challenging the law discovered fabricated citations within his expert declaration. The law, which bans the use of "deep fake" technology to manipulate elections, is being contested on First Amendment grounds, and Hancock’s affidavit forms a key pillar of the state’s defense.

The controversy centers around several academic studies cited by Hancock that appear to be entirely fictitious. One such citation references a study titled "The Influence of Deepfake Videos on Political Attitudes and Behavior," purportedly published in the Journal of Information Technology & Politics in 2023. However, a thorough search of the journal, academic databases, and the specific pages referenced by Hancock reveal no such study. Instead, entirely different articles occupy those pages. Similarly, another cited study, "Deepfakes and the Illusion of Authenticity: Cognitive Processes Behind Misinformation Acceptance," is also absent from any known academic repository. These discrepancies have raised serious questions about the veracity of Hancock’s declaration and the methods used in its preparation.

The plaintiffs challenging the law argue that these phantom citations bear the hallmarks of "AI hallucinations," suggesting they were generated by large language models like ChatGPT. While the exact mechanism by which these fabricated citations found their way into Hancock’s declaration remains unclear, the incident casts a shadow over the entire document and raises concerns about the reliability of its contents. The situation is further complicated by the lack of response from Hancock, the Stanford Social Media Lab, and Minnesota Attorney General Keith Ellison’s office, all of whom have remained silent despite repeated requests for comment.

The implications of this controversy extend beyond the immediate legal battle over Minnesota’s deep fake law. It strikes at the heart of academic integrity and the credibility of expert testimony, particularly in the rapidly evolving field of misinformation research. Hancock’s expertise lies precisely in understanding the nuances of online deception, making the presence of fabricated citations in his declaration all the more perplexing and damaging. The incident underscores the potential pitfalls of relying on AI tools without proper verification and oversight, especially in high-stakes legal contexts.

The case also highlights the ongoing debate surrounding the regulation of online speech, particularly in the context of emerging technologies like deep fakes. Proponents of the Minnesota law argue that AI-generated content poses a unique threat to democratic processes, requiring specific legal interventions. They contend that unlike traditional forms of misinformation, deep fakes are difficult to debunk and can spread rapidly online, potentially swaying public opinion and influencing election outcomes. However, critics argue that such laws infringe on free speech protections and that existing legal frameworks are sufficient to address the harms caused by deep fakes. They emphasize the importance of countering false information with accurate information, rather than resorting to censorship.

The incident involving Hancock’s affidavit provides ammunition to those who oppose restrictions on online speech. By exposing the fabricated citations, the plaintiffs argue that they have demonstrated the effectiveness of "true speech" in countering falsehoods, thereby reinforcing the argument against censorship. This case serves as a cautionary tale about the potential misuse of AI and the importance of rigorous fact-checking, especially when dealing with information that could have significant societal consequences. It also underscores the need for transparency and accountability within the legal system, particularly when expert testimony plays a crucial role in shaping legal decisions. The ongoing silence from Hancock and the involved institutions only fuels the controversy and raises further questions about the integrity of the legal process.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Misinformation Plagues South Korean Election

June 1, 2025

Study Reveals Pervasive Misinformation in Popular TikTok Mental Health Content

June 1, 2025

North Dakota Law Enforcement Disputes Sanctuary Jurisdiction Designation

June 1, 2025

Our Picks

Misinformation Plagues South Korean Election

June 1, 2025

AI-Driven Detection of Disinformation Campaigns

June 1, 2025

Study Reveals Pervasive Misinformation in Popular TikTok Mental Health Content

June 1, 2025

AI-Powered Detection of Disinformation Campaigns

June 1, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

News

North Dakota Law Enforcement Disputes Sanctuary Jurisdiction Designation

By Press RoomJune 1, 20250

North Dakota Sheriffs Rebut Federal "Sanctuary" Designation, Citing Misinformation and Lack of Transparency FARGO, N.D.…

Coordinated Disinformation Campaign Revealed in Cyabra Report

June 1, 2025

Chief of Defence Staff General Chauhan: 15% of Army Resources Dedicated to Countering Misinformation

June 1, 2025

Combating Misinformation: Equipping Content Creators and Journalists with Essential Verification Skills

June 1, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.