Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Proposed Stringent Penalties for Dissemination of Misinformation in Indian State.

July 1, 2025

The Amplification of Insurance Fraud through Deepfakes, Disinformation, and AI

July 1, 2025

Iranian Influence Operations Pose Threat of Subversion within the UK

July 1, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Social Media»Surge in Fake Political Content Observed in Canada Prior to Federal Election
Social Media

Surge in Fake Political Content Observed in Canada Prior to Federal Election

Press RoomBy Press RoomApril 19, 2025
Facebook Twitter Pinterest LinkedIn Tumblr Email

Canada’s 2024 Federal Election: A Breeding Ground for Disinformation and Fraud

The 2024 Canadian federal election has witnessed an alarming surge in online disinformation and fraud, impacting over a quarter of the electorate. A new report by the Media Ecosystem Observatory (MEO) reveals a "dramatic acceleration" in sophisticated and polarizing misleading content, including deepfake videos and fraudulent investment ads disguised as news. This new wave of disinformation is harder to detect than in previous elections, raising serious concerns about the integrity of the electoral process and the public’s trust in information sources. The rise of these tactics coincides with the first federal election since Meta, Facebook’s parent company, blocked Canadian news content on its platforms due to the Online News Act (Bill C-18). This blockage, while intended to address compensation for news publishers, has inadvertently created a vacuum filled by malicious actors.

The MEO study highlights the proliferation of Facebook ads impersonating legitimate news brands to promote cryptocurrency scams. These ads employ fake headlines and manipulated videos, luring unsuspecting users to fraudulent websites designed to steal personal financial information. This sophisticated approach, designed to appear credible and trustworthy, marks a shift from simpler forms of misinformation seen in past elections. The production quality and targeted nature of these campaigns raise questions about their origins and intent. The report warns that this isn’t merely low-effort misinformation; it’s a highly organized operation using convincing visuals engineered to resemble legitimate political coverage. This tactic exploits the public’s existing trust in established media outlets to achieve malicious ends.

Paradoxically, despite Meta’s news ban, Facebook remains a primary source of political information for over half of Canadians. This reliance exposes a significant vulnerability: users often unknowingly consume unverified content presented as news. They interact with political memes, commentary pages, and candidate posts, mistaking engagement for informed consumption. This blurring of lines between entertainment, opinion, and verified news contributes to a distorted information landscape, ripe for manipulation. The MEO argues that the absence of credible news sources on major platforms has created an ideal environment for low-quality, polarizing, and fraudulent content to flourish. This void leaves citizens vulnerable to manipulation and undermines the foundation of a well-informed electorate.

One of the most alarming trends identified in the report is the emergence of deepfake videos falsely depicting Prime Minister Mark Carney endorsing cryptocurrency investment schemes. These videos, designed to mimic segments from reputable news organizations like CBC or CTV, feature fabricated interviews and false claims about government policies. One example includes a fabricated headline announcing Carney’s retaliatory tariff plan against Trump’s hikes, leading to a scam website. Another instance involved a Facebook page, Money Mindset, running French-language ads featuring a deepfake of Carney, garnering significant impressions despite a short runtime. The MEO report stresses that these imposter ads and deepfake videos undermine the credibility of both political leaders and legitimate news organizations, further eroding public trust in institutions.

While foreign interference remains a concern, with Canada’s Security and Intelligence Threats to Elections (SITE) task force identifying potential threats from China, Russia, and Iran, the MEO report suggests that the majority of the disinformation originates domestically and focuses primarily on financial scams rather than direct electoral manipulation. An operation linked to China on WeChat was flagged by SITE, but its impact was deemed immaterial. This focus on financial exploitation, while not directly aiming to sway votes, still poses a significant threat by eroding public trust and further muddying the information environment during a critical period. The prevalence of domestically-sourced scams highlights the need for greater internal vigilance and regulation of online spaces.

Despite Meta’s stated policies against such ads and encouragement for users to report them, the MEO report criticizes the inconsistency of enforcement. Many of these fraudulent ads evade detection by avoiding explicit political labeling, thus remaining outside Meta’s public ad library. This loophole allows them to proliferate unchecked. The report draws a stark contrast between the stringent broadcasting standards applied to television content and the lax oversight of online platforms. The unchecked spread of fake news ads on Facebook during a federal election is described as "dystopian." Though Meta claims continued investment in technology and enforcement tools to combat scams and impersonations, and acknowledges it as an industry-wide challenge, researchers emphasize the need for stronger oversight, particularly in the absence of reliable news content on major platforms. The report concludes with a stark warning: the current information landscape effectively hands control to unregulated actors, leaving the public to bear the consequences.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Iranian Influence Operations Pose Threat of Subversion within the UK

July 1, 2025

Australia Holds Social Media Companies Accountable for Misinformation

July 1, 2025

Fact Check: Debunking False Reports of Nationwide Traffic Law Changes on Websites and Social Media

July 1, 2025

Our Picks

The Amplification of Insurance Fraud through Deepfakes, Disinformation, and AI

July 1, 2025

Iranian Influence Operations Pose Threat of Subversion within the UK

July 1, 2025

Indian State Introduces Proposed Legislation for Seven-Year Prison Sentence for Dissemination of False Information

July 1, 2025

Experts Warn of Russian AI-Driven Disinformation Campaign Targeting British Citizens.

July 1, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Social Media

Australia Holds Social Media Companies Accountable for Misinformation

By Press RoomJuly 1, 20250

Australia Takes Aim at Misinformation with Landmark Legislation, Threatening Tech Giants with Hefty Fines Canberra…

The Dissemination of Misinformation Regarding Transgender Healthcare and Its Influence on Progressive Ideology.

July 1, 2025

Sprout Social Achieves Industry Leadership with 164 G2 Leader Awards in Social Media Management.

July 1, 2025

Fact Check: Debunking False Reports of Nationwide Traffic Law Changes on Websites and Social Media

July 1, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.