Carnegie Mellon University Tackles Disinformation in Democracy with Cutting-Edge Research and Tools
The integrity of democratic processes, particularly elections, faces a significant threat from the pervasive spread of disinformation in the digital age. To address this critical challenge, Carnegie Mellon University’s (CMU) IDeaS center hosted "Disinformation & Democracy," a compelling event showcasing the institution’s innovative research and technological advancements in combating online harms, particularly disinformation. The event, featuring presentations by faculty and student researchers, offered insights into the multifaceted nature of disinformation and its impact on democratic discourse. Attendees also had the unique opportunity to interact with cutting-edge software tools designed to identify and counter the spread of false and misleading information.
The event highlighted several key research projects and software tools developed at CMU. AESOP, a scenario generation tool employing large language models, aids in creating realistic simulations of information environments, allowing researchers to study how individuals identify and respond to influence campaigns. ORA and BEND, powerful network analysis and visualization tools, enable the examination of social media data to understand influence dynamics and the spread of disinformation. NetMapper employs computational linguistics to analyze text, revealing sentiments, emotions, and moral values embedded within, offering critical insights into how people react to and characterize misinformation. Finally, demonstrations of bot detection tools underscored the prevalence of automated accounts in spreading disinformation across social media platforms. Sway, an AI-facilitated group chat platform, also took center stage, demonstrating its potential to foster more constructive discussions on controversial topics among students with diverse viewpoints.
Two expert panels enriched the event, offering diverse perspectives on the complex interplay between disinformation, technology, and democracy. The first panel, “Countering Disinformation,” convened experts from philosophy, economics, computer science, and communication. Simon Cullen, Assistant Professor of Philosophy, discussed his research on AI-guided discussion platforms. Uttara Ananthakrishnan, whose research focuses on the societal impact of technology, examined consumer behavior in digital environments. Evan Williams, a PhD student, shared his work on understanding how users encounter misinformation through search engines. Chris Labash, Associate Professor of Communication and Innovation, explored strategies for designing communication that promotes trust in accurate information.
The second panel, "AI, Elections, and Disinformation," delved into the intricate relationship between artificial intelligence and the democratic process. Hoda Heidari, Assistant Professor of Ethics and Computational Technologies, discussed her work on fairness and accountability in AI. Hong Shen, Assistant Research Professor in the Human-Computer Interaction Institute, explored the ethical and policy implications of digital platforms and algorithms. Christine Lepird, a PhD student, presented her research on social network analysis and the detection of inauthentic online news. Kathleen M. Carley, Director of both the CASOS and IDeaS centers, provided a comprehensive overview of her research, integrating cognitive science, network science, and computer science to tackle complex societal challenges, including online disinformation and social cybersecurity.
The "Disinformation & Democracy" event offered both in-person and virtual participation options. While the panels were livestreamed, the interactive Q&A sessions and software demonstrations were exclusively available to those attending in person. This hybrid format ensured broader reach while providing hands-on experience with the tools and technologies being developed at CMU. The event exemplified the university’s commitment to tackling the complex issue of disinformation through cutting-edge research and innovative tool development.
The software demonstrations provided a tangible glimpse into the potential of technology to combat disinformation. Attendees witnessed firsthand how tools like AESOP, ORA, BEND, NetMapper, and bot detectors can be used to identify, analyze, and understand the spread of false and misleading information. These demonstrations underscored the importance of interdisciplinary collaboration in developing effective solutions, bringing together expertise in computer science, social science, and cognitive science.
The event served as a powerful call to action, emphasizing the urgent need for continued research, technological innovation, and public awareness to safeguard democratic processes from the corrosive effects of disinformation. The work being done at CMU, as showcased at this event, offers a beacon of hope in the fight against online manipulation and the preservation of a well-informed citizenry. The tools and insights shared at "Disinformation & Democracy" represent significant steps towards fostering a more resilient and informed digital landscape, ultimately strengthening the foundations of democratic societies.