Browse Reports

 

 

The uses and abuses of Deepfake Technology, February 2022

Deepfake technology is essentially artificial intelligence capable of creating realistic but false videos, photos and audio of people. Not all deepfakes are so harmless, however, as the technology can, and has been, used to commit fraud, sexually harass women, exacerbate tensions and cause violence. With the increasing dependence on the internet for news as well as the speed of online communication, deepfakes will pose challenges to national security and public safety, to individuals, especially women, and governance of cyber-security.

 

 

Network Contagion Research Institute

These are an independent, data-driven, evidence-based series of reports that the NCRI and select partners release regarding the spread of hostile ideological content. One of the main goals of these reports is to handle sensitive social issues around the spread of ideology in an objective and data driven way. NCRI aims to facilitate honest conversations about the spread of political deception, hate and manipulation, especially on social media.

 

 

Brookings: Report on how to combat fake news and disinformation

In order to maintain an open, democratic system, it is important that government, business, and consumers work together to solve these problems. Governments should promote news literacy and strong professional journalism in their societies. The news industry must provide high-quality journalism in order to build public trust and correct fake news and disinformation without legitimizing them. Technology companies should invest in tools that identify fake news, reduce financial incentives for those who profit from disinformation, and improve online accountability.

 

 

Misinformation in Canada: Research and Policy Options, May 2021

Misinformation refers to false or misleading information. Disinformation, a subcategory of misinformation, is false information spread with intent to deceive. Both mis- and disinformation are ongoing problems that have been exacerbated by COVID-19. Evidence for Democracy completed a research project to characterize the research landscape in Canada and to provide options for addressing misinformation.

 

 

Next-Generation Technology and Electoral Democracy: Understanding the Changing Environment, Centre for International Governance Innovation

Rapid transformation of the digital sphere has created new and ever more insidious threats to democracy and the electoral process — on a global scale. Growing evidence of foreign influence operations combined with mounting worries over corporate surveillance, the power of platform monopolies and the capabilities of the dark web have challenged government and society in unprecedented ways. CIGI convened a transdisciplinary team of experts from fields such as computer science, law, public policy and digital communication to formulate a special report for key government and civil society stakeholders.

 

 

Submission to the UN Special Rapporteur on disinformation and freedom of opinion and Expression

Disinformation campaigns are a growing threat to global stability and democratic values, but in some countries, laws ostensibly aimed at countering such activities have been used to crack down on journalists and civil society groups. The increase in dissemination of disinformation by state and non-state actors in pursuit of financial, ideological and political goals is concerning. Manipulation of the information environment through the propagation of disinformation risks constraining the space available to democratic stakeholders, and particularly to marginalised groups, for authentic political expression.

 

 

Carnagie Endowment for international Peace: European Democracy and Counter-Disinformation: Toward a New Paradigm?

European governments are moving into a new phase in their efforts to counter disinformation. The recent project with The Hague Program for Cyber Norms looked at how the governments of several European countries (France, Germany, Hungary, Serbia, Sweden, and the United Kingdom) have adjusted their counter-disinformation strategies during the pandemic. This identified two major trends. First, governments are realizing that the distinction between domestic and foreign disinformation has become increasingly obsolete. Second, alongside their attempts to regulate online platforms, governments are starting to think more about the democratic character of their counter-disinformation measures.

 

 

UNICEF: Digital misinformation / disinformation and children Rapid analysis | How can we best protect children from the harms that stem from mis/disinformation?

The report goes beyond simply trying to understand the phenomenon of false and misleading information, to explain how policymakers, civil society, tech companies and parents and caregivers can act to support children as they grow up, pushing back the rising tide of misinformation and disinformation.

 

 

Call for submissions: Challenges to freedom of opinion and expression in times of conflicts and disturbances

United Nations. July 11, 2022.

The United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan will focus her next thematic report on challenges to freedom of opinion and expression in times of armed conflict and other disturbances for the 77th session of the UN General Assembly in October 2022.

 

 

Kosovo’s Online Media Lack Resources to Combat Fake News: Report

Balkan Insight. July 19, 2022.

A report by the Press Council of Kosovo said that online media don’t have access to fact-checking specialists or enough editors and sub-editors to deal with the problems of disinformation and fake news.

 

 

Report on the sixth expert roundtable – The role of public service media in countering disinformation, 20 June 2022.

Organization for Security and Co-operation in Europe. July 8, 2022.

Presentations by:

  •  Teresa Ribeiro, OSCE Representative on Freedom of the Media
  • Minna Aslama Horowitz, ,Researcher at the Nordic Observatory for Digital Media and Information Disorder (NORDIS)
  • Nicola Frank, Head of the Institutional and International Relations, European Broadcasting Union (EBU)
  • Ara Shirinyan, Chair of the Council of Broadcasters of Armenia
  • Luc van Bakel, Editor-in-chief, research unit of VRT NWS (Belgium)
  • Marius Dragomir, Director, Center for Media, Data and Society (CMDS)

 

 

Government censorship rebrands with ‘disinformation’ campaign

Rabble.ca. Yves Engler. July 2022.

Social Media Lab at Toronto Metropolitan University’s School of Management released a report titled, “The reach of Russian propaganda and disinformation in Canada.” According to lead author Anatoliy Gruzd, “the research provides evidence that the Kremlin’s disinformation is reaching more Canadians than one would expect. Left unchallenged, state-sponsored information operations can stoke societal tensions and could even undermine democracy itself.”

 

 

Are Canadians immune to Russian propaganda? New research says you’ve likely already seen it on social media.

Toronto Metropolitan University. July 11, 2022.

Canadians are being exposed to pro-Kremlin propaganda. Slightly over half of Canadians (51%) reported encountering at least one pro-Kremlin claim about the Russia-Ukraine war on social media, according to new research from the Social Media Lab at the Ted Rogers School of Management.

In recent years, Russia has deployed bots, trolls and hackers across social media and the internet, as part of Russia’s goal to shape their public perception on the world stage. These tactics are in an effort to curate a more favorable environment for their agenda in Ukraine, as well as other areas of geopolitical.

 

 

The Reach of Russian Propaganda & Disinformation in Canada

Social Media Lab. July 12, 2022.

This report examines the extent to which Canadians are exposed to and might be influenced by pro-Kremlin propaganda on social media based on a census-balanced national survey of 1,500 Canadians conducted between May 12–31, 2022. Among other questions, the survey asked participants about their social media use, news consumption about the war in Ukraine, political leanings, as well as their exposure to and believe in common pro-Kremlin narratives.

 

 

Mis- and Disinformation Research Agenda Survey: Key Themes

MITTE July 2022.

In the summer of 2021, MITRE surveyed recent mis- and disinformation research agendas and priorities, as well as related conference and workshop proceedings, from across academia, government, civil society, and industry to create this meta-analysis of priority research needs. We identified several major themes, which can help the nation prioritize future research.

 

 

Hoax in the Machine: Disinformation Against Voting Systems Manufacturers and Technologies in the 2022 US Midterm Elections

Recorded Future. Nov ember 7, 20223.

This report presents Recorded Future’s insights and assessments on disinformation and influence efforts targeting United States (US)-deployed voting technologies up to, during, and in the aftermath of the 2022 midterms, including electronic voting systems, voting machines, and various Election Assistance Commission-approved software and hardware used in the administration of US elections at the local, state, and federal levels. While we acknowledge that voting technologies have faced a long history of criticisms in the US, we intentionally focus this report on election infrastructure disinformation and influence efforts generated between the 2020 general election and the 2022 midterm elections.

 

 

Brief: Disinformation Risk in the United States Online Media Market, October 2022

Global Disinformation Index. October 21, 2022.

GDI’s research looked at 69 U.S. news sites, selected on the basis of online traffic and social media followers, as well as geographical coverage and racial, ethnic and religious community representation. The index scores sites across 16 indicators – indicators which themselves contain many, many more individual data points – and generates a score for the degree to which a site is at risk of disinforming its readers.

The data from the study corroborates today’s general impression that hyperbolic, emotional, and alarmist language is a feature of the U.S. news media landscape.

 

 

What a Pixel Can Tell: Text-to-Image Generation and its Disinformation Potential

DRI Global. September 23, 2022

For years now a set of disinformation tools and tactics have spread cascades of falsehood across the Internet. This problem could take a turn for the worse with the development of machine learning models, an emerging technology powered by Artificial Intelligence (AI) that hostile actors could use to support false narratives. Consider this: a hostile actor creates a false headline, builds a story around it, and uses AI to design an image that perfectly supports the erroneous narrative. This is what fully synthetic content, such as hyperrealistic images created through text prompts and powered by AI, enables. Also known as text-to-image-generation, this impressive technology, with fascinating potential in its legitimate uses, could have daunting effects on our democratic public discourse.

 

 

How to fight misinformation in the post-truth era

PHYS Org. Central European University. November 17, 2022

The article also touches upon the question of the future usage of AI and the fear that it may be used to produce disinformation in the future. "The institutions of epistemic vigilance are indeed challenged by digitization. Perhaps the solution lies in digitization too, in programming AI to curate reliable information based on the principles of epistemic vigilance," the authors claim.

In conclusion, history tells us that institutions that deliver reliable information are fragile, and there is no straightforward strategy to repair them. Szegofi and Heintz believe that while it is not certain that we will have institutions of epistemic vigilance in the future, it is worth saving them, as they allow for us to trust.

 

 

90% of People Claim They Fact-Check News Stories As Trust in Media Plummets

Security Org. Aliza Vigderman. November 4, 20223.

As the popularity of social media surpasses traditional news sources, information has grown more unreliable, and “fake news” becomes harder to detect. The same digital platforms that empower global communication seed doubt and spread misinformation.

The misinformation and disinformation that have influenced elections and hampered public health policies also damaged faith in all forms of media. Meanwhile, political attacks on some news sources have divided Americans further into partisan camps.

The nation is united, however, in recognizing the problem. Our second annual study of more than 1,000 people revealed that nine out of 10 American adults fact check their news, and 96 percent want to limit the spread of false information.

 

 

Disinformation is a Regional Economic Problem

Asia-Pacific Economic Cooperation. Emmanuel A. San Andres. October 25, 2022

Information disorder is a catch-all phrase for the spread of false information which may cause negative societal and even economic harm—whether or not the harm was the intent of the creators and spreaders. It can be divvied into three categories: misinformation, or the sharing of falsity but with no intent to harm anyone; disinformation or the sharing of false information with intent to do harm; and malinformation, or the repurposing or recontextualization of facts, also with harmful intent. All three are reliant on how fast stories can be spread online to dangerous effect.

 

 

New pro-China disinformation campaign targets 2022 elections: Report

Axios. Sam Sabin. October 26, 2022.

Researchers at Google-owned Mandiant said in a report Wednesday that they've detected a group attempting to sow division in the U.S. and "operating in support of the political interests of the People’s Republic of China."

Mandiant's information adds to growing reports that pro-China actors are interested in influencing and disrupting next month's elections — although there's no evidence they've been successful.

 

 

Pro-PRC DRAGONBRIDGE Influence Campaign Leverages New TTPs to Aggressively Target U.S. Interests, Including Midterm Elections

Mandiant Intelligence. October 26, 2022.

Mandiant has recently observed DRAGONBRIDGE, an influence campaign we assess with high confidence to be operating in support of the political interests of the People’s Republic of China (PRC), aggressively targeting the United States by seeking to sow division both between the U.S. and its allies and within the U.S. political system itself.

 

Resilience Against Disinformation: A New Baltic Way to Follow?

RKK ICDS. October 20, 2022.

The Baltic states, although not immune to disinformation, have accumulated unique experience and developed effective methods to resist and combat this malice.

This report is based on in-depth semi-structured interviews and supplementary surveys conducted with the representatives of several clusters – media, civil society organizations, state institutions, think-tanks/academia and business communities. It aims to assess risks and vulnerabilities, as well as the three nations’ preparedness to counteract foreign-led disinformation. This report also reviews the existing indices that lead to a greater understanding of the intricate nature and interdependence of resilience-shaping factors at various levels, while contributing the unique Baltic perspective to the evolving, global study of disinformation.

 

 

Researchers Analyze the Nuances of Misinformation

MIT Initiative on the Digital Economy. Peter Dizikes. December 2022.

Stopping the spread of political misinformation on social media may seem like an impossible task. But a new study co-authored by MIT scholars finds that most people who share false news stories online do so unintentionally, and that their sharing habits can be modified through reminders about accuracy. When such reminders are displayed, it can increase the gap between the percentage of true news stories and false news stories that people share online, as shown in online experiments that the researchers developed.

 

A Capability Definition and Assessment Framework for Countering Disinformation, Information Influence, and Foreign Interference

NATO Strategic Communications Centre of Excellence. J.Pamment.Dec.5,2022

This report proposes a capability assessment framework for countering disinformation, information influence, and foreign interference. At present, much emphasis is placed on the capability to counter disinformation and other associated phenomena. However, few have attempted to systematically define what those countermeasures are, and how they could be placed within a single, coherent capability assessment framework. Since there is no one-size-fits-all solution to this problem, this report provides a flexible approach to capability assessment based on simple principles that can be applied by different types of actors. In support of this, and drawing upon previous research in this subject area, four capability assessment tools are established as tools to solve different assessment problems

 

Hard News: Journalists and the Threat of Disinformation.

Pen America. May 2023.

In a nationwide survey, PEN America asked reporters and editors from local, regional, and national outlets how working amid floods of disinformation—content created or distributed with intent to deceive—is altering their profession, their relationships with their sources and audiences, and their lives. Responses from more than 1,000 U.S. journalists1 reveal that disinformation is significantly changing the practice of journalism, disrupting newsroom processes, draining the attention of editors and reporters, demanding new procedures and skills, jeopardizing community trust in journalism, and diminishing journalists’ professional, emotional, and physical security. Journalists told PEN America how worried they are about the impact of disinformation on their work, the time and effort it takes to keep from inadvertently spreading falsehoods—and how underequipped they and their newsrooms are to effectively counter the torrents of untruths that threaten a free press’s critical role in our democratic process. Only 18 percent of the reporters and editors responding said they were being offered sufficient professional development support on how to detect and report on disinformation.

 

REPORT on foreign interference in all democratic processes in the European Union, including disinformation

European Parliament. May 2023.

The work of the second Special Committee follows seamlessly from the first, and the present INGE 2 Resolution is to be complementary to the INGE 1. It therefore includes recommendations and updates on the EU’s coordinated strategy against foreign interference; on EU resilience building; on interference using online platforms; on the critical infrastructure and strategic sectors; on interference during electoral processes; on covert funding of political activities by foreign actors and donors; on cybersecurity and resilience of democratic processes; on the impact of interference on the rights of minorities and other vulnerable groups; on deterrence, attribution and collective countermeasures, including sanctions; and on neighbourhood policy, global cooperation, and multilateralism.

 

 

IBERIFIER Reports — Analysis of the Impact of Disinformation on Political, Economic, Social and Security Issues, Governance Models and Good Practices: The cases of Spain and Portugal

Iberifier Media Research & Fact-Checking Hub. June 2023.

This report is the result of a comparative study on the analysis of the impact of disinformation on political, economic, social and security issues, governance models and good practices in the cases of Spain and Portugal. The analysis is one of the strategic objectives of the IBERIFIER project (Iberian Media Research & Fact-Checking).

Scaling Trust on the Web: Comprehensive Report of the taskforce for a Trustworthy Future Web

Atlantic Council. June 2023

The Atlantic Council emphasizes that the escalation of risks and harm threatens the potential of transformative technologies. They assert that there is a crucial opportunity to apply valuable lessons learned over the years and invest in strengthening human dignity and global societal resilience.

According to the Atlantic Council, what happens in the offline world will inevitably manifest in the online realm. They note that the distinction between the "real" and "digital" worlds is increasingly becoming blurred, compelling individuals to engage with online tools even in previously offline domains. The Atlantic Council suggests that as we navigate the digital future, including the development of a trustworthy web, we must acknowledge and address the intricate complexities and challenges that exist both online and offline.

How Americans’ confidence in technology firms has dropped.

Brookings. Sean Kates, Jonathan Ladd & Joshua A. Tucker. June 14, 2023.

American Institutional Confidence poll (“AIC”) is a national survey that measures confidence in institutions and support for democracy. They found a significant decline in Americans' confidence in technology and tech companies between 2018 and 2021. This drop is more pronounced compared to other institutions. In this piece, they explore the reasons behind this decline and its implications.

The Russian War on Truth: Defending Allied and Partner Democracies Against the Kremlin’s Disinformation Campaigns

NATO Parliamentary Assembly. Joelle Garriaud-Maylam. September 15, 2023.

Russian disinformation poses a serious threat to the security and democracy of Euro-Atlantic area countries. By eroding the distinction between reality and fiction, the Kremlin and its supporters seek to amplify societal divisions and destabilize Allied and partner states. They also seek to undermine citizens’ trust in democratic institutions and systems.

The aim of this draft report is to stimulate avenues of thought to aid in this fight. It delves into the origins, recent developments, objectives and operations of Russian disinformation. Additionally, it provides an overview of the various measures taken by Allied and partner nations to counter it and examines the reasons behind Ukraine’s success in combatting Russian information manipulation during the new invasion. Finally, it sets out recommendations for Allied governments and NATO to develop a more cohesive and efficient response to Russian disinformation.

Global Risk Report 2024

World Economic Forum. January 10, 2024.

The Global Risks Report explores some of the most severe risks we may face over the next decade, against a backdrop of rapid technological change, economic uncertainty, a warming planet and conflict. As cooperation comes under pressure, weakened economies and societies may only require the smallest shock to edge past the tipping point of resilience.

Where False Information Is Posing the Biggest Threat

Statista. Anna Fleck. January 19, 2024.

In an unprecedented year for elections, false information is one of the major threats that people around the world will face according to experts surveyed for the World Economic Forum’s 2024 Global Risk Report. The following chart shows the varying degrees to which misinformation and disinformation are rated to be problems for a selection of analyzed countries in the next two years, based on a ranking of 34 economic, environmental, geopolitical, societal and technological risks.

Tracking AI-enabled Misinformation: 725 ‘Unreliable AI-Generated News’ Websites (and Counting), Plus the Top False Narratives Generated by Artificial Intelligence Tools

NewsGuard. McKenzie Sadeghi et al. February 2024.

NewsGuard has so far identified 725 AI-generated news and information sites operating with little to no human oversight and is tracking false narratives produced by artificial intelligence tools. From unreliable AI-generated news outlets operating with little to no human oversight, to fabricated images produced by AI image generators, the rollout of generative artificial intelligence tools has been a boon to content farms and misinformation purveyors alike.

Cyber threats to Canada’s democratic process

Canada Communications Security Establishment. 2023

Foreign adversaries are increasingly using cyber tools to target democratic processes around the world. Disinformation has become ubiquitous in national elections, and adversaries are now using generative artificial intelligence (AI) to create and spread fake content. This report addresses cyber threat activity targeting elections, and the growing threat that generative AI poses to democratic processes globally and in Canada.