Skip to main content
European Commission logo

How COMPROP lifted the veil on political propaganda

An EU-funded project has been making a name for itself with its unique insights into political propaganda and misinformation on social media. The work of this project proves helpful at a time when our societies truly need it most. This is because disinformation keeps spreading online and is threatening the very foundations of our democracies.

© EvgeniyBobrov #219397132, source:stock.adobe.com 2020

PDF Basket

No article selected

With the Brexit referendum and particularly tense elections in the US, 2016 was arguably a turning point for Western democracies. How both campaigns unfolded at polling stations was already historical. But behind the scenes, a whole new kind of web-based political propaganda was born. ‘Fake news’ became the buzzword of the year, and tracking misinformation turned into a full-time job for many researchers.

The COMPROP team, funded via a grant from the European Research Council, was set up in the wake of these events. Their objective: identifying sources of misinformation during critical moments in public life and mapping the spread of conspiracy theories. Over four years, the team monitored millions of public Twitter accounts and analysed data from other popular platforms like Facebook and WhatsApp. “We continuously strove to advance state-of-the-art in examining political news and communication on social media,” says Philip N. Howard, Director of the Oxford Internet Institute and the project’s principal investigator.

The project’s first endeavour was to create innovative methods for this analysis. To do so, the team combined traditional coding methods with automated techniques like sentiment analysis and topic modelling to get more effective and accurate interpretations. They also developed unique techniques to analyse visual disinformation (images, memes, videos) as their importance in conveying disinformation grew. “We used these science methods to extract and analyse data, while sociologists and political scientists provided valuable insights to help us interpret our findings,” Howard explains.

Insidious algorithms

In 2018, the research notably led to a report detailing the extent of Russian interference in the US presidential elections. The report was shared with the US Senate. It showed that pro-Trump messages discrediting the US electoral system were spread specifically towards African-American voters. Meanwhile, other Russian campaigns targeted other specific groups, including Hispanics, Muslims, Christians, the LGBT community and war veterans. 

With its pioneering methods, the project vastly contributed to the wide realisation that “junk news” have now become major news stories worth investing research on. “Since our initial results on the US elections were published, computational propaganda and foreign interference in electoral processes have become major themes of research across academic departments. There have been a number of public awareness campaigns that have highlighted how malicious actors target voters based on their demographic profiles,” says Howard.

Each project finding was effectively communicated to larger audiences through short reports carried by various prominent news organisations and media channels. Likewise, the consortium actively engaged with a wider community of stakeholders including government organisations, activists and social media platforms themselves, to disseminate research findings and generate impact.

The number of topics approached is substantial. Besides Brexit and Donald Trump’s election, the team focused on the French presidential elections in 2017, as well as other important elections in Latin America, India and European countries. They also investigated socio-political issues such as climate change, divisive narratives and, most recently, COVID-19-related misinformation.

In one of their latest publications, the team demonstrated how questionable stories on COVID-19 from state-backed outlets in Russia, China, Turkey and Iran were shared more widely than those from major news organisations. Whilst mainstream news generates an average 25 engagements per post, these questionable stories generated as many as 125 engagements on average.

“Our research has been instrumental in highlighting how misinformation on social networks has effectively changed modern political campaigning. It demonstrates how malicious political actors have used computational propaganda to influence democratic process around the world,” Howard explains. But COMPROP is also a wake-up call for governments. As the world becomes increasingly connected and Artificial Intelligence (AI) continues to rise, online manipulation and political propaganda will keep being one of the defining issues of our time.

PDF Basket

No article selected

Project details

Project acronym
COMPROP
Project number
648311
Project coordinator: United Kingdom
Project participants:
United Kingdom
Total cost
€ 1 980 112
EU Contribution
€ 1 980 112
Project duration
-

See also

More information about project COMPROP

All success stories