When combating influence operations, focusing on discouraging misleading digital market techniques is a more versatile, effective strategy than focusing on whether foreign or domestic actors are involved.
James Pamment is no longer with the Carnegie Endowment for International Peace.
James Pamment was a nonresident scholar in the Technology and International Affairs Program at the Carnegie Endowment for International Peace. He is also an associate professor at Lund University in Sweden, and co-editor-in-chief of the Place Branding and Public Diplomacy journal.
Pamment’s research is about how states influence one another through strategic communication, diplomacy, intelligence, aid, and propaganda. His most recent book is British Public Diplomacy and Soft Power: Diplomatic Influence and Digital Disruption, which covers the evolution of British public diplomacy between 1995 and 2015. His most recent edited book, Countering Online Propaganda and Violent Extremism (edited with Corneliu Bjola), assesses whether the lessons learned from countering violent extremism (CVE) initiatives can be adapted to countering propaganda.
Prior to joining Carnegie, Pamment was a senior analyst at the Centre for Asymmetric Threats Studies (CATS), a governmental think tank at the Swedish National Defence University, providing support to the EU-NATO Hybrid Threats Centre of Excellence in Helsinki. He has consulted extensively for governments and international organizations on questions of countering hostile foreign influence, electoral protection, and public diplomacy. He has previously held research positions at the University of Texas at Austin, the USC Center on Public Diplomacy, and Oxford University.
When combating influence operations, focusing on discouraging misleading digital market techniques is a more versatile, effective strategy than focusing on whether foreign or domestic actors are involved.
The EU needs a disinformation strategy that is adaptable and built to last.
EU officials must coordinate better to mount an effective collective response to disinformation campaigns and influence operations throughout Europe.
Amid the coronavirus pandemic, Europe and the West are grappling with a host of thorny dilemmas posed by disinformation and foreign influence operations.
In 2018, Twitter released a large archive of tweets and media from Russian and Iranian troll farms. This archive of information operations has since been expanded to include activity originating from more than 15 countries and offers researchers unique insight into how IO unfolds on the service.
The EU Code of Practice on Disinformation was an important experiment that has now come to an end. But what should follow? Without a renewed focus on stakeholder engagement, efforts could stall, putting everyone at risk of disinformation attacks.
While increasing media coverage is dedicated to how information is used to influence target audiences, a common terminology for describing these activities is lacking.