Russia's disinfo playbook? It's Brexit

Disinformation and counterterrorism expert Martin Innes unveils the inner workings of Russian influence operations and their connection to the Brexit campaign.

Russia's disinfo playbook? It's Brexit
Photo: Daniel Sorabji, Getty Images
💡
We are on a knife-edge. 

Since Trump got re-elected, there is a chilling effect. It is sweeping right across those of us who stand in opposition to Big Tech’s takeover of politics. It is happening quickly and it is serious. Journalists, academics and civil society are under attack, and philanthropy is in free fall or has retreated entirely.

Sadly, we now have an expiry date: without new funding, we won’t survive. We have exhausted every alternative and we can’t stand against the unholy alliance of "Big Tech and State" without you.

With enough of you becoming paid members, we will have a chance to keep the fight alive. Will you help us? ✊🏻

Influence operations, both online and offline, have been practiced by most, if not all, global and national powers.

However, for geopolitical reasons, Western powers like the US and UK have placed particular emphasis on studying and countering Russian disinformation.
Least to say, there is enough evidence now of election interference attempts by the Russian state in many countries. The shadow of pro-Russian interference in the recent presidential elections has even led the Romanian Constitutional Court, controversially, to cancel the vote.

I sat down with Professor Martin Innes, the architect of Disinformation, Strategic Communications and Open Source Research Programme (DISCOS), based out of Cardiff University, for an in-depth conversation about Russian disinformation techniques, reach, effectiveness, and the role of "political technologists." 

Martin Innes is Director of the Crime and Security Research Institute, the Universities' Police Science Institute, and a Professor in the School of Social Sciences at Cardiff University
One shocking revelation emerged: in the days leading up to the Brexit referendum, Yevgeny Menchenko, one of these technologists, was in London and was granted access to extensive polling data from different sources involved in the referendum campaign. This data later became the foundation of his master’s course in Moscow on information warfare, designed to train the next generation of Russian political technologists.

Why did you start the project and what is emerging?

We had been conducting research using social media to understand public reactions to terror attacks. In 2017, the UK experienced four terror attacks in close succession. When we analyzed some of the responses, we noticed something alarming: several of the accounts posting about these events were the same ones identified as interfering in the 2016 U.S. presidential election. These were accounts attributed to Russia's Internet Research Agency (IRA).

That realisation felt significant. It suggested that the same operatives were not just meddling in U.S. politics but were active in other contexts too, extending their influence beyond elections. This prompted us to set up a research program to explore how social media was being weaponized for disinformation and broader information operations.

As we delved deeper, we developed a growing body of knowledge. Yet, there was this persistent question: where were these operations being directed from? It appeared to be coordinated, not random—a central engineering hub perhaps? But where? Who was translating the Kremlin's geopolitical strategy into these specific campaigns?

Initially, the assumption was it originated from Russia's intelligence services. But as we studied more, that explanation felt incomplete. We were essentially playing a game of whack-a-mole: identifying one operation, then another, and another. It was reactive, and the dots weren’t fully connecting.

We knew about the Kremlin’s overarching strategy, and we could identify individual operations, but there was a gap between the two. The prevailing assumption—that the FSB, GRU, or other intelligence agencies managed all of this—didn’t fully add up. That gap became the focus of our research.

This led us to a critical question: who was translating these high-level geopolitical strategies into the nuts-and-bolts of information operations? The answer, we suspected, lay not only with intelligence agencies but with a specific group of actors: political technologists.

Political technologists have a long history in Russia, deeply tied to its political culture. They were instrumental in Putin’s early rise and heavily involved in managing domestic political narratives. But until recently, they weren’t widely seen as having a major role in foreign information operations.

What prompted you to challenge the assumption about intelligence services managing influence ops? 

For one, we saw figures like Yevgeny Prigozhin—someone who wasn’t an intelligence officer—at the center of operations like the Internet Research Agency. That raised the question of whether there were other similar individuals or groups driving these campaigns.

Second, as we tracked these operations over time, we saw them evolving in sophistication. They became more creative, more adaptive, and more nuanced. This suggested a level of expertise and innovation that didn’t entirely match the modus operandi of traditional intelligence services. Instead, it looked like something rooted in the unique skill set of political technologists.

When we started exploring their role, it became clear that political technologists weren’t confined to managing domestic Russian politics. With events like the 2014 annexation of Crimea and the full-scale invasion of Ukraine in 2022, the scale and intensity of Russian disinformation efforts expanded dramatically.

We also began to notice a shift in the tools being used. The application of artificial intelligence, particularly large language models, became more apparent. These tools allowed for automated generation of sophisticated content, which aligned with the kind of innovative, experimental approaches political technologists are known for.

We began to triangulate these observations, leading us to a broader conclusion: while the Kremlin sets the geopolitical strategy, the tactical execution — the translation of these strategies into targeted disinformation campaigns — falls to these technologists. They occupy the space between high-level policy and ground-level operations, making them central players in Russia’s information warfare apparatus.

This realization opened up a new dimension in understanding how disinformation is designed and executed. It’s not just about spotting operations but identifying the architects behind them.

But who are they? Where do they come from? Were they instrumental in helping Putin with the first political campaign, the one famously boosted by Boris Berezovski’s TV channel? 

One of the most influential figures in shaping Putin's early reign was Vladislav Surkov, a political technologist and former Deputy Prime Minister of the Russian Federation.

A portrait of Surkov, by Peter Pomerantsev, The Atlantic, Nov 2014

Over time, different generations of technologists have emerged, and we've identified at least 500 individuals working in political technology across various sectors. Some are deeply political, some more technology-driven, working for commercial firms, while others are closely aligned with intelligence agencies.

These technologists play a crucial role in building Russia's geopolitical influence and expanding its network, operating not just in Russia but across Europe, Latin America, and Africa. Their work involves gathering open-source intelligence—polling data, political issues—and developing tailored strategies for exerting influence in different countries. 

How?

One of their key functions is taking directives from the presidential administration and then using those to draw up country plans. So, for instance, when Russia is seeking to establish greater influence in a country, they will do what we would understand as open source research. They will look at what polling data is available, what are the political issues. They work out a plan for how they could get influence in this particular country, given the situation that it's in, and start working on that agenda. It could be political influence, economic influence, or business influence over the media ecosystem. All in play at the same time.

Do you have any evidence that this intelligence gathering is helped and supported by the Russian embassies in different countries? 

We haven't got evidence on that at this point in time. However, we have been trying to map out this particular group, and we are working on linking them to the different parts of a playbook. We're trying to understand their methodology, the Big Picture, rather than the single operation or goal…

Because by doing that you get the key to the playbook: then all you have to do is just check that it's still working that way, right?

Yes, but it's also necessary, for a whole host of reasons. Identifying suspect accounts has got a lot more difficult and it’s extremely time-consuming. And some of the mistakes Russian operatives were making three years ago, they don't make anymore. So, we moved to pattern detection and pattern recognition, where we work on connecting the dots. Political technologists have been a key element of the ways in which Russia's Information operations have evolved and the innovations that have been introduced. In our latest report we deliberately picked up on two particular, quite innovative aspects. The first was what we refer to in the report as cyber Zyrinovsky. Zyrinovsky is a very important political figure within Russia. He was understood to have been a key influence on Putin's ideas about nationalism. He died a couple of years ago.

But a particular set of political technologists called the Social Design Agency have taken AI large language models and filled one up with Zyrinovsky's speeches and discourse ideology. So now, even though Zaranovsky is dead, contemporary Russia can log on to the cyber Zaranovsky. 

He still lives. 

He lives on, exactly that. And so they could log on and ask: what does Zaranowski think about the invasion of Ukraine? And it’s so interesting, because Putin has had a long standing interest in AI and the Kremlin, the Russian government, going back to 2017, has been investing in AI technologies quite significantly in Russia. Now, Zirinovsky is very meaningful within Russian political culture. But once you've done this, you can do it again. There is talk that they might invest in a cyber Lenin, and do exactly the same thing, “What does Lenin think of…?”

And then there’s Minecraft, you cover this at length in your work…

Yes, they had gone into Minecraft, the gaming platform for kids, where, in 2023, they built a monument to Zyrinovsky. This gives you a better sense of the breadth of their interests: information operations are not only done through manipulation of social media or by setting up fake websites. There's a whole monopoly of techniques, applied across different platforms, and we're only picking up a limited amount of this. We need a much broader radar in order to grasp the magnitude of the influence and audience manipulation.

What you are telling me means what is happening is way more insidious than we thought, because it's a whole cultural influence operation which can take many shapes and forms. 

Yeah, absolutely. This is why we are trying to connect the dots between this contemporary influence-engineering and the doctrine of Russian active measure.

The Russian doctrine of active measures is seen as a strategic, long-term, insidious exercise. I think one of the problems of the Western approach is that we're focused on specific operations and trying to measure the impact. I don't think that is the strategy that's being applied on the other side.

The active measures doctrine says: you do this over a long period of time and you chip away and there's this drip, drip, drip, cumulative effect.

You are a counterterrorism expert. What is the impact on psyops of different cultural imprints... I am thinking of Israel, the USA, China, North Korea, all of them. How are they different? How is Russian disinformation specific? 

That's a really important question for the future. There are many caveats here, but surely each of the different threat actors we're talking about has a different culture, different objectives, different strategic imperatives. Certainly to us, the Russian approach does look like it tracks back into their whole doctrine of active measures. China, for example, looks like it can operate at quite a phenomenal scale. And the work that we've done would tend to suggest that China, in terms of its technical apparatus, is quite sophisticated and quite good. They haven't quite got the content and the presentational elements right yet, but that will happen. And now we're seeing other countries come onto the scene and start utilizing these techniques, with nuances and differences. It operates in the shadows, but it’s also becoming part of how statecraft in a digital age is being conducted and organized, to the point that some elements of it are reshaping public diplomacy.

And why, according to your findings, is the UK so prominent as a target, compared to other countries? 

The UK has been quite steadfast and quite on the front foot in terms of its opposition to Russia, after the Skripal poisonings in 2018, triggered a significant shift compared to Londongrad. But also, because of our close alignment with the US, we were quite aware of the implications of the exposure of the 2016 presidential election interference efforts: we had seen something similar happen in 2014 during the Scottish independence referendum.

You mentioned Londongrad, the bygone era, not long ago, when London was a paradise for Russian money, whatever its origin. You would agree that for a long time before 2018 a big chunk of the elite in London basically went to bed with the Russians, made a lot of money, and that included politicians. The whole of Europe was in bed with Putin, until the sanctions following Putin’s invasion of Ukraine. But, as you mention in your report, one of the Russians welcomed with open arms in London is Yevgeny Minchenko, in the weeks before the Brexit vote. What happens here? Who is he? What kind of access does he get?

Minchenko is one of those prominent political technologists shaping the agenda in Moscow. We found out that has been delivering training on disinformation methods ahead of the last US elections.

Evgeny (Yevgeny) Minchenko is the President and Chief Executive Officer of MINCHENKO CONSULTING Communication Group (credits MCCG)
But in 2016, before the Brexit referendum, he was going around in London meeting a number of significant figures in the UK establishment. He was given data and all sorts of material, under the auspices that he was here to research the Brexit process. And, it has since transpired, he's now involved in teaching a newly set up master's course in Moscow. Effectively, it's a course in information warfare, using all of the information that he's been given to help train the next generation of political technologists. 
Michenko’s essay, “How elections are won in the USA, Great Britain and the European Union: analysis of political technologies,” includes survey materials provided to him by UK politicians, campaign staff, political consultants, and journalists.

Wait, what did he gather from all the information collected in London? Did he influence the result of the referendum?

We have no evidence to say that he influenced the outcome of the referendum. But what he has got is lots and lots of data on public attitudes within Britain on a range of different issues. And what that data enables you to do is understand the wedge issues if you are trying to influence and manipulate a population. What are the audience segments who you want to influence in particular ways? And this is part of the playbook on which those political technologists operate and one of the functions they perform. Minchenko did that in person and collected this data in person, but they also do it remotely. They'll be reading all of the opinion polls produced in a country they want to influence and identifying the useful segments: the anti-immigration segment or the people already interested in conspiracy theories. Quite rapidly, you can segment the audience and get a very good sense of the buttons you want to press if you want to run an influence campaign. Whose buttons can we press?

How was it allowed to happen?

I don't know. We're looking back very, very retrospectively, and I am slightly puzzled. We don't have the answer. But it goes back to what we were discussing: this is a long-term strategic effort rather than just little chunks of an operation. It's not just a campaign on social media. We really do need to connect the dots.

What is the role of social media platforms in countering this influence operation?

I think it's difficult to lump them all together. There is a differentiation: some of them take it more seriously and are more concerned, others less so. Also, a lot of the political public discourse has centered upon what they are doing on X, Twitter, Facebook, but it’s so much more than that.

Russians are creative, they're innovative, and they've been quite good at spotting new opportunities and new ways to reach new audiences via different social media platforms. We need to broaden the radar and have a more comprehensive understanding because they are everywhere, and they adapt very quickly. In the last two years, they have become undetectable. You cannot tell whether a comment is from a Russian troll or not.

I absolutely recommend to anybody who is interested in this area to go and read the material that the U.S. Department of Justice put out about the Social Design Agency, a legal affidavit published prior to the U.S. election. The really interesting material is in the appendices of the affidavit. It gives us a sense of the Social Design Agency, who are famous for running Doppelganger, which is a very physical information operation. But what it tells us is they’ve got performance indicators: about the number of different kinds of textual or video or image-based material that they have to put out every month. It tells us that not only they are putting out particular messages, but they also attach comments to these kinds of messages, and they have the number of comments. This gives us a rich level of detail about how much they are putting out, where they are putting it out, and how. And once you understand that, you’re much more able to see it.

It looks like they are trying to push a very specific ideology in Russia for Russians. But my understanding is that in the West, for instance, what they push is confusion and chaos. Is this correct?

I think it's wrong to reduce the geopolitical strategy to a single thing. There’s certainly an immediate short-term interest in ensuring that they come out advantaged by the war in Ukraine. There is definitely a sense in which they see the shift to a multipolar kind of world order as something that should be pursued. Beyond that, though, establishing distrust in institutions in the West, degrading public beliefs… that's all part of the armory that allows the achievement of instrumental goals.

What I have noticed is a kind of “pushing one narrative and the opposite of it.”

Absolutely. They’re not necessarily fixed on what they want to happen in the West. It's more about how to achieve a broader strategic set of goals. And again, that's part of how we need to rethink and change our understanding of what these information operations are designed to do.

But if it's a broader cultural operation, a strategic operation, you cannot tell whether they succeeded in, for instance, winning the Brexit referendum or obtaining Trump's new victory in the last elections. You cannot reliably measure the outcome of a broad cultural operation of destabilization.

No, we cannot say that effectively. Our current research agenda is starting to try to think much more seriously about information effects. That may not necessarily be a measurement framework but perhaps an indicator framework of things that you look at and keep track of to gain a sense of how much impact is actually being leveraged over the long term.

It is a really complex set of problems, and we might not get anywhere.