A Kremlin-linked group known for online campaigns to sow falsehoods and distrust among Russia’s foes helped fuel the frenzy of conspiracy theories about Catherine and her health.
The whirl of conspiracy theories that enveloped Catherine, Princess of Wales, before she disclosed her cancer diagnosis last week probably didn’t need help from a foreign state. But researchers in Britain said Wednesday that a notorious Russian disinformation operation helped stir the pot.
Martin Innes, an expert on digital disinformation at Cardiff University in Wales, said he and his colleagues tracked 45 social media accounts that posted a spurious claim about Catherine to a Kremlin-linked disinformation network, which has previously spread divisive stories about Ukraine’s president, Volodymyr Zelensky, as well as about France’s support for Ukraine.
As in those cases, Professor Innes said, the influence campaign appeared calculated to inflame divisions, deepen a sense of chaos in society, and erode trust in institutions — in this case, the British royal family and the news media.
“It provokes an emotional reaction,” he said. “The story was already being framed in conspiracy terms, so you can appeal to those people. And people who support the royal family get angry.”
The motive, he said, was likely commercial as well as political. Social media traffic about Catherine skyrocketed over the last three months, as a dearth of information about her condition created a void that an online army filled with rumors and speculation. For the Russian network, amplifying those posts through their accounts would enable them to boost their own traffic statistics and follower counts.
It is not clear who might have hired the disinformation network to go after Catherine, but it has a track record of campaigns to undermine the countries and people at odds with the Kremlin. Britain’s robust support for Ukraine, and London’s longstanding antagonism with Moscow, would make it a tempting target for the Russians.
The Daily Telegraph, a London newspaper, reported on Sunday that British officials were worried that Russia, China and Iran were fueling disinformation about Catherine in an effort to destabilize the country.
Asked about these reports in Parliament on Monday, the deputy prime minister, Oliver Dowden, did not name the countries, but said it was “a reminder to us all that it is important for us to ensure that we deal with valid and trusted information, and are appropriately skeptical about many online sources.”
In 2020, a British parliamentary committee concluded that Russia had mounted a prolonged, sophisticated campaign to undermine Britain’s democracy — using tactics that ranged from disinformation and meddling in elections to funneling dirty money and employing members of the House of Lords. The Russian foreign ministry dismissed the conclusions as “Russophobia.”
Kensington Palace, where Catherine and her husband, Prince William, have their offices, declined to comment on Russia’s role in the recent rumormongering. The palace has appealed to the news media and the public to give Catherine privacy, after she announced she had cancer in a video last Friday.
Professor Innes, who leads a research program exploring the causes and consequences of digital disinformation, said his team noticed a mysterious spike in a certain type of social media post on March 19, a day after video surfaced of Catherine and William leaving a food shop near their home in Windsor.
One widely repeated post on X featured an image from the video, with Catherine’s face clearly altered. It asked, “Why do these big media channels want to make us believe these are Kate and William? But as we can see, they are not Kate or William. …”
Tracing the 45 accounts that recycled this post, Professor Innes said, the researchers found they all originated from a single master account, carrying the name Master Firs. It bore the characteristics of a Russian disinformation operation known in the industry as Doppelgänger, he said.
Since 2017, Doppelgänger has been linked to the creation of fake websites that impersonate actual news organizations in Europe and the United States. Last week, the U.S. Treasury Department’s Office of Foreign Assets Control announced sanctions against two Russians, and their companies, for involvement in cyberinfluence operations. They are believed to be part of the Doppelgänger network.
Catherine is not the only member of the royal family to have become the subject of an online feeding frenzy in Russia. On the same day as the multiple posts about the video, an erroneous report of the death of King Charles III began circulating on Telegram, a social media network popular in Russia.
Those reports were later picked up by Russian media outlets, forcing the British embassies in Moscow and Kyiv, the Ukrainian capital, to deny them as “fake news.” Like Catherine, Charles, 75, is being treated for cancer, though he continues to greet visitors privately and plans to attend church services on Easter.
Beyond the Russian involvement, the rumors and gossip about Catherine’s health sprouted in many corners of the web, including on accounts sympathetic to William’s brother, Prince Harry, and his wife, Meghan. With such a widespread online frenzy, the impact of any state actor might be muted.
“It’s very hard to isolate only one piece,” said Alexandre Alaphilippe, executive director of the EU DisinfoLab, a research organization in Brussels that played a role in identifying the Russia-based disinformation group in 2022 and gave it the name Doppelgänger. “The question is what is being spun by the media, online influencers or inauthentic sources. Everything is interconnected.”
Such campaigns are also particularly hard to measure, he said, because social media companies like X and Meta have restricted access to data that would allow researchers, journalists and civil society groups to get a more granular look at the spread of material on their platforms.
Nor are some disinformation-for-hire outfits very discriminating about what material they spread online, Mr. Alaphilippe said. “You may see bots pushing a Russian narrative on Monday,” he said. “On Tuesday, they may do online gaming. On Wednesday, they can do crypto-scam campaigns.”
Even as awareness of Russian disinformation campaigns has grown since the American presidential election in 2016, the volume of internet trickery and lie spreading has not slowed.
Through bots, online trolls and disinformation peddlers, Russia-linked groups jump on news events to sow confusion and discord. Ukraine has been the major focus of their efforts for the past two years as President Vladimir V. Putin seeks to undermine the West’s resolve to continue supporting the war.
A French government minister recently blamed Russia for artificially amping up concerns about a bedbug scare last year in Paris. Another false claim that media monitoring groups said was amplified by Russia was that the European Union would allow powdered insects to be mixed into food.
The spreading of rumors about Catherine is a more traditional influence operation, but the Russians have been refining their tactics as governments and independent researchers grow more sophisticated at detecting their activities.
In the United States and Europe, fake news sites have popped up to push Russian propaganda and potentially influence elections in 2024. In YouTube and TikTok videos, people have posed as Ukrainian doctors and movie producers to tell fake tales favorable to Russia’s interests.
“Whether spreading it for profit or for political purposes, these types of actors tend to jump on anything engaging and controversial,” said Rasmus Kleis Nielsen, director of the Reuters Institute for the Study of Journalism at Oxford University. “Not unlike some news media,” he added, though their motives might differ.
“When politically motivated,” Professor Nielsen said, “the point is rarely persuasion as much as attempts to undermine people’s confidence in the media environment.”