Conspiracy theories, misinformation and speculation about coronavirus have flooded social media. But who starts these rumours? And who spreads them?
We’ve investigated hundreds of misleading stories during the pandemic. It’s given us an idea about who is behind misinformation – and what motivates them. Here are seven types of people who start and spread falsehoods:
You’d hope no-one was fooled by a WhatsApp voice note claiming the government was cooking a giant lasagne in Wembley stadium to feed Londoners. But some people didn’t get the joke.
To take a slightly more serious example, a prankster created a screenshot of a fake government text that claimed the recipient had been fined for leaving the house too many times. He thought it would be funny to scare people breaking lockdown rules.
After encouraging his followers to share it on Instagram, it found its way to local Facebook groups, where it was posted by worried residents, some of whom took it seriously.
“I don’t really want to cause panic,” says the prankster, who wouldn’t give us his real name. “But if they believe a screenshot on social media, they really need to sort of re-evaluate the way they consume information on the internet.”
- The people fighting fakes from their sofas
- How you can stop bad information from going viral
- What we know about those NHS Twitter bot rumours
Other fake texts claiming to be from the government or local councils have been generated by scammers looking to make money from the pandemic.
One such scam investigated by fact-checking charity Full Fact in March claimed that the government was offering people relief payments and asked for bank details.
Photos of the scam text were shared on Facebook. Since it circulated by text message, it’s difficult to get to the bottom of who was behind them.
Scammers started using fake news about the virus to make money as early as February, with emails suggesting people could “click for a coronavirus cure review” or suggesting they were entitled to a tax refund because of the outbreak.
Misinformation doesn’t just come from dark corners of the internet.
Last week President Donald Trump questioned whether exposing patients’ bodies to UV light or injecting bleach could help treat the coronavirus. He was speculating and took facts out of context.
He later claimed the comments were sarcastic. But that didn’t stop people from phoning hotlines to ask about treating themselves with disinfectant.
It’s not just the US President. A Chinese foreign ministry spokesman promoted the idea that Covid-19 might have been brought to Wuhan by the US Army. Conspiracy theories about the outbreak have been discussed in prime time on Russian state TV, and by pro-Kremlin Twitter accounts.
All the uncertainty about the virus has created a perfect breeding ground for conspiracy theories.
A false story of murky origins claiming the first volunteer to take part in a UK vaccine trial had died circulated in big anti-vaccination and conspiracy Facebook groups. It was fiction.
Interviews with David Icke on YouTube, which have since been removed, also peddled false claims that 5G is linked to coronavirus. Mr Icke also appeared on a London TV station, which was found to have breached the UK’s broadcasting standards. His Facebook page was later taken down, the company said, for publishing “health misinformation that could cause physical harm”.
Conspiracy theories have led to scores of attacks on 5G masts.
- Reality Check: No, 5G does not spread coronavirus
- Covid-19 5G theories are ‘most common’ misinformation
- Ofcom rules on Holmes and Icke coronavirus remarks
Sometimes misinformation seems to come from a trustworthy source – a doctor, professor or hospital worker.
But often the “insider” is nothing of the sort.
A woman from Crawley in West Sussex was the originator of a panicky voice note predicting dire – and completely unsubstantiated – death tolls for young and healthy coronavirus sufferers. She claimed to have inside information through her work at an ambulance service.
She did not respond to requests for comment or provide proof of her job, so we don’t know whether she actually is a health worker. But we do know that the claims in her voice note were unfounded.
That alarming voice note and many others went viral because they worried people, who then shared the messages with friends and family.
That includes Danielle Baker, a mum of four from Essex, who forwarded a note on Facebook messenger “just in case it was true”.
“At first I was a bit wary because it was sent from a lady that I didn’t know,” she says. “I forwarded it on because myself and my sister have babies the same age and also have older children, and we all have high risk in our households.”
They’re trying to be helpful and they think they’re doing something positive. But, of course, that doesn’t make the messages they pass along true.
It’s not just your mum or uncle. Celebrities have helped amplified misleading claims go mainstream.
The singer M.I.A. and actor Woody Harrelson are among those who have been promoting the 5G coronavirus theory to their hundreds of thousands of followers on social media.
A recent report by the Reuters Institute found that celebrities play a key role in spreading misinformation online.
Some have huge platforms on traditional media as well. Eamonn Holmes was criticised for appearing to give some credence to the 5G conspiracy theorists on ITV This Morning.
“What I don’t accept is mainstream media immediately slapping that down as not true when they don’t know it’s not true,” he said.
Mr Holmes later apologised and Ofcom “issued guidance” to ITV, deeming the comments “ill-judged”.
Illustrations by Simon Martin. Additional reporting by Olga Robinson.
We’ve been nominated for a Webby Award – if you appreciate the work we do vote for us here.