Almost three years into the pandemic, Covid-19 remains stubbornly stubborn. This also applies to misinformation about the virus.
As Covid cases, hospitalizations and deaths rise in parts of the country, myths and misleading narratives continue to develop and spread, angering overworked doctors and bypassing content moderators.
What began in 2020 as rumors casting doubt on the existence or seriousness of Covid quickly evolved into often outlandish claims about dangerous technology lurking in masks and supposed miracle cures from unproven drugs like ivermectin. The launch of the vaccine last year sparked another wave of unfounded concerns. Now, on top of all the claims that are still being circulated, there are conspiracy theories about the long-term effects of the treatments, researchers say.
Ideas still thrive on social media platforms, and the constant barrage, now a years-long buildup, has made it increasingly difficult to get accurate advice, misinformation researchers say. This leaves people already suffering from pandemic fatigue further injured by the ongoing dangers of Covid and vulnerable to other harmful medical content.
“It’s easy to forget that health misinformation, including about Covid, can still contribute to people not getting vaccinated or creating stigma,” said Megan Marrelli, the editorial director of Meedan, a nonprofit focused on digital literacy and information access concentrated. “We know for certain that health misinformation contributes to the spread of disease in the real world.”
Twitter is of particular importance to researchers. The company recently gutted the teams responsible for keeping dangerous or inaccurate material on the platform at bay. ceased enforcement its Covid misinformation policy and began basing some content moderation decisions on public polls released by its new owner and CEO, billionaire Elon Musk.
From Nov. 1 to Dec. 5 February, Australian researchers collected more than half a million conspiratorial and misleading English-language tweets about Covid, using terms such as “deep state”, “hoax” and “bioweapon”. The tweets received more than 1.6 million likes and 580,000 retweets.
The researchers said the amount of toxic material skyrocketed late last month with the release of a film that made unsubstantiated claims that Covid vaccines had triggered “the largest orchestrated die-off in world history”.
Naomi Smith, a sociologist at Federation University Australia who helped carry out the research along with Timothy Graham, a digital media expert at Queensland University of Technology, said Twitter’s misinformation policies helped target content against Curb vaccinations circulating on the platform in 2015 and 2016 From January 2020 to September 2022, Twitter suspended more than 11,000 accounts for violating its Covid misinformation policy.
well dr Smith said the protective barriers “fall down in real time, which is both academically interesting and downright terrifying.”
On Elon Musk’s Twitter
“Before Covid, people who believed in medical misinformation generally just talked to each other, locked in their own little bubble, and you had to do a bit of work to find that bubble,” she said. “But now you don’t have to do any work to find that information — it shows up in your feed with all other types of information.”
Several prominent Twitter accounts that had been suspended for spreading unsubstantiated claims about Covid have been restored in recent weeks, including those of Rep. Marjorie Taylor Greene, a Georgia Republican, and Robert Malone, a vaccine skeptic.
Mr. Musk himself has taken to Twitter to speak out about the pandemic, predicting in March 2020 that the United States is likely to “almost zero new cases” until the end of April. (More than 100,000 positive tests were reported to the Centers for Disease Control and Prevention in the last week of the month.) This month, he targeted Dr. Anthony S. Fauci, who will soon step down as President Biden’s chief medical adviser and longtime director of the National Institute of Allergy and Infectious Diseases. Mr. Musk said Dr. Fauci should be prosecuted.
Twitter did not respond to a request for comment. Other major social platforms, including TikTok and YouTube, said last week they remain committed to tackling Covid misinformation.
YouTube bans content – including videos, comments and links – about vaccines and Covid-19 that goes against the advice of local health authorities or the World Health Organization. Facebook’s policy on Covid-19 content is more than 4,500 words long. TikTok said it had removed more than 250,000 videos for misinformation about Covid and was working with partners like its Content Advisory Council to develop its policies and enforcement strategies. (Mr. Musk dissolved Twitter’s advisory board this month.)
But platforms are struggling to enforce their Covid rules.
Newsguard, an organization that tracks online misinformation, found this fall that typing “Covid vaccine” into TikTok resulted in searches for “Covid vaccine violation” and “Covid vaccine warning” being suggested during the same search query on Google for recommendations for “walk -in Covid vaccine” and “Types of Covid vaccines”. A search on TikTok for “mRNA vaccine” turned up five videos with false claims within the first 10 results, according to researchers. TikTok said in a statement that its community guidelines “make it clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform.”
In recent years, people have sought medical advice from neighbors or attempted to diagnose themselves through Google searches, said Dr. Anish Agarwal, Philadelphia emergency room physician. Now, years after the pandemic began, he’s still getting patients who believe “crazy” claims on social media that Covid vaccines will put robots in their arms.
“We fight against this every day,” said Dr. Agarwal, who teaches at the University of Pennsylvania Perelman School of Medicine and is Associate Director of Penn Medicine’s Center for Digital Health.
Discussions about the coronavirus online and offline are constantly shifting, with patients asking him questions about booster shots and long Covid lately, Dr. said Agarwal. He has received a grant from the National Institutes of Health to study the Covid-related social media habits of different populations.
“In the future, understanding our behaviors and thoughts around Covid will also likely shed light on how individuals interact with other health information on social media and how we can actually use social media to combat misinformation,” he said.
Years of lies and rumors about Covid have had a contagious effect, damaging public acceptance of all vaccines, said Heidi J. Larson, director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.
“The Covid rumors will not go away – they will be repurposed and they will adapt,” she said. “We can’t erase that. No company can fix that.”
Some efforts to slow the spread of misinformation about the virus have met with First Amendment concerns.
A law California passed a few months ago, set to take effect next month, would penalize doctors for spreading false information about Covid vaccines. It is already facing lawsuits from plaintiffs who claim the regulation is an unconstitutional violation of freedom of expression. Tech companies like Meta, Google and Twitter have faced lawsuits this year from people banned from Covid for misinformation, claiming the companies have gone too far in their content moderation efforts, while other lawsuits have accused the platforms for not doing enough to curb misleading narratives about the pandemic.
Dr Graham Walker, an emergency room physician in San Francisco, said the rumors about the pandemic circulating online had driven him and many of his colleagues to take to social media to try to correct inaccuracies. He has posted multiple Twitter threads with more than a hundred evidence-filled tweets trying to debunk misinformation about the coronavirus.
But this year, he said he feels increasingly defeated by the onslaught of toxic content on a variety of medical issues. He left Twitter after the company abandoned its Covid misinformation policy.
“I was starting to think this wasn’t a winning fight,” he said. “It doesn’t feel like a fair fight.”
well dr Walker said he is watching a “triple pandemic” of Covid-19, RSV and influenza bombard the healthcare system, causing emergency room wait times to increase from less than an hour to six hours in some hospitals. Misinformation about readily available treatments is at least partly to blame, he said.
“If we had a bigger surge in vaccinations with the latest vaccines, we would probably have a smaller number of people becoming extremely ill with Covid and that will certainly affect the number of hospitalizations,” he said. “Honestly, at this point, we’re going to take every dent we can get.”