A Northwestern Medicine study used Artificial Intelligence to analyse tweets and figure out how COVID misinformation on social media can erase scientific truths from public awareness
Public attitudes to COVID-19 are one of the most crucial obstacles in the way of vaccinating a population.
The power of misinformation during a pandemic
In the UK, a recent Government report found that vaccine hesitancy remained high in London and in care home workers. Currently, the influx of COVID deaths is lower than it has been since November, 2020.
However, rumours keep circulating. These rumours threaten to derail vaccination efforts.
For instance, WhatsApp groups and Facebook posts in closed communities continue to whisper the myth that AstraZeneca will make you infertile. We dissected this rumour (created by a disgruntled German academic) and more, right here.
The pandemic highlighted weaknesses in social media networks more viciously than ever, with misinformation about COVID being circulated faster than it can be stopped. Often, Government information initiatives are treated with equal suspicion – while a Tweet with a lot of likes can become a lighthouse of imagined truth, for thousands who see it and don’t go anywhere else to fact-check that information.
This means that people refrain from taking the vaccine, or apply a home-made ‘cure’ for COVID that cannot help them.
Analysing 92,687,660 tweets
Researchers at Northwestern Medicine dove into the underworld of misinformation on Twitter, using an AI to analyse tweets and understand the impact of political speeches on health beliefs.
A team led by the study’s first author Hanyin Wang, a PhD student in the Driskill Graduate Program, retrospectively collected COVID-19-related tweets using the Twitter API. They examined 92,687,660 tweets, then randomly selected 5,000 of those tweets for deeper analysis.
Each tweet was then checked to understand if it met any of the four core constructs of the health belief model, perceived susceptibility, perceived severity, perceived benefits and perceived barriers.
‘Social media has contributed to misinformation’
“In the pandemic, social media has contributed to much of the information and misinformation and bias of the public’s attitude toward the disease, treatment and policy,” said corresponding study author Yuan Luo, chief Artificial Intelligence officer at the Institute for Augmented Intelligence in Medicine at Northwestern University Feinberg School of Medicine.
“Our study helps people to realize and re-think the personal decisions that they make when facing the pandemic. The study sends an ‘alert’ to the audience that the information they encounter daily might be right or wrong, and guide them to pick the information endorsed by solid scientific evidence.
“We also wanted to provide useful insight for scientists or health care providers, so that they can more effectively broadcast their voice to targeted audiences.”
‘Politicians may talk inaccurately’
“Politicians may talk inaccurately about a certain treatment’s effectiveness or say that COVID-19 is no big deal; it’s just like the flu,” said Luo said, also chief AI officer at Northwestern University Clinical and Translational Sciences Institute.
“These comments have as strong effect as real scientific evidence and drive people’s beliefs. This is what we are concerned about.
“As a scientist, you need to be aware that you need to get the science out to people. If you don’t put energy into this, your efforts can be easily offset by those who talk irresponsibly. Going forward, we may want to pay more attention to a public information campaign to educate people about the vaccine in order to maximize the inoculation impact.
“A lot of people aren’t aware of how much their beliefs are impacted by tweets, and don’t bother to fact check what they read and retweet. When the information is biased, they ignore or did not notice it.”
Editor's Recommended Articles
Must Read >> Hong Kong approves the Sinovac COVID vaccine for use