For-profit Instagram accounts displaying incorrect health information

November 9, 2022 — Nearly 1 in 4 popular posts on Instagram about hepatitis B contained misinformation, and these posts were much more likely than accurate posts to be from for-profit accounts or accounts selling a product or service. service.

It’s according to the new to research recently presented at 2022 American College of Gastroenterology Annual Scientific Meeting in Charlotte, North Carolina.

“Users who generate hepatitis B misinformation also have greater reach with more followers and engagement with more likes than users who do not promote misinformation,” said the presenter. Zachary C. Warner, MD, MPH, internal medicine resident at the University of Arizona. “It’s possible that patients with chronic diseases — conditions that don’t have simple treatments — are vulnerable to online health misinformation and for-profit users.”

As misinformation and skepticism towards evidence-based medicine have become more prevalent online, patients are turning to social media and other user-generated sites for information and support regarding their health, according to Warner.

“While these sites are useful because they can provide access to social support and information that patients would otherwise miss, medical information on social media is unregulated,” he warned.

Intel not verified

Although the consequences of exposure to misinformation online are not well studied, negative effects are possible.

“Adoption of unproven remedies and symptom management may increase patients’ risk for poor health outcomes and financial hardship,” Warner said. “Unproven cures and symptom management approaches are unlikely to be covered by health insurance, potentially leaving patients to pay significant fees.”

Warner and his team limited their Instagram search to a one-month snapshot in December 2021. They searched for all publicly available posts that mentioned hepatitis B and hepatitis B. After removing duplicates from the top 55 messages for each term, they coded the rest. 103 messages with a validated tool to assess misinformation. The tool’s variables included engagement, such as likes and comments; user characteristics, such as number of subscribers; and claims with erroneous information as judged by medical experts.

Next, the researchers analyzed the relationship between profitability and misinformation among messages. Almost a quarter of messages (23%) contained misinformation about hepatitis B or its treatment. These posts also had higher average engagement with 1,599 likes compared to posts with specific information about hepatitis B, which had an average of 970 likes. Accounts with posts containing misinformation also tracked a higher average number of accounts (1,127) than those with accurate posts about hepatitis B (889 accounts). But accounts posting the wrong information had about a third of the average number of followers (22,920) compared to accounts posting the correct information (70,442 followers).

“We think it’s wise to maintain a high level of skepticism for information that promises ‘too good to be true’ results, uses supporting anecdotes, or is experimental,” Warner said. “We recommend the CRAAP test, which guides individuals to assess sources of health information.”

Does it pass the CRAAP test?

  • Take into account VSurgency of information
  • The Radequacy to your needs
  • Asource authority
  • Athe accuracy of the content, and
  • The PSuppose the source exists

The researchers found in their study that just under a third (30%) of hepatitis B posts referenced a conspiracy theory and a similar proportion came from for-profit accounts (29%). . And just over a third (34%) of posts came from accounts that sold a product or service through Instagram.

Overall, more than three times as many posts with misinformation were from for-profit accounts (47%) than posts with accurate information (14%). A similar proportion of posts with misinformation (43%) came from accounts selling a product or service, compared to accurate posts (13%) from accounts selling a product or service.

These results were not surprising to David Gorski, MD, PhD, professor of surgery at Wayne State University School of Medicine.

“While health misinformation is often driven by ideology and belief, it is almost always also driven by the profit motive of practitioners who sell treatments based on the misinformation,” he said.

“In other words, most quacks believe in the quackery they’re selling, and believers are far more effective salespeople than scammers who know what they’re selling is quackery,” Gorski said.

“We strongly emphasize that patients do their best to assess the possible drivers behind the individuals or organizations that create the health information viewed online, especially on social media sites,” Warner said, also advising that clinicians and health organizations openly engage online and in personal conversations about misinformation.

“Disinformation die-hards are almost always unapproachable and unapproachable, and it’s largely a waste of time trying to change their minds,” Gorski said. “However, the people who are on the fence, who are unsure, are reachable. We should be targeting our educational efforts at them, not at those selling quackery.”

Comments are closed.