A recent video, entitled “This Natural Trick Can Cure Your Cancer,” boldly claims that a man named Johan R. Tarjany discovered that a species of moss could kill cancer cells. This amazing cancer cure “has been known since the 1800s,” the video continues, but sadly “this knowledge has been suppressed by big pharmaceutical companies.”
At the 00:49 mark, the video fesses up: Dr. Tarjany is fictitious and so is his supposed cancer cure. The video’s real message: viewers need to be skeptical, use critical thinking and ask questions, particularly when confronted by claims of miracle cures.
“Johan R. Tarjany” is an anagram for Jonathan Jarry, a science communicator at McGill University’s Office for Science and Society (OSS), which is dedicated to “separating sense from nonsense” in the sciences. Mr. Jarry created the video, which was released this past June and quickly went viral, racking up over 10 million views. His goal was to counteract the influence of many similar such videos circulating online. Junk science claims run rampant on the internet.
“Besides videos, there are Facebook pages, meetings, an entire ecosystem online that caters to people with hard-to-treat conditions. These claims are tinged with conspirational thinking and provide easy answers to complicated problems,” says Mr. Jarry. “It’s very hard to fight this tide of misinformation.”
“We’ve always had social media, but that has accelerated, along with the growth of fake news and the celebrity pseudo-science advocate,” says Timothy Caulfield, a law professor and health policy expert at the University of Alberta, and the author of Is Gwyneth Paltrow Wrong About Everything?, an exploration of celebrity-backed bunk.
Mr. Caulfield also hosts the documentary series, A User’s Guide to Cheating Death, which began streaming on Netflix in late September. The show takes a skeptical, scientific look at various controversial health procedures.
“I think the whole phenomenon of junk science erodes our critical thinking, and that might be the most serious damage in the long run,” says Mr. Caulfield. “All liberal democracies depend on a certain level of critical thinking. Pseudo-science confuses discourse around legitimate science and hurts public trust in it.”
Joe Schwarcz, co-founder of McGill’s OSS, says that part of the confusion comes from the fact that proponents of junk science have no need to be cautious in what they say. “When people want simple solutions to complex problems, they gravitate towards the quacks and charlatans,” he says. “They always seem to have an answer, even though it’s usually an incorrect one. As scientists, we constantly have to pepper our language with maybes, ifs and buts, and don’t knows, and the other side doesn’t need to do that. They are not constrained, as we are, to have to pay attention to the real information that is out there.”
Heather Douglas, who until recently held the Waterloo Chair in Science and Society at the University of Waterloo and is now an associate professor in the department of philosophy at Michigan State University, says a public consensus around certain established facts is essential to scientific inquiry and debate. “Alternative facts are a way of undermining a basis of public discourse, or even of having a sensible conversation in the classroom,” she says. Dr. Douglas cites so-called flat-Earthers as a prime example of this wilful denial of established fact.
Academia, which strives for truth, is a natural enemy of misinformation and deception. Some academics have reacted by getting more involved in speaking out for legitimate research. Others, like Mr. Caulfield and Dr. Schwarcz, have become well-known debunkers of junk science and its proliferation, which itself has become a subject of academic study from a variety of disciplines and viewpoints.
“The proponents of pseudo-science increasingly adopt the language of real science to give themselves a veneer of legitimacy and credibility,” says Mr. Caulfield. “We see a lot of this with stem cells, for example. They will talk about giving stem cell injections or products that stimulate stem cells. They are trying to leverage the excitement of real research, so it becomes increasingly difficult to tease out the real science from the fake. It all becomes a confusing mess.”
Academics are also studying the related phenomenon of vaccine hesitancy – parents who choose not to vaccinate their children as a result of bogus anti-vaccination propaganda. Heather MacDougall, an associate professor of history at U of Waterloo, studies and writes about the history of vaccination in Canada. She says that vaccination opponents use various tactics, including incomplete information, to present their case.
“The scientific literature discusses both the benefits and limitations of vaccines. The opposition pulls out the paragraphs discussing limitations and uses those for their opposition,” she says. They also frequently cite a long-since debunked link between vaccines and autism.
In November, Dr. MacDougall will participate in a panel discussion entitled, “Science fact of fiction? How can science be heard in an age of misinformation?” at the Canadian Science Policy Conference in Ottawa.
The media also had a role to play in the propagation of junk science. Josh Greenberg, director of the school of journalism and communication at Carleton University, says that today’s flood of misinformation has its roots partly in the decline of traditional sources of information, such as newspapers.
“In their place, we’ve seen the emergence of things like blogs, op-ed columns, sponsored posts which haven’t been subject to the same kinds of editorial scrutiny as the Globe and Mail, for example.” As a result, he says, anyone with an axe to grind, a hidden agenda or something to sell now has access to a large audience.
And that audience is being subjected to many deceptive practices, says Bernie Garrett, an associate professor of nursing at the University of British Columbia and part of an interdisciplinary group looking at internet health scams. He and his research group have developed a series of criteria to determine the likelihood that something is a scam.
“Scams often tout miracle cures, boasting overnight results, with little or no empirical evidence – such as large-scale studies – for the product’s efficacy,” says Dr. Garrett. He notes that claiming a “cure,” as opposed to a treatment, is itself suspect. “Claims of miracle cures are a classic warning sign. No physician or nurse will tell you, ‘this is a cure for your condition,’ because we can’t guarantee that.”