The Worried Well Whipped Into A Frenzy

Friday, May 23, 2014

Transcript

 

Gary Schwitzer has devoted his life to reviewing how health news is reported and, more often than not, misreported. On The Media last spoke to him in 2009 when, in the face of continuously ham-handed health reports, he decided to stop examining network television coverage of health altogether. But now, Gary’s back! His website, healthnewsreview.org, gathered a team of experienced reviewers to evaluate 1,899 health stories covering innovations in health science. His study, reviewing stories from 2006 to 2013, appeared earlier this month in the Journal of the American Medical Association. Schwitzer and his team judged the success or failure of these stories by developing and asking a set of very focused questions.
 
In reporting about how wonderful something was, did it actually discuss the cost? Did it adequately explain the size of the potential benefit and the size of the potential harm? Did it evaluate the quality of the evidence? Did it have more than a single source, and did it look at conflicts of interest in the source? Did the story rely solely or largely on a news release? Was it reporting on something that was truly available, or was this a phase one study that is years away? In reporting on the new, which is what our job is in journalism, did it also put the “new” into the context of existing alternatives, which, by default, have a longer, more proven track record?
 
The questions cover a variety of possible mistakes in order to uncover misreporting, but many of the stories suffer from some common errors. On the subject of statistical literacy, one of them that Schwitzer divined was the confusion of risk reduction in relative terms versus absolute terms. Schwitzer explained this issue in layman’s terms.
 
It's like having a 50% off coupon at Macy's, but you don't know what that 50% off coupon can be applied to. It can be applied to a diamond necklace or to a pack of chewing gum. Until you answer that, you don't know the absolute value; you only know the relative value. So let's not talk about a drug reducing the risk of something by 50%, 50% of what? If we’re talking about a change from 2 in 100 having a problem down to 1 in 100, that, indeed, is a 50% relative risk reduction, but to people with this condition, it's a difference of 1 in 100. And it means that everybody else who didn’t benefit had to take the drug, pay for it, run the risk of side effects and didn't get any result.
 
Another big problem is the failure to explain the limits of observational studies, namely that causal links cannot be made from these studies because correlation is not causation. Schwitzer mentions the frequent use of active verbs like “raised or lowered risk” or “boosted protection” when journalist describe correlations in an observational studies, but that those phrases imply causation and are therefore inaccurate.
 
For journalists, putting a human face to an otherwise abstract story gives it a little flesh and blood and works as an easy appeal to the reader’s pathos, but this practice bedevils Schwitzer as a reviewer of health news. He claims that such techniques are, “possibly the leading category of imbalance,” because they only tell the positive patient stories and ignore the many trial dropouts or study participants that could not maintain the regimen of the experiment.
 
One of the almost universal problems that appeared in Schwitzer’s study was new technology as being reported nearly uncritically as the next big thing. Here, too, Schwitzer offers his analysis and critique.
 
When you're talking about proton beam radiation therapy machines that require a linear accelerator the size of a football field, it is, gee whiz, it’s all the kind of stuff that gets you on page 1. It also should be the kind of stuff that we’re reporting, whoa, let’s put the brakes on and look at where we have the evidence and where we don't for how this is any better than what we’re already using.
 
But Schwitzer does see some positive examples of health journalism. He points to in-depth investigative and data-driven projects, which are often foundation supported efforts, as some of the best healthcare journalism around. The day to day misreporting that occurs, however, leads Schwitzer to believe that consumers would be better off with far less healthcare news coverage.
 
Despite his efforts, the same mistakes are regularly being made by journalists, but Schwitzer still believes that his work could affect meaningful change. However, at least temporarily, that change may be put on hold. His study and site have, for eight years, received foundation funding, but today Schwitzer keeps his site alive through his own individual blogging, and finding new sources of funding proves difficult.
 
I think one of the reasons for that is that almost every day, with almost everything we write, we upset somebody in journalism. But we also upset somebody in healthcare, who liked the way that shoddy journalism was making their idea look better than it really may have been. So you tell me. Who is gonna fund this kind of stuff?
 
Gary Schwitzer is publisher of the website, healthnewsreview.org.

Gary Schwitzer has devoted his life to reviewing how health news is reported and, more often than not, misreported. On The Media last spoke to him in 2009 when, in the face of continuously ham-handed health reports, he decided to stop examining network television coverage of health altogether. But now, Gary’s back! His website, healthnewsreview.org, gathered a team of experienced reviewers to evaluate 1,899 health stories covering innovations in health science. His study, reviewing stories from 2006 to 2013, appeared earlier this month in JAMA Internal Medicine. Schwitzer and his team judged the success or failure of these stories by developing and asking a set of very focused questions.

In reporting about how wonderful something was, did it actually discuss the cost? Did it adequately explain the size of the potential benefit and the size of the potential harm? Did it evaluate the quality of the evidence? Did it have more than a single source, and did it look at conflicts of interest in the source? Did the story rely solely or largely on a news release? Was it reporting on something that was truly available, or was this a phase one study that is years away? In reporting on the new, which is what our job is in journalism, did it also put the “new” into the context of existing alternatives, which, by default, have a longer, more proven track record?


The questions cover a variety of possible mistakes in order to uncover misreporting, but many of the stories suffer from some common errors. On the subject of statistical literacy, one of them that Schwitzer divined was the confusion of risk reduction in relative terms versus absolute terms. Schwitzer explained this issue in layman’s terms.


It's like having a 50% off coupon at Macy's, but you don't know what that 50% off coupon can be applied to. It can be applied to a diamond necklace or to a pack of chewing gum. Until you answer that, you don't know the absolute value; you only know the relative value. So let's not talk about a drug reducing the risk of something by 50%, 50% of what? If we’re talking about a change from 2 in 100 having a problem down to 1 in 100, that, indeed, is a 50% relative risk reduction, but to people with this condition, it's a difference of 1 in 100. And it means that everybody else who didn’t benefit had to take the drug, pay for it, run the risk of side effects and didn't get any result.


Another big problem is the failure to explain the limits of observational studies, namely that causal links cannot be made from these studies because correlation is not causation. Schwitzer mentions the frequent use of active verbs like “raised or lowered risk” or “boosted protection” when journalist describe correlations in an observational studies, but that those phrases imply causation and are therefore inaccurate.


For journalists, putting a human face to an otherwise abstract story gives it a little flesh and blood and works as an easy appeal to the reader’s pathos, but this practice bedevils Schwitzer as a reviewer of health news. He claims that such techniques are, “possibly the leading category of imbalance,” because they only tell the positive patient stories and ignore the many trial dropouts or study participants that could not maintain the regimen of the experiment.

 

One of the almost universal problems that appeared in Schwitzer’s study was new technology as being reported nearly uncritically as the next big thing. Here, too, Schwitzer offers his analysis and critique.


When you're talking about proton beam radiation therapy machines that require a linear accelerator the size of a football field, it is, gee whiz, it’s all the kind of stuff that gets you on page 1. It also should be the kind of stuff that we’re reporting, whoa, let’s put the brakes on and look at where we have the evidence and where we don't for how this is any better than what we’re already using.


But Schwitzer does see some positive examples of health journalism. He points to in-depth investigative and data-driven projects, which are often foundation supported efforts, as some of the best healthcare journalism around. The day to day misreporting that occurs, however, leads Schwitzer to believe that consumers would be better off with far less healthcare news coverage.

Despite his efforts, the same mistakes are regularly being made by journalists, but Schwitzer still believes that his work could affect meaningful change. However, at least temporarily, that change may be put on hold. His study and site have, for eight years, received foundation funding, but today Schwitzer keeps his site alive through his own individual blogging, and finding new sources of funding proves difficult.


I think one of the reasons for that is that almost every day, with almost everything we write, we upset somebody in journalism. But we also upset somebody in healthcare, who liked the way that shoddy journalism was making their idea look better than it really may have been. So you tell me. Who is gonna fund this kind of stuff?

Gary Schwitzer is publisher of the website, healthnewsreview.org.

Guests:

Gary Schwitzer

Hosted by:

Bob Garfield

Comments [2]

Gary Schwitzer from Minnesota

To knowlengr:

You have captured the essence of our intent all along - that we know "it's difficult to give every story and source the scrutiny it deserves" - but that "even if these are only ideals to be sought," our criteria should be top-of-mind. And it doesn't require much to add a caveat and a disclaimer or two.

We try to remind journalists of simple concepts that could be injected into more stories:

• In health care, more isn't always better, newer isn't always better. Less is often more.

• Any health care decision involves tradeoffs: something you stand to gain but also something you stand to lose.

• To quote Sir Muir Gray: "All screening tests cause harm; some do good as well."

• On surrogate markers: "A difference to be a difference must make a difference."

It doesn't require a PhD and it doesn't require a 1,000-word story.

Gary Schwitzer
Publisher
HealthNewsReview.org

May. 27 2014 02:53 PM

Great piece. It's difficult to give every story and source the scrutiny it deserves -- especially for Twitter RT's of previously covered stories. Still, even if they are only ideals to be sought, they should be top-of-mind. At the least, a caveat or disclaimer or two could serve as a proxy for a more thorough analysis.

At the moment, there's no full transcript of this piece, so I posted the twelve questions that health journalists (really, it's science journalism generally, though health coverage is especially problematic) should be asking at knowlengr.com (http://bit.ly/1w3A4U4).

May. 25 2014 01:28 PM

Leave a Comment

Email addresses are required but never displayed.