Skip to main content

Can you spot the problem with these headlines? (Level 1) - Jeff Leek and Lucy McGowan

471,667 Views

18,386 Questions Answered

TEDEd Animation

Let’s Begin…

In medicine, there’s often a disconnect between news headlines and the scientific research they cover. While headlines are designed to catch attention, many studies produce meaningful results when they focus on a narrow, specific question. So how can you figure out what’s a genuine health concern and what’s less conclusive? Jeff Leek and Lucy McGowan explain how to read past the headline.

Additional Resources for you to Explore

Sometimes a health headline will focus on a study done in mice, or done with only 2 people, or done by forcing people to run hundreds of miles a day. The farther the study is from who you are and how you act in your real life, the less likely the results will translate directly to you. For example, the Physicians Health Study is a well-known study that was conducted in the 1980s that demonstrated that aspirin reduced the risk of having a heart attack. The observed effect was very strong and lead to headlines such as: “Health; Doctors Confirm Benefits of Aspirin”, however this study was only conducted among men, and since it has been demonstrated that the effect may not be the same for women. Does this mean the original study was wrong? Of course not! The researchers were explicit in saying their conclusions could only be drawn among men. This leads to an important question to ask when evaluating whether a study’s result will be interesting for me: Who was included in the study? Could I fit in that population?

In our videos, we briefly discuss the design of a study, for example paying attention to whether there is a control group. There are many types of studies but two big types are randomized, controlled trials (RCTs) and observational studies. RCTs are the gold standard, because they let us prove that an exposure, like taking a drug or changing a diet, causes a change in your health. Observational studies can support the hypothesis that the drug or diet causes the change, but you usually need a lot more information. Hill’s criteria are a set of extra bits of information that might lend more credibility to an observational study (see Lucy’s fun blog post on this for more).

In addition to the design of the study, it is important to notice the sample size.Bigger studies usually let us measure results more accurately, especially when the result is likely to be noisy or hard to get a good handle on. To demonstrate this, a science journalist, John Bohannon, conducted a study that found that eating chocolate lead to weight loss. The problem? This study was only conducted on only 16 participants and John purposefully looked at many different outcomes before he found a significant one. (Check out John’s write up of this study).

If the impact of the exposure, like a drug or diet, is pretty small we call that a small effect size. The smaller the impact, the harder it is to reliably find it with a study. When small effect sizes are combined with small sample sizes it makes it really, really hard to figure out if we found something real or not in the data. Similarly, in a very large study, we may be able to detect a very small effect that is statistically significant, but not actually scientifically meaningful. For example, remember the study we discussed that examined whether a new drug prolonged the life of patients with pancreatic cancer - while the result was statistically significant,
the actual effect was only a difference of 10 days
.

Finally, it is important to pay attention to the outcome being analyzed. Many times, surrogate outcomes are used. Our example is based on a real case from the medical literature. In the mid-twentieth century, researchers started publishing studies suggesting that a B-vitamin called niacin might help prevent heart disease by raising levels of beneficial cholesterol. By 2009, more than 700,000 prescriptions for niacin were being written in the U.S. every month. There was just one problem. It turns out that niacin doesn’t help prevent heart disease — but it does increase the risk of serious side effects. The original niacin studies concluded that the vitamin would reduce the risk of heart disease because it raised the levels of the healthy form of cholesterol, and people with higher levels tend to have fewer heart attacks. But they didn’t measure heart attack risk directly. When other scientists eventually did, it turned out that niacin’s effect on cholesterol didn’t actually lead to fewer heart attacks.

A great way to dig deeper is to read the research article behind the headlines. Read Jeff’s blog post for tips on tracking down the original research.

We have only scratched the surface of things to think about when evaluating scientific studies. If you are interested in pursuing this area further, we have a short massive online open course (MOOC) that dives a bit deeper into these issues. The course is designed to be completed by busy students and professionals, with short lessons and quizzes that can be completed on the go. We take real examples from the medical literature and dive into the text to understand the study design, statistical methods, and results. The whole course can be completed in about four hours. We leave out the mathematical detail and focus on the conceptual ideas. Anyone can pick this course up and gain a better understanding of how data and statistics are used in the medical literature. You can find the course on Leanpub: Understanding Data and Statistics in Medical Literature.

Enjoy!

Lucy D’Agostino McGowan and Jeff Leek

Next Section »

About TED-Ed Animations

TED-Ed Animations feature the words and ideas of educators brought to life by professional animators. Are you an educator or animator interested in creating a TED-Ed Animation? Nominate yourself here »

Meet The Creators

  • Educator Lucy McGowan, Jeff Leek
  • Director Michael Kalopaidis
  • Narrator Addison Anderson
  • Animator Maria Savva
  • Illustrator Dimitra Dakaki
  • Sound Designer Andreas Trachonitis
  • Producer Zedem Media
  • Director of Production Gerta Xhelo
  • Editorial Producer Alex Rosenthal
  • Associate Producer Bethany Cutmore-Scott
  • Associate Editorial Producer Elizabeth Cox
  • Script Editor Eleanor Nelsen
  • Fact-Checker Brian Gutierrez

More from Media Literacy