Tuesday, November 1, 2011

Soda and Teen Violence: Hazards of Correlational Analysis Can Put Truth in Harm’s Way

Statistical Ammo for the Twinkie Offense: Guns Don’t Kill People, Soda Kills People?

No one with an interest in public health and a zest for social engineering can go wrong by criticizing soda. That particular cup runneth over with empty calories and plausible links to obesity, as well as tooth decay. The carbonated beverage industry is fighting a pitched battle in municipalities across the US to avoid punitive taxation meant to price soda beyond the reach of citizens who seem most at risk of making poor nutritional choices -- choices that ultimately all of us must pay for. No matter what you think of coercive taxation, it’s really hard to defend soda.

Now, though, thanks to research published by two Harvard-trained researchers, Sara Solnick of University of Vermont and David Hemenway of the Harvard School of Public Health, soda has come under another sort of fire.

An analysis performed on the Boston Youth Survey database shows a significant correlation between soda consumption and violence in teens, including a propensity to carry guns and exhibit violent behavior with partners. The paper, which appeared in the Journal of Injury Prevention, got a two-day flurry of press attention because it has a provocative thesis that keyed off the famous “Twinkie Defense.” It gives everyone even more reason to worry about what dangers may lurk in our refrigerators.

Or does it?

Causality and Correlations

Notably, this kind of ad hoc, ad hominem statistical analysis is becoming increasingly commonplace as researchers mine survey and diary data for links between diet and just about everything, hoping to demonstrate in many ways that we are what we eat. But when hunt-and-peck analysis is behind so much nutritional science, it’s important to ask: Are all these correlations worthy of media coverage -- sensational or sober? And although they’re always good press, are they good science?

Scientists and social scientists are trained to recognize that association is not the same as causality. Still, researchers who scavenge databases for statistical links sometimes issue that caveat about causality in an almost pro forma way, then go on to offer up imaginative speculation about what causal relationships there may be.

In this case, Solnick and Hemenway suggest the possibility that all that sugar might be crowding out other nutrients (a concept whose underlying science is not actually clear), as well as the possibility that the stimulating effects of caffeine may be the root cause of the problem. As it happens, the researchers are handicapped in speculating about which of these two is more plausible because the Boston Youth Survey database does not include information on the type of soda consumed (e.g., diet vs. regular; caffeine-free vs. caffeinated). Not a minor detail.

Even more important, they have little in the data to discipline their speculation because the survey database was merely mined, not designed, by these two researchers to test their own hypotheses, and thus, it cannot speak to many other relevant considerations that might potentially explain the statistical link. All these researchers had available to them were socio-economic variables and information about whether the teens ate meals with their parents, a handy but limited surrogate for parental oversight and family functionality or social integration.

Demon Coke: Sensationalism Over Plausibility

In the hard sciences, investigations are designed and evaluated on the basis of pre hoc hypotheses, so as to discourage “shopping” for statistical effects. In the social sciences, analysis of large survey databases allows researchers to browse for findings of interest while often leaving them ill-equipped to test more rigorously for cause and effect. The ability to do that places added onus on researchers to develop more stringent, science-based criteria before elevating correlations to the status of meaningful findings.

Much as I personally dislike soda and wish that so many Americans did not drink themselves fat, it is only one of many sources of excess carbohydrates in our diet and should not be demonized quite so recklessly. Under the heading of cause and effect, let’s agree that it’s much safer to have science -- real science -- drive politics than the reverse.

Statistics and the Search for Truth

Speaking more broadly, however, researchers who discover correlational links between foods or environmental phenomena and health effects should be more cautious and more disciplined in their rush for publicity. Researchers are under the obligation to look harder for logical explanations that square with known science before waving correlations at us in the literature and the press. They actually need to represent ad hoc correlations as something closer to hypothesis than fact.

The same argument can be made on the other side of the ledger, when researchers identify strong associations between healthy lifestyle behaviors and desirable outcomes. For instance, a recent University of Aberdeen study identified a strong negative correlation between self-reports of apple consumption during pregnancy and asthma in children. It’s actually difficult to trace a logical connection between the two -- something the authors freely acknowledge -- but they argue that it is, at very least, benign to endorse apple consumption in pregnancy, even if the finding is coincidental.

Here is a more cynical take. When a study uncovers only two (negative) correlations (the other was fish consumption and infant eczema), you can bet those findings will be published.

Admittedly, our ability to detect events sometimes gets ahead of our ability to explain them. Quantum physics is a perfect example -- we don’t discard findings because they violate current cosmologies. But nutritional science is not astrophysics. It is played out in the public arena on a daily basis, and people are prone to run with half-baked ideas in the world they wish desperately to control and understand: their own health.

With all of these quasi-findings given legitimacy in the scientific press, it’s no wonder that the public is often so distrustful of scientific data -- and so quick to propagate their own principles of magical causation. We’ve taught them how.

No comments:

Post a Comment

To prevent spam, all comments are moderated. Please allow a few minutes for your comment to be reviewed and approved. Thank you.