Sometimes, The Scientists Are Wrong

My wife Jeanie and I, for one reason or another, got to talking the other day about smoking. How annoying and nasty the habit is, how grateful we are that neither of us ever got started on it, and how sad we feel for the people we know who smoke and are now suffering the inevitable health-consequences of a life spent smoking. That sort of thing. Then Jeanie related something that took me by surprise: her mother, Claudia, who had smoked for close to fifty years before quitting, once told Jeanie that, back in the days prior to WWII, a doctor had actually prescribed cigarettes for her husband-at-the-time—as a treatment for his asthma. The act of inhaling cigarette smoke, the doctor’s reasoning apparently went, would improve her husband’s breathing. Of course, we both got a good laugh out of that.

It was such a kooky notion—that cigarettes might actually benefit you—that my curiosity was piqued, and so I decided to do some research into the recent history of tobacco in the US. What I discovered wasn’t so funny. It was a decades-long road built by tobacco industry greed, aided and abetted by deft Madison Avenue hustling and outright manipulation of the “facts” (read, lying) combined with a general state of ignorance and stupidity shared equally by the medical establishment and its blindly-trusting American public.

Here is just a sampling of the relevant milestones along that road (thanks to Gene Borio of History Net):

  • 1912: First strong connection made between lung cancer and smoking. Dr. I. Adler is the first to strongly suggest that lung cancer is related to smoking in a monograph.
  • 1934: The American Medical Association accepts tobacco advertising in their journals. These ads include statements like, “We advertise KOOL cigarettes simply as a pleasant combination of fine tobaccos made even more pleasant by the cooling sensation of menthol. They won’t cure anything. They won’t harm anybody. They will prove enjoyable.”
  • 1948: The Journal of the American Medical Association argues, “more can be said in behalf of smoking as a form of escape from tension than against it . . . there does not seem to be any preponderance of evidence that would indicate the abolition of the use of tobacco as a substance contrary to the public health.”
  • 1964: 1st Surgeon General’s Report, asserting that cigarette smoking is responsible for a 70 percent increase in the mortality rate of smokers over non-smokers.

What I found particularly stunning was this: it wasn’t until fifty-two years had passed since discovering the first hard evidence linking smoking and lung cancer that the medical establishment finally admitted—publicly—that smoking is likely to kill us. Remember this point. I’ll come back to it later.

Question: What has all of this to do with changing our ideas about nutrition?

Ah, good question. Outside of the fact that we put both into our mouths, there would appear to be little similarity between tobacco and food. This is especially true when we consider that tobacco is a known toxic substance clinically linked (as of this writing) to nearly 440,000 of the more than 2.4 million annual deaths in the US (American Heart Association). But then I ran into this eye-opening article published on About.com dated June 21, 2007, which revealed this: poor diet and physical inactivity was, in fact, the second leading cause of death in the US in the year 2000 (the latest such figures available), accounting for 365,000 deaths or 15.2% of the total deaths. (One might assume that, given the significant rise in the incidence of diabetes, obesity, and other diet-related health issues since 2000, those figures have likely increased as well.)

Evidently—just like tobacco—food can kill us. But obviously, unlike tobacco, we need food to survive. How then does one interpret the simple statement “food can kill us?” Are we talking about all foods, or just some? Which ones? Processed foods? Foods containing dairy? Wheat? Protein? Carbohydrate? Low-fat? And what about quantities?

How are we to eat healthily, you ask, when we have so many choices, and so much conflicting information about those choices to wade through?

Ah, another good question. The answer that immediately pops to mind is: science. We will enlist the aid of science to help us make educated decisions. Science will tell us which foods are good to eat, and in what quantities.

Which brings me, finally, to the meat of this introductory article, Questioning the Science.

I realize you might have concerns about this. You might be thinking, unless we are ourselves scientists, who are we to question the science of nutrition? More important, why would we question it?

(A brief aside: I have always been amused by some interpretations of the oft-affixed car bumper sticker which exhorts us to “Question Authority” to mean “Reject Authority.” I mean nothing of the kind. In the context of this article, when I use the word “question,” I mean “examine closely.” )

There are a number of reasons why we might question the science we are using to help us live safe, productive and happy lives.

The first reason involves the limitations inherent in the way modern scientific theories are developed and tested: the venerable scientific method. The scientific method is the gold standard of accepted science. It is the only protocol for scientific reasoning recognized virtually everywhere in the scientific community.

It is also flawed.

Robert Persig, author of the landmark book Zen and the Art of Motorcycle Maintenance, and a former student of biochemistry, had much to say about the scientific method. Here is a sampling:

  • "The number of rational hypotheses that can explain any given phenomenon is infinite."
  • "If the purpose of scientific method is to select from among a multitude of hypotheses, and if the number of hypotheses grows faster than experimental method can handle, then it is clear that all hypotheses can never be tested. If all hypotheses cannot be tested, then the results of any experiment are inconclusive and the entire scientific method falls short of its goal of establishing proven knowledge."
  • "Traditional scientific method has always been at the very best, 20 – 20 hindsight. It’s good for seeing where you’ve been. It’s good for testing the truth of what you think you know, but it can’t tell you where you ought to go."

Persig’s assertion—that it is impossible to know all of the possible hypotheses that might apply to a given phenomenon—means that scientists are forced to come up with ideas to test solely from their (collective or individual) imaginations, which are obviously limited. And for every idea that presents itself, there are ten more lurking just around the next corner. All theories are, at best, educated guesses. And nothing is ever actually proven.

Does this mean that all scientific pronouncements are bogus? Of course not. But many of them are definitely suspect, and for a variety of reasons.

Consider this excerpt from a recently published article on NewScientist.com:

"Most published scientific research papers are wrong, according to a new analysis. Assuming that the new paper is itself correct, problems with experimental and statistical methods mean that there is less than a 50% chance that the results of any randomly chosen scientific paper are true.

"John Ioannidis, an epidemiologist at the University of Ioannina School of Medicine in Greece [and Tufts University in the US], says that small sample sizes, poor study design, researcher bias, and selective reporting and other problems combine to make most research findings false. But even large, well-designed studies are not always right, meaning that scientists and the public have to be wary of reported findings."

 

One possible way to identify flawed studies is to rigorously examine their adherence to a basic tenet of the scientific method: replication. The question must be asked: is the outcome of a particular hypothesis-test consistently repeatable?

Remember the two characters (oops, I meant scientists) in Utah who came up with Cold Fusion? The scientific world was abuzz for months over this ostensible panacea for the world’s energy woes. It looked great—on paper. Unfortunately, no one else in the scientific community could replicate the process.
Never mind that the theory behind the process couldn’t be proven ; it couldn’t even be demonstrated.
(It might be useful to recollect here that the so-called Law of Gravity (a misnomer) has yet to be proven, but it has obviously been replicated.)

But even studies replicated over and over again can be inherently flawed, this time by an effect difficult to guard against in the scientific community: researcher bias (otherwise known as confirmation bias).

Leo Tolstoy once observed:

"I know that most men, including those at ease with problems of the
greatest complexity, can seldom accept even the simplest and most
obvious truth if it be such as would oblige them to admit the falsity
of conclusions which they have delighted in explaining to colleagues,
which they have proudly taught to others, and which they have woven,
thread by thread, into the fabric of their lives."

The term hadn’t yet been coined in Tolstoy’s era, but the phenomenon he refers to here is well-known in psychological circles as cognitive dissonance. (For an in-depth and entertaining discussion of cognitive dissonance, take a look at Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts (2007 Harcourt Books) by Carol Tavris and Elliot Aronson.) Of course, Tolstoy was likely referring to people in general; but the problem of cognitive dissonance (or any other form of confirmation bias) takes on greater significance when ascribed to research scientists, who are rightly expected to maintain a high degree of impartiality in their research—even when it appears to be leading in an unexpected and/or undesirable direction.

All of which illustrates the second reason to question the results of any scientific inquiry: scientists are people.

This might be hard for many readers to swallow. As a culture, we worship our scientists, our doctors, our researchers, our physicists, chemists, biologists. We put them up on their ivory pedestals, shower them with our adoration, and give to them our undying promise to unquestioningly believe their every utterance and to promulgate it as truth.

Alas, would that God Himself commanded such respect.

So where does this leave us? Obviously, we still must rely on scientific inquiry—conducted by scientists—to help us make our own decisions regarding our health and fitness. But, as was mentioned above, we must be wary of the information we encounter. We must examine it under our own microscopes for procedural flaws. We must demand strict adherence to the scientific method (despite its flaws, still the best research tool around when used properly), and we must demand that our researchers do everything in their power to take their humanity out of the scientific process.

It sounds as if we have to become scientists ourselves to ensure the validity of our scientific information. Of course, that’s hardly practical. But I do believe that, by making ourselves more scientifically aware, and by making the effort to learn some practical techniques for critical examination, we can certainly improve the overall quality and impact of our personal decisions.

Here’s one way to start:

Award-winning science writer Gary Taubes, a correspondent for Science Magazine, has written an exhaustive and enlightening (some might say, depressing) exposé on scientific study, entitled Good Calories, Bad Calories. The book focuses (obviously) on the thought currently in vogue throughout the nutrition industry, and exposes the hugely convoluted body of information being tossed around without scrutiny—in virtually every corner of the diet movement—as a veritable house of cards, based on a combination of (among other things) unsubstantiated hearsay, post-hoc fallacies, and (surprise!) incredibly bad science. If there is any one book which can teach us how to critically evaluate the scientific information we are constantly bombarded with, this is it.

Among the book’s salient points (from the book jacket):

  • For decades we have been taught that fat is bad for us, carbohydrates better, and that the key to a healthy diet is eating less and exercising more. Yet with more and more people acting on this advice, we have seen unprecedented epidemics of obesity and diabetes. Taubes argues persuasively that the problem lies in refined carbohydates (white flour, sugar, easily digested starches)—via their dramatic effect on insulin, the hormone that regulates fat accumulation—and that the key to good health is the kind of calories we take in, not the number. There are good calories, and bad ones.
  • Taubes traces how the common assumption that carbohydrates are fattening was abandoned in the 1960s when fat and cholesterol were blamed for heart disease and then—wrongly—were seen as the causes of a host of other maladies, including cancer. He shows us how these unproven hypotheses were emphatically embraced by authorities in nutrition, public health, and clinical medicine, in spite of how well-conceived clinical trials have consistently refuted them. He also documents the dietary trials of carbohydrate-restriction, which consistently show that the fewer carbohydrates we consume, the leaner we will be.
  • With precise references to the most significant existing clinical studies, he convinces us that there is no compelling scientific evidence demonstrating that saturated fat and cholesterol cause heart disease, that salt causes high blood pressure, and that fiber is a necessary part of a healthy diet. Based on the evidence that does exist, he leads us to conclude that the only healthy way to lose weight and remain lean is to eat fewer carbohydrates or to change the type of carbohydrates we do eat, and, for some of us, perhaps to eat virtually none at all.

In Part 2 of this article, I will discuss some specifics about looking critically at current scientific thought on nutrition.