I’m more than half-way through Taubes’ book that I mentioned about a month back and the one thing that struck me (in a good way) was that it’s more of a political work than one on nutrition. To take one simple example, the third and fourth chapters carry the titles “Creation of Consensus” and “The Greater Good,” respectively, and they delve into the politicization of the science of nutrition, and a particular kind of mentality wherein it is considered acceptable to “save” one life by “treating” a thousand people, even if 99.9% of them are not, and will never be, “sick.”
It is perplexing to see people who call themselves scientists ignoring, for some reason, all evidence that causes their hypothesis to fail, and Taubes writes the following in the tenth chapter-
Even the diabetes community found it easier to accept Reaven’s science than its dietary implications. Reaven’s observations and data “speak for themselves,” as Robert Silverman of the NIH suggested at a 1986 consensus conference on diabetes prevention and treatment. But they placed nutritionists in an awkward position. “High protein levels can be bad for the kidneys,” said Silverman. “High fat is bad for your heart. Now Reaven is saying not to eat high carbohydrates. We have to eat something.” “Sometimes we wish it would go away,” Silverman added, “because nobody knows how to deal with it.”
This is what psychologists call cognitive dissonance, or the tension that results from trying to hold two incompatible beliefs simultaneously. When the philosopher of science Thomas Kuhn discussed cognitive dissonance in scientific research—”the awareness of an anomaly in the fit between theory and nature”—he suggested that scientists will typically do what they have invariably done in the past in such cases: “They will devise numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” And that’s exactly what happened with metabolic syndrome and its dietary implications. The syndrome itself was accepted as real and important; the idea that it was caused or exacerbated by the excessive consumption of carbohydrates simply vanished.
And this is from the fourteenth chapter on “The Mythology of Obesity,” where he discusses the “thrifty gene” hypothesis and its history-
It wasn’t until the late 1970s, just a few years before Neel himself publicly rejected his hypothesis, that obesity researchers began invoking thrifty genes as the reason why putting on weight seems so much easier than losing it. Jules Hirsch of Rockefeller University was among the first to do so, and his logic is noteworthy, because his primary goal was to establish that humans, like every other species of animal, had apparently evolved a homeostatic system to regulate weight, and one that would do so successfully against fluctuations in food availability. We eat during the day, and yet have to supply nutrients to our cells all night long, while we sleep, for example, so we must have evolved a fuel storage system that takes this into account. “To me, it would be most unthinkable if we did not have a complex, integrated system to assure that a fraction of what we eat is put aside and stored,” Hirsch wrote in 1977. To explain why these components might cause obesity so often in modern societies, he assumed as fact something that Neel had never considered more than speculation. “The biggest segment of man’s history is covered by times when food was scarce and was acquired in unpredictable amounts and by dint of tremendous caloric expenditure,” Hirsch suggested. “The long history of food scarcity and its persistence in much of the world could not have gone unnoticed by such an adaptive organism as man. Hoarding and caloric miserliness are built into our fabric.”
But that “scarcity” is a myth more than anything else-
The prevailing opinion among anthropologists, not to be confused with that of nutritionists and public-health authorities, is that hunting and gathering allow for such a varied and extensive diet, including not just roots and berries but large and small game, insects, scavenged meat (often eaten at “levels of decay that would horrify a European”), and even occasionally other humans, that the likelihood of the simultaneous failure of all nutritional resources is vanishingly small. When hunting failed, these populations could still rely on foraging of plant food and insects, and when gathering failed “during long-continued drought,” as the missionary explorer David Livingstone noted of a South African tribe in the mid-nineteenth century, they could relocate to the local water holes, where “very great numbers of the large game” also congregated by necessity. This resiliency of hunting and gathering is now thought to explain why it survived for two million years before giving way to agriculture. In those areas where human remains span the transition from hunter-gatherer societies to farmers, anthropologists have reported that both nutrition and health declined, rather than improved, with the adoption of agriculture. (It was this observation that led Jared Diamond to describe agriculture as “the worst mistake in the history of the human race.”)
Although famines were both common and severe in Europe until the nineteenth century, this would suggest that those with European ancestry should be the most likely to have thrifty genes, and the most susceptible to obesity and diabetes in our modern toxic environments. Rather, among Europeans there is “a uniquely low occurrence of Type 2 diabetes,” as Diamond puts it, more evidence that the thrifty-gene hypothesis is incorrect.