Andrew’s Blog

Dr. Zimmerman's New Info/Perspective on Vaccines Posted on March 14, 2019, 0 Comments

Milk and Mucus Myths and Misdirection Posted on January 25, 2016, 0 Comments

The more health claims made about a food, the worse it is for you.  

In the case the dairy industry, the above statement is nothing less than spot on (shameless plug, I'll admit). Sick cows fed everything under the sun other than the grass they were designed to eat are not the ideal source for your dairy consumption.  And if the producers opt to then pasteurize or homogenize or in some shape or form bastardize that dairy, then what was once an incredibly healthy source of nutrition soon becomes udderly unrecognizable as a food.  

However, organic, grass-fed cows (and sheep, goats, etc) raised the way nature intended can produce quality dairy products which are extremely beneficial for health.  Sufficient quantities of bio-available calcium (i.e. animal sources for those of us who aren't ruminant herbivores) keep parathyroid hormone low while also increasing the likelihood of tryptophan converting to niacin rather than serotonin (and that should make the health conscious happy).  It also helps maintain a favorable calcium to phosphorous ratio in the diet, without which blood pressure, inflammation, and even tumor growth are often increased.  Calcium also down regulates the production of adrenalin.  Oh--and it's involved in muscle contraction and is an essential component in the electrical conduction system of the heart, too.  

But does dairy cause mucus production?

Quite simply--yes.  In those who are sensitive to it, dairy can create an immune response in the body...just like any and every food someone eats that isn't conducive to their specific digestive capabilities.  

As one study from the journal Medical Hypotheses states:

Excessive milk consumption has a long association with increased respiratory tract mucus production and asthma. Such an association cannot be explained using a conventional allergic paradigm and there is limited medical evidence showing causality. In the human colon, β-casomorphin-7 (β-CM-7), an exorphin derived from the breakdown of A1 milk, stimulates mucus production from gut MUC5AC glands. In the presence of inflammation similar mucus overproduction from respiratory tract MUC5AC glands characterises many respiratory tract diseases. β-CM-7 from the blood stream could stimulate the production and secretion of mucus production from these respiratory glands. Such a hypothesis could be tested in vitro using quantitative RT-PCR to show that the addition of β-CM-7 into an incubation medium of respiratory goblet cells elicits an increase in MUC5AC mRNA and by identifying β-CM-7 in the blood of asthmatic patients. This association may not necessarily be simply cause and effect as the person has to be consuming A1 milk, β-CM-7 must pass into the systemic circulation and the tissues have to be actively inflamed. These prerequisites could explain why only a subgroup of the population, who have increased respiratory tract mucus production, find that many of their symptoms, including asthma, improve on a dairy elimination diet.

So I guess it depend on who you are and what you've done to your digestive capacity via nutrition and lifestyle practices.  Let me 'splain:

Oftentimes an inability to digest/assimilate dairy begins secondary to another offending agent.  Gluten is a prime suspect.  Alcohol and medicinal drugs are also common culprits.  Anything which is a stress to the biological system has the potential to inflame the gut wall (for more on this subject, see the Seesaw of Sickness in my book, Spot On: Nutrition found here:  This microtrauma to the intestine causes tiny holes to form, allowing food particles to pass into the bloodstream undigested. The body then creates antibodies to that particular food, potentially causing you to have an immune response to whatever you’re eating. Additionally, the constant inflammation causes what's termed villous atrophy. Lining the wall of your intestines, you have little finger like projections called villi. These, in turn, have tiny little microvilli covering them--you have about 200 million per square millimeter. The job of the microvilli is to help you assimilate nutrition from your food by producing various enzymes. One of these enzymes, in the case of our current discussion, is lactase--the enzyme you need to do anything with the lactose in dairy.  No micromilli equals limited lactase (and other digestive enzymes) which limits your ability to consume dairy without suffering ill effects.  

Of course, raw dairy typically comes with the exact enzymes one needs to safely and effectively consume it.  But pasteurization destroys all those enzymes along with most if not all of the heat-sensitive nutrients.  This is one reason why folks who are "lactose intolerant" often do fine when eating raw dairy.  These people also typically fare better with full fat dairy instead of skim or low fat versions which will have more lactose per serving than the unadulterated milk products.  Sheep and goat dairy are often better tolerated than diary from cow (ask if you wanna know why); and still others are sensitive to the form--doing fine with hard cheeses or yogurt yet having trouble with milk.  Lastly, quantity and frequency of exposure are also factors which need to be considered in regards to how a person reacts to dairy.  While the healthy digestive system should be able to handle just about anything that's thrown at it or in it (up to a point), the sad truth is most of us have done such damage to ourselves that we need to do some serious rehab of the gut wall and our health in general before we're able to eat whatever we want.

And, personally, I think that level of consciousness--even if forced upon us due to digestive complaints or otherwise--is actually a blessing.  It's a painful signal that we're moving in the wrong direction, and it's time to redirect.  


For those who need their academic mind satisfied, I've included a couple of studies below.  N = 1, however, so I suggest you experiment on you to find what works best for your specific biochemistry. In the first of three studies investigating the widely held belief that "milk produces mucus," 60 volunteers were challenged with rhinovirus-2, and daily respiratory symptoms and milk and dairy product intake records were kept over a 10-day period. Nasal secretion weights were obtained by weighing tissues collected and sealed immediately after use. Information was obtained on 51 subjects, yielding 510 person-days of observation. Subjects consumed zero to 11 glasses of milk per day (mean, 2.7; SE, 0.08), and secretion weights ranged from zero to 30.4 g/day (mean, 1.1; SE, 0.1). In response to an initial questionnaire, 27.5% reported the practice of reducing intake of milk or dairy products with a cold or named milk or dairy products as bad for colds. Of the latter group, 80% stated the reason as "producing more mucus/phlegm." Milk and dairy product intake was not associated with an increase in upper or lower respiratory tract symptoms of congestion or nasal secretion weight. A trend was observed for cough, when present, to be loose with increasing milk and dairy product intake; however, this effect was not statistically significant at the 5% level. Those who believe "milk makes mucus" or reduce milk intake with colds reported significantly more cough and congestion symptoms, but they did not produce higher levels of nasal secretions. We conclude that no statistically significant overall association can be detected between milk and dairy product intake and symptoms of mucus production in healthy adults, either asymptomatic or symptomatic, with rhinovirus infection.  There is a belief among some members of the public that the consumption of milk and dairy products increases the production of mucus in the respiratory system. Therefore, some who believe in this effect renounce drinking milk. According to Australian studies, subjects perceived some parameters of mucus production to change after consumption of milk and soy-based beverages, but these effects were not specific to cows' milk because the soy-based milk drink with similar sensory characteristics produced the same changes. In individuals inoculated with the common cold virus, milk intake was not associated with increased nasal secretions, symptoms of cough, nose symptoms or congestion. Nevertheless, individuals who believe in the mucus and milk theory report more respiratory symptoms after drinking milk. In some types of alternative medicine, people with bronchial asthma, a chronic inflammatory disease of the lower respiratory tract, are advised not to eat so-called mucus-forming foods, especially all kinds of dairy products. According to different investigations the consumption of milk does not seem to exacerbate the symptoms of asthma and a relationship between milk consumption and the occurrence of asthma cannot be established. However, there are a few cases documented in which people with a cow's milk allergy presented with asthma-like symptoms.

Meat and the Environment Posted on December 15, 2015, 0 Comments

Original Source found here:

Not eating red meat won’t save the planet

Asa Wahlquist
Published: December 14, 2015 - 9:00PM

Comment: The future of protein is not meat
It sounds so easy: stop eating red meat to lower greenhouse gas emissions. But nature is far more complicated than that.

There are three critical questions you need to ask before cutting beef and lamb out of your diet for environmental reasons: what will happen to the grasslands that cattle and sheep graze; how will alternate protein be produced; and what will the greenhouse consequences of that production be?

About 60 per cent of the world's agricultural land is grasslands, land that is too poor and too dry to be cropped. In Australia, about 70 per cent of the country is grassland. The only way food can be produced from grasslands is by grazing ruminants. Mammals cannot digest grass, but ruminants have special stomachs filled with grass-digesting bacteria. The problem is those bacteria produce methane, which the ruminant burps out.

Methane is a potent greenhouse gas with a rating 25 times that of carbon dioxide over 100 years, although it has a lifetime of 9 to 12 years in the atmosphere.

The experience worldwide is that if cattle are removed from grasslands, the original ruminants re-establish themselves, or ferals invade.

In Australia the main ferals are goats, as well as camels in drier regions. Contrary to popular belief, kangaroos do produce methane, although the actual quantities, and their alternate pathways for digesting cellulose from grass, are the subject of ongoing research. Even termites produce methane: they are responsible for about three per cent of Australia's greenhouse gas emissions.

What if everyone did go vegetarian and the grasslands were not grazed at all? In Australia, they would most likely burn. Bushfire accounts for about 3 per cent of Australia's net greenhouse gas emissions.

The argument overseas focuses largely on the huge quantities of grain, that could otherwise be consumed by humans, that are fed to livestock. This is a practice that is indefensible on environmental grounds. In Australia, most cattle and all sheep are grassfed. Dairy cattle are usually given supplementary feed, which is mostly forage or hay with some grain.

If you decide not to eat meat, where are you going to get your protein, and what are the greenhouse gas consequences? Soy beans, chickpeas, lentils - all the high-protein legumes - are crops that are grown on cleared land, land that is ploughed, fertilised, planted, irrigated and harvested by greenhouse-gas producing machines.

Australia is at its limit of land that can be cleared for cropping, and is in the process of reducing irrigation in its food bowl, the Murray-Darling basin. And talking of irrigation, under Australian conditions soybeans need almost as much water as cotton. Australia produces roughly 15 per cent of the soybeans that it consumes, although much of that is used in stock feed.

Pigs and chickens are monogastric and as a result produce a small fraction, per kilo, of the methane produced by ruminants. Unlike cattle they cannot live on grass. In traditional farm situations they were fed on crop residues and waste, but now significant quantities of grains are grown to feed them.

Meat protein substitutes, ranging from tofu to synthetic meat, are all highly processed and that means more greenhouse gas production.

Estimating methane production is a tricky business. There are a number of figures for the percentage of greenhouse gas emissions agriculture is responsible for, and they are getting better. On Monday, the CSIRO announced methane emissions from Australian cattle were actually 24 per cent lower than previously thought.

Critics of meat consumption like to compare ruminant-produced methane with transport emissions. But fossil fuels are releasing carbon that was sequestered hundreds of millions of years ago that will never be replaced. The methane burped by a cow comes from carbon sequestered in the grass during the last growing season. If that grass keeps growing, or produces seedlings, carbon will be sequestered again next season.

There is no comparison: burning fossil fuels is a one-way street. The methane produced by ruminants is a natural part of an ancient life cycle.

Asa Wahlquist is a rural journalist.

This story was found at:

Does Rheumatoid Arthritis begin in the Gut? Posted on October 16, 2015, 2 Comments

2007 Feb;83(976):128-31.

Is rheumatoid arthritis a disease that starts in the intestine? A pilot study comparing an elemental diet with oral prednisolone.



This pilot study aimed to determine if an elemental diet could be used to treat patients with active rheumatoid arthritis and to compare its effect to that of oral prednisolone.


Thirty patients with active rheumatoid arthritis were randomly allocated to 2 weeks of treatment with an elemental diet (n = 21) or oral prednisolone 15 mg/day (n = 9). Assessments of duration of early morning stiffness (EMS), pain on a 10 cm visual analog scale (VAS), the Ritchie articular index (RAI), swollen joint score, the Stanford Health Assessment Questionnaire, global patient and physician assessment, body weight, erythrocyte sedimentation rate (ESR), C-reactive protein (CRP) and haemoglobin, were made at 0, 2, 4 and 6 weeks.


All clinical parameters improved in both groups (p<0.05) except the swollen joint score in the elemental diet group. An improvement of greater than 20% in EMS, VAS and RAI occurred in 72% of the elemental diet group and 78% of the prednisolone group. ESR, CRP and haemoglobin improved in the steroid group only (p<0.05).


An elemental diet for 2 weeks resulted in a clinical improvement in patients with active rheumatoid arthritis, and was as effective as a course of oral prednisolone 15 mg daily in improving subjective clinical parameters. This study supports the concept that rheumatoid arthritis may be a reaction to a food antigen(s) and that the disease process starts within the intestine.

Original Source:

In Defense of Red Meat Posted on June 23, 2015, 0 Comments

How Americans Got Red Meat Wrong by NinaTeicholz

The idea that red meat is a principal dietary culprit has pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals, at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading.

“Our diets have changed radically within the past 50 years,” he explained, “with great and often harmful effects on our health.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption.

The data from the early 1900s, which is what McGovern and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included.

And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

In the absence of scientific data, history can provide a picture of food consumption in the late-18th- to 19th-century in America.

Early Americans settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one 18th-century Swedish visitor described—and there was little point in farming since meat was so readily available.

Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days. The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

In the woods, there were bears (prized for their fat), raccoons, bobo­links, opossums, hares, and virtual thickets of deer—so much that the colo­nists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable.

Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old.

“I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

In the book Putting Meat on the American Table, researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of 8,000 urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds.

A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the 18th century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

About 175 pounds of meat per person per year—compared to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, about half is poultry—chicken and turkey—whereas until the mid-20th century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs).

Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media.

It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

Roger Horowitz, Putting Meat On the American Table (Baltimore, MD: John's Hopkins University Press, 2000): 11 - 17; Adapted from Carrie R. Daniel et al., "Trends In Meat Consumption in the USA".

Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. And before large supermarket chains started importing kiwis from Australia and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

So it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Rich­ard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator. By these accounts, for the first 250 years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s.

Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospi­tal, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital.

The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris, makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his 700 male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.”

About one fifth of the U.S. population was over 50 years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some 10 million Americans of a prime age for having a heart attack at the turn of the 20th century, but heart attacks appeared not to have been a common problem.

Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle, Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another 20 years.

In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.

Original source found here:

The Baneful Consequences of the U.S. Dietary Guidelines Posted on February 02, 2015, 0 Comments

Adele Hite     January 13, 2015     Original source found here:

The next set of Dietary Guidelines for Americans (DGA), the public health nutrition policy that directs all federal nutrition activities “including research, education, nutrition assistance, labeling, and nutrition promotion,”1 are due out in 2015. The DGA are meant to address a simple question: What should Americans eat to be healthy?2 As the 2015 Dietary Guidelines Advisory Committee (DGAC) begins to create the report that will advise any possible changes to the DGA, they appear poised to provide the same answer to that question that has proven largely ineffective for the past thirty-five years.

Although the DGAC has retreated from the recommendation that Americans reduce their intake of total fat, limits on saturated fat and cholesterol from animal products remain firmly in place and these levels may be restricted further. Thus despite the superficial movement away from reduced-fat guidance, in terms of which foods are permitted and which are restricted or forbidden, nothing has changed.

According to the 2015 DGAC, eggs, meat, butter and full-fat dairy are still to be limited or eliminated from the diet altogether. Consumption of whole grains, fruits and vegetables, lowfat or no-fat dairy, fish, and lean cuts of poultry are encouraged, and, with restrictions on fat intake relaxed, Americans will now be allowed to consume even more vegetable oil than before.

While the 2015 DGAC has acknowledged that when Americans replaced dietary fat with starches and sugars obesity rates climbed, there has been no recognition of the relationship between this phenomenon and DGA guidance. Rather, the implication remains that high rates of being overweight and obese in America are due to the fact that Americans have simply failed to comply with what the U.S. Departments of Agriculture (USDA) and Health and Human Services (DHHS)―the two government agencies in charge of the DGA―have determined is best for the public. “Poor diet and physical inactivity are the most important factors contributing to an epidemic of overweight,”3 not poor dietary recommendations based on inadequate science.


In fact, a primary misconception in public health nutrition is that current national nutrition polices are based on scientific agreement about what constitutes a healthy diet. However from the beginning, federal dietary guidance has been based more on ideology, including romantic notions of returning to a “natural” way of eating, than science. Although nutrition science has changed dramatically in the thirty-five years since the first national dietary recommendations were issued, the recommendations themselves have remained virtually unchanged. The historical and cultural influences behind federal dietary recommendations, their controversies and their consequences, warrant a close critical examination. They demonstrate that although science and policy perform very different functions, they can be mutually reinforcing. Though this does serve to make science more political, it does not make policy more scientific.

A cascade of unintended consequences has resulted from those original dietary recommendations, guidance that remains entrenched, held in place by politics, ideology, institutional agendas, and the influence of interested industries.4,5 This entrenchment has resulted in millions of U.S. taxpayer dollars spent on nutrition policies, programs and practices that do not result in good health, while the very same taxpayers are expected to shoulder the blame for these negative outcomes.


When the first national nutrition recommendations for the prevention of chronic disease, the 1977 Dietary Goals for Americans, were originally proposed, not only was the content of the recommendations hotly debated, the very concept of one-size-fits-all, population-wide dietary advice was itself highly controversial. The 1977 Dietary Goals introduced a diet―high in grains and cereals and low in fat, with few animal products, and vegetable oils substituting for animal fats―that was an extreme departure from what Americans were then eating. Not only was the diet recommended by the 1977 Goals a radical change for many Americans, the very idea that the federal government could know what foods were best for any given individual was a dramatic shift in how public health nutrition was understood and administered.

Before the 1977 Goals were created, the determination of which foods were “good” for you and which were “bad” was located within the family and community, rather than with the government. Packaged food did not carry a nutrition label, and government dietary guidance focused on acquisition of adequate essential nutrition, rather than the avoidance of foods that might cause chronic disease. Despite the lack of government guidance on how to prevent chronic disease through nutrition, heart disease rates had been decreasing in America since 1968,6 and in 1975, less than 15 percent of the population was considered obese.7

In many regards, the health of Americans in the 1970s had never been better. However, concerns about “lifestyle-related” diseases permeated the consciousness of much of middle class America, and food manufacturers responded accordingly. The American Heart Association (AHA) had created a national platform for a theory proposed by a physiologist named Ancel Keys, which asserted that dietary fat—especially saturated fat and cholesterol from animal products—led to heart disease. Responding to these interests, manufacturers of “heart-healthy” margarines and meat substitutes began claiming their products could reduce the risk of heart disease, although the federal government remained unconvinced.

Evidence that dietary fat and cholesterol had significant effects on heart disease was elusive, and the Federal Trade Commission repeatedly warned manufacturers not to make false and misleading claims linking food products to the prevention of heart disease.8 Although the AHA primarily aimed its fear-of-fat message at businessmen who might be lucrative donors,8 the counter-culture thinking that emerged from the social upheavals of the 1960s picked up the refrain, marrying concerns about chronic disease to anxiety about the environment and world hunger.

Earlier in the decade, a popular vegetarian cookbook by Frances Moore Lappé, Diet for a Small Planet, suggested that a meat-free diet would be low in saturated fat and cholesterol, thus reducing risk of obesity, heart disease and cancer; furthermore, Lappé asserted, a vegetarian way of life would reduce world hunger, energy costs, and environmental impacts of agriculture.9

While Frances Moore Lappé’s Diet for a Small Planet popularized vegetarian ideology, then-Secretary of Agriculture Earl Butz, an economist with many ties to large agricultural corporations, was enacting policies that encouraged the planting of large-scale, monoculture crops on all arable land.10

The “fencerow to fencerow” policies Butz initiated helped to shift farm animals from pasture land to feed lots. Making room for government-subsidized corn and soybeans would increase efficiency of food production; what didn’t go into cows could go into humans, including the oils that were a by-product of turning crops into animal feed.

The agenda of vegetarians and health reformers who urged Americans to consume fewer animal products, eat more grain and cereal products, and to substitute polyunsaturated oils found in corn and soybean oil for saturated animal fats like butter and lard, fit neatly into large agribusiness efforts to increase the market for processed foods that have a wider profit margin than eggs and meat.11

These cultural forces coalesced around Senator George McGovern’s Senate Select Committee on Nutrition and Human Needs, which was first created in order to address malnutrition in America. The work of the Select Committee had been so successful that it shifted its attention from malnutrition to “overnutrition” and focused on the creation of a report that was meant to do for diet and chronic disease what the 1964 Surgeon General’s Report had done for cigarettes and cancer.12 This work took on renewed urgency and significance as the committee’s tenure seemed about to come to an end.13 Such a report would address the public’s growing fears about obesity and chronic disease and policymakers’ concerns about rising health care costs―and perhaps extend the lifespan of the committee itself.14

During the summer of 1976, the committee conducted a series of hearings, entitled “Diet Related to Killer Diseases,” from doctors and scientists specifically chosen for their willingness “to talk about eating less fat, eating less sugar, eating less meat.”15 The title of the hearings and the experts chosen to testify set the direction for their findings. In early 1977, the committee released the Dietary Goals for Americans, blaming what they saw as an “epidemic” of killer diseases—obesity, diabetes, heart disease and cancer—on changes in the American diet that had occurred in the previous fifty years, specifically the increase in “fatty and cholesterol-rich foods.”16

The report claimed that in order to reduce their risk of chronic disease, Americans should reduce their intake of food that contained fat, particularly saturated fat and cholesterol from animal products like meat, whole milk, eggs and butter, and instead consume more grains, cereals, vegetable oils, fruits, and vegetables. These particular recommendations reflected not only concerns related to health, but the “back-to-nature” ideology that was becoming increasingly popular with regard to food and diet. The committee used material from Diet for a Small Planet, along with research on vegetarian diets, to argue that a shift to plant-based protein could reduce intake of calories, cholesterol and saturated fat, as well as reduce blood pressure, risk of cancer, use of natural resources, and food costs.16 This message gave official sanction to the romantic notion that a plant-based diet could not only prevent chronic disease, but feed the hungry and save the planet.

These recommendations were met with vehement objections from scientists, doctors, and public health professionals, who argued that the recommendations were scientifically unsound and potentially harmful.17 Those who supported the Dietary Goals felt the proposed radical change in the American diet presented no risk to the health of the American people.16 In contrast, the American Medical Association said, “The evidence for assuming that benefits to be derived from the adoption of such universal dietary goals . . . is not conclusive and there is potential for harmful effects from a radical long-term dietary change as would occur through adoption of the proposed national goals.”18 Yet this warning went unheeded, and the controversy over the Dietary Goals had little effect on future USDA/ DHHS recommendations. With few changes, the 1977 Goals became the first Dietary Guidelines for Americans in 1980. The DGA have since become a powerful policy document, although the limitations that have afflicted them since the beginning have resulted in several unintended negative consequences.


The controversy surrounding the original 1977 Dietary Goals took shape along several lines. Critics raised doubts regarding the appropriateness of a single, population-wide dietary prescription, applied to all individuals regardless of level of risk, to prevent diseases that were not established as nutritional in nature.19 In addition, they made strenuous objections to the fact that these recommendations had not been tested for safety or efficacy and would be the equivalent of conducting a population-wide dietary experiment.20

Critics of the report pointed to the report’s “new age, neo-naturalist” stance, noting that the nutrition scientists at the Department of Health, Education, and Welfare (now the DHHS), who urged caution in the face of the limited science on nutrition and chronic disease, could not compete with this popular ideology either for public support or for government funds for additional research.21

That the creators of the 1977 Goals had used a thin veneer of science to support their preconceived notions of what diet was best for Americans was evident in the contradictory nature of the report’s own data. For example, the 1977 Goals suggested consumers should increase vegetable oil consumption. However, dissenting scientists pointed out that increased consumption of vegetable oils and decreased consumption of saturated fats were, according to data supplied by the 1977 Goals themselves, associated with increased levels of heart disease.17 As a result of this shaky scientific foundation, significant scientific controversy continues about some of the original and current assertions upon which the DGA recommendations are built. These can be seen generally as an on going inability to firmly establish the connections between dietary patterns and chronic disease with available methodology. More specifically, controversy continues to surround the theories that 1) dietary fat, saturated fat, and cholesterol cause heart disease, obesity, diabetes and cancer and should be replaced in the diet with polyunsaturated vegetable oils; 2) a diet high in carbohydrates will reduce the risk of chronic disease; and 3) excessive sodium intake is the primary variable in the etiology of hypertension, a risk factor for heart disease.

The case against saturated fat and cholesterol has been particularly difficult to maintain in the face of evidence to the contrary that has accumulated in the past three decades. When the first DGA were created, there was no agreement regarding the relationship of diet to blood lipids and atherosclerosis. The reasons given then for the difficulty in clarifying the relationship were “the complicated nature of this disease, as well as the multitude of contributing factors and their relationships.”22 Large observational and intervention studies conducted early in the history of the DGA, such as the Framingham study, Multiple Risk Factor Intervention Trial, and the National Diet-Heart Study, are frequently cited as proving that a lowfat, low-cholesterol diet reduces risk of heart disease, yet the results from these studies are weak or inconclusive with regard to the relationship between diet and the development of heart disease.23-26 The science since that time remains inconsistent, limited, and open to question.

In 1997, Ancel Keys, the scientist whose theories about dietary cholesterol and heart disease first warned Americans away from meat and eggs, acknowledged, “There’s no connection whatsoever between cholesterol in food and cholesterol in the blood. None. And we’ve known that all along.27 Studies cited by the 2010 DGAC Report demonstrate varied metabolic responses to lowered dietary saturated fat, with certain subpopulations exhibiting adverse rather than improved health outcomes.3 Two recent comprehensive meta-analyses indicate that saturated fat is not linked to heart disease.28,29 In fact, in a definitive review of forty-eight clinical trials, with over sixty-five thousand participants, the reduction or modification of dietary fat had no effect on mortality, cardiovascular mortality, heart attacks, stroke, cancer, or diabetes.30 Yet, avoiding saturated fat remains a cornerstone of national dietary guidance. Surveys show that the vast majority of Americans have come to believe that consuming animal fats increases one’s risk of heart disease, and many try to limit their intake of foods that contain these fats.31


The 1977 Dietary Goals did more than change the health beliefs of Americans. They affected all aspects of the food environment. That the 1977 Goals would have a powerful effect on the food industry was apparent even before they were finalized, but it is unlikely that the result was the intended one. While the initial hearings were being held, members of McGovern’s committee were warned that the food industry would respond with an explosion of products designed to meet whatever new dietary standards were established.32 With the creation of the 1977 Goals, the federal government had unmistakably designated who the “winners” and “losers” in the food sector would be. The “winners” would be manufacturers of breads, cereals, margarine, cooking oils, and soy products; “losers” would be producers of meats, butter, eggs and cheese.

Experts recognized at the time that many processed food manufacturers could “reformulate existing products to remove their allegedly deleterious nutritional effects,” something that would be very difficult for farmers who produced eggs and meat.33 To compound the advantage, for “food producers and processors whose product categories are favored by the goals, greater promotional emphasis on the nutrition value of these products may be expected. In effect products can be promoted using the national dietary goals as a ‘stamp of approval’ to gain greater acceptance in an increasingly nutrition-conscious marketplace.”33 The group most likely to be hurt by the new paradigm was not food processors but farmers: “The farmers feel especially threatened . . . because their livelihood could be most directly affected by the recommended changes. As the primary element in the food chain, farmers tend to be the most specialized and do not enjoy the flexibility and insulation of a multi-product line food processor.”33

Indeed, since the advent of the first DGA, the amount of money farmers receive for food produced has fallen by half.34 As consumers adopted eating patterns recommended in the DGA, a much larger share of their food dollar went to increased processing and marketing and the labor costs associated with these activities. Since the DGA encourages Americans to consume fewer of the products that generate a higher farm value―in other words, what the farmer is paid for the product that leaves the farm―and more of the products that generate a lower farm value, farmers overall receive less of each dollar spent on food in America. For example, the farm value of eggs, a food the DGA tells Americans to limit, is worth 54 percent of the consumer’s dollar. Instead, the DGA recognizes cereal as a preferred “healthy” breakfast; its farm value is worth only 8 percent of the consumer’s dollar.

Conventional arguments that promote plant-based diets as the most beneficial for health, the environment, and feeding the world neglect to address the way in which those diets are compatible with the agricultural policies that benefit large agricultural corporations and undermine the interests of farmers. Creating a more “democratic, socially and economically just, and environmentally sustainable” food system that supports farmers may need to begin with a reassessment of what foods may be considered nourishing.35


With federal nutrition directives to avoid saturated fat and cholesterol driving food manufacturing and consumer demand, eating patterns in America have changed dramatically since the first DGA were created. Consumers, whether they were interested in reducing the saturated fat content of their diet or not, were faced with food choices that had changed according to the DGA. As a result, despite accusations that they have ignored federal dietary advice, Americans have increased their intake of flour and cereal products and the vegetable oils that could be added to them, changes that are in line with DGA recommendations. Consumption data gathered from national health surveys indicate that virtually all of the increase in calories in the past 30 years has come from carbohydrate foods (starches and sugars such as would be found in flour and cereal products), while calories from saturated (animal) fats have decreased.36 While these changes are in line with recommendations from the DGA, they may have transformed the American diet in ways incompatible with good health.

In 1988, a vegetarian-oriented food activist group, Center for Science in the Public Interest (CSPI), warned the American public against the dangers of saturated fat and campaigned for the food industry to switch from beef tallow and lard to partially hydrogenated vegetable oil—specifically soybean oil. This is the kind of oil that is now associated with harmful trans fats. But in 1988, CSPI insisted trans fats were an improvement over saturated fat from animals.37 Oil seed companies were prepared with the technology to make this switch; Earl Butz’s agricultural policies provided plenty of the soybeans needed to create the oils that would be partially hydrogenated. Thus, far from resisting this change, “nearly all targeted firms responded by replacing saturated fats with trans fats.”37 For consumers, CSPI’s successful campaign meant that natural animal fats that cause no danger to health were replaced with highly-processed and harmful trans fats―whether t he public w anted t hose changes or not.

Surplus corn provided another substitute for saturated fats in the form of high-fructose corn syrup (HFCS). As Dr. Robert Lustig, an endocrinologist specializing in obesity has noted, “When you take the fat out of a recipe, food tastes like cardboard, and you need to replace it with something— that something being sugar.”38 HFCS offered a cheap, plentiful, sugary replacement for the animal fats that Americans were now told to avoid. For example, “fat-free” yogurt, sweetened with HFCS, appeared on grocery store shelves, as a “healthy” alternative to full-fat yogurt.

In time, scientists on the 2000 DGAC realized that the emphasis on reducing fat in the diet could lead to “adverse metabolic consequences” resulting from a high intake of sugars and starches.39 They went on to note that “an increasing prevalence in obesity in the United States has corresponded roughly with an absolute increase in carbohydrate consumption.”32 At least some of that increase in carbohydrate consumption came from the HFCS that replaced saturated fats in food.

Obesity was not the only thing that increased in prevalence since the creation of the first DGA. In fact, trends indicate that, since 1980, the rates of many chronic diseases have increased dramatically. Prevalence of heart failure and stroke has increased significantly.6 Rates of new cases of all cancers have gone up.40 Rates of diabetes have tripled.41 In addition, although body weight is not in itself a measure of health, as the 2000 DGAC noted, rates of overweight and obesity have increased as Americans have adopted the eating patterns recommend by the DGA.7

In all of these categories, the health divide between black and white Americans has persisted or worsened, with black Americans especially negatively affected by the increase in diabetes. When following DGA recommendations, African-American adults gain more weight than their Caucasian counterparts, and low-income individuals have increased rates of diabetes, hypertension, and high cholesterol.42,43 Despite adherence to healthy eating patterns as determined by the DGA, studies have shown that African-American children remain at higher risk for development of diabetes and prediabetic conditions.44 African-Americans are almost twice as likely to have diabetes as non- Hispanic white Americans, and these differences in health outcomes have not been adequately explained by social and economic disparities in these populations.45 Long-standing differences in environmental, genetic and metabolic characteristics may mean recommendations that are merely ineffective in preventing chronic disease in white, middle-class Americans and are in fact detrimental to the long-term health of black and low-income Americans.


While on the one hand the DGA have failed to prevent chronic disease, on the other hand they have also failed to provide Americans with guidance in accordance with obtaining adequate essential nutrition. Before the 1977 Dietary Goals were created, federal dietary recommendations focused on foods Americans were encouraged to eat in order to acquire adequate nutrition, not on food components to limit or avoid in order to prevent chronic disease.46 Meat, eggs, butter and whole milk were considered important sources of essential nutrients, and avoiding saturated fat in food was considered a “questionable dietary practice” adopted by “food faddists.”47 During World War II, meat and fats were considered such valuable sources of nutrition that Americans back home were asked to save them for the troops and eat fish and vegetables instead. In fact, prior to the creation of the DGA, Americans got about 36 percent of their calories from grains, fruits, and vegetables and over 50 percent of their calories from meat, eggs, cream, cheese, and fat.48

From the beginning, scientists were concerned that recommendations warning people to limit their intake of foods that were traditionally considered to be highly nutritious would adversely affect intake of essential nutrients. In response to the 1977 Dietary Goals, one scientist argued that “there are serious nutritional problems that affect many Americans that are clearly related to dietary inadequacies, particularly of high-quality protein . . . implementation of your recommendations could have a negative effect on these problems.”17

In fact, research has found that following DGA recommendations can have a detrimental impact on intake of essential nutrition. A 2013 study demonstrated that sodium restrictions in the 2010 DGA are “incompatible with potassium guidelines and with nutritionally adequate diets, even after reducing the sodium content of all foods by 10 percent.”49 The reduced-fat diet recommended by the DGA has also been linked to lower intakes of several important essential nutrients. In one study, lower fat intake was associated with lower intake of nine out of fourteen important micronutrients, independent of calorie intake.50

Choline, which was not recognized as an essential nutrient until after the first DGA were created, plays an important role in brain development in fetuses, and adequate amounts are important for the prevention of liver disease, atherosclerosis, and neurological disorders.51 Current average intakes of choline are far below established adequate levels.40 Scientists have suggested that, “Given the importance of choline in a wide range of critical functions in the human body, coupled with the less than optimal intakes among the population, dietary guidance should be developed to encourage the intake of choline-rich foods.”40 However, consumption of eggs and meat, two foods that are rich in choline, is restricted by current DGA recommendations that limit intake of cholesterol and saturated fat.


In 1977, the Dietary Goals acknowledged that “genetic and other individual differences mean that these guidelines may not be applicable to all.”16 However, this qualification has been muted in subsequent DGA. Although it is clear that good nutrition plays an important role in long-term health, when the first DGA were created the particular dietary pattern that would be optimal for achieving lifelong health was unclear; that is still the case today. Early critics of the Guidelines felt that the scientific model used to address nutrient deficiencies did not apply to chronic diseases such as heart disease and cancer.52 Scientists thirty years later express similar concerns, adding that “nutrient-based metrics [of current recommendations] are hampered by imprecise definitions and inconsistent usage,” and “few individuals can accurately gauge daily consumption of calories, fats, cholesterol, fiber or salt.”53 However, current Guideline recommendations urge Americans to track food and calorie intake as means of achieving a healthy diet.3

Furthermore, the DGA have institutionalized the idea that overweight and obese people are different from “normal”—establishing, as part of national dietary policy, the notion that they are less likely to accurately or honestly report on their own eating habits. The 2010 DGA indicate that, on the basis of national survey data, Americans do not seem to be consuming excessive amounts of calories. Thus the inexplicably high rates of obesity in America must be due to the fact that people who are overweight or obese lie about how much they eat: “[T]he numbers are difficult to interpret because survey respondents, especially individuals who are overweight or obese, often underreport dietary intake.”3

This moralistic approach to obesity and weight loss has contributed to extensive and unrecognized “collateral damage” in the form of fat-shaming, eating disorders, weight discrimination, and poor health from restrictive food habits. At the same time, researchers at the Centers for Disease Control have shown that overweight and obese people are often as healthy as their “normal” weight counterparts.54

Finally, the emphasis on plant-based nutrition and the demonization of animal-based foods is a culturally biased perspective. Although the 2010 DGA claim that the recommendations they contain “accommodate the varied food preferences, cultural traditions and customs of the many and diverse groups who live in the United States,”27 this is most certainly not the case. Animal products containing saturated fat are an important part of many food cultures: sausages of Eastern European and Chinese cuisine; ghee, the clarified butter of Indian cuisine; chorizo and eggs from Latin America; liver patés eaten by Jewish Americans; greens and fatback of Southern and soul food traditions.

As a dietitian, I was taught to respect the preferences of those who choose vegetarian or vegan diets. However, when it comes to animal products, dietitians, in accordance with the DGA, are encouraged to engage in “pork-shaming,” counseling people on how to eliminate, limit, or modify traditional foods in order to avoid saturated fat and cholesterol. As a dietitian, I found that people who were told to give up their traditional dishes, or to change them in ways that reduced saturated fat and cholesterol, were very likely to give up those dishes altogether; substitutions were not as good as the “real thing” and for a reason. For example, in Southern U.S. cooking, salt pork cuts the bitter taste of greens and fatback provides a vehicle for flavor as well as for fat-soluble vitamins. Greens made with little or no fat may actually be less nutritious; certainly they are if people don’t eat them.


The first DGA, created in 1980 without a specific legislative mandate, began as a very simple twenty-page, one-column booklet directed at consumers. However, it became apparent in the decade following the release of the first DGA that obesity rates in America had increased, despite the fact that Americans were making alterations to their diets in line with their recommendations.55,56 In light of these circumstances, the DGA needed not only to explain the noted discrepancies between behavior and outcome, but should attempt to prevent further negative changes in the health of Americans. In 1990, Congress passed a law indicating that DGA should be reviewed and reissued every five years, emphasizing that: “Each such report shall contain nutritional and dietary information and guidelines for the general public,. . . and shall be based on the preponderance of the scientific and medical knowledge which is current at the time the report is prepared [emphasis mine].”57

However, the DGA have never been able to overcome their original shaky scientific foundations. They have grown in size, right along with the waistlines of Americans, but have failed to improve health outcomes. Over the years, the seven recommendations from the 1980 DGA became twenty-three complicated instructions to micromanage food components in the 2010 DGA. As a result, the DGA are considered too complex for consumers to use and are instead meant for policymakers and healthcare professionals, who “translate” the DGA for consumers.

Both the lack of science and the lack of simplicity that current DGA exhibit are violations of their legislative mandate. At the same time, the DGA have become a powerful and influential document that goes far beyond providing information to consumers. These recommendations shape all government dietary guidance, dictate nationwide nutrition standards, influence agricultural policies and nutrition research protocols, direct how food manufacturers target consumer demand, guide healthcare practices, and affect how the American public thinks about diet, weight, and health. They can be considered the most influential health-related pronouncements in the world.


The 2015 DGAC has made sustainability and environmental concerns part of its agenda, indicating that one of their goals is to “develop dietary guidance that supports human health and the health of the planet.”58 There is no mistaking the fact that protecting the environment and ensuring a sustainable food supply are important issues. In fact, they are far too important to be entrusted to a committee of nutrition scientists with little knowledge or expertise in the vast and complex interactions that make up the American agriculture and food production system. The American public has already been subject to the unintended effects of policy established by the USDA and DHHS without the support of sufficient evidence. The world simply cannot withstand the consequences if the DGA’s impact on the environment is similar to its impact on obesity and chronic disease.


In 1977, the Dietary Goals presented a single perspective on food and health to the public as if it were a commonsense approach to nutrition grounded firmly in science and applicable to all Americans. This was not the case. However, there is such an approach available to the leadership at USDA and DHHS. Dietary recommendations that focus on a food-based guidance that assists Americans in acquiring adequate essential nutrition is based in solid, non-controversial science and is equally applicable to all Americans. Although scientific understanding of essential nutrition is not complete by any means, it is nevertheless supported by evidence that has stood the test of time with little controversy. All Americans require essential nutrition; without exception, inadequate intake results in diseases of deficiency. It is not necessary to eliminate, restrict or modify culturally traditional foods under the essential nutrition paradigm.

Focusing on essential nutrition is an approach that includes and celebrates a wide variety of food traditions. Such guidance would shift the focus of public health nutrition towards general health and wellness, and away from weight and other surrogate markers like cholesterol levels and blood pressure, leaving those areas of concern for the healthcare setting. Importantly, guidance that emphasizes adequate essential nutrition would be clear, concise, and useful to the general public. Contradictory messages about nutrition―unavoidable when most dietary guidance lacks a strong scientific basis because it simply echoes the DGA―have led to widespread general confusion and a lack of confidence in the science of nutrition.59 The proliferation of “food rules” that stem from DGA guidance have left many consumers frustrated by the feeling that the standards for “healthy eating” are unreachable, even as they strive to meet those standards.60 DGA recommendations based on adequate essential nutrition from wholesome, nourishing foods would not only provide the foundation for good health, they would finally provide what has been missing from the past thirty-five years of federal nutrition policy: dietary guidance that works―for all Americans.


1. U.S. Department of Agriculture, Center for Nutrition Policy and Promotion. 2010 Dietary Guidelines for Americans Backgrounder: History and Process [Internet]. 2011 [cited 2011 Jan 31]. Available from: Backgrounder.pdf
2. Kennedy E. United States Department of Agriculture Public Meeting [Internet]. Mar 10, 2000. Available from:
3. U.S. Department of Agriculture and U.S. Department of Health and Human Services. Dietary Guidelines for Americans, 2010 [Internet]. 7th ed. Washington, DC: U.S. Government Printing Office; 2011 [cited 2010 Jan 31]. Available from:
4. Taubes G. Good calories, bad calories: challenging the conventional wisdom on diet, weight control, and disease. New York: Knopf; 2007.
5. Teicholz N. The Big Fat Surprise: Why meat, butter, and cheese belong in a healthy diet. New York: Simon & Schuster; 2014.
6. National Heart, Lung, and Blood Institute. Morbidity and Mortality: 2007 Chart Book on Cardiovascular, Lung, and Blood Diseases [Internet]. Bethesda, MD: U.S. Department of Health and Human Services, National Institutes of Health; 2007 [cited 2011 Sep 24]. Available from: http://
7. Ogden CL, Carroll MD. Prevalence of overweight, obesity, and extreme obesity among adults: United States, trends 1976-1980 through 2007-2008. [Internet]. Hyattsville, MD: National Center for Health Statistics; 2010 Jun [cited 2011 Sep 1]. Available from: hestat/obesity_adult_07_08/obesity_adult_07_08.pdf
8. Levenstein H. Fear of Food: A history of why we worry about what we eat. Chicago: Univ Of Chicago Press; 2013.
9. Lappé FM. Diet for a Small Planet. 10th anniversary ed., completely rev. & updated. New York: Ballantine Books; 1982. 496 p.
10. Butz EL. An Emerging, Market-Oriented Food and Agricultural Policy. Public Adm Rev. 1976 Mar;36(2):137.
11. Pyle G. Raising less corn, more hell: the case for the independent farm and against industrial food.1st ed. New York: Public Affairs; 2005. 229 p.
12. Oppenheimer GM, Benrubi ID. McGovern’s Senate Select Committee on Nutrition and Human Needs Versus the: Meat Industry on the Diet-Heart Question (1976–1977). Am J Public Health. 2013 Nov 14;104(1):59–69.
13. Austin JE, Hitt C. Nutrition intervention in the United States: cases and concepts. Cambridge, Mass: Ballinger Pub. Co; 1979. 387 p.
14. Hegsted M. Washington – Dietary Guidelines [Internet]. 1990 [cited 2011 Jan 24]. Available from:
15. Peretti J, Sahota M. The Men Who Made Us Fat. BBC Two; 2012.
16. Select Committee on Nutrition and Human Needs of the United States Senate. Dietary goals for the United States [Internet]. 2nd ed. Washington: U.S. Government Printing Office; 1977 [cited 2013 Aug 1]. Available from:
17. Select Committee on Nutrition and Human Needs, United States Senate. Dietary Goals for the United States: Supplemental Views. Washington, D.C.: U.S. Government Printing Office; 1977.
18. American Medical Association. Dietary goals for the United States: statement of The American Medical Association to the Select Committee on Nutrition and Human Needs, United States Senate. R I Med J. 1977 Dec;60(12):576–81.
19. Harper AE. Dietary goals-a skeptical view. Am J Clin Nutr. 1978 Feb;31(2):310–21.
20. Weil WB Jr. National dietary goals. Are they justified at this time? Am J Dis Child 1960. 1979 Apr;133(4):368–70.
21. Broad W. Jump in funding feeds research on nutrition. Science. 1979 Jun 8;204(4397):1060–1.
22. Jacobson NL. The Controversy over the Relationship of Animal Fats to Heart Disease. BioScience. 1974 Mar;24(3):141–8.
23. Smil V. Coronary Heart Disease, Diet, and Western Mortality. Popul Dev Rev. 1989 Sep;15(3):399. 24. Truswell AS. Some problems with Cochrane reviews of diet and chronic disease. Eur J Clin Nutr. 2005 Aug;59 Suppl 1:S150–4; discussion S195–6.
25. Multiple risk factor intervention trial. Risk factor changes and mortality results. Multiple Risk Factor Intervention Trial Research Group. JAMA 1982 Sep 24;248(12):1465–77.
26. The National Diet-Heart Study Final Report. Circulation. 1968 Mar;37(3 Suppl):I1–428.
27. Rosch PJ. Cholesterol does not cause coronary heart disease in contrast to stress. Scand Cardiovasc J. 2008 Jan 1;42(4):244–9.
28. Chowdhury R, Warnakula S, Kunutsor S, Crowe F, Ward HA, Johnson L, et al. Association of Dietary, Circulating, and Supplement Fatty Acids With Coronary Risk. Ann Intern Med. 2014 Mar 18;160(6):398–407.
29. Siri-Tarino PW, Sun Q, Hu FB, Krauss RM. Saturated fat, carbohydrate, and cardiovascular disease. Am J Clin Nutr. 2010 Mar 1;91(3):502–9.
30. Hooper L, Summerbell CD, Thompson R, Sills D, Roberts FG, Moore H, et al. Reduced or modified dietary fat for preventing cardiovascular disease. In: The Cochrane Collaboration, Hooper L, editors. Cochrane Database of Systematic Reviews [Internet]. Chichester, UK: John Wiley & Sons, Ltd; 2011 [cited 2013 May 29]. Available from:
31. Eckel RH, Kris-Etherton P, Lichtenstein AH, Wylie-Rosett J, Groom A, Stitzel KF, et al. Americans’ Awareness, Knowledge, and Behaviors Regarding Fats: 2006-2007. J Am Diet Assoc. 2009 Feb;109(2):288–96.
32. Taubes G. What if It’s All Been a Big Fat Lie? The New York Times [Internet]. 2002 Jul 7 [cited 2014 Oct 3]; Available from:
33. Austin JE, Quelch JA. US national dietary goals: Food industry threat or opportunity? Food Policy. 1979 May;4(2):115–28.
34. Sexton R. Market Consolidation Poses Challenges for Food Industry. Calif Agric. 2002 Oct;56(5):146.
35. Wilkins JL. Eating Right Here: Moving from Consumer to Food Citizen. Agric Hum Values. 2005 Sep 1;22(3):269–73.
36. Wright J, Kennedy-Stephenson J, Wang C, McDowell M, Johnson C. Trends in Intake of Energy and Macronutrients —- United States, 1971—2000. Morb Mortal Wkly Rep. 2004 Feb 6;53(4):80–2.
37. Schleifer D. The perfect solution. How trans fats became the healthy replacement for saturated fats. Technol Cult. 2012 Jan;53(1):94–119.
38. Peretti J. Why our food is making us fat [Internet]. The Guardian. [cited 2014 Dec 5]. Available from:
39. Dietary Guidelines Advisory Committee. Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans, 2000 [Internet]. Washington, D.C.: U.S. Department of Agriculture and U.S. Department of Health and Human Services; 2000 Feb [cited 2012 Apr 12]. Available from:
40. Jemal A, Murray T, Ward E, Samuels A, Tiwari RC, Ghafoor A, et al. Cancer Statistics, 2005. CA Cancer J Clin. 2005;55(1):10–30.
41. Centers for Disease Control and Prevention. Number (In Millions) of Civilian, Noninstitutionalized Persons with Diagnosed Diabetes, United States, 1980-2011 [Internet]. National Center for Chronic Disease Prevention and Health Promotion, Division of Diabetes Translation; [cited 2013 Apr 12]. Available from:
42. Zamora D, Gordon-Larsen P, Jacobs DR Jr, Popkin BM. Diet quality and weight gain among black and white young adults: the Coronary Artery Risk Development in Young Adults (CARDIA) Study (1985-2005). Am J Clin Nutr. 2010 Oct;92(4):784–93.
43. Ben-Shalom Y, Fox MK, Newby PK. Characteristics and Dietary Patterns of Healthy and Less- Healthy Eaters in the Low-Income Population. U.S. Department of Agriculture, Food and Nutrition Service, Office of Research and Analysis; 2012 Feb.
44. Lindquist CH, Gower BA, Goran MI. Role of dietary factors in ethnic differences in early risk of cardiovascular disease and type 2 diabetes. Am J Clin Nutr. 2000 Mar;71(3):725–32.
45. Kurian AK, Cardarelli KM. Racial and ethnic differences in cardiovascular disease risk factors: a systematic review. Ethn Dis. 2007;17(1):143–52.
46. McNutt K. Dietary Advice to the Public: 1957 to 1980. Nutr Rev. 1980 Oct;38(10):353–60. 47. Jalso SB, Burns MM, Rivers JM. Nutritional beliefs and practices. J Am Diet Assoc. 1965 Oct;47(4):263–8.
48. LeBovit C, Cofer E, Murray J, Clark F. Dietary Evaluation of Food Used in Households in the United States. Household Economic Research Division, Agricultural Research Service, U.S. Department of Agriculture; 1961. Report No.: 16.
49. Maillot M, Monsivais P, Drewnowski A. Food pattern modeling shows that the 2010 Dietary Guidelines for sodium and potassium cannot be met simultaneously. Nutr Res N Y N. 2013 Mar;33(3):188–94.
50. Obarzanek E, Hunsberger SA, Van Horn L, Hartmuller VV, Barton BA, Stevens VJ, et al. Safety of a fat-reduced diet: the Dietary Intervention Study in Children (DISC). Pediatrics. 1997 Jul;100(1):51–9.
51. Zeisel SH, da Costa K-A. Choline: An Essential Nutrient for Public Health. Nutr Rev. 2009 Nov;67(11):615–23.
52. Harper A. Killer French Fries. Sciences. 1988;28:21–7. 53. Mozaffarian D, Ludwig DS. Dietary guidelines in the 21st century—a time for food. JAMA. 2010 Aug 11;304(6):681–2.
54. Flegal KM, Kit BK, Orpana H, Graubard BI. Association of all-cause mortality with overweight and obesity using standard body mass index categories: A systematic review and meta-analysis. JAMA. 2013 Jan 2;309(1):71–82.
55. Kuczmarski RJ, Flegal KM, Campbell SM, Johnson CL. Increasing prevalence of overweight among us adults: The national health and nutrition examination surveys, 1960 to 1991. JAMA. 1994 Jul 20;272(3):205–11.
56. Nestle M, Porter DV. Evolution of federal dietary guidance policy: from food adequacy to chronic disease prevention. Caduceus Spring. 1990;6(2):43–67.
57. 101st Congress. National Nutrition Monitoring and Related Research Act of 1990. Sect. 301, 101- 445 Oct 22, 1990.
58. Nelson M, Abrams S, Brenna T, Hu F, Millen B. Subcommittee 5: Food Sustainability and Food Safety. 2015 Dietary Guidelines Advisory Committee; 2014 Jan.
59. Nagler RH. Steady diet of confusion: Contradictory nutrition messages in the public information environment. Diss Available ProQuest. 2010 Jan 1;1–301.
60. Brenton J. In Pursuit of Health: Mothers, Children, and the Negotiation of an Elusive Ideal [Internet]. [Raleigh, North Caroline]: North Carolina State University; 2014. Available from:

Props from one of my PGA clients, Jason Dufner Posted on January 22, 2015, 0 Comments

"I talked to a pretty specialized guy in Atlanta, his name is Andrew Johnston, a friend of mine had had some chronic back issues and I got with Andrew in Atlanta and Andrew's kind of a holistic guy, he does the whole thing, diet, PT, working out, he does the holistic approach to your health.
"I kind of specified that I was interested in what he had to offer as far as eating better and the diet.  My friend Lane Savoy had great results with him changing his diet.  With his back he had some really chronic back issues, so I gave it a go and feeling pretty good about it." --Jason Dufner

The mad scientist that put all of this together is actually a young personal trainer in Atlanta who does something called Triumph Training and had worked with a friend of Dufner’s.

...but consultation with Dr. Andrew Johnston in Atlanta led to an approach to reduce inflammation.

and for the record, I'm not a doctor nor do I play one on t.v.

A Question about Constipation Posted on January 02, 2014, 1 Comment

Thanks for writing.

Constipation can have several etiologies, but here's what I'd consider the most helpful in restoring your health in full:

1) Thyroid function.  Thyroid health has an impact on all of the body's systems including elimination and detoxification.  Nutrition is a key factor in thyroid function, and many politically correct diet recommendations are adversely impacting the health of this critical organ.  Specifically
--PUFAs (Polyunsaturated Fatty Acids): vegetable oils are the prime culprits here.  They are pro-inflammatory, down regulate the thyroid, and actually inhibit immunity.
--most processed foods will use PUFAs as they are cheap alternatives to better quality ingredients.
--Cruciferous veggies when eaten raw (i.e. broccoli, cauliflower, cabbage, etc).  They work as goitrogens and actually down regulate the thyroid.  Cook them well and eat them with a saturated fat (animal product or even coconut oil--the latter of which is pro-thryoid and has anti-bacterial and anti-viral properties).
2) Nutrition
--anything which you may be intolerant of (gluten, for example) can cause inflammation and all the resulting dysfunctions
--many of the foods touted as being high in fiber (i.e. beans, grains, green leafy veggies) are not optimal for human digestion (we're omnivores and not ruminant herbivores--thus, we have a difficult time breaking down these foods and utilizing them effectively).  Anytime digestion is impaired, fueling suffers while inflammation is increased.
--Gums (locust bean, xanthum, etc) all are particularly bad and will literally gum up the intestines.
3) Hydration.  You need enough but not too much.  Good rule of thumb is 1/2 your body weight in lbs in oz of water each day (i.e. 150lbs = 75oz of water).  But this can easily dilute electrolyte status, which is really what hydration is predicated upon, so I recommend adding a bit of salt (sea or pickling with no anti-caking agents) to everything I drink.  Not only is this pro-thyroid, it also down regulates the production of aldosterone--a stress hormone.  And anytime you see stress, think inflammation/dysfunction on some level and extra demand on the body's resources.
4) Movement.  Should be full body.  Think of movements which move you into and out of the fetal position.  Squats would be a good example.  But swimming could work, too.  And running or even walking is great to get things moving.  At the very least, bouncing up and down (on a mini trampoline, for example) would help with lymphatic drainage and promote peristalsis.  A word of caution here: exercise (especially cardio as typically performed) in excess of your current training status can easily down regulate the thyroid, so I might cap duration at 45mins.

Additional strategies would include exposure to sun light or at least bright (250+w) incandescent lighting to stimulate the mitochondria, adequate protein intake (preferably from animal sources with liberal use of gelatin/bone broth which has minimal tryptophan and can be used to balance the amino acid profile such that it doesn't perpetuate inflammation/sluggish thyroid), enough dietary carbohydrate (fruit and below ground veggies being the best choices), minding natural circadian rhythms, proper breathing mechanics, and awareness of your thinking and how each though/idea/belief impacts your physiology secondary to activation of specific parts of the autonomic nervous system.  Lots of other possible ideas, but the above should be more than enough to get you moving in the right direction (pun intended).  More info can be found in my book (  And I'm working as quickly as my schedule will allow on my next book which will explore these subject in even greater detail. 

Good luck and know that health is your birthright. 
Go claim it.
Much Chi

Feedback from a Client who is Free at Last Posted on August 29, 2013, 0 Comments

I just had to write to you and tell you how great I am feeling.  I am trying to not be dramatic when I say that going GF is changing my life.  I feel SO much clearer headed and have an outrageous amount of energy.  I can't stop cleaning and organizing things.  This has NEVER been my idea of fun.  I feel almost as if I have had an awakening.  So excited to see what happens next! Just thought you would like to know. :)

A Woman Seeking Advice about the Progression of Osteopenia Posted on August 11, 2013, 0 Comments

Off the top of my head:

--fluoride is critical to minimize/eliminate.  It's in water (unless using a reverse osmosis filter), toothpaste (of course--though you can easily find alternatives), tea (even organic ones), and anything packaged/canned/made with water.
--Here's some info about gluten and osteoporosis--
--Estrogen stunts growth, including bones.  It does so by a number of ways including increasing prolactin which accelerates bone loss.  Hope you're not using it.  Avoiding Xenoestrogens will help (see list attached).  And supplementing with Progesterone (via Progest E Complex) would likely serve you well and help keep you from being estrogen dominant.
--Serotonin is problematic for bone mass, too, and can be increased by anything which irritates the intestines (where 90% is produced and triggers peristalsis).  SSRI's of any sort should be suspect.  It stimulates osteoprotegrin (just like prolactin does), reducing bone resorption, along with PTH and cortisol--both of which remove calcium from bone. 
--decreasing the production of nocturnal stress hormones (night is when most bone loss occurs) would be a good strategy.  Blood sugar maintenance is one component you can easily manipulate which falls under this heading. 
--Anything pro-thyroid (coconut oil, sunlight, salt, etc) is beneficial for bone health (and health in general).   As such, anything which inhibits thyroid (PUFA;s being at the top of the list, but there are MANY more) should be minimized. 
--Zinc is an important co-factor in the stimulation of bone building osteoblasts, even helping to stimulate the production of new osteoblasts. On the other hand, zinc suppresses the excessive activity of osteoclasts, cells that are responsible for bone resorption, demineralization and ultimately bone loss. Zinc helps to regulate the key inflammatory gene signal in bone marrow, NF-kappaB, which is required for optimal balancing of osteoblast and osteoclast formation and function.  See the link below:
--zinc and osteoporosis--
--Copper is essential for both formation of bone and maintenance of bone structure
--Also, diets high in garlic and other related vegetables such as onions and leeks have been shown to reduce the risk of developing osteoporosis.  Also, K2 (not K1 which comes from green, leafy greens) has been shown to increase bone density in people already diagnosed with osteoporosis.  It does so by blocking the removal of calcium from bone caused by parathyroid hormone.  Add D3, too, like we talked about.
--The Calcium Lie by Robert Thompson, MD might be an interesting read for you.
--oral bisphosphonates and femur fractures--
--and some info about oral bisphosphonates and cancer--
--It’s important to realize that these types of drugs do NOT build any new bone. Rather they are metabolic poisons that kill off your osteoclasts, which halts the normal bone repair process since you now lack the cells that break bone down.  Your bones will indeed get denser. However, denser bones are NOT stronger, which is the part they don’t tell you. Eventually your bones become weaker and more prone to fracture.  In women who have been taking a bisphosphonate-type drug for five or more years, their bones have literally lost the ability to regenerate and this is why many may be faced with more brittle bones and fractures.

LOTS of info, I know, so ask if you have questions or want to dial in a strategy. 
But all this should at least get you (and your doc?) thinking.
Much Chi


Why Grass Fed is more Expensive than Conventional Posted on June 14, 2013, 0 Comments

1--Government subsidies of corn and soy (neither of which a cow/sheep/goat/etc is designed to eat) along with other waste products which the USDA says is lawful--though not necessarily healthy--to put in feed.

2--Grass fed animals require more room to move and graze and give back to the earth.
3--Grass fed animals require more time to mature and gain weight since they're eating the way Nature intended for them to eat.

4--Modern cattle are bred to be bigger and produce larger quantities of milk (which often causes mastitis leading to overuse of antibiotics).

5--Conventional milk is often split into various products using the original cream to sell to people since "whole milk is bad for you."

6--Pasteurization allows for longer shelf life yet at the cost of destroying all the enzymes necessary to make milk a healthy food.

7--Grass fed animals produce a milk which has more CLA, Vit A, Vit D, and Vit K2 along with all the naturally occurring enzymes and other heat-sensitive nutrition.


In Summary: Healthy Cows Produce Healthy Products which keep People Healthy


I was wondering what is a good way to start a gluten free diet? Do u stop cold turkey? What is a good gluten free shopping list? Posted on May 26, 2012, 0 Comments

from A.Headley


Thanks for writing, Alicia.
A good way to start a gluten free diet is to just begin minimizing the various ways it appears in your diet: breads, cereals, pastas, baked goods, etc. But since as little as one gram of gluten can create an immune response in the body, you really have to avoid it completely to benefit fully. I suggest a gradual decrease in the amount of gluten in your diet, perhaps replacing some of your favorite/problem foods with gluten free alternatives (i.e. rice pasta, millet breads, etc--though these are often laden with "gums" and other additives you cannot digest). Ultimately, however, one of the goals of a gluten free diet is to minimize your dependency on man made carbohydrate. For example, have your spaghetti sauce over spaghetti squash--it's gluten free, tastes great, and actually delivers nutrition instead of just providing calories. is a great resource for you as you evolve into a gluten free diet. And give yourself a solid two weeks of absolutely no gluten, preferably two months as that's how long it takes the gut wall to begin to heal. So be careful with ingredients as some hidden sources of Gluten include:

–hydrolyzed or textured protein
–beef or dairy from cows fed grains along with chickens on a grain fed diet
–most soy sauces
–vinegar (unless it specifically states wine vinegar or balsamic vinegar, etc)
–battered or fried anything
--alcohol made from grains

As for my shopping list, this is what a typical week looks like:

Spaghetti Squash
Sweet Potatoes


Dried Cranberries
Dried Pineapple
Frozen Fruit (for smoothies!)

Fruit Roll Up Bars (for my son)
DARK chocolate (with no soy lecithin)
Ice Cream
Raw Cocoa Powder
Coconut Oil

Tomato Sauce and Stewed Tomatoes in glass (for Spaghetti sauce over the spaghetti squash)
I wish WF carried stewed tomatoes in glass since I know the cans leach BPA, so I usually make my own or get a tomato sauce out of a glass container.
Corn Tortillas (for Fish Tacos tonight)
Goat’s Milk Yogurt
Goat’s Milk (raw from a local source)

Low fat fish (cod, halibut, etc)
Eggs (locally/privately sourced and AWESOME!)
Pulled Pork
Cheese of all sorts, mostly unpasteurized and often from goat/sheep
Oxtail (for Gelatin)

I hope that helps, Alicia. Good luck, and I know you can do this. Not one of my clients who has decided to go gluten free found it to be difficult, especially after a couple of weeks. And all of them have noted some positive benefit, even if they didn't think they had a problem with gluten. Better skin, clear thinking, improved bowel habits, weight loss, more energy, etc. And since gluten containing grains contain phytic acid which inhibits the absorption of zinc (which is necessary for you to taste the sweetness of a food), the desire for sweets and other desserts usually lessens. And you take control of your diet instead of your diet controlling you!

I'm glad to hear you're taking responsibility for yourself--that's the true definition of health.
Much chi

Q: A lot of us have switched to whole wheat products because we’ve been told complex carbohydrates are heart healthy and good for us. Are you saying that’s not true? Posted on October 17, 2011, 0 Comments

A: The research that indicates whole grains are healthy is all conducted the same way: white flour is replaced with whole wheat flour, which, no question, is better for you. But taking something bad and replacing it with something less bad is not the same as research that directly compares what happens to health and weight when you eliminate wheat altogether. There’s a presumption that consuming a whole bunch of the less bad thing must be good for you, and that’s just flawed logic. An analogy would be to say that filtered cigarettes are less bad for you than unfiltered cigarettes, and therefore, a whole bunch of filtered cigarettes is good for you. It makes no sense. But that is the rationale for increasing our consumption of whole grains, and that combined with the changes in wheat itself is a recipe for creating a lot of fat and unhealthy people.

--Dr. William Davis, author of Wheat Belly

Preschool nutrition advice Posted on July 08, 2011, 0 Comments

Here's a quote from one of my son's favorite bedtime stories.

"Charlie the Chick eats lots of Barley.
That's why he has such a big fat TUMMY!!!!"

My son is only 4 years old. Yet he's reading books that speak directly against a lot of the "health" advice most people take as god given fact. But let me ask you a question: How do you spell god?

What? It's G-O-D? And all this time I thought it was U-S-D-A.

That must mean the USDA's recommendation to eat 6-11 servings of grains isn't gospel? Then how'd that idea start? Luise Light, the creator of the Food Pyramid which was introduced to Americans in 1992, says that her original pyramid was corrupted. Instead of fruits and vegetables making up the base of the diet, cereals and wheat products were pushed. Her recommendation for starchy foods was only 2-4 servings/day.

She goes on to say, quote “…the health consequences of encouraging the public to eat so much refined grain, which the body processes like sugar, was frightening! But our exhortations to the political heads of the agency fell on deaf ears. The new food guide, replacing the ‘Basic Four,’ would be a promotional tool to get the public to buy and consume more calories, sugar and starch.”

Thank you Grain Lobby for your role in the sickening of America!

In previous posts you've read how the Standard American Diet (also known as S.A.D.) contributes to everything from obesity to Type 2 Diabetes. Now let me address how grains specifically create unhealthy bodies.

See, every living thing in Nature wants to survive, including plants. While animals rely on speed or armor or camouflage for protection, plants generally rely on chemical defenses. For example, the cocoa plant is naturally high in caffeine, and caffeine inhibits short term memory. So when an animal eats the cocoa plant, it often can't remember where he got that last meal.

The defense for grains is a substance called phytic acid. Phytates inhibit the absorption of calcium, magnesium, iron, and zinc. You need calcium for your heart to function, magnesium for carbohydrate metabolism, iron to transport oxygen, and zinc to procreate. Now I don't know about you, but I kinda like it when my heart beats and I'm able to breathe. And while I might be able to live without procreating...,practicing is a heck of a lot of fun, you gotta admit.

What's more, Biohealth Diagnostics estimates that 60% of Caucasians have gluten sensitivity. Many experts out there will tell you that it's as high as 90% of white skinned people with other races close behind. When a person is gluten intolerant, ingesting any grain other than rice, buckwheat, millet, or corn will inflame the gut wall. This micro trauma to the intestine causes tiny holes to form, allowing food particles to pass into the bloodstream undigested. The body then creates antibodies to that particular food and you have an immune response to whatever you're eating. This means that the brain fog, the lethargy, the bad skin, and any other symptom you are having due to gluten consumption, you'll now have with almost any food in your diet.

Additionally, this constant inflammation causes what's termed villous atrophy. Lining the wall of your intestines, you have little finger like projections called villi. These, in turn, have tiny little microvilli covering them--you have about 200 million per square millimeter. The job of the microvilli is to help you assimilate nutrition from your food by producing various enzymes. Unfortunately, with villous atrophy, the gut wall gets blasted and the intestine end up looking more bare than Old Mother Hubbard's cupboard. Less surface area = less micro villi = less nutrient absorption. This leads people down the road of sickness and obesity as they're forced to eat more to maintain nutrient status.

Now, many of my clients will me they don't have a problem with gluten. I guess they think it's normal to fart every 3rd step or something. And I'll admit it's often difficult to figure out the true etiology of the symptoms as average retention time in the body is 56-72 hours. This means you can eat something with gluten in it on Monday and not have a reaction until Thursday. Thus, it can be hard to correlate cause with effect. My suggestion is to cut out all gluten from the diet for at least 2 weeks and see how you feel. Most people will have a very hard time with this approach as they are literally addicted to gluten. Or more specifically, they're addicted to the dopamine created to counter the pain of an inflamed intestinal wall. But if they can last a good two weeks, they'll often notice a marked increase in energy and vitality.

So in conclusion, if you want to be healthy, don't follow the herd. In fact, move 180 degrees opposite of everyone else. Swim against the stream, people! Indeed, my advice: GO...against the grain!

JKeating Posted on January 31, 2011, 0 Comments

31 days with gluten gone--a New Year's resolution which has led to a drop in weight, a deflated spare tire, and health benefits he won't even recognize for some time...

JHoward Posted on January 31, 2011, 0 Comments

Gluten's gone

CCarlisi Posted on January 31, 2011, 0 Comments

After several "experts" told her the answer could be found with a scalpel and exploratory surgery, this client decided she'd give up gluten and get her health back. 20+years of intestinal pain and inflammation has since become a distant memory...

RWeller Posted on January 31, 2011, 0 Comments

11 months gluten free (after being urged to give it up for over 2 years...) and 25lbs+ lost, several dress sizes down, medication minimized, and more.

Going Against the Grain Posted on January 31, 2011, 1 Comment

An appropriate title for a category of posts highlighting the achievements of my clientele who have given up gluten, don't you think? And I want these entries to outnumber the ones in my 365 category. So if you want to see your name in the hallowed halls of the Triumph Training Blog; or if you have a "gut" sensation that gluten in your diet is keeping your health from manifesting in all its glory and you've decided to make a change, let me know.

365 Ways #310–to B12 or not to B12 Posted on December 01, 2010, 0 Comments

#310--B12 is a vitamin essential for cellular energy. It is found only in animal foods, and its release is predicated on HCL (hydrochloric acid) production. Thus, older people, who typically see a decline in the ability to produce HCL for a variety of reasons including dehydration, will often be deficient in this critical nutrient. Interestingly enough, poor B12 status can mimic senility. That's not surprising when one considers that this nutrient is responsible for promoting normal nerve growth by maintaining the fatty sheaths which cover the nerve endings. So eating quality meat, and maintaining HCL production at adequate levels, may be one of the key strategies in maintaining cognitive functions as we age. Fire up the grill, Grandpa!