Aletho News

ΑΛΗΘΩΣ

People Die Faster After Eating These Foods

By Dr. Joseph Mercola | February 27, 2019*

The struggle with weight gain and obesity is a common and costly health issue, leading to an increase in risk for heart disease, Type 2 diabetes and cancer, just to name a few.

According to CDC figures for 2017-18, 19.3% of American children1 and 42.4% of adults2 are now obese, not just overweight. That’s a significant increase over the 1999/2000 rates, when just under 16% of children ages 6 to 193 and 30.5% of adults were obese.

Research has linked growing waistlines to a number of different sources, including processed foods, sodas and high-carbohydrate diets. Risks associated with belly fat in aging adults includes an elevated risk of cardiovascular disease and cancer.4

Researchers have actually predicted obesity will overtake smoking as a leading cause of cancer deaths,5 and recent statistics suggest we’re well on our way to seeing that prediction come true as obesity among our youth is triggering a steep rise in obesity-related cancers at ever-younger ages.

Millennials More Prone to Obesity-Related Cancers

As obesity rates rise, so do related health problems, including cancer. According to a report6 published in 2014 on the global cancer burden, obesity is already responsible for an estimated 500,000 cancer cases worldwide each year, and that number is likely to rise further in coming decades.

As reported in a Lancet study7 by the American Cancer Society, rates of obesity-related cancers are rising at a far steeper rate among millennials than among baby boomers. According to the authors,8 this is the first study to systematically examine obesity-related cancer trends among young Americans.

What’s more, while six of 12 obesity-related cancers (endometrial, gallbladder, kidney, multiple myeloma and pancreatic cancer) are on the rise, only two of 18 cancers unrelated to obesity are increasing. As noted in the press release:9

“The obesity epidemic over the past 40 years has led to younger generations experiencing an earlier and longer lasting exposure to excess adiposity over their lifetime than previous generations.

Excess body weight is a known carcinogen, associated with more than a dozen cancers and suspected in several more … Investigators led by Hyuna Sung, Ph.D., analyzed 20 years of incidence data (1995-2014) for 30 cancers … covering 67 percent of the population of the U.S. …

Incidence increased for 6 of the 12 obesity-related cancers … in young adults and in successively younger birth cohorts in a stepwise manner. For example, the risk of colorectal, uterine corpus [endometrial], pancreas and gallbladder cancers in millennials is about double the rate baby boomers had at the same age …

‘Although the absolute risk of these cancers is small in younger adults, these findings have important public health implications,’ said Ahmedin Jemal, D.V.M., Ph.D., scientific vice president of surveillance [and] health services research and senior/corresponding author of the paper.

‘Given the large increase in the prevalence of overweight and obesity among young people and increasing risks of obesity-related cancers in contemporary birth cohorts, the future burden of these cancers could worsen as younger cohorts age, potentially halting or reversing the progress achieved in reducing cancer mortality over the past several decades.

Cancer trends in young adults often serve as a sentinel for the future disease burden in older adults, among whom most cancer occurs.'”

Changes in Diet Are Driving the Obesity Epidemic

Studies10,11,12 have repeatedly demonstrated that when people switch from a traditional whole food diet to processed foods (which are high in refined flour, processed sugar and harmful vegetable oils), disease inevitably follows.

Below are just a few telling statistics. For more, see nutrition researcher Kris Gunnars’ 11 graphs published in Business Insider showing “what’s wrong with the modern diet.”13

  • Over the past 200 years, our sugar intake has risen from 2 pounds to 152 pounds per year.14 While Americans are advised to get only 10% of their calories from sugar,15 equating to about 13 teaspoons a day for a 2,000-calorie diet, the average intake is 42.5 teaspoons per day.16 It’s important to realize that it’s nearly impossible to achieve that on a processed food diet.
  • Not only that, you can’t exercise off the excess calories. For example, to burn off the calories in a single 12-ounce soda, you’d have to walk briskly for 35 minutes. To burn off a piece of apple pie, you’d be looking at a 75-minute walk.17
  • Soda and fruit juice consumption is particularly harmful, studies18,19 show, raising a child’s risk of obesity by 60% per daily serving.20 Research has also shown refined high-carb diets in general are as risky as smoking, increasing your risk for lung cancer by as much as 49%.21
  • Between 1970 and 2009, daily calorie intake rose by an average of 425 calories, a 20% increase, according to Stephan Guyenet, Ph.D.,22 who studies the neuroscience of obesity. This rise is largely driven by increased sugar and processed food consumption, and the routine advertising of junk food to children.23
  • To attract customers and compete with other restaurants, companies often add salt, sugar, fat and flavor chemicals to trigger your appetite. Unfortunately, it turns out additives and chemicals supplemented in processing kill off beneficial gut bacteria, which further exacerbates the problems created by a processed food diet.24
  • According to epidemiology professor Tim Spector, even eating a relatively small number of highly processed ingredients is toxic to your gut microbiome, which start to die off just days after eating a fast food heavy diet, suggesting excess calories from fast food may not be the only factor to blame for rising weight.
  • Processed vegetable oils, which are high in damaged omega-6 fats, are another important factor in chronic ill health. Aside from sugar, vegetable oils are a staple in processed foods, which is yet another reason why processed food diets are associated with higher rates of heart disease and other diseases.
  • Soybean oil, which is the most commonly consumed fat in the U.S.,25 has also been sh
  • “Ultraprocessed diets cause excess calorie intake and weight gain,” research26 concludes, showing that when people are allowed to eat as much as they want of either ultraprocessed foods or unprocessed food, their energy intake is far greater when eating processed fare.
  • In just two weeks, participants gained between 0.3 and 0.8 kilos (0.166 and 1.76 pounds) on the ultraprocessed diet, and lost 0.3 to 1.1 kilos (0.66 to 2.42 pounds) when eating unprocessed food.

As These Foods Became the Norm, so Did Chronic Illness

Unfortunately, Americans not only eat a preponderance of processed food, but 60% of it is ultraprocessed27 — products at the far end of the “significantly altered” spectrum, or what you could typically purchase at a gas station.

The developed world in general eats significant amounts of processed food, and disease statistics reveal the inherent folly of this trend. There’s really no doubt that decreasing your sugar consumption is at the top of the list if you’re overweight, insulin resistant, or struggle with any chronic disease.

It’s been estimated that as much as 40% of American health care expenditures are for diseases directly related to the overconsumption of sugar.28 In the U.S., more than $1 trillion is spent on treating sugar and junk food-related diseases each year.29

Any foods that aren’t whole foods directly from the vine, ground, bush or tree are considered processed. Depending on the amount of change the food undergoes, processing may be minimal or significant. For instance, frozen fruit is usually minimally processed, while pizza, soda, chips and microwave meals are ultraprocessed foods.

The difference in the amount of sugar between foods that are ultraprocessed and minimally processed is dramatic. Research30 has demonstrated that over 21% of calories in ultraprocessed foods comes from sugar, while unprocessed foods contain no refined or added sugar.

In a cross-sectional study31 using data from the National Health and Nutrition Examination Survey of over 9,000 participants, researchers concluded that “decreasing the consumption of ultraprocessed foods could be an effective way of reducing the excessive intake of added sugars in the USA.”

Definition of Ultraprocessed Food

As a general rule, ultraprocessed foods can be defined as food products containing one or more of the following:

  • Ingredients that are not traditionally used in cooking.
  • Unnaturally high amounts of sugar, salt, processed industrial oils and unhealthy fats.
  • Artificial flavors, colors, sweeteners and other additives that imitate sensorial qualities of unprocessed or minimally processed foods (examples include additives that create textures and pleasing mouth-feel).
  • Processing aids such as carbonating, firming, bulking, antibulking, defoaming, anticaking, glazing agents, emulsifiers, sequestrants and humectants.
  • Preservatives and chemicals that impart an unnaturally long shelf-life.
  • Genetically engineered ingredients, which in addition to carrying potential health risks also tend to be heavily contaminated with toxic herbicides such as glyphosate, 2,4-D and dicamba.

As described in the NOVA classification of food processing,32 “A multitude of sequences of processes is used to combine the usually many ingredients and to create the final product (hence ‘ultraprocessed’).” Examples include hydrogenation, hydrolysation, extrusion, molding and preprocessing for frying.

Ultraprocessed foods also tend to be far more addictive than other foods, thanks to high amounts of sugar (a substance shown to be more addictive than cocaine33), salt and fat. The processed food industry has also developed “craveabilty” into an art form. Nothing is left to chance, and by making their foods addictive, manufacturers ensure repeat sales.

Processed Food Diet Linked to Early Death

In related news, recent research34 involving more than 44,000 people followed for seven years warns that ultraprocessed foods raise your risk of early death. The French team looked at how much of each person’s diet was made up of ultraprocessed foods, and found that for each 10% increase in the amount of ultraprocessed food consumed, the risk of death rose by 14%.

This link remained even after taking confounding factors such as smoking, obesity and low educational background into account. As you’d expect, the primary factors driving the increased death rate was chronic diseases such as heart disease and cancer.

Nita Forouhi, a professor at the MRC Epidemiology Unit at the University of Cambridge, who was not part of the study, told The Guardian :35

“The case against highly processed foods is mounting up, with this study adding importantly to a growing body of evidence on the health harms of ultraprocessed foods … [W]e would ignore these findings at public health’s peril.

A vital takeaway message is that consumption of highly processed foods reflects social inequalities — they are consumed disproportionately more by individuals with lower incomes or education levels, or those living alone.

Such foods are attractive because they tend to be cheaper, are highly palatable due to high sugar, salt and saturated fat content, are widely available, highly marketed, ready to eat, and their use-by dates are lengthy, so they last longer. More needs to be done to address these inequalities.”

Ultraprocessed Foods Linked to Cancer

Another French study36,37 published last year also found that those who eat more ultraprocessed food have higher rates of obesity, heart problems, diabetes and cancer. Nearly 105,000 study participants, a majority of whom were middle-aged women, were followed for an average of five years.

On average, 18% of their diet was ultraprocessed, and the results showed that each 10% increase in ultraprocessed food raised the cancer rate by 12%, which worked out to nine additional cancer cases per 10,000 people per year.

The risk of breast cancer specifically went up by 11% for every 10% increase in ultraprocessed food. Sugary drinks, fatty foods and sauces were most strongly associated with cancer in general, while sugary foods had the strongest correlation to breast cancer.

According to the authors, “These results suggest that the rapidly increasing consumption of ultraprocessed foods may drive an increasing burden of cancer in the next decades.” Study co-author Mathilde Touvier told CNN:38

“It was quite surprising, the strength of the results. They were really strongly associated, and we did many sensitive analysis and adjusted the findings for many cofactors, and still, the results here were quite concerning.”

Diet Is a Key Factor Determining Your Health and Longevity

Research39 published in 2017 linked poor diet to an increased risk of cardiometabolic mortality (death resulting from Type 2 diabetes, heart disease and stroke).

According to the authors, suboptimal intake of key foods such as fruits, vegetables, nuts and seeds, and animal-based omega-3, along with excessive consumption of processed foods such as meats and sugar-sweetened beverages accounted for more than 45% of all cardiometabolic deaths in 2012. In other words, the more processed foods you eat, and the less whole foods you consume, the greater your risk of chronic disease and death.

Other research published that same year found that eating fried potatoes (such as french fries, hash browns and potato chips) two or more times per week may double your risk of death from all causes.40 Eating potatoes that were not fried was not linked to an increase in mortality risk, suggesting frying — and most likely the choice of oil — is the main problem.

In a 2013 presentation41 at the European Ministerial Conference on Nutrition and Noncommunicable Diseases by Dr. Carlos Monteiro,42 professor of nutrition and public health at the University of Sao Paulo, Brazil, Monteiro stresses the importance of creating “policies aiming the reformulation of processed foods,” and limiting children’s exposure to junk food marketing, in order to tackle the rise in diet-related noncommunicable diseases.

In my view, eating a diet consisting of 90% real food and only 10% or less processed foods is an achievable goal for most that could make a significant difference in your weight and overall health. You simply need to make the commitment and place a high priority on it. To get started, consider the following guidelines:

  • Focus on raw, fresh foods, and avoid as many processed foods as possible (if it comes in a can, bottle or package, and has a list of ingredients, it’s processed).
  • Severely restrict carbohydrates from refined sugars, fructose and processed grains.
  • Increase healthy fat consumption. (Eating dietary fat isn’t what’s making you pack on pounds. It’s the sugar/fructose and grains that add the padding.)
  • You may eat an unlimited amount of nonstarchy vegetables. Because they are so low in calories, the majority of the food on your plate should be vegetables.
  • Limit protein to less than 0.5 gram per pound of lean body weight.
  • Replace sodas and other sweetened beverages with pure, filtered water.
  • Shop around the perimeter of the grocery store where most of the whole foods reside, such as meat, fruits, vegetables, eggs and cheese. Not everything around the perimeter is healthy, but you’ll avoid many of the ultraprocessed foods this way.
  • Vary the whole foods you purchase and the way you eat them. For instance, carrots and peppers are tasty dipped in hummus. You get the crunch of the vegetable and smooth texture of the hummus to satisfy your taste, your brain and your physical health.
  • Stress creates a physical craving for fats and sugar that may drive your addictive, stress-eating behavior. If you can recognize when you’re getting stressed and find another means of relieving the emotion, your eating habits will likely improve.
  • The Emotional Freedom Techniques (EFT) can help reduce your perceived stress, change your eating habits around stress and help you create new, healthier eating habits that support your long-term health. To discover more about EFT, watch the video at this referenced link on substack.43

* This article has been updated with new information.

Sources and References

May 3, 2022 - Posted by | Science and Pseudo-Science, Timeless or most popular |

2 Comments »

  1. “Fast Food” is not Good food. Just as your television lies to you about just about everything, so too do the advertisements urging you to eat hamburgers, fries, buckets of chicken, etc etc……Here’s a good idea, buy a cookbook that teaches you how to feed yourself properly and healthily. Snacking between meals is also ridiculous. When you have three meals a day, don’t fret, you’ll make it to the next meal without dying.

    If you’re a bricklayer, builder, WWF wrestler, etc etc, you NEED three good meals/day, but if you tap a computer all day, work as a cashier, drive a bus, etc, you will not die if you miss one meal, in fact, it will be good for you……

    Like

    Comment by brianharryaustralia | May 3, 2022 | Reply

  2. GRAIN – A DOUBLE EDGED SWORD
    The globe-spanning presence of wheat and its exalted status among secular and sacred institutions alike, differentiates it from all other foods presently enjoyed by this planet’s human inhabitants. And yet the unparalleled rise of wheat as the very catalyst for the emergence of ancient civilization has not occurred without a great price. While wheat was the engine of civilization’s expansion and was glorified as a “necessary food,” both in the physical (staff of life) and spiritual sense (the body of Christ), those suffering from celiac disease are living testimony to the lesser known dark side of wheat. A study of celiac disease may help unlock the mystery of why modern man, who dines daily at the table of wheat, is the sickest animal yet to have arisen on this strange planet of ours.

    THE CELIAC ICEBERG
    Celiac disease (CD) was once considered an extremely rare affliction, limited to individuals of European origin. Today, however, a growing number of studies1 indicate that Celiac disease is found throughout the world at a rate of up to 1 in every 133 persons, which is several orders of magnitude higher than previously estimated. These findings have led researchers to visualize CD as an iceberg.2 The tip of the iceberg represents the relatively small number of the world’s population whose diagnosis of celiac disease depends on the gross presentation of clinical symptoms. This is the classical case of CD characterized by gastrointestinal symptoms, malabsorption and malnourishment, and confirmed with the “gold standard” of an intestinal biopsy. The submerged middle portion of the iceberg is largely invisible to classical clinical diagnosis, but not to modern serological screening methods such as antibody testing.3 This middle portion is composed of asymptomatic and latent celiac disease as well as “out of the intestine” varieties of wheat intolerance. Finally, at the base of this massive iceberg sits approximately 20-30% of the world’s population – those who have been found to carry the HLA-DQ locus of genetic susceptibility to celiac disease on chromosome 6.4
    The “Celiac Iceberg” may not simply illustrate the problems and issues associated with diagnosis and disease prevalence, but may represent the need for a paradigm shift in how we view both CD and wheat consumption among non-CD populations. First let us address the traditional view of CD as a rare, but clinically distinct species of genetically-determined disease, which I believe is now running itself aground upon the emerging, post-Genomic perspective, whose implications for understanding and treating disease are Titanic in proportion.

    THE GENES ARE NOT TO BE BLAMED, BUT WHAT WE CHOOSE TO EXPOSE THEM TO
    Despite common misconceptions, monogenic diseases, or diseases that result from errors in the nucleotide sequence of a single gene are exceedingly rare. Perhaps only 1% of all diseases can be considered to fall within this category, and Celiac disease is not one of them. In fact, following the completion of the Human Genome Project (HGP) in 2003 it is no longer accurate to say that our genes “cause” disease, any more than it is accurate to say that DNA is sufficient to account for all the proteins in our body (which it is not!). Despite initial expectations, the HGP revealed that there are only 30,000-35,000 genes in human DNA (genome), rather than the 100,000 + believed necessary to encode the 100,000 + proteins found in the human body (proteome).

    The “blueprint” model of genetics: one gene ‘ one protein ‘ one cellular behavior, which was once the holy grail of biology, has now been supplanted by a model of the cell where epigenetic factors (literally: “beyond the control of the gene”) are primary in determining how DNA will be interpreted, translated and expressed. A single gene can be used by the cell to express a multitude of proteins and it is not the DNA itself that determines how or what genes will be expressed. Rather, it is to the epigenetic factors that we must look to understand what makes a liver cell different from a skin cell or brain cell. All of these cells share the exact same 3 billion base pairs that make up our DNA code, but it is the epigenetic factors, e.g. regulatory proteins and post-translational modifications, that make the determination as to which genes to turn on and which to silence, resulting in each cell’s unique phenotype. Moreover, epigenetic factors are directly and indirectly influenced by the presence or absence of key nutrients in the diet, as well as exposures to chemicals, pathogens and other environmental influences. In a nutshell, what we eat, and what we are exposed to in our environment directly affects our DNA and its expression.
    Within the horizon of this new perspective even classical monogenic diseases like Cystic Fibrosis (CF) can be viewed in a new, more promising light. In CF many of the adverse changes that result from the defective expression of the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) gene may be preventable or reversible, owing to the fact that the misfolding of the CFTR gene product has been shown to undergo partial or full correction (in the rodent model) when exposed to phytochemicals found in turmeric, cayenne, and soybean.5 Moreover, nutritional deficiencies of seleniun, zinc, riboflavin, vitamin e, etc. in the womb or early in life, may “trigger” the faulty expression or folding patterns of the CFTR gene in Cystic Fibrosis which otherwise might not have undergone epigenetic activation.6 This would explain why it is possible to live into the late seventies with this condition, as was the case for Katherine Shores (1925-2004). The implications of these findings are rather extraordinary: epigenetic and not genetic factors are primary in determining disease outcome. Even if we were to exclude the possibility of the reversible correction of certain monogenic diseases, the basic lesson from the post-Genomic era is that we can’t blame our DNA for causing disease, rather, it may have more to do with what we choose to expose our DNA to.

    CELIAC DISEASE REVISITED
    What all of this means for CD is that if the genetic susceptibility locus, HLA DQ, does not determine the exact clinical outcome of the disease7, or, if the HLA genes are activated not as a cause, but as a consequence of the disease process8, we may need to shift our epidemiological focus from viewing this as a classical “disease” involving the passivity of a subject controlled by aberrant genes to viewing it as an expression of a natural, protective response to the ingestion of something that the human body was not designed to consume.9

    If we view Celiac disease not as an unhealthy response to a healthy food, but as a healthy response to an unhealthy food, classical CD symptoms like diarrhea may make more sense. Diarrhea can be the body’s way to reduce the duration of exposure to a toxin or pathogen, and villous atrophy can be the body’s way to prevent the absorption and systemic effects of chronic exposure to wheat.
    New insights into the genetic differences between humans and diverse species such as mouse, rat, chicken and turkey who share leptin genes, indicate that the seeds of cereal grasses were not introduced into the human diet until as recently as 500 generations ago. Within this context arguments against eating wheat take on greater relevance.

    I believe we would be better served to view the symptom’s of CD as expressions of bodily intelligence rather than bodily deviancy. We must shift the focus back to the disease trigger, which is wheat itself.
    People with celiac may actually have an advantage over the unafflicted because those who are “non-symptomatic” and whose wheat intolerance goes undiagnosed or misdiagnosed for lacking classical symptoms may suffer in ways that are equally or more damaging, but expressed more subtly, or in distant organs. Within this view Celiac disease would be redefined as a protective (healthy?) response to exposure to an inappropriate substance, whereas “asymptomatic” ingestion of the grain with its concomitant “out of the intestine” and mostly silent symptoms, would be considered the unhealthy response insofar as it does not signal in an obvious and acute manner that there is a problem with consuming wheat.

    It is possible that Celiac disease represents both an extreme reaction to a global, speciesspecific intolerance to wheat we all share as a matter of degree, and that CD symptoms reflect the body’s innate intelligence when faced with the consumption of something that is inherently toxic. Let us illustrate this point using Wheat Germ Agglutinin (WGA), as an example.

    WGA is glycoprotein classified as a lectin and is known to play a key role in kidney pathologies, such as IgA nephropathy. In the article: “Do dietary lectins cause disease?” the Allergist David L J Freed points out that WGA binds to “glomerular capillary walls, mesangial cells and tubules of human kidney and (in rodents) binds igA and induces IgA mesangial deposits,” indicating that wheat consumption may lead to kidney damage in susceptible individuals.10

    Indeed, a study from the Mario Negri Institute for Pharmacological Research in Milan Italy published in 2007 in the International Journal of Cancer looked at bread consumption and the risk of kidney cancer. They found that those who consumed the most bread had a 94% higher risk of developing kidney cancer compared to those who consumed the least bread.11 Given the inherently toxic effect that WGA may have on kidney function, it is possible that in certain genetically predisposed individuals, e.g. HLA-DQ positive, the body – in its innate intelligence – makes an executive decision: either continue to allow damage to the kidneys (if not also other organs) until kidney failure and rapid death result, or launch an autoimmune attack on the villi to prevent the absorption of the offending substance which results in a prolonged though relatively malnourished life. Is this not the explanation given for the body’s reflexive formation of mucous following exposure to certain highly allergenic or potentially toxic foods, e.g. dairy products, sugar, etc?

    The mucous coats the offending substance, preventing its absorption and facilitating safe elimination via the gastrointestinal tract. From this perspective the HLA-DQ locus of disease susceptibility in the Celiac is not simply activated but utilized as defensive adaptation to continual exposure to an inappropriate substance. In those who do not have the HLA-DQ locus, an autoimmune destruction of the villi will not occur as rapidly, and exposure to the universally toxic effects of WGA will with all likelihood go unabated until silent damage to distant organs precipitates into the diagnosis of a seemingly non-wheat consumption related classical disease species.

    Loss of kidney function may only be the “tip of the iceberg,” when it comes to the possible adverse effects that wheat proteins and wheat lectin can generate in the body. If kidney cancer is a likely possibility, then other cancers may eventually be linked to wheat consumption as well. This correlation would fly in the face of globally sanctioned and reified assumptions about the inherent benefits of wheat consumption. It would require we suspend cultural, socio-economic, political and even religious assumptions about its inherent benefits. In many ways, the reassessment of the value of wheat as a food requires a William Boroughs-like moment of shocking clarity when we perceive “in a frozen moment….what is on the end of every fork.” Let’s take a closer look at what is on the end of our fork.

    OUR BIOLOGICALLY INAPPROPRIATE DIET

    In a previous article,12 I discussed the role that wheat plays as an industrial adhesive (e.g. paints, paper mache’, and book binding-glue) in order to illustrate the point that it may not be such a good thing for us to eat. The problem is implicit in the word gluten, which literally means “glue” in Latin and in words like pastry and pasta, which derives from wheatpaste, the original concoction of wheat flour and water which made such good plaster in ancient times. What gives gluten its adhesive and difficult-to-digest qualities are the high levels of disulfide bonds it contains.

    These same sulfur-to-sulfur bonds are found in hair and vulcanized rubber products, which we all know are difficult to decompose and are responsible for the sulfurous odor they give off when burned. There will be 676 million metric tons of wheat produced this year alone, making it the primary cereal of temperate regions and third most prolific cereal grass on the planet. This global dominance of wheat is emblemized by fact that the Food & Agricultural Organization (FAO) – which is the United Nation’s international agency for defeating hunger. – uses a head of wheat as its official symbol. Any effort to indict the credibility of this “king of grains” will prove challenging. As Rudolf Hauschka once remarked, wheat is “a kind of earth-spanning organism.” It has vast socio-economic, political, and cultural significance. For example, in the Catholic Church, a wafer made of wheat is considered irreplaceable as the body of Christ.

    Our dependence on wheat is matched only by its dependence on us. As the human lifeform has spread accross the planet, so has the grain We have assumed total responsibility for all phases of the wheat life cycle: from fending off its pests, to providing its ideal growing conditions, to facilitating reproduction and expansion into new territories. We have grown so inextricably interdependent that neither species is sustainable at current population levels without this symbiotic relationship. It is this mutual envelopment and codependence may explain why our culture has for so long consistently confined wheat intolerance to categorically distinct, “genetically-based” diseases like “Celiac.” These categorizations may be needed in order to protect us from the realization that wheat exerts a vast number of deleterious effects on human health, in the same way that “lactose intolerance” distracts attention from the deeper problems associated with the casein protein found in cow’s milk. Rather than see wheat for what it very well may be: a biologically inappropriate food source, we “blame the victim,” and look for genetic explanations for what’s wrong with small subsections of our population who have the most obvious forms of intolerance to wheat consumption, e.g. Celiac, Dermatitis Herpetiformis, etc.

    The medical justification for these classifications may be secondary to economic and cultural imperatives that require the inherent problems associated with wheat consumption be minimized or occluded. In all probability the Celiac genotype represents a surviving vestigial branch of a once universal genotype, which through accident or intention, have had through successive generations only limited exposure to wheat. The Celiac genotype, no doubt, survived through numerous bottlenecks or “die offs” represented by a dramatic shift from hunted and foraged/gathered foods to gluten-grain consumption, and for whatever reason simply did not have adequate time to adapt or select out the gluten-grain incompatible genes. The Celiac response may indeed reflect back to us what was once a species-wide intolerance to a novel new food source: the seed storage form of the monocotyledonous cereal grasses which our species only began consuming 1-500 generations ago at the advent of the Neolithic transition (10-12,000 BC). Let us return to the image of the Celiac iceberg for greater clarification.

    OUR SUBMERGED GRAIN-FREE METABOLIC PREHISTORY
    The iceberg metaphor is an excellent way to expand our understanding of what was once considered to be an extraordinarily rare disease into one that has statistical relevance for us all, but it has a few limitations. For one, it reiterates the commonly held view that Celiac is a numerically distinct disease entity or “disease island,” floating alongside other numerically distinct disease “ice cubes” in the vast sea of normal health. Though accurate in describing the sense of social and psychological isolation many of the afflicted feel, the Celiac iceberg/condition may not be a distinct disease entity at all. Although the HLA-DQ locus of disease susceptibility on chromosome 6 offers us a place to project blame, I believe we need to shift the emphasis of responsibility for the condition back to the disease “trigger” itself: namely, wheat and other prolamine rich grains, e.g. barley, rye, spelt, and oats, and without which the typical afflictions we call Celiac would not exist. Within the horizon of this view the “Celiac iceberg” is not actually free floating but an outcropping from an entire submerged subcontinent, representing our long-forgotten (cultural time) but relatively recent metabolic prehistory as hunters-and-gatherers (biological time), where grain consumption was with all likelihood non-existent, except in instances of near-starvation.

    WHEAT: AN EXCEPTIONALLY UNWHOLESOME GRAIN
    Wheat presents a special case insofar as wild and selective breeding has produced variations which include up to 6 sets of chromosomes (3 genomes worth!) capable of generating a massive number of proteins each with a distinct potentiality for antigenicity. Common bread wheat (Triticum aestivum), for instance, has over 23,788 proteins cataloged thus far.13 In fact, the genome for common bread wheat is actually 6.5 times larger than that of the human genome! 14

    With up to 50% gluten content in some varieties of wheat, it is amazing that we continue to consider “glue-eating” a normal behavior, whereas wheat-avoidance is left to the “celiac” who is still perceived by the majority of health care practitioners as a “freak” reaction to the consumption of something intrinsically wholesome.
    Thankfully we don’t need to rely on our intuition, or even (not so) common sense to draw conclusions about the inherently unhealthy nature of wheat. A wide range of investigation has occurred over the past decade revealing the problem with the alcohol soluble protein component of wheat known as gliadin, the glycoprotein known as lectin (Wheat Germ Agglutinin), the exorphin known as gliamorphin, and the excitotoxic potentials of high levels of aspartic and glutamic acid found in wheat. Add to these the normal anti-nutrient content found in grains, phytates, enzyme inhibitors, etc. and you have a substance which we may consider the farthest thing from wholesome.

    The remainder of this article will demonstrate the following adverse effects of wheat on both celiac and non-celiac populations:
    1) wheat causes damage to the intestines
    2) wheat causes intestinal permeability
    3) wheat has pharmacologically active properties
    4) wheat causes damage that is “out of the intestine” affecting distant organs
    5) wheat exhibits molecular mimicry
    6) wheat contains high concentrations of excitoxins.

    1) WHEAT GLIADIN CREATES IMMUNE-MEDIATED DAMAGE TO THE INTESTINES

    Gliadin is classified as a prolamin, which is a wheat storage protein high in the amino acids proline and glutamine and soluble in strong alcohol solutions. Gliadin, once deamidated by the enzyme Tissue Transglutaminase, is considered the primary epitope for T-cell activation and subsequent autoimmune destruction of intestinal villi. And yet, gliadin does not need to activate an autoimmune response, e.g. Celiac disease, in order to have a deleterious effect on intestinal tissue.
    In a study published in GUT in 2007 a group of researchers asked themselves the question: “Is gliadin really safe for non-coeliac individuals?” In order to test the hypothesis that an innate immune response to gliadin is common in patients with Celiac disease and without Celiac disease, intestinal biopsy cultures were taken from both groups and challenged with crude gliadin, the gliadin synthetic 19-mer (19 amino acid long gliadin peptide) and 33-mer deamidated peptides. Results showed that all patients with or without Celiac disease when challenged with the various forms of gliadin produced an interleukin-15-mediated response. The researchers concluded: “The data obtained in this pilot study supports the hypothesis that gluten elicits its harmful effect, throughout an IL15 innate immune response, on all individuals [my italics].”15

    The primary difference between the two groups is that the Celiac disease patients experienced both an innate and an adaptive immune response to the gliadin, whereas the non-Celiacs experienced only the innate response. The researchers hypothesized that the difference between the two groups may be attributable to greater genetic susceptibility at the HLA-DQ locus for triggering an adaptive immune response, higher levels of immune mediators or receptors, or perhaps greater permeability in the Celiac intestine. It is possible that over and above the possibility of greater genetic susceptibility, most of the differences are from epigenetic factors that are influenced by the presence or absence of certain nutrients in the diet. Other factors such as exposure to NSAIDs like naproxen or aspirin can profoundly increase intestinal permeability in the non-Celiac, rendering them susceptible to gliadin’s potential at activating secondary adaptive immune responses. This may explain why in up to 5% of all cases of classically defined Celiac Disease the typical HLA-DQ haplotypes are not found. However, determining the factors associated greater or lesser degrees of susceptibility to gliadin’s intrinsically toxic effect should be a secondary to the fact that it is has been demonstrated toxic to both non-Celiacs and Celiacs.

    2) WHEAT GLIADIN CREATES INTESTINAL PERMEABILITY

    Gliadin upregulates the production of a protein known as zonulin, which modulates intestinal permeability. Over-expression of zonulin is involved in a number of autoimmune disorders, including Celiac disease and Type 1 diabetes. Researchers have studied the effect of gliadin on increased zonulin production and subsequent gut permeability in both Celiac and non-Celiac intestines, and have found that “gliadin activiates zonulin signaling irrespective of the genetic expression of autoimmunity, leading to increased intestinal permeability to macromolecules.”16 These results indicate, once again, that a pathological response to wheat gluten is a normal or human species specific response, and not based entirely on genetic susceptibilities. Because intestinal permeability is associated a wide range of disease states, including cardiovascular illness, liver disease and many autoimmune disorders, I believe this research indicates that gliadin (and therefore wheat) should be avoided as a matter of principle.

    3) WHEAT GLIADIN HAS PHARMACOLOGICAL PROPERTIES

    Gliadin can be broken down into various amino acid lengths or peptides. Gliadorphin is a 7 amino acid long peptide: Tyr-Pro-Gln-Pro-Gln-Pro-Phe which forms when the gastrointestinal system is compromised. When digestive enzymes are insufficient to break gliadorphin down into 2-3 amino acid lengths and a compromised intestinal wall allows for the leakage of the entire 7 amino acid long fragment into the blood, glaidorphin can pass through to the brain through circumventricular organs and activate opioid receptors resulting in disrupted brain function.

    There have been a number of gluten exorphines identified: gluten exorphine A4, A5, B4, B5 and C, any many of them have been hypothesized to play a role in autism, schizophrenia, ADHD and related neurological conditions. In the same way that the Celiac Iceberg illustrated the illusion that intolerance or susceptibility to wheat’s ill effects are the exceptionally rare case, it is possible if not probable that wheat exerts pharmacological activity in everyone, and that what distinguishes the schizophrenic or autistic from the functional wheat consumer is the degree to which they are effected. Beneath the tip of the “Gluten Iceberg,” if you will, we might find these opiate-like peptides to be responsible for bread’s general popularity as a “comfort food”, and our use of phrases like “I love bread,” or “this bread is to die for” to be indicative of wheat’s narcotic properties.

    I believe a strong argument can be made that the agricultural revolution that occurred approximately 10-12,000 years ago as we shifted out of the Paleolithic into the Neolithic era was precipitated as much by environmental necessities and human ingenuity, as it was by the addictive qualities of psychoactive peptides in the grains themselves. The world-historical reorganization of society, culture and consciousness accomplished through the symbiotic relationship with cereal grasses, may have had as much to do with our ability to master agriculture, as to be mastered by it. The presence of pharmacologically active peptides would have further sweetened the deal, making it hard to distance ourselves from what became a global fascination with wheat.

    An interesting example of wheat’s addictive potential pertains to the Roman army. The Roman empire was once known as the “Wheat Empire,” with soldiers being paid in wheat rations. Rome’s entire war machine, and its vast expansion was predicated on the availability of wheat. Forts were actually granaries, providing up to a year’s worth of grain in order to lay siege upon their enemies. Historians describe how punishment for the misbehavior of soldiers included being deprived of wheat rations and being given barley instead. The Roman Empire would go on to facilitate the global dissemination of wheat cultivation which fostered a form of imperialism with biological as well as cultural roots.

    The Roman appreciation for wheat, like our own, may have less to do with its nutritional value as “health food” than its ability to generate a unique narcotic reaction. It may fulfill our hunger while generating repetitive, ceaseless craving for more of the same, and by doing so, enabling the surreptitious control of human behavior. Other researchers have come to similar conclusions. According to the biologists Greg Wadley & Angus Martin: “Cereals have important qualities that differentiate them from most other drugs. They are a food source as well as a drug, and can be stored and transported easily. They are ingested in frequent small doses (not occasional large ones), and do not impede work performance in most people. A desire for the drug, even cravings or withdrawal, can be confused with hunger. These features make cereals the ideal facilitator of civilisation (and may also have contributed to the long delay in recognising their pharmacological properties).” 17

    4) WHEAT LECTIN (WGA) DAMAGES OUR TISSUE.

    Wheat contains a lectin known as Wheat Germ Agglutinin which is responsible for causing direct, non-immune mediated damage to our intestines, and subsequent to entry into the bloodstream, damage to distant organs in our body. Lectins are sugar-binding proteins which are highly selective for their sugar moieties. It is believed that wheat lectin, which binds to the monosaccharide N-acetyl glucosamine (NAG), provides defense against predation from bacteria, insects and animals. Bacteria have NAG in their cell wall, insects have an exoskeleton composed of polymers of NAG called chitin, and the epithelial tissue of mammals, e.g. gastrointestinal tract, have a “sugar coat” called the glycocalyx which is composed, in part, of NAG. The glycocalyx can be found on the outer surface (apical portion) of the microvilli within the small intestine.
    There is evidence that WGA may cause increased shedding of the intestinal brush border membrane, reduction in surface area, acceleration of cell losses and shortening of villi, via binding to the surface of the villi18 WGA can mimic the effects of epidermal growth factor (EGF) at the cellular level, indicating that the crypt hyperplasia seen in celiac may be due to a mitogenic reponse induced by WGA.19 WGA has been implicated in obesity and “leptin resistance” by blocking the receptor in the hypothalamus for the appetite satiating hormone leptin.20 WGA has also been shown to have an insulin-mimetic action, potentially contributing to weight gain and insulin resistance.21 And as we discussed earlier, wheat lectin has been shown to induce IgA mediated damage to the kidney, indicating that nephropathy and kidney cancer may be associated with wheat consumption.

    5) WHEAT PEPTIDES EXHIBIT MOLECULAR MIMICRY

    Gliadorphin and gluten exporphins exhibit a form of molecular mimicry that effects the nervous system, but other wheat proteins affect different organ systems. The digestion of gliadin produces a 33 amino acid long peptide known as 33-mer which has a remarkable homology to the internal sequence of pertactin, the immunodominant sequence in the Bordetella pertussis bacteria (whooping cough). Pertactin is considered a highly immunogenic virulence factor, and is used in vaccines to amplify the adaptive immune response. It is possible the immune system may confuse this 33-mer with a pathogen resulting in either or both a cell-mediated and adaptive immune response against Self.

    6) WHEAT CONTAINS HIGH LEVELS OF EXCITO-TOXINS

    John B. Symes, D.V.M. (http://www.dogtorj.net) is responsible for drawing attention to the potential excitotoxicity of wheat, dairy, and soy, due to their exceptionally high levels of the non-essential amino acids glutamic and aspartic acid. Excitotoxicity is a pathological process where glutamic and aspartic acid cause an over-activation of the nerve cell receptors (e.g. NMDA and AMPA receptor) leading to calcium induced nerve and brain injury. Of all cereal grasses commonly consumed wheat contains the highest levels of glutamic acid and aspartic acid. Glutamic acid is largely responsible for wheat’s exceptional taste. The Japanese coined the word umami to describe the extraordinary “yummy” effect that glutamic acid exerts on the tongue and palate, and invented monosodium glutamate (MSG) to amplify this sensation. Though the Japanese first synthesized MSG from kelp, wheat can also be used due to its high glutamic acid content. It is likely that wheat’s popularity, alongside its opiate-like activity, has everything to do with the natural flavor-enhancers already contained within it. These amino acids may contribute to neurodegenerative conditions such as Multiple sclerosis, Alzhemier’s, Huntington’s disease, and other nervous disorders such as Epilepsy, Attention Deficit Disorder and Migraines.

    CONCLUSION

    In this article we have proposed that celiac disease be viewed not as a rare “genetically determined”disorder, but as an extreme example of our body communicating to us a once universal, species-specific affliction: severe intolerance to wheat. Celiac disease reflects back to us how profoundly our diet has diverged from what was until only recently a grain free diet, and even more recently, a wheat free one. We are so profoundly distanced from that dramatic Neolithic transition in cultural time that “missing is any sense that anything is missing.” The body on the other hand can not help but to remember a time when cereal grains were alien to the diet, because in biological time it was only moments ago.
    I encourage everyone to look upon the plight of the person with celiac disease not as condition alien to our own. Rather, the celiac disease sufferer gives us a glimpse of how profoundly wheat may distort and disfigure our health if we continue to expose ourselves to its ill effects. I hope this article will provide inspiration for non-Celiacs to try a wheat free diet and judge for themselves if it is really worth eliminating. Only your body can discover first hand whether or not going wheat free is going to be of benefit or not. Considering what is at stake its probably worth a try!
    For hyper-linked references to the footnotes in this article please visit the following link.

    Sayer Ji is the founder of greenmedinfo.com

    Like

    Comment by Buddy Silver | May 3, 2022 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.