Читайте только на Литрес

Kitap dosya olarak indirilemez ancak uygulamamız üzerinden veya online olarak web sitemizden okunabilir.

Kitabı oku: «Wheat Belly Total Health: The effortless grain-free health and weight-loss plan», sayfa 3

Yazı tipi:

Even more remarkably, Dr Price specifically sought out members of these cultures who had recently transitioned to consuming ‘white man’s food’ – people who were bartering for the breads, pastries and sweets of Westerners visiting or bordering their land. In every instance, he observed an astounding increase in tooth decay, affecting 25 to 50 per cent of teeth examined, along with gingivitis, periodontitis, tooth loss, infectious abscesses, crooked and crowded teeth, and reductions in the size of the maxillary (midfacial) bone and mandible (jawbone). Nearly toothless mouths in teenagers and young adults were not uncommon.

The traditional diets of these societies were typically fish, shellfish and kelp among coastal cultures, and animal flesh and organs, raw dairy products, edible plants, nuts, mushrooms and insects among inland cultures. With only two exceptions (the Lötschental Valley Swiss, isolated by the Alps, who consumed a coarse rye bread, and the Gaelic people of the islands of the Outer Hebrides, who consumed crude oats), grains, sugars and processed foods were notably absent. (The Swiss had an intermediate number of dental caries, more than other cultures studied, while the Gaelic population did not.)

What is even more startling about Dr Price’s observations of the rarity of tooth decay and deformity is that none of these cultures practised any sort of dental hygiene: no toothbrushes, no toothpaste, no fluoridated water, no dental floss and no dentists or orthodontists. While Dr Price’s observations cannot be used to precisely pinpoint the nutritional distinctions between modern and traditional cultures, they nonetheless make a powerful point. Anyone wishing to read Dr Price’s account can find it reproduced in a recent reprint.34

This social ‘experiment’ has also occurred in the opposite direction: a return to traditional diet and lifestyle after a period of Westernization. In 1980, Dr Kerin O’Dea, while at the Royal Children’s Hospital in Melbourne, conducted an extraordinary experiment: she asked 10 diabetic, overweight Aboriginal individuals living Western lifestyles, all of whom retained memories of prior lifestyles, to move back to their origins in the wilds of northwestern Australia and follow their previous hunter-gatherer diet of kangaroo, freshwater fish and yams. They began their adventure with high blood glucose levels of (on average) 209 mg/dl, high triglycerides of 357 mg/dl, as well as abnormal insulin levels. After seven weeks of living in the wild, killing animals and eating familiar gathered foods, the 10 lost an average of 17.6 pounds of body weight and dropped their blood glucose to 119 mg/dl and triglycerides to 106 mg/dl.35 Of the original 10, five returned nondiabetic. In a 2005 lecture, Dr O’Dea remarked: ‘I was struck by the change in people when they were back in their own country: they were confident and assertive, and proud of their local knowledge and skills. At the time we were not able to measure markers of psychosocial state, however observation suggested a very positive change.’36

Search the four corners of the earth today and you will find that the only surviving hunter-gatherer population that’s untouched by modern diet is the Sentinelese of the North Sentinel Island in the Indian Ocean. Because their language is strikingly different from all languages in neighbouring lands, it is thought that the Sentinelese have been isolated since anatomically modern humans first migrated to this part of the world 60,000 years ago.37 Attempts to visit their island have been met with volleys of arrows, spears and rocks, so observations are limited. From what has been observed, however, they are lean and healthy, hunting, fishing and gathering foods without the ‘benefit’ of agriculture.

We have to be careful not to regard the life of the hunter-gatherer human as idyllic or problem-free: they had plenty of problems. While it is widely believed that stress is a modern phenomenon, this is absurd. Which is more stressful: struggling to pay your bills or having a marauding, bloodthirsty tribe of humans slaughter your friends, seize the women and enslave the children? We need to observe some of the practices of primitive cultures, such as head shrinking by the Jivaro Indians of the Amazon or cannibalism by the Carib of the Lesser Antilles and Venezuela, to remind ourselves that the world of humans can be an inhospitable place. Violence inflicted by and upon humans has characterized our existence from the start. While violence is certainly still a part of modern life, legal and political constraints that became necessary as human populations developed greater reliance on the practice of agriculture make it far less a part of day-to-day life than it was, say, 50,000 years ago. Yes, there is a bright side to agriculture and civilization.

The development of civilization and the cultivation of the seeds of grasses: two processes that ran parallel over the past 10,000 years that led to concepts such as sedentary non-nomadic life, land ownership, centralized government and many other phenomena we now accept as part of modern life. But when we observe what happens to cultures unexposed to the seeds of grasses who are then compelled to consume them, we observe an exaggerated microcosm of what the rest of the world is now experiencing.

Eat Like an Egyptian

Tooth decay, dental infections, crooked teeth, iron and folate deficiencies, diabetes, degenerated joints, weight gain, obesity: I’ve just described the average modern person. Take a member of a primitive culture following their traditional diet and feed them the processed foods of modern man – complete with the enticing products of the seeds of grasses – and within a few years, we’ve given them all the same problems we have, or worse. Yes, without ‘modern civilization’ they might succumb to the greedy ambitions of a violent neighbouring clan, but with grain in their lives they’ll have to engage in battle while sporting a 44-inch waist, two bad knees and a mouth that’s missing half its teeth.

While obesity and the diseases associated with it are virtually absent from hunter-gatherer cultures, neither are they entirely new. Diseases of affluence developed even before geneticists introduced changes into grains. Hippocrates, a Greek doctor in the 3rd century BC, and Galen, a Roman doctor of the 2nd century AD, both made detailed studies of obese people. William Wadd, an early-19th-century London doctor and a lifelong observer of the ‘corpulent’, made this observation after the autopsy of an obese man:

The heart itself was a mass of fat. The omentum [a component of the intestines] was a thick fat apron. The whole of the intestinal canal was imbedded in fat, as if melted tallow had been poured into the cavity of the abdomen; and the diaphragm and the parietes [walls of organs] of the abdomen must have been strained to their very utmost extent, to have sustained the extreme and constant pressure of such a weighty mass. So great was the mechanical obstruction to the functions of an organ essential to life, that the wonder is, not that he should die, but that he should live.38

What is new is that overweight and obesity have been transformed from that of curiosity to that of epidemic. The situation we confront in the 21st century is all the more astounding because modern epidemiologists and health officials declare that the causes of the epidemic of overweight, obesity and their accompanying diseases are either unclear or that the burden of blame should be placed on the gluttonous and sedentary shoulders of the public. But the answers can be discerned through observations of primitive societies plagued by none of the issues plaguing us.

More than the presence of grains distinguishes primitive from modern life, of course. Hunter-gatherers also drank no soft drinks; consumed no processed foods laced with hydrogenated fats, food preservatives or food colourings; and consumed no high-fructose corn syrup or sucrose. They were not exposed to endocrine-disruptive chemicals released by industry into our groundwater and soil, and which taint our food. The civilizations of ancient Greece and Rome and of 19th-century Europe also did not consume these components of the modern diet (except for increasing consumption of sucrose beginning in the 19th century). No Coca-Cola, hydrogenated fats, brightly coloured sweets lit up by FD&C Red No. 3 (E127) or polychlorinated biphenyl (PCB)-laced water graced their tables. But they did consume the seeds of grasses.

So just how much can we blame on the adoption of the seeds of grasses into the human diet? Let’s consider that question next. Each variety of seeds of grasses poses its own unique set of challenges to nonruminants who consume them. Before we get under way in our discussion of regaining health in the absence of grains, let’s talk about just how they ruin the health of every human who allows them to adorn his or her plate.

Chapter 2
Let Them Eat Grass

I asked the waiter, ‘Is this milk fresh?’ He said, ‘Lady, three hours ago it was grass.’ Phyllis Diller

Grasses are everywhere.

They grow on mountains, along rivers and lakes, in valleys, vast steppes, savannahs, prairies, golf courses and your garden. And they now reign supreme in the human diet.

Grasses are wonderfully successful life forms. They are geographically diverse, inhabiting every continent, including Antarctica. They are a study in how life can adapt to extremes, from the tundra to the tropics. Grasses are prolific and hardy, and they evolve rapidly to survive. Even with the explosive growth of the human population, worldwide expansion of cities and suburbs, and tarmac spanning the US coast-to-coast, grasses still cover 20 per cent of the earth’s surface area. Just as insects are the most successful form of animal life on the planet, grasses are among the most successful of plants. Given their ubiquity, perhaps it’s not unexpected that we would try to eat them. Humans have experimented with feasting on just about every plant and creature that ever inhabited the earth. After all, we are creatures who make food out of tarantulas and poisonous puffer fish.

While grasses have served as food for many creatures (they’ve even been recovered from fossilized dinosaur faeces), they were not a food item on our dietary menu during our millions of years of adaptation to life on this planet. Pre-Homo hominids, chimpanzee-like australopithecines that date back more than 4 million years, did not consume grasses in any form or variety, nor has any species of Homo prior to sapiens. Grasses were simply not instinctively regarded as food. Much as you’d never spot an herbivorous giraffe eating the carcass of a hyena or a great white shark munching on sea kelp, humans did not consume any part of this group of plants, no matter how evolutionarily successful, until the relatively recent past.

The seeds of grasses are a form of ‘food’ added just a moment ago in archaeological time. For the first 2,390,000 years of our existence on earth, or about 8,000 generations, we consumed things that hungry humans instinctively regarded as food. Then, 10,000 years or just over 300 generations ago, in times of desperation, we turned to those darned seeds of grasses. They were something we hoped could serve as food, since they were growing from every conceivable environmental nook and cranny.

So let us consider what this stuff is, the grasses that have populated our world, as common as ants and earthworms, and been subverted into the service of the human diet. Not all grasses, of course, have come to grace your dinner plate – you don’t save and eat the clippings from cutting your lawn, do you? – so we’ll confine our discussion to the grasses and seeds that humans have chosen to include on our dinner plates. I discuss this issue at some length, because it’s important for you to understand that consumption of the seeds of grasses underlies a substantial proportion of the chronic problems of human health. Accordingly, removing them yields unexpected and often astounding relief from these issues and is therefore an absolutely necessary first step towards regaining health, the ultimate goal of this book. We will spend a lot of time talking about how recovering full health as a non-grass-consuming Homo sapiens of the 21st century – that means you – also means having to compensate for all of the destruction that has occurred in your body during your unwitting grain-consuming years. You’ve consumed what amounts to a dietary poison for 20, 30 or 50 years, a habit that your non-grain-accustomed body partially – but never completely – adapts to, endures or succumbs to. You then remove that poison and, much as a chronic alcoholic needs to recover and heal his liver, heart, brain and emotional health after the flow of alcohol ceases, so your body needs a bit of help to readjust and regain health minus the destructive seeds of grasses.

So what makes the grasses of the world a food appropriate for the ruminants of the earth, but not Homo sapiens? There is no single factor within grains responsible for its wide array of bowel-destroying effects – there is an arsenal.

Non-Wheat Grains: You Might As Well Eat Jelly Beans

There is no question that, in this barrel of rotten apples, wheat is the rottenest. But you still may not want to make cider with those other apples.

What I call ‘non-wheat grains’, such as oats, barley, rye, millet, teff, sorghum, corn and rice, are nonetheless seeds of grasses with potential for curious effects in nonruminant creatures not adapted to their consumption. I would classify non-wheat grains as less bad than the worst – modern wheat – but less bad is not necessarily good. (That extraordinarily simple insight – that less bad is not necessarily good – is one that will serve you well over and over as you learn to question conventional nutritional advice. You will realize that much of what we have been told by the dietary community, the food industry and even government agencies violates this basic principle of logic again and again.) Less bad can mean that a variety of undesirable health effects can still occur with that seed’s consumption – those effects will just not be as bad as those provoked by modern wheat.

So what’s the problem with the seeds of non-wheat grasses? While none achieve the nastiness of the seeds of modern wheat, they each have their own unique issues. For starters, they’re all high in carbohydrates. Typically, 60 to 85 per cent of the calories from the seeds of grasses are in the form of carbohydrates. This makes sense, since the carbohydrate stored in the seed was meant to provide nutrition to the sprouting plant as it germinates. But the carbohydrate in seeds, called amylopectin A, is rapidly digested by humans and raises blood sugar, gram for gram, higher than table sugar does.

For instance, a 125 g (4½ oz) serving of cooked organic, stoneground oatmeal has nearly 50 grams of net carbohydrates (total carbohydrates minus fibre, which we subtract because it has no glycaemic potential), or the equivalent of slightly more than 11 teaspoons of sugar, representing 61 per cent of the calories in oatmeal. This gives it a glycaemic index (GI, an index of blood sugar-raising potential) of 55, which is enough to send blood sugar through the roof and provoke all the phenomena of glycation, i.e., glucose modification of proteins that essentially acts as biological debris in various organs. This irreversible process leads to conditions such as cataracts, hypertension, the destruction of joint cartilage that results in arthritis, kidney disease, heart disease and dementia. (Note that a glycaemic index of 55 falls into what dietitians call the ‘low’ glycaemic index range, despite the potential to generate high blood sugars. We discuss this common fallacy in Chapter 5.) All non-wheat grasses, without exception, raise blood sugar and provoke glycation to similar degrees.

Human manipulation makes it worse. If corn is not consumed as intact kernels but instead is pulverized into fine cornflour, the surface area for digestion increases exponentially and accounts for the highest blood sugars possible from any food. This is why the glycaemic index of cornflour is 90 to 100, compared with 60 for corn on the cob and 59 to 65 for sucrose or table sugar.

For years, we’ve been told that ‘complex’ carbohydrates are better for us than ‘simple’ sugars because the lengthy carbohydrate molecules of amylopectin A and amylose in grains don’t raise blood sugar as high as sugars with one or two sugar molecules, such as glucose (one sugar) or sucrose (two sugars: glucose and fructose), do. But this is simply wrong, and this silly distinction is therefore being abandoned: the GI of complex carbohydrates is the same as or higher than that of simple sugars. The GI of whole wheat bread: 72; the GI of millet as a hot cereal: 67. Neither are any better than the GI of sucrose: 59 to 65. (Similar relationships hold for the glycaemic load, a value that factors in typical portion size.) The World Health Organization (WHO) and the Food and Agriculture Organization of the United Nations have both advised dropping the complex versus simple distinction, and rightly so, as grains, from a blood sugar viewpoint, are the same as or worse than sugar.

And the problems with non-wheat grains don’t end with blood sugar issues.

Lectins: Good Enough for the KGB

The lectin proteins of grains are, by design, toxins. Lectins discourage creatures, such as moulds, fungi and insects, from eating the seeds of a plant by sickening or killing them. After all, the seed is the means by which plants continue their species. When we consume plants, we consume defensive lectins. Lectin proteins’ effects on humans vary widely, from harmless to fatal. Most plant lectins are benign, such as those in spinach and white mushrooms, which cause no adverse effects when consumed as a spinach salad. The lectin of castor beans is an entirely different story; its lectin, ricin, is highly toxic and is fatal even in small quantities. Ricin has been used by terrorists around the world. Gyorgy Markov, Bulgarian dissident and critic of the Soviet government, was murdered by KGB agents in 1978 when he was poked with the tip of an umbrella laced with ricin.

The lectin of the seed of wheat is wheat germ agglutinin (WGA). It is neither as benign as the lectin of spinach nor as toxic as the lectin of ricin; it is somewhere in between. WGA wreaks ill effects on everyone, regardless of whether you have coeliac disease, gluten sensitivity or no digestive issues at all. The lectins of rye, barley and rice are structurally identical to WGA and share all of its properties and are also called ‘WGA’. (The only substantial difference is that rye, barley and rice express a single form of lectin, while genetically more complex wheat expresses up to three different forms.) Interestingly, 21 per cent of the amino acid structure of WGA lectins overlaps with ricin, including the active site responsible for shutting down protein synthesis, the site that accounts for ricin’s exceptional toxicity.1

Lectin proteins have the specific ability to recognize glycoproteins (proteins with a sugar side chain). This makes plant lectins effective in recognizing common glycoproteins on, say, the surface of a fungal cell. But that same process can occur in humans. When a minute quantity, such as 1 milligram, of WGA is purified and intestinal tissue is exposed to it, intestinal glycoproteins are bound and severe damage that resembles the effects of coeliac disease results.2 We also know that WGA compounds the destructive intestinal effects of coeliac disease started by gliadin and other grain prolamin proteins.3 If you have inflammatory bowel disease, ulcerative colitis, or Crohn’s disease, grain lectins intensify the inflammation, making cramps, diarrhoea, bleeding and poor nutrient absorption worse.

WGA is oddly indestructible. It is unaffected by cooking, boiling, baking or frying. WGA is also untouched by stomach acid. Though the acid produced in the human stomach is powerfully corrosive (dip your finger in a glass full of stomach acid and you won’t have a finger for very long), WGA is impervious to it, entering the stomach and passing through the entire gastrointestinal tract unscathed, undigested and free to do what it likes to any glycoproteins exposed along the way.

While most WGA remains confined to the intestine, doing its damage along the 30-foot length of this organ, we know that a small quantity gets into your bloodstream. (We know this because people commonly develop antibodies to this protein.) Once WGA enters the bloodstream, odd things happen: red blood cells clump (or ‘agglutinate’, the basis for WGA’s name), which can, under certain circumstances (obesity, smoking, sedentary living, dehydration, etc.), increase the tendency of blood to clot – the process that leads to heart attack and stroke. WGA is often called a mitogen because it activates cell division, or mitosis (a concept familiar to anyone who studies cancer, a disease characterized by unrestrained mitosis). WGA has indeed been demonstrated to cause mitosis in lymphocytes (immune system cells) and cells lining the intestine.4 We know that such phenomena underlie cancer, such as the intestinal lymphoma that afflicts people with coeliac disease.5 WGA also mimics the effects of insulin on fat cells. When WGA encounters a fat cell, it acts just as if it were insulin, inhibiting activation of fat release and blocking weight loss while making the body more reliant on sugar sources for energy.6 WGA also blocks the hormone leptin, which is meant to shut off appetite when the physical need to eat has been satisfied. In the presence of WGA, appetite is not suppressed, even when you’re full.7

All in all, grain lectins are part of a potent collection of inflammatory factors. Indigestible or only partially digestible, they fool receptors and thwart hormonal signals after gaining entry to our bodies through the seeds of grasses.

VIP: Very Important Peptide

The lectin found in wheat, rye, barley and rice (WGA) also blocks the action of another very important hormone called vasoactive intestinal peptide, or VIP. 8 While studies have been confined mostly to experimental models, not humans, the blocking of VIP has the potential to explain many of the peculiar phenomena that develop in people who consume grains but do not have coeliac disease or gluten sensitivity.

VIP plays a role in dozens of processes. It is partly responsible for:

• Activating the release of cortisol from the adrenal glands9

• Modulating immune defences against bacteria and parasites in the intestine10

• Protecting against the immune destruction of multiple sclerosis11

• Reducing phenomena that can lead to asthma and pulmonary hypertension (increased pressure in the lungs)12

• Maintaining healthy balance of the immune system that prevents inflammatory bowel diseases, Crohn’s disease and ulcerative colitis13

• Promoting sleep and maintaining circadian rhythms (day–night cycles)14

• Participating in determining taste in the tongue15

• Modulating the immune and inflammatory response in skin that protects us from psoriasis16

In other words, the diseases that are at least partially explained by blocking VIP sure look and sound like the collection of conditions that we witness, day in, day out, in wheat-consuming people: low cortisol levels responsible for low energy, worsening of asthma and pulmonary hypertension, worsening of Crohn’s disease and ulcerative colitis, disruption of sleep, distortions of taste such as the reduced sensitivity to sweetness (meaning you need more sugar for sweetness) and psoriasis. The VIP pathway may prove to be one of the important means by which grains disrupt numerous aspects of health.

Grains and a Mouthful of Bacteria

Grains affect the microorganisms that inhabit our bodies. These microbiota live on your skin and in your mouth, vagina and gastrointestinal tract.

Over the last few years, there has been a new scientific appreciation for the composition of human microbiota. We know, for instance, that experimental animals raised in an artificial sterile environment and thereby raised with a gastrointestinal tract that contains no microorganisms have impaired immunity, are prone to infections, are less efficient at digestion and even develop structural changes of the gastrointestinal tract that differ from creatures that harbour plentiful microorganisms. The microorganisms that inhabit our bodies are not only helpful; they are essential for health.

The bacteria that share in this symbiotic relationship with our bodies today are not the same as those carried by our ancestors. Human microorganisms underwent a shift 10,000 years ago when we began to consume the seeds of grasses. DNA analyses of dental plaque from ancient human teeth demonstrate that oral flora of primitive non-grain-consuming humans was different from that of later grain-consuming humans. Alan Cooper, PhD, of the University of Adelaide Centre for Ancient DNA, and Keith Dobney, PhD, of the University of Aberdeen, analysed bacterial DNA from teeth of hunter-gatherers before grains. They then compared it with early grain-adopting humans and later Neolithic, Bronze Age and medieval populations – periods when agriculture flourished. Pre-grain hunter-gatherers demonstrated wide diversity of oral bacterial species, predominant in species unassociated with dental decay. Grain-consuming humans, in contrast, demonstrated reduced diversity, with what the researchers called a ‘more disease causing configuration’, a pattern that worsened the longer humans consumed grains.17 Mouth bacteria underwent another substantial shift 150 years ago during the Industrial Revolution, with the proliferation of even greater disease-causing species, such as Streptococcus mutans, coinciding perfectly with the mechanical milling of flours. Disease-causing species of oral flora are now ubiquitous and dominate the mouths of modern humans, sustained by modern ­consumption of grains and sugars.18 Dr Dobney comments: ‘Over the past few hundred years, our mouths have clearly become a substantially less diverse ecosystem, reducing our resilience to invasions by disease-causing bacteria.’19

This study rounds out what anthropologists have been telling us for years: when humans first incorporated grains into our diets, we experienced an explosion of tooth decay, tooth loss and tooth abscesses.20 We now know that grains, from einkorn and barley to maize and millet, were responsible for this marked shift in dental health, because they caused disturbances in oral microorganisms.

Insights into oral flora do not necessarily tell us what happened to bowel flora, though there is some overlap. Even though we all begin our lives with sterile gastrointestinal tracts ripe to be populated with organisms provided at birth from the vaginal canals of our mothers, many events occur during our development that lead to divergences between the organisms in our mouths and those in our bowels – such as the appearance of teeth, stomach acidification, the hormonal surge of puberty and antibiotics. Nonetheless, we can still take some lessons about human diet and bowel flora by studying . . .

The Science of Scatology

In addition to knowing that the oral flora of humans changed once we chose to consume grains, we also know that primitive humans had different bowel flora than modern humans. The ancient remains of human faeces, or coprolites, have been recovered from caves and other locations where humans congregated, ate, slept, died and, of course, moved their bowels.

Though we have to make allowances for the inevitable degeneration of faecal material over time, we can make observations on the varieties of bacterial species present in coprolites and thereby primitive human intestinal tracts. We know, for instance, that some Treponema, a species of bacteria important for digestion of fibrous foods and anti-inflammatory effects, are widely present in coprolites of pre-grain cultures but are nearly absent from modern humans.21

These observations are important because we know that abnormal conditions of the gastrointestinal tract, such as irritable bowel syndrome, peptic ulcers and ulcerative colitis, are associated with changes in bowel flora composition.22 We may uncover a connection between these changes in flora and autoimmune diseases, weight control, cancer and other conditions.

We don’t know how many of these changes are due to diet and how many are due to the diseases themselves, but we do know with certainty that the composition of human oral and bowel flora underwent changes over time. And the facts are clear: when humans began to consume the seeds of grasses, the microorganisms cohabiting our bodies changed, and they changed in ways that affect our health.

Let’s now discuss each non-wheat grain individually and explain why, like wheat, they do your health no favours.

Maybe We’ll Chew a Cud: Adaptations to Consuming the Seeds of Grasses

It would be wrong to argue that no human adaptations have evolved over the several thousand years we’ve consumed the seeds of grasses. There have indeed been several changes in the human genetic code that have developed in grain-consuming societies and that are thereby notably absent in non-agricultural native North and South American, South Pacific and Australian populations.

• Genes for increased expression of the salivary enzyme amylase, determined by the AMY1 gene, allow increased digestion of the amylopectin starches of grains.23

Ücretsiz ön izlemeyi tamamladınız.

Türler ve etiketler

Yaş sınırı:
0+
Litres'teki yayın tarihi:
27 aralık 2018
Hacim:
587 s. 12 illüstrasyon
ISBN:
9780008145880
Telif hakkı:
HarperCollins
Metin
Средний рейтинг 0 на основе 0 оценок
Metin
Средний рейтинг 0 на основе 0 оценок