Vitamin C- The Real Story Read online

Page 2


  Inuit Indians ate a high-protein, high-fat diet. Traditional Eskimos, in an environment of stark landscapes shaped by glacial temperatures, lived on a diet with little plant food and no farm or dairy products. Inuits mostly subsisted on simple hunting and fishing. Coastal Indians exploited the sea, while those further inland took advantage of caribou, including predigested vegetation in the animals’ stomachs consisting of mosses, lichens, and the available tundra plants. But Inuits were not subject to high levels of heart disease, despite this diet high in saturated fat and low in fruits and vegetables. Similarly, people on the Atkins diet have not had the suggested increased risk of heart disease. These diets are hardly balanced in the conventional sense and do not comprise the mix of grains, fruits, vegetables, meat, eggs, and dairy in the government-recommended food pyramids. Such diets conventionally should not be adequate. While Inuits must eat a few milligrams of vitamin C to prevent acute scurvy, people are expected to succumb rapidly to ill health on a diet of little but fat and animal protein—at least that’s what the so-called experts have been telling us.

  Inuits retain reasonable health on an apparently poor diet.8 The Inuit and Atkins diets have something in common that provides adequacy while being less than optimal for health. The Inuit diet modifies the antioxidant profile and may reduce free radical damage and the need for high levels of vitamin C.9 They both contain a relatively high ratio of vitamin C to sugar. While the intake of vitamin C is low in the Inuit diet, the drop in carbohydrates is far larger. A typical Western diet may include 500 grams of carbohydrate each day but less than 50 milligrams (mg) of vitamin C. Importantly, sugar prevents the absorption of vitamin C into cells. While the Inuits have a lower intake of vitamin C, they use the molecule more efficiently, as the competition with sugars and particularly glucose is correspondingly lower. The low-carbohydrate Inuit diet partly compensates for a low vitamin C intake.

  The principal benefit of fruits and vegetables is an increased intake of antioxidants, particularly vitamin C. This book explains why eating more vegetables, while good advice, will not provide the benefits of vitamin C supplementation. Some doctors claim that high-dose vitamin C acts as a powerful anti-infective agent, potentially to eradicate heart disease and to prevent or treat cancer. No one claims that eating a few additional vegetables will provide the massive benefits ascribed to vitamin C.

  Straight from the Horse’s Mouth

  The controversy over vitamin C became well-known when Nobel Prize–winning chemist Linus Pauling, Ph.D., advocated megadoses to prevent and treat diseases, such as the common cold. Dr. Pauling suggested that people need doses 100 times greater than those recommended by doctors and other nutritional experts. The medical profession’s response was a devastating attack on Dr. Pauling’s scientific competence, some even labeling him a “quack.” After Dr. Pauling’s death in 1994, the medical establishment alleged they had shown that he was wrong and that people needed only small amounts of vitamin C. If people consumed more, they suggested, it would not be absorbed and so could not have the health effects claimed by Dr. Pauling and others. As we shall see, recent scientific evidence does not support this position.

  Historically, vitamins were considered micronutrients and thought to be essential to good health. Without these, a person could become sick or might even die. A micronutrient is a substance, such as a vitamin or mineral, needed in minute amounts for the proper growth and metabolism of a living organism. By definition, larger amounts of micronutrients are inessential and might even be toxic.

  Vitamin C was named before the substance that prevents its associated deficiency disease, scurvy, was discovered and isolated. This was premature, as its properties could not be determined before its chemical identity. The designation “vitamin C” presupposes that only small amounts of the substance are needed. When Albert Szent-Györgyi, M.D., Ph.D., first isolated ascorbic acid and identified it as vitamin C, in the period from 1927 to 1933, he was aware that this preconception could prejudice subsequent scientific investigation. From the start, Dr. Szent-Györgyi suspected that, for optimal health, people might need gram levels of vitamin C.

  As other vitamins were isolated and investigated, the amounts needed to prevent acute disease appeared to be small, so the idea of vitamins as micronutrients became nutritional dogma. Since then, scientific opinion on most vitamins has diverged into two camps. The first has governmental and official support, largely for historical reasons. This official grouping considers that the intake of vitamins should be just enough to prevent acute deficiency symptoms, such as scurvy. According to the conventional view, intakes above this minimum level are considered unnecessary and may have some theoretical dangers. For vitamin C, these dangers are currently unsupported by evidence.

  A second set of scientists and physicians, who we call the orthomolecular group, take the view that the evidence is incomplete. Orthomolecular is a word coined by Linus Pauling to describe the use of nutrients and normal (“ortho”) constituents of the body in optimum amounts as the primary treatment. Thus, optimal health may require more than the minimum intake. Scientists in this group consider that the evidence on the health effects of vitamin and nutrient intake is woefully inadequate—in other words, we do not have the data to determine optimal intakes. If the orthomolecular scientists are correct, optimal nutrition might prevent much chronic human disease.

  Surprisingly, for most vitamins and minerals, the difference between conventional and orthomolecular recommendations is not large. The government Recommended Dietary Allowance (RDA) for vitamin E is 22 International Units (IU) per day, while orthomolecular-oriented physicians typically recommend higher levels, in the range 100–1,000 IU per day (5–50 times the RDA). By comparison, the corresponding discrepancy for vitamin C is huge. The RDA for vitamin C in the United States is 90 mg per day for an adult male, whereas scientists such as Dr. Pauling have recommended values of 2–20 grams (2,000–20,000 mg) per day. This difference is even greater for people who are ill. The official position is that levels of vitamin C higher than the 90 mg are not beneficial in illness. However, physicians such as Robert F. Cathcart III, a pioneering vitamin C researcher, have been using doses of up to 200 grams (200,000 mg) per day to treat disease, an intake over 2,000 times the RDA.

  There is a pertinent story attributed to Francis Bacon (1561–1626), a leading figure in natural philosophy who worked in the transitional period between the Renaissance and the early modern era.10 In 1432, some friars had a quarrel about the number of teeth in a horse’s mouth. The argument raged for thirteen days, as the scholars consulted ancient books and manuscripts in an effort to obtain a definitive answer. Then, on the fourteenth day, a young friar asked innocently if he should find a horse and look inside its mouth. With a mighty uproar, the others attacked him and cast him out. Clearly, Satan had tempted the neophyte to declare unholy ways of finding the truth, contrary to the teaching of the fathers!

  Bacon’s story sounds quaint in our current technological age. Unfortunately, the friars’ way of hiding from reality, by stipulating how people should search for the truth, is prevalent in modern medicine. As the story of vitamin C unfolds, this simple nutrient will expose modern medicine as a craft, dominated by institutional authorities, rather than a scientific discipline. For example, clinical trials have used unscientific myths about placebo function to negate nutritional effects. Medical traditionalists have misrepresented low doses of vitamin C as corresponding to the massive doses claimed to be effective. Mainstream medicine sidelines and ignores the clinical observations on high-dose vitamin C to the detriment of people’s health.

  A Matter of Survival

  Although vitamin C is essential to life, most animals do not need to consume it because they manufacture it within their bodies. However, some animals, including humans, have lost the ability to synthesize vitamin C. They have become, in effect, ascorbate mutants, reliant on vitamin C in their diet. Without it, they die—deficiency in humans, apes, and guinea pigs causes a fa
tal disease, scurvy.

  About 40 million years ago, the ancestors of human beings were small, furry mammals. One such creature lost the gene for an enzyme necessary to synthesize ascorbic acid, perhaps because of a radiation-induced genetic mutation.11 Offspring of this mutant were consequently unable to make vitamin C. Presumably, they were largely vegetarian, consuming a diet rich in ascorbic acid, so loss of the enzyme was not catastrophic.

  Evolutionary fitness is the ability of an organism to leave viable offspring. Surprisingly, loss of the gene for making vitamin C did not have a hugely detrimental effect on the evolutionary fitness and survival of our ancestors. We know this because, otherwise, animal species with this mutation would have died out, and they did not. It is probable that some animals, including humans, gained an evolutionary advantage by losing the gene for vitamin C.

  Humans are not the only creatures who need to consume vitamin C. Others include guinea pigs, apes, some bats, and several bird species. These animals have all evolved successfully, surviving in the struggle for existence for millions of years. If the ability to manufacture vitamin C was lost only once during evolution, we might conclude it was an interesting oddity. However, on the tree of life, birds and mammals diverged much earlier than the time our ancestors lost the gene. Birds appear to have originated from reptiles, in the Upper Jurassic and Lower Cretaceous periods (about 150 million years ago). Mammals evolved from reptiles much earlier, in the Carboniferous and Permian periods (about 250–350 million years ago). This suggests that birds and mammals lost their genes for making vitamin C separately and independently.

  In humans, lack of vitamin C causes scurvy, which leads to bleeding and bruising throughout the body. Gums swell, teeth fall out, and, within a few months, the sufferer dies a horrible death. On early sea voyages, scurvy killed many sailors. Strangely, some people were more resistant to the disease than others, which might indicate that a few people retain some biochemical ability to make the vitamin or sustain its levels in the body. Fortunately, even a few milligrams of vitamin C each day will prevent acute scurvy. We might wonder why early humans did not die of scurvy and become extinct. However, herbivorous animals, including apes, live largely on a diet of vegetables; their vitamin C intake is high. By studying the diet of the great apes, Linus Pauling estimated that early humans probably had an intake of between 2.5 and 9 grams of vitamin C a day.12 If an animal ate a diet with plenty of vitamin C, loss of the gene to make it would not have caused loss of evolutionary fitness. Consequently, we can reasonably assume that our early ancestors were largely vegetarian.

  Evolutionary success also depends on reproduction. Provided the young could consume enough vitamin C to prevent acute scurvy, absence of the gene might not have lowered the early humans’ evolutionary fitness. There would have to be sufficient vitamin C to prevent disease and maintain fitness levels throughout the period of conceiving and bringing up children.

  In times of plenty, loss of the vitamin C gene may have had only a marginal effect. Indeed, vegetarian animals without the gene might have a slight energetic advantage, as they did not need to manufacture the substance internally. Animals with the gene and mutants that had lost it could have coexisted for long periods in the same population. However, when food supplies became short, those that did not waste vital energy making vitamin C may have had a survival advantage. In the words of Dr. Cathcart, the mutants could “out starve” those with the gene. During periods of severe evolutionary stress, animals without the vitamin C gene could predominate to the point that those with the gene became extinct.

  An Evolutionary Advantage

  Several lines of evidence suggest that the human population has crashed in the past. For many species, evolutionary bottlenecks are surprisingly common because a species exists for only as long as it can compete for its place in the ecosystem. Most species that have existed on earth are already extinct. A typical species has a lifetime of about 10 million years.13 Current evidence suggests humans almost became extinct about 150,000 years ago. Genetic studies suggest that all humans arose from a small population in Africa only 150,000 to 200,000 years ago.14 A creative interpretation of the scientific facts traces all human life back to a single woman, living about 150,000 years ago in East Africa, the area that now encompasses Ethiopia, Kenya, and Tanzania. “Mitochondrial Eve,” as she is known, is the most recent common female, or matrilineal, ancestor of all humans.15

  To understand the significance of Mitochondrial Eve, remember that human cells contain small particles called mitochondria that hold the biochemical machinery to supply us with energy. Mitochondria have their own genetic material (DNA), which is transferred to children by way of the mother’s egg (ovum).16 The male sperm is much smaller than the ovum and does not supply mitochondria to the fetus. Scientists have shown that all humans contain mitochondrial DNA originating from a single individual. This Eve did not live alone but probably dwelt in a small village or community, where her children had some evolutionary advantage over others in the tribe.

  There is a corresponding common male ancestor, called “Y-chromosomal Adam,” who lived 60,000 to 90,000 years ago. Chromosomes are packages of genes that transfer DNA to the cells of the offspring. Male children get a Y chromosome from their father, which pairs up with an X chromosome from the mother, creating the XY pair that determines the male sex. Females get one X chromosome from each parent, forming an XX pair. Scientists have traced mutations in the Y chromosome back in time to identify Y-chromosomal Adam. Unlike the biblical Adam, Y-chromosomal Adam lived many tens of thousands of years after Mitochondrial Eve. Our Adam and Eve should not be considered as scientific facts, but as stories to illustrate one possible interpretation of the available evidence.

  A possible explanation for Y-chromosomal Adam is a super-volcanic event that occurred 70,000 to 75,000 years ago at Lake Toba in Indonesia, which might have devastated the human population.17 Humans could have been reduced to a few thousand breeding pairs, creating a bottleneck in human evolution. The geological event was perhaps thousands of times greater in magnitude than the 1980 eruption of Mount St. Helens and may have lowered the global temperature for several years, potentially triggering an ice age. It is possible that Y-chromosomal Adam was simply the most successful survivor of the Toba super-volcanic catastrophe.

  This narrative illustrates how selection pressures on humans can be severe. If loss of the vitamin C gene provided increased ability to survive periods of starvation, it may even have secured the ultimate survival of the human race. It is possible to explain our Mitochondrial Eve by assuming that a mutation in her mitochondrial DNA gave her a large advantage over other humans. In this case, people with Eve’s mitochondria would have increased in the population and could have eventually replaced all other forms. We could explain Y-chromosomal Adam in a similar way.

  Unlike apes and many other mammals, humans have little genetic diversity; this may have been caused by population bottlenecks.18 Any gene that is represented in only a small number of individuals is at risk of elimination. There have been many times when lack of the gene for vitamin C could have bestowed an evolutionary advantage. Population bottlenecks may have ensured that individuals without the gene came to dominate. We carry the consequences of this evolutionary accident in our genes.

  Cost of the Lost Gene

  Despite this evolutionary benefit, the loss of the gene for vitamin C may have left older people facing severe deficits and disease. Once an animal has reproduced, evolutionary selection is less effective. In modern humans and some animal groups, grandparents may be involved in raising the young, but, in evolutionary terms, this is a secondary factor. In the wild, older animals can be rare and extended family groups are the exception. The loss of the gene for vitamin C could lead to numerous problems, including arthritis, cardiovascular disease, cancer, and decreased immune response. However, death of an animal that has completed its reproductive phase does not prevent its successful offspring from forming the next generation. Prov
ided these chronic diseases occurred later in life, their effect on evolutionary fitness would be small. In evolutionary terms, it does not matter if an old guinea pig suffers, provided it has left a large number of healthy young offspring.

  Thus, human evolution may have provided the ability to survive periods of food shortage, but at the expense of chronic disease. Such disease would only be an issue if the intake of vitamin C in the diet were insufficient for long-term needs. We know little about our mammalian ancestors at the time the vitamin C gene was lost. Forty million years ago is shortly after the dinosaurs became extinct and we have only a sparse record of fossilized bones from that era. More importantly, we have little information about the diet of our ancestors.

  The typical diet of modern humans does not consist predominantly of vitamin C–rich vegetables. Although people live reasonably long lives with this limited intake, they increasingly suffer degenerative diseases and a lower quality of life. This unnecessary suffering might not happen if the lost gene were still present. Our inability to make our own ascorbate means that each newborn baby comes factory-equipped with, if not an inborn vitamin C deficiency, a vitamin C dependency.

  The Health Benefits of Vitamin C

  The genuine vitamin C story has become clearer over recent years, as claims for unique health benefits of vitamin C are consistent with the available evidence. There is little scientific support for the idea that low (RDA level) doses of ascorbate are optimal for humans. Antioxidants like vitamin C are essential for life because disease processes almost invariably involve free radical attack, which antioxidant defenses can counteract.