Cradle to Cradle Read online

Page 9

Over time cities all over the world built up an infrastructure for transferring nutrients from place to place. Cultures went into conflict with other cultures for resources, land, and food. In the nineteenth and early twentieth centuries, synthetic fertilizers were developed, laying the ground for the massively intensified production of industrialized agriculture. Soils now yield more crops than they naturally could, but with some severe effects: they are eroding at an unprecedented rate, and they are drained of nutrient-rich humus. Very few small farmers return local biological wastes to the soil as a primary source of nutrients any longer, and industrialized farming almost never does. Moreover, the synthetic fertilizers were often heavily contaminated with cadmium and radioactive elements from phosphate rocks, a hazard of which farmers and residents were generally unaware.

  Yet certain traditional cultures have well understood the value of nutrient flows. For centuries in Egypt, the Nile River overflowed its banks each year, leaving a rich layer of silt across the valleys when waters withdrew. Beginning about 3200 B.C., farmers in Egypt structured a series of irrigation ditches that channeled the Nile’s fertile waters to their fields. They also learned to store food surpluses for periods of drought. The Egyptians maximized these nutrient flows for centuries without overtaxing them. Gradually, as British and French engineers entered the country during the nineteenth century, Egypt’s agriculture shifted to Western methods. Since the completion of the Aswan High Dam in 1971, the silt that enriched Egypt for centuries now accumulates behind concrete, and people in Egypt build housing on once fertile areas originally reserved for crops. Houses and roads compete dramatically for space with agriculture. Egypt produces less than 50 percent of its own food and depends on imports from Europe and the United States.

  Over thousands of years, the Chinese perfected a system that prevents pathogens from contaminating the food chain, and fertilized rice paddies with biological wastes, including sewage. Even today some rural households expect dinner guests to “return” nutrients in this way before they leave, and it is a common practice for farmers to pay households to fill boxes with their bodily wastes. But today the Chinese, too, have turned to systems based on the Western model. And, like Egypt, they are growing more dependent on imported foods.

  Humans are the only species that takes from the soil vast quantities of nutrients needed for biological processes but rarely puts them back in a usable form. Our systems are no longer designed to return nutrients in this way, except on small, local levels. Harvesting methods like clear-cutting precipitate soil erosion, and chemical processes used in both agriculture and manufacture often lead to salinization and acidification, helping to deplete more than twenty times as much soil each year as nature creates. It can take approximately five hundred years for soil to build up an inch of its rich layers of microorganisms and nutrient flows, and right now we are losing five thousand times more soil than is being made.

  In preindustrial culture, people did consume things. Most products would safely biodegrade once they were thrown away, buried, or burned. Metals were the exception: these were seen as highly valuable and were melted down and reused. (They were actually what we call early technical nutrients.) But as industrialization advanced, the consumption mode persisted, even though most manufactured items could no longer actually be consumed. In times of scarcitya recognition of the value of technical materials would flare up; people who grew up during the Great Depression, for example, were careful about reusing jars, jugs, and aluminum foil, and during World War II, people saved rubber bands, aluminum foil, steel, and other materials to feed industrial needs. But as cheaper materials and new synthetics flooded the postwar market, it became less expensive for industries to make a new aluminum, plastic, or glass bottle or package at a central plant and ship it out than to build up local infrastructures for collecting, transporting, cleaning, and processing things for reuse. Similarly, in the early decades of industrialization, people might pass down, repair, or sell old service products like ovens, refrigerators, and phones to junk dealers. Today most so-called durables are tossed. (Who on Earth would repair a cheap toaster today? It is much easier to buy a new one than it is to send the parts back to the manufacturer or track down someone to repair it locally.) Throwaway products have become the norm.

  There is no way, for example, that you are going to consume your car; and although it is made of valuable technical materials, you can’t do anything with them once you finish with it (unless you are a junk artist). As we have mentioned, these materials are lost or degraded even in “recycling” because cars are not designed from the beginning for effective, optimal recycling as technical nutrients. Indeed, industries design products with built-in obsolescence—that is, to last until approximately the time customers typically want to replace them. Even things with a real consumable potential, such as packaging materials, are often deliberately designed not to break down under natural conditions. In fact, packaging may last far longer than the product it protected. In places where resources are hard to get, people still creatively reuse materials to make new products (such as using old tire rubber to make sandals) and even energy (burning synthetic materials for fuel). Such creativity is natural and adaptive and can be a vital part of material cycles. But as long as these uses are ignored by current industrial design and manufacturing, which typically refrain from embracing any vision of a product’s further life, such reuse will often be unsafe, even lethal.

  Monstrous Hybrids

  Mountains of waste rising in landfills are a growing concern, but the quantity of these wastes—the space they take up—is not the major problem of cradle-to-grave designs. Of greater concern are the nutrients—valuable “food” for both industry and nature—that are contaminated, wasted, or lost. They are lost not only for lack of adequate systems of retrieval; they are lost also because many products are what we jokingly refer to as “Frankenstein products” or (with apologies to Jane Jacobs) “monstrous hybrids”—mixtures of materials both technical and biological, neither of which can be salvaged after their current lives.

  A conventional leather shoe is a monstrous hybrid. At one time, shoes were tanned with vegetable chemicals, which were relatively safe, so the wastes from their manufacture posed no real problem. The shoe could biodegrade after its useful life or be safely burned. But vegetable tanning required that trees be harvested for their tannins. As a result, shoes took a long time to make, and they were expensive. In the past forty years, vegetable tanning has been replaced with chromium tanning, which is faster and cheaper. But chromium is rare and valuable for industries, and in some forms it is carcinogenic. Today shoes are often tanned in developing countries where few if any precautions are taken to protect people and ecosystems from chromium exposure; manufacturing wastes may be dumped into nearby bodies of water or incinerated, either of which distributes toxins (often disproportionately in low-income areas). Conventional rubber shoe soles, moreover, usually contain lead and plastics. As the shoe is worn, particles of it degrade into the atmosphere and soil. It cannot be safely consumed, either by you or by the environment. After use, its valuable materials, both biological and technical, are usually lost in a landfill.

  A Confusion of Flows

  There may be no more potent image of disagreeable waste than sewage. It is a kind of waste people are happy to get “away” from. Before modern sewage systems, people in cities would dump their wastes outside (which might mean out the window), bury them, slop them into cesspools at the bottom of a house, or dispose of them in bodies of water, sometimes upstream from drinking sources. It wasn’t until the late nineteenth century that people began to make the connection between sanitation and public health, which provided the impetus for more sophisticated sewage treatment. Engineers saw pipes taking storm water to rivers and realized this would be a convenient way to remove waterborne sewage. But that didn’t end the problem. From time to time the disposal of raw sewage in rivers close to home became unbearable; during the Great Stink of London in 1858, for examp
le, the reek of raw sewage in the nearby Thames disrupted sittings of the House of Commons. Eventually, sewage treatment plants were built to treat effluents and sized to accommodate waterborne sewage combined with added storm water during major rains.

  The original idea was to take relatively active biologically based sewage, principally from humans (urine and excrement, the kind of waste that has interacted with the natural world for millennia), and render it harmless. Sewage treatment was a process of microbial and bacterial digestion. The solids were removed as sludge, and the remaining liquid, which had brought the sewage to treatment in the first place, could be released essentially as water. That was the original strategy. But once the volume of sewage overwhelmed the waterways into which it flowed, harsh chemical treatments like chlorination were added to manage the process. At the same time, new products were being marketed for household use that were never designed with sewage treatment plants (or aquatic ecosystems) in mind. In addition to biological wastes, people began to pour all kinds of things down the drain: cans of paint, harsh chemicals to unclog pipes, bleach, paint thinners, nail-polish removers. And the waste itself now carried antibiotics and even estrogens from birth control pills. Add the various industrial wastes, cleaners, chemicals, and other substances that will join household wastes, and you have highly complex mixtures of chemical and biological substances that still go by the name of sewage. Antimicrobial products—like many soaps currently marketed for bathroom use—may sound desirable, but they are a problematic addition to a system that relies on microbes to be effective. Combine them with antibiotics and other antibacterial ingredients, and you may even set in motion a program to create hyperresistant superbacteria.

  Recent studies have found hormones, endocrine disrupters, and other dangerous compounds in bodies of water that receive “treated” sewage effluents. These substances can contaminate natural systems and drinking-water supplies and, as we have noted, can lead to mutations of aquatic and animal life. Nor have the sewage pipes themselves been designed for biological systems; they contain materials and coatings that could degrade and contaminate effluents. As a result, even efforts to reuse sewage sludge for fertilizer have been hampered by farmers’ concern over toxification of the soil.

  If we are going to design systems of effluents that go back into the environment, then perhaps we ought to move back upstream and think of all the things that are designed to go into such systems as part of nutrient flows. For example, the mineral phosphate is used as a fertilizer for crops around the world. Typical fertilizer uses phosphate that is mined from rock, however, and extracting it is extremely destructive to the environment. But phosphate also occurs naturally in sewage sludge and other organic wastes. In fact, in European sewage sludge, which is often landfilled, phosphate occurs in higher concentrations than it does in some phosphate rock in China, where much of it is mined to devastating effect on local ecosystems. What if we could design a system that safely captured the phosphate already in circulation, rather than discarding it as sludge?

  From Cradle-to-Grave to Cradle-to-Cradle

  People involved in industry, design, environmentalism, and related fields often refer to a product’s “life cycle.” Of course, very few products are actually living, but in a sense we project our vitality—and our mortality—onto them. They are something like family members to us. We want them to live with us, to belong to us. In Western society, people have graves, and so do products. We enjoy the idea of ourselves as powerful, unique individuals; and we like to buy things that are brand-new, made of materials that are “virgin.” Opening a new product is a kind of metaphorical defloration: “This virgin product is mine, for the very first time. When I am finished with it (special, unique person that I am), everyone is. It is history.” Industries design and plan according to this mind-set.

  We recognize and understand the value of feeling special, even unique. But with materials, it makes sense to celebrate the sameness and commonality that permit us to enjoy them—in special, even unique, products—more than once. What would have happened, we sometimes wonder, if the Industrial Revolution had taken place in societies that emphasize the community over the individual, and where people believed not in a cradle-to-grave life cycle but in reincarnation?

  A World of Two Metabolisms

  The overarching design framework we exist within has two essential elements: mass (the Earth) and energy (the sun). Nothing goes in or out of the planetary system except for heat and the occasional meteorite. Otherwise, for our practical purposes, the system is closed, and its basic elements are valuable and finite. Whatever is naturally here is all we have. Whatever humans make does not go “away.”

  If our systems contaminate Earth’s biological mass and continue to throw away technical materials (such as metals) or render them useless, we will indeed live in a world of limits, where production and consumption are restrained, and the Earth will literally become a grave.

  If humans are truly going to prosper, we will have to learn to imitate nature’s highly effective cradle-to-cradle system of nutrient flow and metabolism, in which the very concept of waste does not exist. To eliminate the concept of waste means to design things—products, packaging, and systems—from the very beginning on the understanding that waste does not exist. It means that the valuable nutrients contained in the materials shape and determine the design: form follows evolution, not just function. We think this is a more robust prospect than the current way of making things.

  As we have indicated, there are two discrete metabolisms on the planet. The first is the biological metabolism, or the biosphere—the cycles of nature. The second is the technical metabolism, or the technosphere—the cycles of industry, including the harvesting of technical materials from natural places. With the right design, all of the products and materials manufactured by industry will safely feed these two metabolisms, providing nourishment for something new.

  Products can be composed either of materials that biodegrade and become food for biological cycles, or of technical materials that stay in closed-loop technical cycles, in which they continually circulate as valuable nutrients for industry. In order for these two metabolisms to remain healthy, valuable, and successful, great care must be taken to avoid contaminating one with the other. Things that go into the organic metabolism must not contain mutagens, carcinogens, persistent toxins, or other substances that accumulate in natural systems to damaging effect. (Some materials that would damage the biological metabolism, however, could be safely handled by the technical metabolism.) By the same token, biological nutrients are not designed to be fed into the technical metabolism, where they would not only be lost to the biosphere but would weaken the quality of technical materials or make their retrieval and reuse more complicated.

  The Biological Metabolism

  A biological nutrient is a material or product that is designed to return to the biological cycle—it is literally consumed by microorganisms in the soil and by other animals. Most packaging (which makes up about 50 percent of the volume of the municipal solid waste stream) can be designed as biological nutrients, what we call products of consumption. The idea is to compose these products of materials that can be tossed on the ground or compost heap to safely biodegrade after use—literally to be consumed. There is no need for shampoo bottles, toothpaste tubes, yogurt and ice-cream cartons, juice containers, and other packaging to last decades (or even centuries) longer than what came inside them. Why should individuals and communities be burdened with downcycling or landfilling such material? Worry-free packaging could safely decompose, or be gathered and used as fertilizer, bringing nutrients back to the soil. Shoe soles could degrade to enrich the environment. Soaps and other liquid cleaning products could be designed as biological nutrients as well; that way, when they wash down the drain, pass through a wetland, and end up in a lake or river, they support the balance of the ecosystem.

  In the early 1990s the two of us were asked by DesignTex, a division of Steelcase, to con
ceive and create a compostable upholstery fabric, working with the Swiss textile mill Röhner. We were asked to focus on creating an aesthetically unique fabric that was also environmentally intelligent. DesignTex first proposed that we consider cotton combined with PET (polyethylene terephthalate) fibers from recycled soda bottles. What could be better for the environment, they thought, than a product that combined a “natural” material with a “recycled” one? Such hybrid material had the additional apparent advantages of being readily available, market-tested, durable, and cheap.

  But when we looked carefully at the potential long-term design legacy, we discovered some disturbing facts. First, as we have mentioned, upholstery abrades during normal use, and so our design had to allow for the possibility that particles might be inhaled or swallowed. PET is covered with synthetic dyes and chemicals and contains other questionable substances—not exactly what you want to breathe or eat. Furthermore, the fabric would not be able to continue after its useful life as either a technical or a biological nutrient. The PET (from the plastic bottles) would not go back to the soil safely, and the cotton could not be circulated in industrial cycles. The combination would be yet another monstrous hybrid, adding junk to a landfill, and it might also be dangerous. This was not a product worth making.

  We made clear to our client our intention to create a product that would enter either the biological or the technical metabolism, and the challenge crystallized for both of us. The team decided to design a fabric that would be safe enough to eat: it would not harm people who breathed it in, and it would not harm natural systems after its disposal. In fact, as a biological nutrient, it would nourish nature.

  The textile mill that was chosen to produce the fabric was quite clean by accepted environmental standards, one of the best in Europe, yet it had an interesting dilemma. Although the mill’s director, Albin Kaelin, had been diligent about reducing levels of dangerous emissions, government regulators had recently defined the mill’s fabric trimmings as hazardous waste. The director had been told that he could no longer bury or burn these trimmings in hazardous-waste incinerators in Switzerland but had to export them to Spain for disposal. (Note the paradoxes here: the trimmings of a fabric are not to be buried or disposed of without expensive precaution, or must be exported “safely” to another location, but the material itself can still be sold as safe for installation in an office or home.) We hoped for a different fate for our trimmings: to provide mulch for the local garden club, with the help of sun, water, and hungry microorganisms.