Cradle to Cradle Read online

Page 3


  This and other advances made possible the mass production of the universal car, the Model T, from a centralized location, where many vehicles were assembled at once. Increasing efficiency pushed costs of the Model T down (from $850 in 1908 to $290 in 1925), and sales skyrocketed. By 1911, before the introduction of the assembly line, sales of the Model T had totaled 39,640. By 1927, total sales reached fifteen million.

  The advantages of standardized, centralized production were manifold. Obviously, it could bring greater, quicker affluence to industrialists. On another front, manufacturing was viewed as what Winston Churchill referred to as “the arsenal of democracy,” because the productive capacity was so huge, it could (as in the two world wars) produce an undeniably potent response to war conditions. Mass production had another democratizing aspect: as the Model T demonstrated, when prices of a previously unattainable item or service plummeted, more people had access to it. New work opportunities in factories improved standards of living, as did wage increases. Ford himself assisted in this shift. In 1914, when the prevailing salary for factory workers was $2.34 a day, he hiked it to $5, pointing out that cars cannot buy cars. (He also reduced the hours of the workday from nine to eight.) In one fell swoop, he actually created his own market, and raised the bar for the entire world of industry.

  Viewed from a design perspective, the Model T epitomized the general goal of the first industrialists: to make a product that was desirable, affordable, and operable by anyone, just about anywhere; that lasted a certain amount of time (until it was time to buy a new one); and that could be produced cheaply and quickly. Along these lines, technical developments centered on increasing “power, accuracy, economy, system, continuity, speed,” to use the Ford manufacturing checklist for mass production.

  For obvious reasons, the design goals of early industrialists were quite specific, limited to the practical, profitable, efficient, and linear. Many industrialists, designers, and engineers did not see their designs as part of a larger system, outside of an economic one. But they did share some general assumptions about the world.

  “Those Essences Unchanged by Man”

  Early industries relied on a seemingly endless supply of natural “capital.” Ore, timber, water, grain, cattle, coal, land—these were the raw materials for the production systems that made goods for the masses, and they still are today. Ford’s River Rouge plant epitomized the flow of production on a massive scale: huge quantities of iron, coal, sand, and other raw materials entered one side of the facility and, once inside, were transformed into new cars. Industries fattened as they transformed resources into products. The prairies were overtaken for agriculture, and the great forests were cut down for wood and fuel. Factories situated themselves near natural resources for easy access (today a prominent window company is located in a place that was originally surrounded by giant pines, used for the window frames) and beside bodies of water, which they used both for manufacturing processes and to dispose of wastes.

  In the nineteenth century, when these practices began, the subtle qualities of the environment were not a widespread concern. Resources seemed immeasurably vast. Nature itself was perceived as a “mother earth” who, perpetually regenerative, would absorb all things and continue to grow. Even Ralph Waldo Emerson, a prescient philosopher and poet with a careful eye for nature, reflected a common belief when, in the early 1830s, he described nature as “essences unchanged by man; space, the air, the river, the leaf.” Many people believed there would always be an expanse that remained unspoiled and innocent. The popular fiction of Rudyard Kipling and others evoked wild parts of the world that still existed and, it seemed, always would.

  At the same time, the Western view saw nature as a dangerous, brutish force to be civilized and subdued. Humans perceived natural forces as hostile, so they attacked back to exert control. In the United States, taming the frontier took on the power of a defining myth, and “conquering” wild, natural places was recognized as a cultural—even spiritual—imperative.

  Today our understanding of nature has dramatically changed. New studies indicate that the oceans, the air, the mountains, and the plants and animals that inhabit them are more vulnerable than early innovators ever imagined. But modern industries still operate according to paradigms that developed when humans had a very different sense of the world. Neither the health of natural systems, nor an awareness of their delicacy, complexity, and interconnectedness, have been part of the industrial design agenda. At its deepest foundation, the industrial infrastructure we have today is linear: it is focused on making a product and getting it to a customer quickly and cheaply without considering much else.

  To be sure, the Industrial Revolution brought a number of positive social changes. With higher standards of living, life expectancy greatly increased. Medical care and education greatly improved and became more widely available. Electricity, telecommunications, and other advances raised comfort and convenience to a new level. Technological advances brought the so-called developing nations enormous benefits, including increased productivity of agricultural land and vastly increased harvests and food storage for growing populations.

  But there were fundamental flaws in the Industrial Revolution’s design. They resulted in some crucial omissions, and devastating consequences have been handed down to us, along with the dominant assumptions of the era in which the transformation took shape.

  From Cradle to Grave

  Imagine what you would come upon today at a typical landfill: old furniture, upholstery, carpets, televisions, clothing, shoes, telephones, computers, complex products, and plastic packaging, as well as organic materials like diapers, paper, wood, and food wastes. Most of these products were made from valuable materials that required effort and expense to extract and make, billions of dollars’ worth of material assets. The biodegradable materials such as food matter and paper actually have value too—they could decompose and return biological nutrients to the soil. Unfortunately, all of these things are heaped in a landfill, where their value is wasted. They are the ultimate products of an industrial system that is designed on a linear, one-way cradle-to-grave model. Resources are extracted, shaped into products, sold, and eventually disposed of in a “grave” of some kind, usually a landfill or incinerator. You are probably familiar with the end of this process because you, the customer, are responsible for dealing with its detritus. Think about it: you may be referred to as a consumer, but there is very little that you actually consume—some food, some liquids. Everything else is designed for you to throw away when you are finished with it. But where is “away”? Of course, “away” does not really exist. “Away” has gone away.

  Cradle-to-grave designs dominate modern manufacturing. According to some accounts more than 90 percent of materials extracted to make durable goods in the United States become waste almost immediately. Sometimes the product itself scarcely lasts longer. It is often cheaper to buy a new version of even the most expensive appliance than to track down someone to repair the original item. In fact, many products are designed with “built-in obsolescence,” to last only for a certain period of time, to allow—to encourage—the customer to get rid of the thing and buy a new model. Also, what most people see in their garbage cans is just the tip of a material iceberg; the product itself contains on average only 5 percent of the raw materials involved in the process of making and delivering it.

  One Size Fits All

  Because the cradle-to-grave model underlying the design assumptions of the Industrial Revolution was not called into question, even movements that were formed ostensibly in opposition to that era manifested its flaws. One example has been the push to achieve universal design solutions, which emerged as a leading design strategy in the last century. In the field of architecture, this strategy took the form of the International Style movement, advanced during the early decades of the twentieth century by figures such as Ludwig Mies van der Rohe, Walter Gropius, and Le Corbusier, who were reacting against Victorian-era styles.
(Gothic cathedrals were still being proposed and built.) Their goals were social as well as aesthetic. They wanted to globally replace unsanitary and inequitable housing—fancy, ornate places for the rich; ugly, unhealthy places for the poor—with clean, minimalist, affordable buildings unencumbered by distinctions of wealth or class. Large sheets of glass, steel, and concrete, and cheap transportation powered by fossil fuels, gave engineers and architects the tools for realizing this style anywhere in the world.

  Today the International Style has evolved into something less ambitious: a bland, uniform structure isolated from the particulars of place—from local culture, nature, energy, and material flows. Such buildings reflect little if any of a region’s distinctness or style. They often stand out like sore thumbs from the surrounding landscape, if they leave any of it intact around their “office parks” of asphalt and concrete. The interiors are equally uninspiring. With their sealed windows, constantly humming air conditioners, heating systems, lack of daylight and fresh air, and uniform fluorescent lighting, they might as well have been designed to house machines, not humans.

  The originators of the International Style intended to convey hope in the “brotherhood” of humankind. Those who use the style today do so because it is easy and cheap and makes architecture uniform in many settings. Buildings can look and work the same anywhere, in Reykjavík or Rangoon.

  In product design, a classic example of the universal design solution is mass-produced detergent. Major soap manufacturers design one detergent for all parts of the United States or Europe, even though water qualities and community needs differ. For example, customers in places with soft water, like the Northwest, need only small amounts of detergent. Those where the water is hard, like the Southwest, need more. But detergents are designed so they will lather up, remove dirt, and kill germs efficiently the same way anywhere in the world—in hard, soft, urban, or spring water, in water that flows into fish-filled streams and water channeled to sewage treatment plants. Manufacturers just add more chemical force to wipe out the conditions of circumstance. Imagine the strength a detergent must have to strip day-old grease from a greasy pan. Now imagine what happens when that detergent comes into contact with the slippery skin of a fish or the waxy coating of a plant. Treated and untreated effluents as well as runoff are released into lakes, rivers, and oceans. Combinations of chemicals, from household detergents, cleansers, and medicines along with industrial wastes, end up in sewage effluents, where they have been shown to harm aquatic life, in some cases causing mutations and infertility.

  To achieve their universal design solutions, manufacturers design for a worst-case scenario; they design a product for the worst possible circumstance, so that it will always operate with the same efficacy. This aim guarantees the largest possible market for a product. It also reveals human industry’s peculiar relationship to the natural world, since designing for the worst case at all times reflects the assumption that nature is the enemy.

  Brute Force

  If the first Industrial Revolution had a motto, we like to joke, it would be “If brute force doesn’t work, you’re not using enough of it.” The attempt to impose universal design solutions on an infinite number of local conditions and customs is one manifestation of this principle and its underlying assumption, that nature should be overwhelmed; so is the application of the chemical brute force and fossil fuel energy necessary to make such solutions “fit.”

  All of nature’s industry relies on energy from the sun, which can be viewed as a form of current, constantly renewing income. Humans, by contrast, extract and burn fossil fuels such as coal and petrochemicals that have been deposited deep below the Earth’s surface, supplementing them with energy produced through waste-incineration processes and nuclear reactors that create additional problems. They do this with little or no attention to harnessing and maximizing local natural energy flows. The standard operating instruction seems to be “If too hot or too cold, just add more fossil fuels.”

  You are probably familiar with the threat of global warming brought about by the buildup of heat-trapping gases (such as carbon dioxide) in the atmosphere due to human activities. Increasing global temperatures result in global climate change and shifts of existing climates. Most models predict more severe weather: hotter hots, cooler colds, and more intense storms, as global thermal contrasts grow more extreme. A warmer atmosphere draws more water from oceans, resulting in bigger, wetter, more frequent storms, rises in sea level, shifts in seasons, and a chain of other climatic events.

  The reality of global warming has gained currency not only among environmentalists but among industry leaders. But global warning is not the sole reason to rethink our reliance on the “brute force” approach to energy. Incinerating fossil fuels contributes particulates—microscopic particles of soot—to the environment, where they are known to cause respiratory and other health problems. Regulations for airborne pollutants known to threaten health are growing more severe. As new regulations, based on mounting research about the health threats of airborne toxins resulting from incinerating fossil fuels, are implemented, industries invested solely in continuing the current system will be at a serious disadvantage.

  Even beyond these important issues, brute force energy doesn’t make good sense as a dominant strategy over the long term. You wouldn’t want to depend on savings for all of your daily expenditures, so why rely on savings to meet all of humanity’s energy needs? Clearly, over the years petrochemicals will become harder (and more expensive) to get, and drilling in pristine places for a few million more drums of oil isn’t going to solve that problem. In a sense, finite sources of energy, such as petrochemicals derived from fossil fuels, can be seen as a nest egg, something to be preserved for emergencies, then used sparingly—in certain medical situations, for example. For the majority of our simple energy needs, humans could be accruing a great deal of current solar income, of which there is plenty: thousands of times the amount of energy needed to fuel human activities hits the surface of the planet every day in the form of sunlight.

  A Culture of Monoculture

  Under the existing paradigm of manufacturing and development, diversity—an integral element of the natural world—is typically treated as a hostile force and a threat to design goals. Brute force and universal design approaches to typical development tend to overwhelm (and ignore) natural and cultural diversity, resulting in less variety and greater homogeneity.

  Consider the process of building a typical universal house. First builders scrape away everything on the site until they reach a bed of clay or undisturbed soil. Several machines then come in and shape the clay to a level surface. Trees are felled, natural flora and fauna are destroyed or frightened away, and the generic mini McMansion or modular home rises with little regard for the natural environment around it—ways the sun might come in to heat the house during the winter, which trees might protect it from wind, heat, and cold, and how soil and water health can be preserved now and in the future. A two-inch carpet of a foreign species of grass is placed over the rest of the lot.

  The average lawn is an interesting beast: people plant it, then douse it with artificial fertilizers and dangerous pesticides to make it grow and to keep it uniform—all so that they can hack and mow what they encouraged to grow. And woe to the small yellow flower that rears its head!

  Rather than being designed around a natural and cultural landscape, most modern urban areas simply grow, as has often been said, like a cancer, spreading more and more of themselves, eradicating the living environment in the process, blanketing the natural landscape with layers of asphalt and concrete.

  Conventional agriculture tends to work along these same lines. The goal of a midwestern commercial corn operation is to produce as much corn as possible with the least amount of trouble, time, and expense—the Industrial Revolution’s first design goal of maximum efficiency. Most conventional operations today focus on highly specialized, hybridized, and perhaps genetically modified species of c
orn. They develop a monocultural landscape that appears to support only one particular crop that’s likely not even a true species but some over-hybridized cultivar. Planters remove other species of plant life using tillage, which leads to massive soil erosion from wind and water, or no-till farming, which requires massive applications of herbicide. Ancient strains of corn are lost because their output does not meet the demands of modern commerce.

  On the surface, these strategies seem reasonable to modern industry and even to “consumers,” but they harbor both underlying and overlying problems. Elements that are removed from the ecosystem to make the operation yield more grain more quickly (that is, to make it more efficient) would otherwise actually provide benefits to farming. The plants removed by tillage, for example, could have helped to prevent erosion and flooding and to stabilize and rebuild soil. They would have provided habitat for insects and birds, some of them natural enemies of crop pests. Now, as pests grow resistant to pesticide, their numbers increase because their natural enemies have been wiped out.

  Pesticides, as typically designed, are a perennial cost both to farmers and to the environment and represent a less than mindful use of chemical brute force. Although chemical companies warn farmers to be careful with pesticides, they benefit when more of them are sold. In other words, the companies are unintentionally invested in profligacy with—even the mishandling of—their products, which can result in contamination of the soil, water, and air.