Uncle John's Bathroom Reader Golden Plunger Awards Read online

Page 5


  SEA CHANGE

  Look around any college campus and you’ll see a lot of “peacoats”: short, double-breasted navy blue woolen jackets with slash pockets and large, simple lapels. They’re warm, flatter a variety of figures, easy to get on and off, and long lasting.

  The British Navy originally developed this jacket for its sailors in the early 1700s. The coat’s length allowed sailors to move quickly and easily around decks, and the double-button closure kept the coats from flying open in a gale or while climbing a mast. The pockets made it easy to quickly warm frozen hands, and the lapels could be flipped up for extra warmth. The U.S. Navy officially adopted peacoats in the early 1900s.

  But why are they called “peacoats”? (They’re also sometimes known as “pea jackets” and “reefer coats.”) The name comes from a Dutch–West Frisian word for the heavy woolen cloth used to make them: pij. But have no fear—they’re heavy no longer. Originally 30 ounces in weight, today’s peacoat can be anywhere from 22 to 32 ounces.

  IN THE TRENCHES

  One of most enduring fashions for both men and women is the trench coat. The traditional trench, a rather severe shade of olive brown called khaki, usually comes to the mid-calf. It had fringed shoulder decorations (called epaulets), a buckled belt with several metal loops along the bottom, straps at both wrists, and a button-in liner.

  High-end designer Burberry makes coveted trench coats, and with good reason—Thomas Burberry invented the coat. He invented a fabric called gabardine, which is tightly woven wool. In 1901, he submitted a gabardine coat design to the British War Office. The Brits latched onto it immediately: Burberry’s fabric was lighter and more serviceable than the heavy wool typically used for officers’ greatcoats. By World War I, gabardine coats were common among British (and French) officers. The coat’s D rings were handy for hanging canteens, utility knives, and more, and the epaulets allowed everyone to see an officer’s rank while he remained dry and warm. (Too bad for the lower ranks, who weren’t allowed to wear—or even purchase—them.) The coats got their name from their appearance in this war, too: officers wore them in the trenches all over Europe.

  BLENDING IN TO STAND OUT

  Camouflage has been around for millions of years . . . if you count all of the animal species whose coats and coverings blend in with different surroundings. But the idea of soldiers in Western societies blending into their surroundings didn’t take hold until the 19th century. British, French, and even some American armies usually wore brightly colored uniforms, probably to distinguish themselves from their enemies.

  The first move toward making military uniforms less visible came during the mid-1800s when British soldiers in India started dyeing their red coats brown. Other armies picked up the trend, making their uniforms brown, green, khaki, gray, or some combination (grigio-verde, or gray-green, worked well in the Italian Alps). Widespread use of camouflage began during World War I, when the French created a “Camouflage Division” of 20,000 cam-oufleurs (largely made up of former theater stage designers and artists) whose job was to alter or disguise equipment and locations with painted canvases so the Germans couldn’t find them. (Cam-oufler means “to disguise” or “to veil.”)

  The camouflage worked, but back then, mass-producing printed fabrics was expensive. So the painted uniforms were first restricted to snipers and other exposed soldiers. During World War II, when mass-produced printed fabrics were more widely available, armies of several nations began producing battle fatigues in the splotchy pattern we now call camouflage.

  Perhaps because of its connection to battlefield action, wearing camouflage took a long time to catch on among most civilians (though hunters have worn it for years). However, in the past decade “camo” has become a popular fashion style and is available in shades from greens and browns to fluorescent pinks and purples—and in garments from baggy combat-inspired trousers to skintight tank tops and cargo skirts.

  THE CLASSIC RIVALRY AWARD

  Army–Navy Football

  The annual Army–Navy matchup remains one of the few major football

  games played for the sheer love of the sport . . . or perhaps for sheer

  love of rivalry. Uncle John thinks both are award worthy.

  BIRTH OF A TRADITION

  The first intercollegiate football game took place in 1869 between Rutgers and Princeton universities, but in 1890 a historic sports rivalry began:

  Army (the United States Military Academy at West Point)

  vs.

  Navy (the United States Naval Academy at Annapolis)

  The matchup takes place every first Saturday in December (it used to be played on the Saturday after Thanksgiving). Most of the players aren’t thinking of playing in the NFL, since they have a five-year minimum military commitment after graduation. Instead, they play for the love of the game . . . and to beat Army or Navy, as the case may be.

  THE LINEUP

  On November 29, 1890, two squads of football players faced off in New York on the Great Plain, the field at West Point. The Navy team, which had been playing organized football since 1879, had challenged Army Cadet Dennis Mahan Michie (for whom Army’s Michie Stadium was named) to the game, and he accepted. West Point didn’t even have a team, so Michie had to organize and coach the players in a matter of weeks. As might be expected, that first meeting was a rout. Navy shut out Army 24–0. The next year, Army came back to beat Navy. And so began a passionate rivalry.

  TIME-OUT

  The football game between America’s two biggest military academies has been played ever since with only four interruptions:

  • 1894–1898: After the 1893 Navy victory, a Navy rear admiral and an Army brigadier general had an argument that nearly ended in a duel. To keep things peaceful, President Grover Cleveland decreed that for a period of five years, the Army and Navy football teams be confined to their home turf (other schools could visit them). Thus, they couldn’t compete against each other. In 1899, with tempers no longer hot, the Army–Navy game resumed, but this time neither team had the home advantage: they played in Philadelphia, “neutral territory” because it was about halfway between the schools.

  • 1909: Army canceled its entire schedule because one of its players was killed during a game against Harvard.

  • 1917–1918: Called on account of World War I.

  • 1928–1929: Neither academy could agree on player eligibility standards, so they didn’t play again until 1930.

  The game was canceled again in 1963 because it was scheduled for just eight days after President John F. Kennedy was assassinated. But his widow, Jacqueline Kennedy, insisted that it be rescheduled and played because her husband (a former Navy officer who’d been rejected by the Army) had so enjoyed the contest. That year, Navy won.

  A GOAT AND A MULE WALK INTO A STADIUM . . .

  The traditions surrounding Army–Navy have grown with the years—from the phrases that each academy chants (“Beat Navy!” and “Beat Army!”) to the practical jokes played on the teams’ mascots. The Navy goat and the Army mule have both been kidnapped, sometimes repeatedly. In 2007, three Navy goats were nabbed, and a video of the “crime” was posted on YouTube. (The goats were later returned, unharmed.)

  At the end of each game, both schools get to hear their fight songs—first the losing team and then the winner. In a show of respect and solidarity, the winning team stands alongside the losing team and faces the losing academy’s students. Then the losing team accompanies the winning team, facing its students.

  INSTANT REPLAY

  Here are a few fun facts about the Army–Navy rivalry:

  • Navy’s team name is the Midshipmen, and its colors are gold and blue.

  • Navy’s mascot is a goat named Bill; the animal was selected because of its traditional inclusion on ships as a source of fresh food for sailors.

  • Army’s team name is the Black Knights, and its colors are black, gold, and gray.

  • Army’s mascot, a mule whose name has changed over
the years, was selected for its usefulness in military operations where it hauled supplies, ammunition, and guns, although the first mascot pulled an ice wagon.

  • In 1963, the Army–Navy game was the first televised football game to use instant replay during the broadcast.

  • The game was the subject of a 1973 M*A*S*H episode in which the Korean War–based soldiers tried to watch the game but thought they’d been bombed by the enemy. The “bomb” that landed in the middle of their camp turned out to be a propaganda shell filled with CIA leaflets.

  SUCCESS STORIES

  A few Army–Navy players did go on to the NFL:

  • Roger Staubach (Navy 1965) had a stellar career as a quarterback with the Dallas Cowboys.

  • Phil McConkey (Navy 1979) was a wide receiver and kickoff /punt returner for the New York Giants in their Super Bowl XXI victory.

  • Napoleon McCallum (Navy 1985) was allowed to serve in the Navy while playing for the Los Angeles Raiders. He joined the team full-time when his military service was over, but a knee injury ended his football career in 1994.

  THE LET THE CHIPS FALL WHERE THEY MAY AWARD

  The Toll House Cookie

  Uncle John can’t verify whether or not the chocolate-chip cookie

  was an accidental or intentional creation, but he can

  proclaim it the most incredibly delicious cookie ever.

  HOME IS WHERE THE COOKIE IS

  Apple pies were first baked in England, hot dogs and hamburgers come from Germany, but the chocolate-chip cookie is an all-American creation. In fact, chocolate-chip cookies are such a fixture in American culture that it’s hard to believe they’ve only been around since the 1930s . . . invented by accident (perhaps) in Whitman, Massachusetts.

  In 1930, Kenneth and Ruth Graves Wakefield bought an old tollhouse in Whitman. The house had once been a way station for passing travelers—they paid tolls, rested, and could get home-cooked meals. The young couple christened it the Toll House Inn and set about turning the 1709 building into a modern lodge. Ruth decorated with colonial furnishings and started cooking all homemade meals for their guests. She was a good cook, especially of desserts, and soon her creations attracted crowds. One of Ruth’s trademarks was sending guests away with a full meal and her homemade Butter Drop Do cookies—plain sugar or a chocolate version she made by adding powdered baker’s chocolate to the dough.

  THE BATTER THICKENS

  Here’s where Ruth Wakefield’s chocolate-chip cookie story gets tricky. There are two versions of what happened. In Nestlé’s official account (Nestlé now owns the Toll House Cookie brand), which is also supported by the Wakefield descendants, Ruth ran out of powdered chocolate and substituted a semi-sweet chocolate bar, which she cut into pieces and then added to the batter. The chocolate had come from a friend, Andrew Nestlé—yes, that Nestlé. Ruth assumed that the chocolate would melt during baking and produce chocolate cookies. Instead, the pieces retained their shape and studded the Butter Drop Do cookies with bits of chocolate.

  Different though they were, the cookies were a huge hit with visitors to the inn. Andrew Nestlé saw their potential and made a deal: Nestlé would buy the rights to Ruth’s cookie recipe and sell its semi-sweet chocolate bars (complete with a small chopper to make chocolate chips) with the recipe printed on the back. In exchange, Ruth would receive a lifetime supply of chocolate to make cookies at her inn. The deal was a success; as word spread about Ruth Wakefield’s Toll House Cookies, Nestlé sold a lot of chocolate and eventually developed packages of individual chocolate bits to simplify the cookie-making process. Today, Nestlé still makes chocolate chips, and Ruth Wakefield’s recipe is still printed on the bags.

  NOT EXACTLY HOW THE COOKIE CRUMBLES

  That official tale is all well and good, but George Boucher, the former head chef at the Toll House Inn, and his daughter Carol Cavanaugh tell a slightly different story of the chocolate-chip cookie’s birth. Ruth Wakefield was a graduate of what is now Framingham State College and had worked as a dietitian and food lecturer. Cavanaugh claims that Ruth, an accomplished chef and cookbook author, would have known that chocolate chunks wouldn’t melt completely into batter and wouldn’t have used them as a substitute for chocolate powder.

  According to Cavanaugh and Boucher, the chocolate bar in question accidentally fell into Ruth’s Butter Drop Do batter and cracked into bits due to the vibrations of the industrial-sized mixer she was using to make the cookies. Ruth was going to throw the batter away; she thought it was ruined. But Boucher convinced her to cook the dough instead. The result was a rich chocolate-chip cookie that was an instant hit with the guests.

  Boucher also claims that Wakefield never sold her recipe to Nestlé—only the rights to reproduce it on the chocolate wrapper. According to him, Nestlé’s lawyers found loopholes in the initial agreement that let the company take ownership of the recipe.

  CHIPS OFF THE OLD BLOCK

  Ruth continued to cook and author cookbooks, including a series called Ruth Wakefield’s Recipes: Tried and True that went through 39 printings. In 1966, the Wakefields sold their inn, and over the years, the property was used as a nightclub and then another bed and breakfast. It burned down in 1984. We may never know whether or not Ruth Wakefield’s Toll House Cookie was the result of canny culinary forethought or a happy accident, but at least we’ll always have her original recipe.

  SOME MORSELS OF INFORMATION

  • About two thirds of Americans prefer their chocolate chip cookies without nuts.

  • Half the cookies baked in American kitchens are chocolate chip.

  • On July 9, 1997, Massachusetts designated the chocolate chip cookie as the Official State Cookie, after a third grade class from Somerset proposed it.

  BIGGEST COOKIE EVER

  In 2003, the Immaculate Baking Company made the world’s biggest chocolate chip cookie: 100 feet wide (about the length of a Boeing 737) and weighed 40,000 pounds. The recipe included the following ingredients:

  • 6,000 pounds of semi-sweet chocolate chunks

  • 12,200 pounds of unbleached flour

  • 6,525 pounds of unsalted butter

  • 184 pounds of salt

  • 5,000 pounds of granulated sugar; 3,370 pounds of brown sugar

  • 79 pounds of baking soda

  • 30,000 eggs

  • 10 gallons of pure vanilla

  THE GLOBAL WARMING SOLUTION AWARD

  The Tesla Roadster

  It’s the first eco-friendly vehicle that could appeal equally

  to conservationists and muscle-car fans, which qualifies it

  for an award in itself, but it also gets points

  for helping us save the planet.

  ECO ENGINE

  Sports cars’ speed, style, sass, and power have captivated people for decades. Who cares about fuel economy when you’ve got prestige and sex appeal? That was before the Tesla Roadster. This first-ever all-electric sports car goes from 0–60 in four seconds and doesn’t depend on gas. It’s no suburban hybrid car, either. The Tesla is as curvy and appealing as any Italian coupe. And this car gets 135 MPG on the highway.

  Let’s say that again: 135 MPG. That’s almost two and a half times the mileage of some of the most efficient hybrids, including the Toyota Prius (55 MPG) or the Honda Civic (50 MPG). And it’s nearly five times the mileage of popular compacts like the Mini Cooper (28 MPG).

  The Tesla can be fully charged in three and a half hours on a 240-volt circuit, but it can also be charged overnight using a regular household outlet. Even though that takes longer, it’s an amazing step ahead for the electric car world. The Tesla goes about 220 miles on a single charge.

  CARS ON CURRENTS

  The Roadster is an innovation, but “electric carriages” have been in the works since 1832, when Scotland’s Robert Anderson tried to create one. In 1835, Professor Stratingh of the University of Groeningen in the Netherlands made his own attempt. There were various other tries throughout Europe, but the United
States didn’t have electric vehicles until the late 1890s.

  In 1897, a fleet of electric taxicabs was established in Manhattan, and for a decade or so, it seemed that electric cars might be the wave of the future. Electric cars didn’t have the smell, vibrations, or noise that gasoline-powered cars had, and they also didn’t require cumbersome gear changes.

  But Americans wanted cars that traveled long distances, and the early electric cars couldn’t provide high mileage. That drawback might have been overcome, but in the early part of the 20th century, there was also no place an electric-car driver could stop en route from New York to Las Vegas to plug in and recharge. This, combined with the discovery of Texas crude, meant that gas-powered cars seemed like the smartest way to motor.

  SLEEKER STYLING

  From the 1930s until 1990, a few companies fueled by America’s growing environmental conscience tried to market boxy little electric cars to alterna-Americans, but they had little success because the cars just didn’t look appealing to the mainstream. Finally, by the early 1990s, concern about the environment prompted enough public interest in finding a good electric car solution. Manufacturers made strides, but the cars’ heavy battery packs and ignition/acceleration problems plagued them, and people dismissed electric cars as a fad.