Consider the Fork: A History of How We Cook and Eat
Basic Books, 2012, 352 pp., $26.99
arry Brearley of Sheffield, England invented stainless steel in 1913 in an effort to find a corrosion-resistant metal for gun barrels in a country preparing itself for the yet-to-be-properly appreciated horrors of World War I. Another thing he didn’t appreciate was his lasting improvement to the world’s cutlery. Percy Spencer, creator of the microwave oven, was working on naval radar systems in 1945 when he realized that he had unwittingly discovered a new way of cooking food, even if it would take decades to really catch on. For better or for worse, it caught on.
Histories of technology rarely focus on how advances and inventions apply to the preparation and eating of food. At the same time, culinary histories as a subspecies of social history have become extremely (if not excessively) popular over the past few decades, and they have focused overwhelmingly on the ingredients rather than on the tools, on the “what” of cooking rather than on the “how.” So Bee Wilson, a British food writer, decided to reverse the lens of literary tradition to fill the gap. Her entertaining and illuminating book Consider the Fork concentrates on the history of the domestic kitchen and its many appliances. There is as much invention in a nutcracker as in a bullet, she rightly insists. So instead of recounting the recipes for various gastronomic delights or revealing the secrets of famous restaurant chefs, she looks at the tools that were and are used in an ordinary kitchen.
Some of the more basic tools, like the mortar and pestle, have been around for thousands of years. Others are constantly evolving. Look at the blender, the food processor or the latest thermomix, advertised as being more than ten kitchen appliances in one. Then there are the imaginative utensils that never caught on (alas for the water-powered egg whisk and the magnet-operated spit roaster), as well as those we think we can’t live without. The melon baller may have largely passed into techno-culinary history, but not the mandolin slicer, at least not yet. And while Wilson doesn’t go into detail on this, entire sections of appliance companies and their consultants are consumed (no pun intended) by excruciatingly detailed consideration of aesthetic and ergonomic design.
Wilson offers more than mere skilled observation. She knows how to see broader meaning in quotidian artifacts. “Kitchen gizmos offer a fascinating glimpse into the preoccupations of any given society”, Wilson writes. “If you walk around our own kitchenware shops, you would think that the things we are really obsessed with in the West right now are espressos, panini and cupcakes.”
And for every revolution, there is a counter-revolution. The Slow Food movement started in Italy in 1989 to counter the rise of fast food and fast living. “To the woman who has just acquired an electric blender, the whole world looks like soup”, Wilson jokes in a riff on Abraham Maslow’s famous observation about hammers and nails. But these days it’s fashionable to use more coarsely chopped ingredients, a sort of anti-food processor protest meant to offer proof of human effort and care. This tells us, perhaps, that high-strung, high-speed Western societies are begging deep down to hit the brakes, an instinct that naturally shows up in the kitchen.
Irony does not escape Wilson in describing this. We crave time-saving devices and tend to cook at home less often, but we spend plenty of time creating the absolutely perfect home kitchen. “How can life be complete without a built-in fuchsia pink bean-to-cup espresso machine?”, Wilson quips. We can’t control the big stuff buffeting our lives, but we damned well can command our kitchen like a commodore would his flagship.
aving suggested the deeper significance of the subject, Wilson assiduously traces the changes in cooking technology—as well as the consistencies—over the centuries. She takes us from the original clay pot to the metal cauldron to the brass skillet to the copper pans to cast iron to enamel to stainless steel to non-stick Teflon, originally marketed in the United States as “the happy pan!” All have their uses and cooks their individual or cultural preferences. None are perfect, yet many are loved.
Often the most basic and cherished utensils are the most versatile. The wooden spoon, for example, is certainly not a sophisticated utensil. But it continues to hold onto its place in even the most high-tech, whiz-bang kitchen that is otherwise chock full of mixers and microwaves and gleaming electronic instruments supposedly essential for modern cuisine.
The functionality of the wooden spoon is obvious. It is non-abrasive and therefore gentle on pans. It is non-reactive, so there is no risk of transferring a metallic taste. It is a poor conductor of heat so it never burns a hand in a protracted stirring of soup. And while a metal spoon is clearly a feasible alternative in an age of stainless steel pans, its harder angles tend to smash the diced vegetables, it clanks more disagreeably than wood and its handle is not as easy to hold. Even synthetic spatulas have proven unable to dislodge the ancient and often battered wooden spoon.
We also cook with wooden spoons because, well, we always have. “Tools are first adopted because they meet a certain need or solve a particular problem”, Wilson observes, “but over time the utensils we feel happy using are mainly determined by culture.” Indeed, and that general theme runs throughout Consider the Fork. Its various chapters on everything from the technology of pots and pans to the giant kitchens of medieval England to the adoption of domestic refrigeration as led by the United States illustrate the point.
And the point, by the way, very much includes consideration of differences in national cultures as well as more universal values that accompany cooking and eating everywhere. The Japanese, for example, came to chopstick culture later than the Chinese, from whom they adopted the idea. For Japanese commoners, it was only around the 8th century that chopsticks replaced hands. But chopsticks quickly adopted very specific rules of polite Japanese behavior, including taboos about the sharing of chopsticks. Wilson notes that the horror of impurity or defilement that is part of the Shinto religion goes some way to explaining the phenomenon of waribashi—the disposable chopsticks made from cheap, light wood and almost split for the restaurant customers to pull apart themselves.
Rather than being a modern adaptation similar to the polystyrene cup or the plastic cutlery of fast food restaurants, as is commonly thought in the West, waribashi started with the development of the Japanese restaurant industry in the 18th century. Fresh chopsticks were the only way a Japanese restaurateur could persuade his customers that what they were putting in their mouths had not been defiled by others’ use. Japan now uses and throws away about 23 billion pairs a year and has exported the custom back to China, which now manufactures around 63 billion pairs of disposable chopsticks annually. Even that is not enough to fill the demand. These days a plant in Georgia—with access to plenty of suitable wood—exports billions of disposable “Made in USA” chopsticks to supermarket chains in China, Japan and Korea.
Sometimes cultural differences are less dramatic, but no less telling. Take the way Britons and Americans use the fork. A civilized British eater holds the fork in the left hand, tines down, and the knife in the right. When cutting and eating, say, a cutlet, the operation of getting a chosen morsel from plate to palette does not require transferring a utensil from one hand to the other. It is more than efficient; it is elegant, as such things go. A typical right-handed American, however, tends to hold the fork in the left hand pointed downward like a dagger into said cutlet while sawing away with the knife in the right hand. Having severed the object, the eater will put down the knife, switch the fork from left hand to right, raise and shove. Having been a foreign correspondent in Washington, DC for many years, I know whereof I speak. There is hope yet for the provinces, but this is plainly a barbaric use of identical tools.
ut what of the origin of said fork? As Wilson explains, the table fork we take for granted turns out to be a relatively recent invention. Fork-like instruments did exist in ancient Rome, but their use was limited to specific purposes and foods such as shellfish or lifting food from the fire. The first true fork on historical record was a two-pronged gold one used by a Byzantine princess who married the doge of Venice in the 11th century. This excessive royal delicacy in preferring a fork over God-given hands created a scandal still referred to in Church circles a couple of hundred years later.
But the Italians did adopt the table fork ahead of any other country in Europe. The reason was pasta. By the Middle Ages, trade in macaroni and vermicelli was well established in Italy and it became obvious to the Italians that three spikes were better than the original one spike or pole in twirling slippery threads of pasta.
Medieval and Tudor diners in England had also used tiny “sucket” forks—with a spoon at one end and two pronged fork at the other—for sugary sweetmeats, their version of candies. Even so, Queen Elizabeth I preferred to use her fingers for sweetmeats because she thought the spearing motion crude.
When an Elizabethan traveller named Thomas Coryate travelled to Italy in the first years of the 17th century, he noted the curious Italian custom of not touching food with fingers. He was bemused by the Italian preference for a fork for holding meat as well as for twirling pasta. He brought this fork habit for meat back to England despite the teasing of friends like the poet John Donne and playwright Ben Johnson. By 1700, forks had become commonplace throughout Europe. Even the stern Puritans were willing to adopt them. Especially after the restoration of Charles II, forks became well established on the table along with the newly fashionable and more elaborately handled trifid spoons. “Not wanting to dirty your fingers with food, or to dirty food with your fingers, had become the polite thing to do. The fork had triumphed, though knives and spoons continued to outsell forks until the early 19th century”, Wilson writes.
The history of the knife is both longer and, of course, more universal, though it takes different shapes and sizes according to centuries and countries. “It is the oldest tool in the cook’s armoury, older than the management of fire by between one million and two million years, depending on which anthropologist you believe”, Wilson observes. “Cutting with some implement or other is the most basic way of processing food.”
The earliest examples of stone-cutting tools date back to Ethiopia around 2.6 million years ago. Even in the Stone Age, humans were fashioning various cutting devices—from sharp choppers to scrapers to hammer stones and spheroids for beating food. They experimented with granite and quartz, obsidian and flint. Over the centuries came the knives of the Bronze and the Iron Ages followed by the use of steel, carbon steel and stainless steel and now titanium and laminates. In medieval and Renaissance Europe, almost every male carried his own knife, usually in a sheath dangling from a belt. This could be used equally for cutting food or enemies. But by the 18th century, carbon steel was increasingly used for making a range of specialized knives, which were particularly popular among the French as their chefs developed haute cuisine based on refined sauces and perfect cuts of meat.
According to Wilson, however, no knife was as multifunctional or as essential to an entire food culture as the wide-bladed Chinese tou, or cleaver. Cast iron was discovered in China in about 500 BCE, and the tou suited a frugal peasant culture where fuel was scarce. It could cut ingredients to small enough pieces that the flavors and ingredients would meld together and cook quickly over a portable brazier. Combined with the wok, it meant nothing was wasted and the maximum flavor could be extracted with minimum cooking energy.
This practice contrasted with European countries where fuel was more plentiful, especially in England. “English cooks chose to roast great carcasses by the heat of great fires in part because—in contrast to other nations—the English were abundantly well endowed with firewood”, Wilson tells us. As they experimented with their delicate sauces and cuts, the French dismissed the English predilection: “To the French we are still ‘les Rosbifs’, wrote Dr. Hunter of York in 1806.”
But as well as being more energy efficient, the Chinese tou also saved those eating from any knife work at table, regarded in Chinese culture as a form of uncouth butchery. This, again, contrasted with the European culture of carving chunks at the table for each lord to then cut into bite-sized pieces with his own personal knife.
By the 17th century, however, sets of identical knives along with the forks were increasingly commonly laid at table. What’s more, the knives were now becoming blunter rather than sharper despite the improvements in metals.
The French started this fashion, albeit whether because of the perceived danger of sharp knives as weapons or because of perceptions of vulgarity is not clear. What is clear is that culture drove the uses of technology as least as much as the other way around. By the 18th century, the table knife had become a utensil more useful for spreading butter, placing things on the fork and cutting food already relatively soft. It’s also why a new generation of serrated steak knives developed, most famously those pioneered in the southern French town of Laguiole, to do the job the table knife had relinquished.
Knives, blunt or sharp, also presented other problems, at least until the invention of stainless steel. Cutting anything acidic destroyed the taste of the food and turned the steel knives black. “Vinaigrette and steel knives were a particularly bad combination, hence the French prejudice that persists to this day, against cutting salad leaves”, Wilson points out. One need tear them. And the idea of silver knives for fish, at least for the upper classes, was not just an affectation; it was because the lemony tang would react against the steel in ordinary knives.
Nor does how we cut food just reflect a question of manners or diet or resources. It also affects the body and the alignment of our jaws. Wilson points out that modern orthodontics devotes considerable effort and even more money into creating the perfect overbite, so that the top layer of teeth hangs neatly over the bottom layer. “What the orthodontists don’t tell you is that the overbite is a very recent aspect of human anatomy and probably results from the way we use our table knives”, she reveals. She credits an American anthropologist, Charles Loring Brace, for establishing that the edge-to-edge bite, comparable to that of apes, had persisted much longer than people had thought. The change only occurred in the late 18th century in Western Europe, starting with “high status individuals.”
According to this theory, the real purpose of the incisors had not been to cut but to clamp down on the food in the mouth, in what Brace called the “stuff and cut” method. Once people cut their food into small morsels, the need for the clamping function ceased but the incisors continued to erupt until the top layer no longer met the bottom layer. In China, Brace discovered that, with the exception of peasants, the Chinese overbite had generally emerged 800 to 1,000 years sooner, corresponding to the development of a highly chopped style of Chinese food and chopsticks. So, it would appear that we are not only what we eat, but, to some limited extent, we are how we eat.
ore generally, Wilson notes that the great change of the 20th century was the creation of the new middle-class kitchens designed for people who would be doing both the eating and the cooking. “These new spaces were different from either the squalid one room kitchen/living rooms of the pre-industrial masses or the servant run kitchens of the privileged”, she writes. “They were hygienic, floored with linoleum and powered by gas or electricity.” Compare that to the era of vast open fires in the medieval kitchens of the British aristocracy, where the smoky heat was so intense that male cooks worked naked or in underwear, women had to be careful not to set their dresses alight, and dogs on a wheel were used to turn the spit.
By the mid-19th century in both Britain and North America, the closed range or “kitchener” had become the essential domestic fitting. Gas and then electric cookers became normal both in Europe and North America by the 1920s as their price decreased and efficiency improved. It was a liberation from what had once been “one of the defining activities of human life—starting and maintaining a fire.” Cookouts or barbecues are now voluntary, if extremely enjoyable, forms of preparing food. Few of us appreciate the novelty of the shift.
The shift in cooking using heat was accompanied by another almost as profound: in cooling. By the mid-20th century, the United States was the obvious world leader in this, its way of life underpinned by the joys of refrigeration and the sound of ice clinking in glasses. (In 1959, 96 percent of American households owned refrigerators, compared to only 13 percent in Britain—though the Brits still bested them in utensil etiquette.)
That fridge culture included frozen foods. Clarence Birdseye was a fur trapper who had worked as a biologist. He and his wife Eleanor and baby son Kellog were on a fur-trapping mission in Labrador from 1912 to 1915, with their game food frozen by the Arctic winds. They decided their fast-frozen winter food tasted better than fresh food or food frozen slowly. Clarence experimented with quick-freezing green vegetables, plunging them first into barrels of salt water, using Kellog’s baby bath to assist. When he returned to New Jersey, he was set on developing a new method for quick-freezing fish. In 1929, he sold his company to Goldman Sachs and the Postum Company for $22 million. That was a lot of money in 1929, even for Goldman Sachs.
The technique still wasn’t perfect. It took until 1930 to work out that peas and other vegetables needed to be blanched in hot water first to deactivate the enzymes that made them spoil. The Birdseye company also embarked on a PR campaign to rename frozen goods “frosted foods.” The linguistic rebranding did not stick, but the campaign worked: By 1955, the frozen food market in the United States was worth $1.5 billion—that’s $13 billion in 2013 dollars.
The gulf between a refrigerated America and the rest of the world owed something to the abundance of capital, energy and wealth in America, no doubt. But this, too, was a question of culture. Europeans actively resisted the technology of cold storage. The French, typically, had a name for it: frigoriphobie, or fear of refrigerators. It was also foreign to the French way of shopping for fresh food daily at the local markets, rather than once a week at giant supermarkets. Now, of course, the whole world aspires to be American when it comes to fridges. Supermarkets in France today, even in places like Avignon, have no less a percentage of frozen foods than supermarkets in Ashtabula.
Thus, observes Wilson, “A fridge rather than a stove now tends to be the starting point—what designers call the statement—around which the rest of a kitchen is constructed.” That is perhaps because, as Wilson jests, “When we can’t think what else to do, we open the fridge door and stare into it long and hard as if it will provide the answers to life’s great questions.” And sometimes, it seems to. At least until we’re hungry again.
Jennifer Hewett is the national affairs columnist for the Australian Financial Review and is based in Sydney, Australia. She has yet to master all her kitchen implements and has been known to occasionally hurl a wooden spoon.