Content area
Issue Title: Special Issue: Evolutionary Studies of Cooperation; Guest Editor: John Q. Patton
Hum Nat (2009) 20:447449
DOI 10.1007/s12110-009-9075-3
Published online: 16 September 2009# Springer Science + Business Media, LLC 2009
It seems as though fire, hearths, and cooking have suddenly come in from the archaeological cold and taken on an evolutionary life of their own. Suddenly, they are hot property, with probably as many different views as there are authors. In Catching Fire, Richard Wrangham elaborates a view on the origins and function of cooking that has been bubbling away on his particular stove for the past several years. The burden of his argument is that cooking food is a very much odder thing than we have supposed. Indeed, it would probably not be unfair to suggest that cultural anthropologistslikely the only people outside nutritionists to show any interest at all in the topichave simply regarded cooking as the rather dull but unavoidable part of the ritual of eating together, itself something of purely social significance. Not so, is the claim here: it is central to the story of becoming human, and as such it is something desperately in need of explanation.
Consider the fact that most of the machinery of human digestion, from teeth and jaw muscles to intestine length, is very small for a primate of our body size. Not to put too fine a point on it, they are just too feeble to be of much help in processing most decent primates diets, from tough vegetable foods right through to meat. Some of these diminutive parts might be seen as adaptations to meat-eating (as has been argued, for example, for intestine length). Yet an all-meat diet leads to protein poisoning in humans. So how could early hominins with such intestines, he asks, have coped with either a coarse vegetable diet or a meat diet without cooking? So, was cooking an early innovation, much earlier than anyone ever imagined?
Wrangham marshals a number of arcane and unexpected facts to support the suggestion of a long association with cooking. One of these is the dreaded Maillard
R. I. M. Dunbar (*)
Institute of Cognitive & Evolutionary Anthropology, University of Oxford, 64 Banbury Rd, Oxford OX2 6PN, UKe-mail: [email protected]
Richard Wrangham, Catching Fire: How Cooking Made Us Human
Basic Books, New York, 2009, $26.95
R. I. M. Dunbar
448 Hum Nat (2009) 20:447449
compound. Maillard compounds are the burned bits you get with your BBQ meat and chargrills. Owing to changes that occur in meat during burning, Maillard compounds are bad newsthey are highly carcinogenic for most mammals. But humans seem unusually immune to their effects. So does that mean we have had a long evolutionary association with burning our food? Its a seemingly compelling argument, although it is only evidence for an early origin for cooking if we accept that early Homo was stuck and had to eat meat because it had already made the intestinal switch away from an apelike digestion.
Wrangham devotes a lot of effort to showing how bad raw food is for you, even when it is vegetables: humans invariably lose weight on raw food diets (unlike most other primates), but everyone (including rats) puts on weight very easily with cooked food. The key is that cooking gelatinises starches, allowing the mammalian stomach to process and extract nutrients significantly more efficiently. There are other benefits to cooking, too: cooking massively reduces the amount of time needed for chewing. We spend less than 10% of our day chewing, but, extrapolating from wild chimpanzees, a human-sized chimp on a raw food diet would end up spending more than 40% of its day chewing. More time to lounge around, then . . . and gossip, perhaps?
So when might all this have happened? Wrangham offers us three possibilities, each associated with a major anatomical transition: 1.8 million years ago (the transition into Homo erectus), 800,000 years ago (the transition into archaic humans) and 200,000 years ago (the transition into anatomically modern humans). His pitch is in favor of the first, partly on the grounds that the move into a more open habitat would, for the first time, have engendered a significant rise in the risk of predation at night for a species committed to having to sleep on the ground (not least because of the scarcity of large trees, never mind the loss of arboreal adaptations). Control of fire, he suggests, must have played a major role in this, allowing early erectus to keep nocturnal predators at bay. I agree: being out on the savanna at night is a serious problem, as I can attest from a 5-mile walk two of us once did accompanied all the way by hyenas loping alongside, just out of range of our torches. Not nice. Then at some point, presumably, the odd tuber or hunk of meat dropped in the fire by accident, andpresto!cooking and the delicious taste of Maillard compounds were discovered! Later still, the argument runs, cooked food being more easily stolen than uncooked food (or more worthwhile stealing?), hominins were prompted to invent containers for storage and safekeeping, and though I do feel this one is stretching it a bitfemales (presumably even then the cooks) eventually bonded to males who could protect their food and pots.
Two other points are brought into the frame in support of an early use of cooking: humans have lost many of their capacities to cope with toxins (no longer needed because heat kills them off) and the oft-cited marked reduction in tooth size. Is the fact that the habiline/erectine transition is where we find a reduction in tooth size indirect evidence of the very early adoption of cooking? Well, yes, I can see the logic, but I always get uncomfortable when anyone starts to claim a trait was lost because it wasnt useful any more. Im just not entirely convinced evolution works quite like that. In any case, there might be more of a consensus favouring the much later reduction in tooth size that coincides with the appearance of modern humans (but, then, that would be much later than would suit Wranghams story).
Hum Nat (2009) 20:447449 449
As always, of course, the archaeological record leaves us stranded. Too often, the double jeopardy of fossilization and the serendipities of excavation mean that at best we can only ever know the latest possible time by which a trait must have evolved it could have been a lot earlier. The archaeologists have often dug their heels in at this point in defense of the hard evidence and some degree of certainty understandably, having been chastened far too often in the past for too much speculation off the back of minimal evidence. But this is a more important issue than such a view suggests. The fact that some traits dont leave nice neat archaeological traces just means that we have to be a bit more creative, and use a little more lateral thinking. We have to look for knock-on consequences elsewhere in the biological system that do produce an archaeological signature. We are, I believe, just about at the point where our theories are becoming sufficiently robust that we can do this. In effect, it requires building the jigsaw of evidence so tightly that only one conclusion can be drawn. And in some sense, this is exactly what Wrangham offers us here. Whether or not he is right will turn on the evidence that can finally be brought to bear on the suggestions he makes. But irrespective of the outcome, this is a brave stab at the problem and one that will at least generate a number of testable hypothesesand thats probably about as good as it gets in science.
Springer Science + Business Media, LLC 2009