Our large brain, long life span and high fertility are key elements of human evolutionary success and are often thought to have evolved in interplay with tool use, carnivory and hunting. However, the specific impact of carnivory on human evolution, life history and development remains controversial. Here we show in quantitative terms that dietary profile is a key factor influencing time to weaning across a wide taxonomic range of mammals, including humans. In a model encompassing a total of 67 species and genera from 12 mammalian orders, adult brain mass and two dichotomous variables reflecting species differences regarding limb biomechanics and dietary profile, accounted for 75.5%, 10.3% and 3.4% of variance in time to weaning, respectively, together capturing 89.2% of total variance. Crucially, carnivory predicted the time point of early weaning in humans with remarkable precision, yielding a prediction error of less than 5% with a sample of forty-six human natural fertility societies as reference. Hence, carnivory appears to provide both a necessary and sufficient explanation as to why humans wean so much earlier than the great apes. While early weaning is regarded as essentially differentiating the genus Homo from the great apes, its timing seems to be determined by the same limited set of factors in humans as in mammals in general, despite some 90 million years of evolution. Our analysis emphasizes the high degree of similarity of relative time scales in mammalian development and life history across 67 genera from 12 mammalian orders and shows that the impact of carnivory on time to weaning in humans is quantifiable, and critical. Since early weaning yields shorter interbirth intervals and higher rates of reproduction, with profound effects on population dynamics, our findings highlight the emergence of carnivory as a process fundamentally determining human evolution.
There are a lot more ex vegans than there are vegans. It is just a stylish fad. We hear about some celebrity like Natalie Portman,Angelina Jolie, Gwyenneth Paltrow, adopting a vegan diet but we dont hear about when they start listening to the 911 calls their bodies are sending them and they drop the noble experiment. I always say this one of the most damning pieces of evidence against a vegan diet is the fact that it simply does not exist in nature.If we observe man in his most natural state, that of the hunter-gatherer, we see examples of diets that have evolved over millenia allowing man to most efficiently adapt to his environment. HG’S on average get 70% of their calories from animal sources. some like the masai, inuit and plains indians get/got almost %100 of their calories from animal sources. Doctors, explorers and missionaries encountering these peoples for the first time universally noted a virtual absence of all modern diseases until they started eating modern foods. i.e. white flour, sugar and vegetable oils.All of the biological,historical, nutritional, social and economic arguments for a vegan diet are false. The proponents of veganism wrongly point to animal products as the cause of the modern degenerative disease epidemic. The use of various animal products including fats has remained the same or actually declined during the last century. The use of sugars, white flour, added chemicals and vegetable oils has increased dramatically.
The intestinal length argument is a false one, a vegan myth. Intestinal length is irrelevant, more important is intestinal and digestive function. Multistage fermentation is an herbivores primary digestive function. Thats why a cow has 4 stomachs. An herbivore has to eat constantly as their diet of grasses and plants is not nutrient dense. They spit up their cud and redigest it multiple times. Bacterial action converts a nutrient sparse diet into a high fat (short chain saturated fatty acids) one. Herbivores cannot go for extended periods without eating. Does this sound like the human digestive system?
More fuzzy and hysterical thinking. Animal foods in are not the factor in the modern diet that are responsible for the epidemices of modern diseases. This is another case of vegan/peta misdirection. In the last century, the use of animal foods especially animal fats has declined. The macro nutrients that have increased coincidently with the increase of modern diseases are refined carbohydrates, and vegetable seed oils. Pesticides and GMO appply to plants. If animal foods caused disease, native populations that consume large amts of animal foods should be rife with heart disease and cancer. IN FACT the opposite is true.Doctors, explorers and missionaries encountering these peoples for the first time universally noted a virtual absence of all modern diseases until they started eating modern foods. i.e. white flour, sugar and vegetable oils. Again answer the question ?Why are there no examples in a human vegan diet in the natural world?
“I wouldn’t claim to be an expert on human evolution, but even I know that the majesty of the human hand is not best summed up by saying it was to help us climb trees for fruit. With some past shared ancestor with other opposable thumb-havers, yes, but you are ignoring millions of years of evolution on what makes us different than our better-climbing cousins.
Remember how every time you watch a nature show, they mention how other great apes have so much more arm strength than us, pound-for-pound? Ever wonder what the trade off was, or did you just think humans didn’t need advantages in strength? The trade off is for benefits in precision and motor control, categories humans beat all other great apes in, including the other ones that came down to the ground. I think you need to find other things than fruit to explain it, considering our cousins’ superiority in fruit-gathering; tool use, for one.
On the feet, I agree distance over sprinting, but are you sure it’s for crops? There is certainly modern anthropological evidence of traditional tribesmen endurance running sprinting animals to exhaustion and then killing and eating them. Google “persistence hunt.” They have videos. I imagine the benefits of our feet are pretty varied, no?”
“Sorry, that’s not a valid argument at all. Almost any source you can find puts the use of fire to cook before modern humans even came about (homo erectus used it), with some estimates dating cooked food to more than a million years ago. No one questions it being at least 100,000 years old, more than enough to adapt evolutionarily. Cutting tools that could be used for butchering date to WAY before that, as would tenderizing tools (we are talking a million years would be super-conservative as a guess). Pretending that the use of fire and tools has not contributed to our present evolutionary state is plain false. Thus, arguing that the evolutionary argument requires not using them is totally wrong.
You might have a point about not having evolved much for the fork, but that kinda doesn’t leave much power in your retort.”
Near the end of each winter, the Nisg̱a’a prepared for one of the most important fishing activities of the year: the annual oolichan run up the Nass River. Beginning around the end of February, they started to fish vast quantities of oolichan, a small and very oily member of the smelt family. One sign of the oolichans’ arrival would be the sudden appearance of large numbers of sea mammals at the mouth of the river, as these animals hunted the oolichan.
Oolichan was not usually eaten as a fish meat. The vast majority of the catch was rendered for its edible grease which could be stored for many months. They boiled the oolichan in large cedar bent boxes until the grease separated and rose to the top. They then skimmed the grease and poured it into other boxes to store it for trade with other tribes or eating throughout the year. Oolichan contained so much grease that they could be burned like candles when dried, earning them the nickname “candlefish”.
The Nisg̱a’a traditionally fished oolichan with either nets or an oolichan rake called a k’idaa. The k’idaa was a long pole with comb-like wooden teeth on the end that was used to “rake” the oolichan from shallow areas of the river. For example, the ancient village of Ank’idaa took its name from the practice of catching oolichan with the k’idaa at that location.
Oolichan were also caught with nets. Sometimes a simple dip-net was used to scoop the oolichan out of the river, and other times a long oblong-shaped net would be anchored with tall stakes in the river and allowed to hang in the current. Every few minutes, the net would fill with oolichan and attendants would gather the net and empty it before setting it in the current again.
Dip-nets were made of twine spun from the fibrous pith of tall stinging nettles. The larger oblong-shaped net (called a “hlist”) was made from twine that was spun from fireweed pith. The fireweed twine was known as “w̓ahaas”, or “thread-made-from-fire-weed”.
Contemporary accounts of gladiator life sometimes refer to the warriors as hordearii—literally, “barley men.” Grossschmidt and collaborator Fabian Kanz subjected bits of the bone to isotopic analysis, a technique that measures trace chemical elements such as calcium, strontium, and zinc, to see if they could find out why. They turned up some surprising results. Compared to the average inhabitant of Ephesus, gladiators ate more plants and very little animal protein. The vegetarian diet had nothing to do with poverty or animal rights. Gladiators, it seems, were fat. Consuming a lot of simple carbohydrates, such as barley, and legumes, like beans, was designed for survival in the arena. Packing in the carbs also packed on the pounds. “Gladiators needed subcutaneous fat,” Grossschmidt explains. “A fat cushion protects you from cut wounds and shields nerves and blood vessels in a fight.” Not only would a lean gladiator have been dead meat, he would have made for a bad show. Surface wounds “look more spectacular,” says Grossschmidt. “If I get wounded but just in the fatty layer, I can fight on,” he adds. “It doesn’t hurt much, and it looks great for the spectators.”
The existence of the four-pointed dagger (replica pictured here) was known from inscriptions, but its function was a mystery until this crippling quadruple knee wound was identified. (Courtesy Karl Grossschmidt)
But a diet of barley and vegetables would have left the fighters with a serious calcium deficit. To keep their bones strong, historical accounts say, they downed vile brews of charred wood or bone ash, both of which are rich in calcium. Whatever the exact formula, the stuff worked. Grossschmidt says that the calcium levels in the gladiator bones were “exorbitant” compared to the general population. “Many athletes today have to take calcium supplements,” he says. “They knew that then, too.”
Why and how did humankind become “unusually successful”? And what, to an evolutionary biologist, does “success” mean, if self-destruction is part of the definition? Does that self-destruction include the rest of the biosphere? What are human beings in the grand scheme of things anyway, and where are we headed? What is human nature, if there is such a thing, and how did we acquire it? What does that nature portend for our interactions with the environment? With 7 billion of us crowding the planet, it’s hard to imagine more vital questions.
One way to begin answering them came to Mark Stoneking in 1999, when he received a notice from his son’s school warning of a potential lice outbreak in the classroom. Stoneking is a researcher at the Max Planck Institute for Evolutionary Biology in Leipzig, Germany. He didn’t know much about lice. As a biologist, it was natural for him to noodle around for information about them. The most common louse found on human bodies, he discovered, is Pediculus humanus. P. humanus has two subspecies: P. humanus capitis—head lice, which feed and live on the scalp—and P. humanus corporis—body lice, which feed on skin but live in clothing. In fact, Stoneking learned, body lice are so dependent on the protection of clothing that they cannot survive more than a few hours away from it.
It occurred to him that the two louse subspecies could be used as an evolutionary probe. P. humanus capitis, the head louse, could be an ancient annoyance, because human beings have always had hair for it to infest. But P. humanus corporis, the body louse, must not be especially old, because its need for clothing meant that it could not have existed while humans went naked. Humankind’s great coverup had created a new ecological niche, and some head lice had rushed to fill it. Evolution then worked its magic; a new subspecies,P. humanus corporis, arose. Stoneking couldn’t be sure that this scenario had taken place, though it seemed likely. But if his idea were correct, discovering when the body louse diverged from the head louse would provide a rough date for when people first invented and wore clothing.
The subject was anything but frivolous: donning a garment is a complicated act. Clothing has practical uses—warming the body in cold places, shielding it from the sun in hot places—but it also transforms the appearance of the wearer, something that has proven to be of inescapable interest to Homo sapiens. Clothing is ornament and emblem; it separates human beings from their earlier, un-self-conscious state. (Animals run, swim, and fly without clothing, but only people can be naked.) The invention of clothing was a sign that a mental shift had occurred. The human world had become a realm of complex, symbolic artifacts.
With two colleagues, Stoneking measured the difference between snippets of DNA in the two louse subspecies. Because DNA is thought to pick up small, random mutations at a roughly constant rate, scientists use the number of differences between two populations to tell how long ago they diverged from a common ancestor—the greater the number of differences, the longer the separation. In this case, the body louse had separated from the head louse about 70,000 years ago. Which meant, Stoneking hypothesized, that clothing also dated from about 70,000 years ago.
And not just clothing. As scientists have established, a host of remarkable things occurred to our species at about that time. It marked a dividing line in our history, one that made us who we are, and pointed us, for better and worse, toward the world we now have created for ourselves.
In this book I argue that the origins of human intelligence are linked to the acquisition of meat, especially through the cognitive capacities necessary for the strategic sharing of meat with fellow group members. Important aspects of the behavior of some higher primates—hunting and meat sharing and the social and cognitive skills that enable these behaviors—are shared evolved traits with humans and point to the origins of human intelligence. This does not mean that there is an instinctive desire to hunt on the part of all modern humans; only a small percentage of people in industrialized countries have ever hunted for anything that’s alive. Instead, the intellect required to be a clever, strategic, and mindful sharer of meat is the essential recipe that led to the expansion of the human brain.
Chimpanzees hunt and eat the meat of a variety of mammals. They are skilled makers and users of tools. These apes and their closest relatives have large brains and an intellect that surpasses that of all other nonhuman animals. They are funhouse mirrors of our ancestry; the same stock produced us, but with a filter of millions of years of adaptations that occurred during the history of each lineage. Chimpanzees, along with the other great apes—the bonobo, gorilla, and orangutan—illustrate how evolution can mold a highly intelligent animal that lives in a complex forest environment and an even more complex society. There is only one animal of greater intelligence, and it also lives in an incredibly intricate web of social relationships, navigating its way through life using group-mates as support systems and as tools to be manipulated. This other animal, of course, is humankind.
A second key piece of evidence about the behavior patterns that made us human is that our ancestors foraged for meat. The fossil record contains evidence of increasingly sophisticated tool manufacturing beginning some two and a half million years ago, just as the human brain began to approach the size threshold that is considered human. Reseachers believe that this tool use facilitated an increase in the importance of meat in the early human diet. Exactly when did meat become an important part of the diet, and how was it obtained? Were early humans savage and cunning hunters, or clever but weak scavengers? How important was meat in the diet as our ancestors’ lineages evolved and diversified, and how could the eating and sharing of animal prey have contributed to the expansion and reorganization of the human brain and cognition? What are the nutritional and social roles of meat in traditional human societies? These are questions to which anthropologists studying the fossil record have few answers.
I also examine traditional human societies using the same Darwinian paradigm that has provided answers to key questions about animal behavior. Comparing the behavioral ecology of humans living in very traditional settings to nonhuman animal ecology is an inquiry into whether both are driven by the same principles of natural selection. Because the direct evidence of early humanity in the fossil record is and always will be scanty—full of bones but lacking in flesh, both literally and figuratively—information on living human and nonhuman meat eaters is very important. There is much to be learned from modern hunting people in this regard. Modern foraging people are not relicts of the past. They have lives and societies with as much cultural sophistry as any other group of modern humans. But technologically they tend to be simpler, allowing us to see how people who need to subsist from their forest or savannah worlds can do so. This interaction of ecology and behavior provides a backdrop against which the potential range of ecological adaptations of ancient humans can be considered.
If not for the anthropocentrism of the earliest taxonomists—the scientists who devised the naming system we still use to classify living things—humans and apes would be grouped together because of our many shared traits. We three hunting apes—chimpanzees, ancestral hominids, and modern foraging people such as the !Kung or Ache—provide a frame of reference of our evolutionary history and therefore the roots of human behavior. I am not the first anthropologist to address these issues, although this is the first account to integrate modern evidence from my three areas of interest. The role of meat in the lives of early hominids has been viewed at times as crucial, at other times as minor, and at still other times as nonexistent in different eras of anthropological thought. With fossils and human foragers providing supporting evidence for what we know about great apes, we can consider a detailed triptych of hunting, scavenging, and meat sharing, all aimed at exploring the origins of human behavior.