Olli Arjamaa & Timo Vuorisalo. American Scientist. Volume 98, Issue 2. Mar/Apr 2010.
Few would argue against the proposition that in the animal kingdom adaptations related to food choice and foraging behavior have a great impact on individuals’ survival and reproduction—and, ultimately, on their evolutionary success. In our own species, however, we are more inclined to view food choice as a cultural trait not directly related to our biological background. This is probably true for variations in human diets on small scales, manifested both geographically and among ethnic groups. Some things really are a matter of taste rather than survival.
On the other hand, some basic patterns of our nutrition clearly are evolved characters, based on between generation changes in gene frequencies. As Charles Darwin cautiously forecast in the last chapter of On the Origin of Species, his theory of natural selection has indeed shed “some light” on the evolution of humans, including the evolution of human diet. The long transition from archaic hunter-gatherers to post-industrial societies has included major changes in foraging behavior and human diet.
The traditional view holds that our ancestors gradually evolved from South and East African fruit-eaters to scavengers or meat-eaters by means of purely biological adaptation to changing environmental conditions. Since the 1970s, however, it has become increasingly clear that this picture is too simple. In fact, biological and cultural evolution are not separate phenomena, but instead interact with each other in a complicated manner. As Richard Dawkins put it in The Selfish Gene, what is unusual about our species can be summed up in one word: culture.
A branch of theoretical population genetics called gene-culture revolutionary theory studies the evolutionary phenomena that arise from the interactions between genetic and cultural transmission systems. Some part of this work relies on the sociobiologically based theoretical work of Charles J. Lumsden and E. O. Wilson, summarized in Genes, Mind, and Culture. Another branch of research focuses on the quantitative study of gene-culture coevolution, originated among others by L. L. Cavalli-Sforza and M. W. Feldman. Mathematical models of geneculture coevolution have shown that cultural transmission can indeed modify selection pressures, and culture can even create new evolutionary mechanisms, some of them related to human cooperation. Sometimes culture may generate very strong selection pressures, partly due to its homogenizing influence on human behavior.
A gene-culture revolutionary perspective helps us to understand the process in which culture is shaped by biological imperatives while biological properties are simultaneously altered by genetic evolution in response to cultural history. Fascinating examples of such gene-culture coevolution can be found in the evolution of human diet. Richard Wrangham’s recent book, Catching Fire: How Cooking Made Us Human, focused on impacts of taming fire and its consequences on the quality of our food. Some scholars favor a memetic approach to this and other steps in the evolution of human diet. Memetics studies the rate of spread of the units of cultural information called mêmes. This term was coined by Dawkins as an analogy to the more familiar concept of the gene. A même can be, for instance, a particular method of making fire that makes its users better adapted to utilize certain food sources. As a rule, such a même spreads in the population if it is advantageous to its carriers. Mêmes are transmitted between individuals by social learning, which, as we all know, has certainly been (and still is) very important in the evolution of human diet.
In the following paragraphs, we will review the biological and cultural evolution of hominid diets, concluding with three examples of cultural evolution that led to genetic changes in Homo sapiens.
The First Steps in the Savanna
The first hominid species arose 10 to 7 million years ago in late Miocene Africa. In particular, Sahelanthropus tchadensis, so far the oldest described hominid, has been dated to between 7.2 and 6.8 million years. Hominids probably evolved from an ape-like tree-climbing ancestor, whose descendants gradually became terrestrial bipeds with a substantially enlarged brain. The overall picture of human evolution has changed rather dramatically in recent years, and several alternative family trees for human origins have been proposed.
The major ecological setting for human evolution was the gradually drying climate of late-Miocene and Pliocene Africa. Early hominids responded to the change with a combination of biological and cultural adaptations that together enhanced their survival and reproduction in the changing environment. This adaptive complex probably included increasingly sophisticated bipedality, complex social behavior, making of tools, increased body size and a gradual change in diet. In part, the change in diet was made possible by stone tools used to manipulate food items. The oldest known stone tools date back to 2.6 million years. Stone tool technologies were certainly maintained and spread by social learning, and very likely the same was true for changes in foraging tactics and choice of food.
The main data sources on hominid paleodiets are fossil hominid remains and archaeological sites. Well-preserved fossils allow detailed analyses on dental morphology and microwear, as well as the use of paleodietary techniques that include stable isotope analysis of bone and dentine collagen, as well as enamel apatite. Other useful and widely applied methods include comparisons of fossils with extant species with known dental morphology and diets. The main problem with dental morphology and wear analyses is that they indicate the predominant type of diet rather than its diversity. Thus, it is always useful to combine paleodietary information from many sources. Archaeological sites may provide valuable information on refuse fauna, tools and homerange areas of hominids, all of which have implications for diet.
Much recent attention has been focused on stable isotope analysis of bone and collagen. These techniques allow comparisons of animals consuming different types of plant diets. This is important, as plant remains seldom fossilize, so the proportion of animals in the diets of early hominids is easily exaggerated. In stable isotope analysis it may be possible to distinguish between diets based on C^sub 3^ plants and those based predominantly on C^sub 4^ plants. C^sub 3^ and C^sub 4^ are two different biochemical pathways for carbon fixation in photosynthesis. Plants that utilize the C^sub 3^ photosynthetic pathway discriminate against ^sup 13^C, and as a resuit C^sub 3^ plants have clearly depleted ^sup 13^C /12C ratios. In contrast, plants that utilize the C^sub 4^ photosynthetic pathway discriminate less against ^sup 13^C and are, therefore, in relative terms, enriched in ^sup 13^C. C^sub 4^ plants are physiologically better adapted to conditions of drought and high temperatures, as well as nitrogen limitation, than are C^sub 3^ plants. Thus it is very likely that the drying climate of Africa increased the abundance and diversity of C^sub 4^ plants in relation to C^sub 3^ plants.
The traditional view on early hominids separated them into australopithecines that were considered predominantly fruit eaters, and species of the genus Homo—that is, H. habilis and H. erectus—who were either scavengers or hunters. This traditional separation has been challenged by paleodietary techniques that have highlighted the importance of changes in the makeup of plant diet outlined above. While the ancestral apes apparently continued to exploit the C3 plants abundant in forest environments, the australopithecines broadened their diet to include C^sub 4^ foods, which together with bipedalism allowed them to colonize the increasingly open and seasonal African environment.
This emerging difference in diet very likely contributed to the ecological diversification between apes and hominids, and was an important step in human evolution. The C^sub 4^ plants foraged by australopithecines may have included grasses and sedges, although the topic is rather controversial. Interestingly, the use of animals as food sources may also result in a C^sub 4^-type isotopie signature, if the prey animals have consumed C^sub 4^ plants. Many researchers believe that a considerable proportion of the diet of australopithecines and early Homo consisted of arthropods (perhaps largely termites), bird eggs, lizards, rodents and young antelope, especially in the dry season.
Brain Size, Food, and Fire
Progressive changes in diet were associated with changes in body size and anatomy. As Robert Foley at the University of Cambridge has pointed out, increased body size may broaden the dietary niche by increasing hornerange area (thus providing a higher diversity of possible food sources) and enhanced tolerance of low-quality foods. A large mammal can afford to subsist off lower-quality foods than a small mammal. Moreover, increased body size enhances mobility and heat retention, and may thus promote the ability to adapt to cooler climates. All these possibilities were realized in the hominid lineage.
In particular, the origin of H. erectus about 1.8 million years ago appears to have been a major adaptive shift in human evolution. H. erectus was larger than its predecessors and was apparently the first hominid species to migrate out of Africa. It also showed a higher level of encephalization (skull size relative to body size) than seen in any living nonhuman primate species today. Increased brain size, in turn, was associated with a change in diet. The increase in brain size probably started about 2.5 million years ago, with gradual transition from Australopithecus to Homo. Because of the proportionately high energy requirements of brain tissue, the evolution of large human brain size has had important implications for the nutritional requirements of the hominid species. According to the Expensive-Tissue Hypothesis, proposed in 1995 by Leslie Aiello with University College London and Peter Wheeler with Liverpool John Moores University, the high costs of large human brains are in part supported by energy- and nutrient-rich diets that in most cases include meat.
Increased use of C^sub 4^ plants was indeed gradually followed by increased consumption of meat, either scavenged or hunted. Several factors contributed to increased meat availability. First, savanna ecosystems with several modern characteristics started to spread about 1.8 million years ago. This benefited East African ungulates, which increased both in abundance and species diversity. For top predators such as H. erectus this offered more hunting and scavenging possibilities. The diet of H. erectus appears to have included more meat than that of australopithecines, and early Homo. H. erectus probably acquired mammalian carcasses by both hunting and scavenging. Archaeological evidence shows that H. erectus used stone tools and probably had a rudimentary hunting and gathering economy. Sharp-edged stone tools were important as they could slice through hide and thus allowed access to meat. These tools also made available tissues such as bone marrow or brain. Greater access to animal foods seems to have provided the increased levels of fatty acids that were necessary for supporting the rapid hominid brain evolution.
As Richard Wrangham has persuasively argued, domestication of fire had a great influence on the diet of our ancestors. Fire could be used in cooperative hunting, and to cook meat and plants. According to hominid fossil records, cooked food may have appeared already as early as 1.9 million years ago, although reliable evidence of the controlled use of fire does not appear in the archaeological record until after 400,000 years ago. The routine use of fire probably began around 50,000 to 100,000 years ago. Regular use of fire had a great impact on the diet of H. erectus and later species, including H. sapiens. For instance, the cooking of savanna tubers and other plant foods softens them and increases their energy and nutrient bioavailability. In their raw form, the starch in roots and tubers is not absorbed in the intestine and passes through the body as nondigestible carbohydrate. Cooking increases the nutritional quality of tubers by making more of the carbohydrate energy available for biological processes. It also decreases the risk of microbial infections. Thus, the use of fire considerably expanded the range of possible foods for early humans. Not surprisingly, the spread of our own species to all main continents coincides with the beginning of the routine use of fire.
In relative terms, consumption of meat seems to have peaked with our sister species H. neanderthalensis. As Matt Sponheimer and Julia A. Lee Thorp with Rutgers University and the University of Cape Town have pointed out, on the basis of extensive evidence, “there can be little doubt that Neanderthals consumed large quantities of animal foods.” Remains of large to medium-sized mammals dominate Neanderthal sites. Neanderthals probably both hunted and foraged for mammal carcasses. Perhaps, unromantically, they had a preference for small prey animals when hunting. And in northern areas colonized by the Neanderthals, there was probably no competition for frozen carcasses. The control of fire by the Neanderthals (and archaic modern humans), however, allowed them to defrost and use such carcasses.
The Carbohydrate Revolution
The Neolithic or Agricultural Revolution, a gradual shift to plant and animal domestication, started around 12,000 years ago. For our species this cultural innovation meant, among many other things, that the proportion of carbohydrates in our diet increased considerably. Cereal grains have accounted for about 35 percent of the energy intake of hunter-gatherer societies, whereas it makes up one-half of energy intake in modern agricultural societies—for example, in Finnish young adults. The Neolithic Revolution also included domestication of mammals, which in favorable conditions guaranteed a constant supply of meat and other sources of animal protein.
Although fire likely played a role in the early utilization of carbohydrates, the big shift in diet brought about by plant domestication has its roots in the interplay of cultural change and biological evolution. Sweet-tasting carbohydrates are energy rich and therefore vital for humans. In the environment of Paleolithic hunter-gatherer populations, carbohydrates were scarce, and therefore it was important to effectively find and taste sweet foods. When eaten, large polymers such as starch are partly hydrolyzed by the enzyme amylase in the mouth and further cleaved into sugars, the sweet taste of which might have functioned as a signal for identifying nutritious food sources. (It is interesting to note that the fruit fly Drosophila melanogaster perceives the same compounds as sweet that we do.) Later, in the Neolithic agriculture, during which humans shifted to consumption of a starch-rich diet, the role of the amylase enzyme in the digestive tract became even more important in breaking down starch.
Salivary amylase is a relatively recent development that first originated from a pre-existing pancreatic amylase gene. A duplication of the ancestral pancreatic amylase gene developed salivary specificity independently both in rodents and in primates, emphasizing its importance in digestion. Additionally, its molecular biology gives us a new insight into how evolution has made use of copy number variations (CNVs, which include deletions, insertions, duplications and complex multisite variants) as sources of genetic and phenotypic variation; single nucleotide polymorphisms (SNPs) were once thought to have this role alone. CNVs may also involve complex gains or losses of homologous sequences at multiple sites in the genome, and structural variants can comprise millions of nucleotides with heterogeneity ranging from kilobases to megabases in size.
Analyses of copy number variation in the human salivary amylase gene (Amyl) found that the copy number correlated with the protein level and that isolated human populations with a high-starch diet had more copies of Amyl. Furthermore, the copy number and diet did not share a common ancestry; local diets created a strong positive selection on the copy number variation of amylase, and this evolutionary sweep may have been coincident with the dietary change during early stages of agriculture in our species. It is interesting to note that the copy number variation appears to have increased in the evolution of human lineage: The salivary protein levels are about six to eight times higher in humans than in chimpanzees and in bonobos, which are mostly frugivorous and ingest little starch compared to humans.
Transition to Dairy Foods
A classic example of gene-culture coevolution is lactase persistence (LP) in human adults. Milk contains a sugar named lactose, which must be digested by the enzyme lactase before it can be absorbed in the intestine. The ability to digest milk as adults (lactose tolerance) is common in inhabitants of Northern Europe where ancient populations are assumed to have used milk products as an energy source to survive the cold and dark winters, whereas in southern Europe and much of Asia, drinking milk after childhood often results in gastrointestinal problems. If the intestine is unable to break down lactose to glucose and galactose—due to lack of lactase or lactase-phlorizin hydrolase (LPH) enzyme, normally located in the villi of enterocytes of the small intestine—bacterial procession of lactose causes diarrhea, bloating and flatulence that can lead to fatal dehydration in infants. On the other hand, milk provides adults with a fluid and rich source of energy without bacterial contamination, enhancing their survival and fitness. Therefore, in the past the phenotype of lactase persistence undoubtedly increased the relative reproductive success of its carriers.
Recent findings of molecular biology show that a single-nucleotide polymorphism that makes isolated populations lactase persistent has been “among the strongest signals of selection yet found for any gene in the genome.” Lactase persistence emerged independently about 10,000 to 6,000 years ago in Europe and in the Middle East, two areas with a different history of adaptation to the utilization of milk. The earliest historical evidence for the use of cattle as providers of milk comes from ancient Egypt and Mesopotamia and dates from the 4th millennium B.C. Still today there are large areas of central Africa and eastern Asia without any tradition of milking, and many adults in these countries are physiologically unable to absorb lactose. The ancient Romans did not drink milk, and this is reflected in the physiology of their Mediterranean descendants today.
The first evidence for a SNP as a causative factor in LP came from a group of Finnish families. A haplotype analysis of nine extended Finnish families revealed that a DNA variant (C/T^sub -13910^) located in the enhancer element upstream of the lactase gene associated perfectly with lactose intolerance and, because it was observed in distantly related populations, suggested that this variant was very old. Later it was shown that this alíele had emerged independently in two geographically restricted populations in the Urals and in the Caucasus, the first time between 12,000 and 5,000 years ago and the second time 3,000 to 1,400 years ago. Yet Saudi Arabian populations that have a high prevalence of LP have two different variants introduced in association with the domestication of the Arabian camel about 6,000 years ago. In Africa, a strong selective sweep in lactase persistence produced three new SNPs about 7,000 years ago in Tanzaniane, Kenyans and Sudanese, reflecting convergent evolution during a similar type of animal domestication and adult milk consumption.
All these facts indicate that there has been a strong positive selection pressure in isolated populations at different times to introduce lactose tolerance, and this has taken place through several independent mutations, implying adaptation to different types of milking culture. Lactase persistence was practically nonexistent in early European farmers, based on the analysis of Neolithic human skeletons, but when dairy farming started in the early Neolithic period, the frequency of lactase persistence alleles rose rapidly under intense natural selection. The cultural shift towards dairy farming apparently drove the rapid evolution of lactose tolerance, making it one of the strongest pieces of evidence for gene-culture coevolution in modern humans. In other words, the même for milking had local variants, which spread rapidly due to the positive effects they had on their carriers.
We must bear in mind, however, that the transcription of a gene is under complex regulation, as is the C/T .13910 variant: It contains an enhancer element through which several transcription factors probably contribute to the regulation of the lactase gene in the intestine. In addition, lactose tolerance in humans and the frequencies of milk protein genes in cattle appear to have also coevolved. When the geographical variation in genes encoding the most important milk proteins in a number of European cattle breeds and the prevalence of lactose tolerance in Europe were studied, the high diversity of milk genes correlated geographically with the lactose tolerance in modern Europeans and with the locations of Neolithic cattle farming sites in Europe. This correlation suggests that there has been a gene-culture coevolution between cattle and human culture leading towards larger herds with a wider distribution of gene frequencies, resulting in the selection of increased milk production and changed composition of milk proteins more suitable for human nutrition. In the future, we will know even more about the geographical evolution of LP, as it has become possible to rapidly genotype large numbers of individuals harboring lactose tolerancelinked polymorphisms producing various gastrointestinal symptoms after lactose ingestion.
We Are Still Evolving
As shown above, culture-based changes in diet (which can be called mêmes) have repeatedly generated selective pressures in human biological evolution, demonstrated for instance by the single nucleotide polymorphism of Jactase persistence and the copy number variation of amylase. These selective sweeps took place 10,000 to 6,000 years ago when animal and plant domestication started, marking the transition from the Paleolithic to the Neolithic era. Much earlier, genetic changes were certainly associated with the dietary changes of australopithecines and H. erectus.
What about the future? Can we, for instance, see any selection pressure in the loci of susceptibility to dietassociated diseases? The answer seems to be yes. The risk of Type II diabetes (T2D) has been suggested to be a target of natural selection in humans as it has strong impacts on metabolism and energy production, and therefore on human survival and fitness. Genomewide and hypothesis-free association studies have revealed a variant of the transcription factor 7-like (TCF7L2) gene conferring the risk of T2D. Later, in Finns, a similar genome-wide T2D study increased the number of variants near the TCF7L2 to 10. When refining the effects of TCF7L2 gene variants on T2D, a new variant of the same gene that has been selected for in East Asian, European and West African populations was identified. Interestingly, this variant suggested an association both with body mass index and with the concentrations of leptin and ghrelin, the hunger-satiety hormones that originated approximately during the transition from Paleolithic to Neolithic culture. In support of the notion that selection is an on-going process in human physiological adaptation, the analysis of worldwide samples of human populations showed that the loci associated with the risk of T2D have experienced a recent positive selection, whereas susceptibility to Type I diabetes showed little evidence of being under natural selection.
In the near future, genome-wide scans for recent positive selections will increase our understanding of the coevolution between the ancient genome and diet in different populations, projecting to problems in modern nutritional qualities. As has been suggested here, that understanding is likely to be considerably more nuanced than the simple “hunter-gatherer-genes-meet-fast-food” approach so often put forward.