web.archive.org

evolution: Definition, Synonyms and Much More from Answers.com

  • ️Mon Apr 18 2005

Concept

Among the dominant concepts of the modern world in general, and biology in particular, few are as powerful—or as misunderstood—as evolution. Even the name is something of a misnomer, since it almost implies some sort of striving to reach a goal, as though the "purpose" of evolution were to produce the most intelligent species, human beings. In fact, what drives evolution is not a quest for biological greatness but something much more down to earth: the need for organisms to survive in their environments. Closely tied to evolution are two processes, mutation and natural selection. Natural selection is a process whereby survival is related directly to the ability of an organism to fit in with its environment, while mutation involves changes in the genetic instructions encoded in organisms.

Although the English naturalist Charles Darwin (1809-1882) often is regarded as the father of evolutionary theory, he was not the first thinker to suggest the idea of evolution as such; however, by positing natural selection as a mechanism for evolution, he provided by far the most convincing theory of evolutionary biological change up to his time. In the years since Darwin, evolutionary theory has evolved, but the essential idea remains a sound one, and it is a "theory" only in the sense that it is impossible to subject it to all possible tests. The idea that evolution is somehow still open to question is another pervasive misconception, and it often appears hand in hand with the most pervasive misconception of all—that evolution is in some way anti-Christian, anti-religion, or anti-God.

How It Works

The "watch Analogy" and What It (Unintentionally) Teaches About Evolution

Throughout this essay, we discuss misconceptions relating to evolution. Such misconceptions have had such a strong impact on modern civilization that it is important to begin by setting aside a few misguided ideas that strike at the very heart of the evolutionary process. Many of these misconceptions are embodied in a popular "argument" against evolution that goes something like this: Suppose you took a watch apart and laid the pieces on the ground. If you came back in a billion years, would you really expect the watch to have assembled itself?

This argument is a virtual museum of all the fallacies associated with evolution. First of all, a watch (or any of the other variations used in similar arguments) is mechanical, not organic or biological, which is the class of objects under discussion within the framework of evolution. In that sense, the answer to this question is easy enough: No, a watch probably never would assemble itself, because it is not made of living material and it has no need for survival.

Another problem with the watch argument is that it starts with impossibly large pieces. Let us assume that the watch is a living being; even so, one would not expect its dials and gears to assemble themselves. But evolution does not make such claims: there is nothing in the theory of evolution to lead one to believe that a collection of organs lying around on a beach eventually would piece themselves together to make a whale.

According to what paleontologists (see Paleontology) and other scientists can deduce, over the course of three billion years life-forms evolved from extremely simple self-replicating carbon-based molecules to single-cell organisms. This is hardly what one would call breakneck speed. The more visible or "exciting" part of evolution, with the proliferation of species that produced the dinosaurs and (much later) humans, took place in the past billion years. In fact, the pace of change was still very, very slow until about half a billion years ago, and it has been accelerating ever since. For the vast majority of evolutionary history, however, change has been so slow that, by contrast, watching paint dry would be like playing a high-speed video game.

Ironically, for the watch scenario to be truly analogous to anything in evolution, one would have to start with atoms and molecules not whole gears and dials. Opponents of evolutionary theory might take this fact as being favorable to their cause, but if the watch were made of living, organic material rather than metal, it is possible that the molecules would have some reason to join in the formation of organelles and, later, cells. Or perhaps they would not. Therein lies another problem with the watch analogy and, indeed, with many of the attempts to argue against evolution on a religious basis. This might be called the "fallacy of intention," or the idea that evolution is driven by some overall purpose.

The "fallacy of Intention."

Hidden in the watch analogy is the idea of the watch itself, the finished product, as a "goal." By the same analogy, the single-cell eukaryotes of a billion or two billion years ago were forming themselves for the purpose of later becoming pine trees or raccoons or people. This is not a valid supposition, as can be illustrated by analogies to human history.

The history of human beings, of course, has taken place over a much, much shorter span than evolutionary history. (The Paleontology essay contains several comparisons between the span of human life on Earth and Earth's entire existence.) Moreover, unlike cells, people do form goals and act on intentions, so if there were any good example of change with a goal in mind, it would have to come from human beings. Yet even in the few thousand years that humans have existed in organized societies, most trends have occurred not as part of a major plan but as a means of adapting to conditions.

Consider the situation of a group of nomads who lived in what is now southern Russia about 5,000 years ago. At some point, this vast collection of tribes began to migrate outward, some moving into an area that is now central Asia and the Indian subcontinent and others migrating westward. No sane person would argue that the westward-traveling members of this group knew that in moving to the geographically advantageous territory of Europe, they were putting in place conditions that would help give their descendants dominance over most of the planet some 4,500 years later. Rather, they were probably just trying to find better land for grazing their horses.

We cannot say what the Indo-Europeans, as they are known to history, were looking for. Our only evidence that they existed is the similarities between the languages of Europe, India, and Iran, first noted by the German philologist and folklorist Jacob Grimm (1785-1863) at about the same time that Darwin was formulating his theory of evolution. Grimm, in fact, used methods not unlike those of Darwin, but instead of fossils he studied words and linguistic structures. Along the way, he found remarkable links, such as the Sanskrit word agni, cousin to the Latin term ignis and such modern English words as ignite.

In contrast to the Indo-Europeans, we know a great deal about another group of westward-moving nomads, the Huns of around A.D. 300, who were indeed looking for better grazing lands. Dislocated from their native areas by the building of China's Great Wall, the Huns crossed the Danube River, displacing the Ostrogoths. The Ostrogoths, in turn, moved westward, and this migration set in motion a domino effect that would bring an end to the Western Roman Empire in A.D. 476.

Did the Huns intend to destroy the Roman Empire and bring about the Middle Ages? No reasonable person would adopt such a conspiratorial view of history. Even more absurd, did the Chinese build the Great Wall with the idea of precipitating this entire chain of events? Again, no one would assert such a premise. If those trends in the evolution of societies were not goal-directed, why would we assume that cells and organisms would have to be striving toward a particular end to obtain certain results?

Confusing Evolution With God

In fact, there is no driving "purpose" to evolution—no scientifically based substitute for God operating from behind the scenes and manipulating the evolutionary process to achieve its ultimate aims. Evolution is not guided by any one large aim but by a million or a billion small aims—the need for a particular species of mollusk to survive, for instance.

As we discuss in the course of this essay, the idea of an underlying conflict between evolution and Christianity (or any other religion, for that matter) is almost entirely without merit. On the other hand, it is theoretically possible that all the processes of evolution took place without a creator—but this still should not pose a threat to anyone's idea of God.

There is nothing in evolution that would lead to the conclusion that there is no God, that the universe is not God's handiwork, or that God does not continue to engage in a personal relationship with each human. Neither is there anything in evolution that would lead to the conclusion that God does exist. Rather, the matter of God is simply not relevant to the questions addressed by evolution. In other words, evolution leaves spiritual belief where it should be (at least, according to Christianity): in the realm of individual choice.

Natural Selection

As we noted earlier, one of the principal mechanisms of evolutionary processes is natural selection. This in itself illustrates the lack of intention, or "goal orientation," in evolution. Like the name evolution itself, the term natural selection can be deceptive, implying that nature selects certain organisms to survive and condemns others to extinction. In fact, something quite different is at work.

Species tend to overproduce, meaning that the number of field mice, for instance, born in any year is so large that this entire population cannot possibly survive. The reason is that there is never enough of everything—food, water, or living space—for all members of the population to receive what they need. Therefore, only those best adapted to the environment are likely to survive.

Faster, Furrier Mice

Suppose, for instance, that the climate in the area where two field mice live is very cold, and suppose that some of the field mice have more protective fur than others; obviously, they are more likely to live. If there are many speedy predators around, judging purely on the basis of that factor alone, it would be easy to predict that the swiftest of the field mice would survive. Thus, faster-running, furrier mice would be "selected" over the slower or less furry mice.

Natural selection is not simply a matter of one particular mouse surviving in an environment. Instead, it involves the survival of specific strains, or lines of descent, that are more suited to the environment in question. Individuals adapted to an environment are more likely to live and reproduce and then pass on their genes to the next generation, while those less adapted are less likely to reproduce and pass on their traits. The genetic strains that survive are not "better" than those that do not—they are only better adapted.

The process of natural selection is ongoing. For example, in generation A, the furrier field mice survive and pass on their "furriness" gene to their offspring. Some of the offspring may still not be furry, and these mice will be less likely to survive and reproduce. In addition, since there are almost always several survival factors affecting natural selection, it is likely that other traits also will determine the survivability of certain individual mice and their genes.

For instance, there may be furry but slow mice in generation B, which despite their adaptation to temperature conditions are simply not fast enough to get away from predators. Therefore, the mice in generation C are likely to be furrier and faster than their ancestors. Additional survival factors may come into the picture, to ensure that the average member of generation D has sharper teeth in addition to swifter feet and a furrier body.

Although this illustration depicts evolutionary changes as taking place over the course of four generations, they are more likely to occur over the span of 400 or 4,000 or four million generations. In addition, the process is vastly more complicated than it has been portrayed here, because numerous factors are likely to play a part. The essential mechanism outlined here, however, prevails: certain traits are "naturally selected" because individuals possessing those traits are more capable of survival.

The "survival of the Fittest."

The concept of natural selection sometimes is rendered popularly as the "survival of the fittest." Scientists are less likely to use this phrase for several reasons, including the fact that it has been associated with distasteful social philosophies or murderous political ideologies—for example, Nazism. Additionally, the word fittest is a bit confusing, because it implies "fitness," or the quality of being physically fit.

This implication, in turn, might lead a person to believe that natural selection entails the survival of the strongest, which is not the case. Yet this is precisely what proponents of a loosely defined philosophy known as social Darwinism claimed. Popular among a wide range of groups and people in the late nineteenth and early twentieth centuries, social Darwinism could be used in the service of almost any belief. Industrialists and men of wealth asserted that those who succeeded financially did so because they were the fittest, while Marxists claimed that the working class ultimately would triumph for the same reason. Across the political spectrum, social Darwinism confused the meaning of "fittest" with that of other concepts: "strongest," "most advanced," or even "most moral." All of this, it need hardly be said, is misguided, not least because evolutionary theory has nothing to do with race, ethnicity, or social class.

In fact, "survival of the fittest," in a more accurate interpretation, means that individuals that "fit," or "fit in with," their environments are those most likely to survive. This is a far cry from any implication of strength or superiority. Imagine a group of soldiers in combat: Which type of soldier is most likely to survive? Is it the one who scores highest on physical training tests, looks the finest in a uniform, comes from a more socially upper-class home, and has the most advanced education? Or is it the one who keeps his head low, acts prudently, does not rush into dangerous situations without proper reconnaissance, and obeys instruction from qualified leaders?

Clearly, the second set of characteristics has much more to do with survival, even though these qualities may seem less "noble" than the first set. Yet it is by adapting, or proving his or her adaptability, to the environment of war that a soldier survives—not by displays of strength or other types of "fitness" that simply appear impressive. In the same way, the fitness of a species does not necessarily have anything to do with strength: after all, the lion, the "king of beasts," would die out in a polar climate or a desert or an aquatic environment.

Mutation

Although natural selection is of principal importance in evolution, mutation also plays a pivotal role. Mutation is the process whereby changes take place in the genetic blueprint for an organism as a result of alterations in the physical structure of an organism's DNA (deoxyribonucleic acid). DNA is a molecule in all cells and in many viruses that contains genetic codes for inheritance. DNA carries genetic information that is transmitted from parent to offspring; when a mutation occurs, this new genetic information—often quite different from the genetic code received by the parent from the grandparent—is passed on instead.

Under normal conditions of reproduction, a copy of the DNA from the parent is replicated and transmitted to the offspring. The DNA from the parent normally is copied exactly, but every once in a while errors arise during replication. These errors usually originate in noncoding regions of the DNA and therefore have little effect on the observable traits of the offspring. On the other hand, some mutations may be lethal, and thus the offspring does not survive for the mutation to become apparent. In a very few cases, however, offspring with a slightly modified genetic makeup manage to survive.

Contrast With Acquired Characteristics

Mutation is not to be confused with the inheritance of acquired characteristics, a fallacious doctrine that had its adherents when Darwin was a young man. If acquired characteristics were taken to an extreme, a lumberjack who loses his arm cutting down a tree and later conceives a child with his wife would most likely father a child who is missing an arm. This notion is absurd, and attempts to put forward a workable theory of acquired characteristics in the late eighteenth and early nineteenth centuries involved much greater subtlety. Still, the idea is misguided.

The French natural philosopher Jean Baptiste de Lamarck (1744-1829), one of the leading proponents of acquired characteristics, maintained that giraffes had gained their long necks from the need to stretch and reach leaves at the top of tall trees. In other words, if a giraffe parent had to stretch its neck, a giraffe baby would be born with a stretched neck as well. Later, Darwin's natural selection provided a much more plausible explanation for how the giraffe might have acquired its long neck: assuming that the nutrients it needed were at the highest levels of the local trees, the traits of tallness, long necks, and the ability to stretch would be selected naturally among the giraffe population.

Mutations and Survival

Unlike the idea of acquired characteristics, mutation does not entail the inheritance of anatomical traits acquired in the course of an organism's life; rather, it is changes in the DNA that are passed on. For example, when mind-altering drugs became popular among young people in the 1960s, concerns were raised that the offspring of drug takers might suffer birth defects as a result of alterations in their DNA. For the most part, this did not happen. Conditions such as Huntington disease and cystic fibrosis, however, are the result of mutations in DNA; so, too, is albinism, which eliminates skin pigment.

Although mutations often are regarded as undesirable because they can affect the health of individuals adversely, they also can have positive effects for the population in question. Suppose a group of bacteria is exposed to an antibiotic, which rapidly kills off the vast majority of the bacteria. In a fraction of those who survive, however, a mutation may develop that makes them resistant to the medication. Eventually, these mutant bacteria will reproduce, creating more mutants and in time yielding an entire population resistant to the antibiotic.

This is the reason why antibiotics can lose their effectiveness over time: bacteria with mutant genes will render every antibiotic useless eventually. The same often can happen with insect sprays, as roaches and other pests develop into mutant strains that are capable of surviving exposure to these pesticides. Such species, with their short cycles of birth, reproduction, and death, are extremely well equipped for survival as a group, which explains why many an unpleasant "bug" (whether a bacterium or an insect) has long been with us. (See Mutation for more on this subject.)

Real-Life Applications

"proving" Evolution

Later in this essay, we look at examples of evolution in action and other phenomena that support the ideas of evolutionary theory. But before examining these many "proofs" of evolution, a few words should be said about the very fact that evolution seems to require so much more proof than most other scientific theories.

All scientific ideas must be capable of being proved or disproved, of course, but the demand for proof in the case of evolution goes far beyond the usual rigors of science. In fact, at this point, the people demanding proof are not scientists but certain sectors of the population as a whole—in particular, religious groups or individuals who fear evolution as a challenge to their beliefs.

Quantum Mechanics: a Much More Difficult Idea

By contrast, quantum mechanics, though it encompasses ideas completely opposed to common sense, has not sustained anything approaching the same challenge or the demand for proof that evolution has encountered from nonscientists. A theory in physics and chemistry that details the characteristics of energy and matter at a subatomic level, quantum mechanics goes against such common assumptions as the idea that we can know both the location and the speed of an object. It is as though science had proved that down was up and up was down. If there were ever a "dangerous" theory, inasmuch as it undermines all our assumptions about the world, it is quantum mechanics not evolution, which is a fairly straightforward idea by comparison.

Quantum mechanics has gone virtually unchallenged (at least on a social or moral, as opposed to a scientific, basis), whereas even today there are many people who refuse to accept the idea of evolution. Granted, quantum mechanics is a much younger idea, having originated only in the 1920s, and it is vastly more difficult to understand. But the real reason why evolution has come under so much more challenge, of course, has to do with the fact that it is perceived (mistakenly) as challenging the primacy of God.

Just a Theory?

One of the aspects of evolution often cited by opponents is the fact that it is, after all, the theory of evolution. The implication is that if it is still just a theory, it must be open to question. In a sense, this is accurate: for scientific progress to continue, ideas should never be accepted as absolute, unassailable truths. But this is not what opponents of evolution are getting at when they cite its status as a "mere" theory. In fact, their use of this point as a basis for attack only serves to illustrate a misunderstanding with regard to the nature of scientific knowledge.

The word theory in "theory of evolution" simply means that evolutionary ideas have not been and, indeed, cannot be tested in every possible circumstance. Most ideas in science are simply theories rather than laws because in few cases is it possible to say with absolute certainty that something always will be the case. One of the few actual scientific laws is the conservation of energy, which holds that for all natural systems the total amount of energy remains the same, though transformations of energy from one form to another take place. This has been tested in such a wide variety of settings and circumstances that there is no reason to believe that would it ever not be the case.

By contrast, there probably never will be enough tests on evolution to advance it to the status of a law. The reason is quite simply that evolution takes a long time. Some examples, such as the instances of industrial melanism that we discuss later, unfold within a short enough period of time that humans can observe them. In general, however, evolutionary processes take place over such extraordinarily long spans of time that it would be impossible to subject them to direct observation.

None of this, however, does anything to discredit evolutionary theory. For that matter, the idea that the entire physical world is made of atoms is still technically a theory, though there is no significant movement of people attempting to discredit it. The reason, of course, is that atomic theory does not seem to contradict anyone's idea of God. (This was not always the case, however. Almost 2,500 years ago, a Greek philosopher named Democritus developed the first atomic theory, but because his ideas were associated with atheism, atomic theory was largely rejected for more than two millennia.)

Facing the Facts

If people really understood the word theory, they would give it a great deal more respect. Unfortunately, the word so often is misused and applied to anything that has not been proved that it has begun to seem almost like an insult to call evolution a theory. After all, in the present essay, we refer to acquired characteristics as a theory, and in everyday life one often hears much less respectable ideas given the status of theory. For this reason, it is worth taking note of the process, from observation to hypothesis to the formulation of general statements, that goes into the development of a truly scientific theory.

In forming his theory of evolution, Darwin began with several observations about the natural world. Among the things he observed is the fact, which we noted earlier, that for a particular species, more individuals are born than can possibly survive with available resources. On the basis of this observation, he formed a hypothesis, or inference. His inference was that because populations are greater than resources, the members of a population must compete for resources.

A theory is made up of many hypotheses, but to proceed from a collection of hypotheses to a true theory, these inferences must be subjected to rigorous testing. Thus, Darwin, in effect, said to himself, "Is what I have said true? Are there more individuals of a species than there are available resources?" Then he began looking for examples, and like a true scientist, he did so with the attitude that if he found examples that contradicted his hypothesis, he would reject the hypothesis and not the facts.

As it turns out, of course, there are always more members of a population than there are resources. This can be illustrated in a small way by observing a litter of puppies or piglets struggling to obtain milk from their mother. Chances are that the mother will not have enough teats for all her babies, and the "runt," unless it is able to force its way through the others to the milk source, may die. Only after testing this hypothesis and other hypotheses, such as that of natural selection, did Darwin formulate his theory.

Evolution and Religion

The fact that some puppies or piglets die for lack of milk is not a nice or pleasant thought, but it is the truth. Again, like a true scientist, Darwin accepted reality, without attempting to mold it to fit his personal beliefs about how things should be.

As a great thinker from the generation that preceded Darwin's, the Scottish philosopher David Hume (1711-1776), wrote in his Enquiry Concerning Human Understanding: "There is no method of reasoning more common, and yet more blamable, than, in philosophical disputes, to endeavor the refutation of a hypothesis, by a pretense of its dangerous consequences to religion and morality." In other words, there is an understandable, but nonetheless inexcusable, human tendency to evaluate ideas not on the basis of whether they are true but rather on the basis of whether they fit with our ideas about the world.

A scientist may be a Christian, or an adherent of some other religion, and still approach the topic of evolution scientifically—as long as he or she does not allow religious convictions to influence acceptance or nonacceptance of facts. The scientist should start with no preconceived notions and no allegiance to anything other than the truth. If that person's religious conviction is strong enough, it can weather any new scientific idea.

Confusing Atheism With Science

This brings up an important point regarding the alleged conflict between religion and science. Not all the blame for this belongs with religious groups or individuals who shut their minds to scientific knowledge. Many scientists over the years likewise have adopted the fallacy of maintaining that religion and science are somehow linked, in this case using scientific facts as a basis for rejecting religion.

One such scientist was Darwin himself, who embraced agnosticism because his own findings had proved that the biblical account of creation cannot be literally true. In this religious choice, he was following in a family tradition: his grandfather, the physiologist Erasmus Darwin (1731-1802), belonged to the mechanist school, a muddle of atheism, bad theory, and genuine science.

The mechanists claimed that humans were mere machines whose activities could be understood purely in terms of physical and chemical processes. Claims such as these ultimately led to the discrediting of their movement, whose ideas failed to explain such biological processes as growth. At the same time, such mechanist philosophers as the French physician and philosopher Julien de La Mettrie (1709-1751) went far beyond the territory of science, teaching that atheism was the only road to happiness and that the purpose of human life was to experience pleasure.

The thinker who perhaps did the most to confuse science and atheism was one of Darwin's most significant early followers, the German natural scientist and philosopher Ernst Haeckel (1834-1919). It was Haeckel, not Darwin, who first proposed an evolutionary explanation for the origin of human beings, which, of course, was a major step beyond even Darwin's claim that all of life had evolved over millions of years.

In the course of developing this idea, Haeckel, who was a practicing Christian until he read Darwin's On the Origin of Species by Means of Natural Selection, renounced his faith and adopted a belief system he called monism, which is based on the idea that there is only a physical realm and no spiritual one. Technically, Haeckel was not an atheist but a pantheist, since his philosophy included the idea of a single spirit that lives in all things, both living and nonliving. Whatever the case, Haeckel's monism is no more scientific than Christianity.

Humans and "monkeys."

It is interesting that the man who put forward the notorious idea that humans and apes are related also would attempt to turn evolution into a sort of "proof" of atheism. In fact, the evolutionary connection between humans and lower primates, or "monkeys," has long been the most powerful point of contention between religion and evolution.

This, in fact, remains one of the most challenging aspects of evolutionary theory—not because it is hard to see how the human body is similar to an ape's body but because there is such a vast difference between a human mind and that of an ape. Whereas our physical similarity to primates is easy to establish, the fact is that no other animal—ape, dolphin, pig, or dog—comes close to humans in terms of reasoning ability. Nor is it reasoning ability alone that separates humans from other animals. Humans possesses a propensity for conceptualization and a level of self-awareness that sets them completely apart from other creatures, so much so that the brains of apes, cats, birds, and even frogs seem more or less alike compared with that of a human.

Animals are concerned with a few things: eating, sleeping, eliminating waste, and procreating. Some mammals have the ability to engage in play, but there is still no comparison between even the most advanced mammalian brains and that of a human. Other primates have the ability to use sticks or stones as tools, but only humans—practically from the beginning of the species 2.5 million years ago—have the ability to fashion tools. Only humans are gifted, or cursed, with restless minds ever in search of new knowledge.

Does any of this disprove evolution? It does not. Does it pose a significant challenge to the idea that humans and other primates evolved from a common ancestor? Not as it has been stated here. All that has been said in the preceding paragraphs is simply a matter of everyday observation, but it is not a scientific hypothesis, let alone a theory. Clearly, there are some questions still to be answered as to why and how humans developed brains so radically different from those of other primates, but the place for such questioning is within the realm of science not outside it.

Creationism

Another thing we can say about the human mind is that it has a tendency to mold ideas toward its own preconceptions as to how things should be. As Hume observed, there is a great temptation, in the minds of all people, to demand that scientific facts conform to a particular set of religious or political beliefs. Such is the case with creationism and "intelligent design theory," two scientific belief systems whose adherents have attempted to challenge evolutionary theory.

Creationism, which sometimes goes by the name of creation science, is based on the belief that God created the universe and did so in a very short period of time. This claim, creationists maintain, can be supported by scientific evidence. Scientific evidence, however, is not really what drives creationism, which is based on a literal reading of the first two chapters of the Book of Genesis. Taken to an extreme, this means that God created the universe about 6,000 years ago in six days of 24 hours each.

Adherents of creationism begin with the premise of a six-day Creation (or at least, a very young Earth) and then look for facts to support the premise—exactly the opposite of the approach taken by true science. The findings of creationists do not change much over the years, unlike evolutionary science, which has continued to develop with new discoveries.

Sometimes creationists attempt to use the findings of evolutionary science against it. For instance, they may interpret industrial melanism (the adaptation of moths to discoloration in the environment caused by pollution, discussed later in this essay) as proof that organisms can change very quickly. This, of course, does not take into account the fact that moths have very short life spans compared with humans, for whom evolutionary change takes much longer. Creationists also point to areas of evolutionary theory where all scientists are not in agreement, citing these as "proof" that the whole theory is unsound.

Intelligent Design Theory and the Court Battle

In contrast to creationism, intelligent design theory is not based on any particular religious position. Instead, it begins with an observation that would find a great deal of agreement among many people, including those who support evolutionary theory. The idea is that evolution alone does not explain fully how life on Earth came to exist as it does, with all its complexity and order. According to intelligent design theory, there must have been some intelligence behind the formation of the universe.

There is another contrast between intelligent design theory and creationism. Whereas it is hard to imagine a genuine scientist embracing creationism, it is not difficult at all to picture a scientific thinker adopting the viewpoint of intelligent design. In fact, this has happened, though long before the "movement" had a name.

Darwin's contemporary, the English naturalist Alfred Russel Wallace (1823-1913), who published his own theory of evolution at about the same time as Darwin's Origin of Species, parted ways with Darwin because he maintained that there must be a spiritual force guiding evolution. Only such a force, he maintained, could explain the human soul. From a philosophical and theological standpoint, this idea has a great deal of merit, but because it cannot be tested, it cannot truly be regarded as science.

Neither creationism nor intelligent design has received any support in the scientific community—nor, during court battles over the teaching of creationism in the public schools during the 1980s, did that idea receive the support of the United States justice system. Creationism, the courts ruled, is a religious and not a scientific doctrine. Evolutionary theory is based on an ever increasing body of evidence that is both observable and reproducible. To teach these other doctrines alongside evolution in the public schools would convey the impression that creationism and intelligent design had been subjected to the same kinds of rigorous tests that have been applied to evolution, and this is clearly not the case.

Evidence for Evolution

A great deal of evidence for evolution appeared in the seminal text of evolutionary theory (mentioned previously), On the Origin of Species by Means of Natural Selection, which Darwin published in 1859. In fact, he had collected much of the evidence he discusses in this volume nearly three decades earlier, from 1831 to 1836, aboard a scientific research vessel off the coast of South America. (He delayed publication because he rightly feared the controversy that would ensue and resolved to present his ideas only when he learned that Wallace had developed his own theory of evolution.)

Just 22 years old, Darwin traveled on the HMS Beagle, from which he collected samples of marine life. His most significant work was done on the Galápagos Islands some 563 mi. (900 km) west of Ecuador. As he studied organisms there, Darwin found that they resembled species in other parts of the world, but they were also unique and incapable of interbreeding with similar species on the mainland. He began to suspect that for any particular environment, certain traits came to the forefront, favored for survival by nature.

Back in England, he already had seen such a mechanism at work in the artificial breeding of pigeons, whereby breeders favored certain gene pools—for instance, white-tailed birds—over others. (Breeders of dogs and other animals today still employ artificial-selection techniques to produce desirable strains.) Darwin posited a similar process of selection in nature, only this one was not artificial, directed by a goal-oriented human intelligence, but natural and guided by the need for survival.

The Spread of Species

Among the phenomena Darwin observed in the Galápagos was the differentiation among the 13 varieties of finch (a type of bird) on the islands as well as the contrasts among these finches and their counterparts on the mainland. As Darwin began to discover, they shared many characteristics, but each variety had its own specific traits (for instance, the ability to crack tough seeds for food) that allowed it to fill a particular niche in its own environment.

From the beginning Darwin was influenced by the recent findings in geology, a newly emerging science whose leading figures maintained that Earth was very, very old. (These scientists included the Scottish geologist Charles Lyell [1797-1875], whose Principles of Geology, published between 1830 and 1833, Darwin read aboard the Beagle) The relationship between geology and evolution has persisted, and findings in the earth sciences continue to support evolutionary theory.

Among the leading ideas in geology and other geosciences since the mid-twentieth century is plate tectonics, which indicates (among other things) that the continents of Earth are constantly moving. (See Paleontology for further discussion of this topic.) This idea of continental drift provided a mechanism for species differentiation of the kind Darwin had observed.

It appears that in the past, when the land-masses were joined, organisms spread over all available land. Later, this land moved apart, and the organisms became isolated. Eventually, different forms evolved, and in time these distinct organisms became incapable of interbreeding. This is what occurred, for instance, when the Colorado River cut open the Grand Canyon, separating groups of squirrels who lived in the high-altitude pine forest. Eventually, populations ceased to interbreed, and today the Kaibab squirrel of the northern rim and the Abert squirrel of the south are separate species.

Common Ancestry

Darwin recognized that some of the best evidence for evolution lies hidden within the bodies of living creatures. If organisms have a history, he reasoned, then vestiges of that history will linger in their bodies—as studies in comparative anatomy show. An example is a phenomenon that sounds as if it is made up, but it is very real: snake hips. Though their ancestors ceased to walk on four legs many millions of years ago, snakes still possess vestigial hind limbs as well as reduced hip and thigh bones.

In some cases widely divergent organisms possess a common structure, adapted to their individual needs over countless generations yet reflective of a shared ancestor. A fascinating example of this is the pentadactyl limb, a five-digit appendage common to mammals and found, in modified form, among birds. The cat's paw, the dolphin's flipper, the bat's wing, and the human hand are all versions of the same original, an indication of a common four-footed ancestor that likewise had limbs with five digits at the end.

The embryonic forms of animals also reflect common traits and shared evolutionary forebears. This is why most mammals look remarkably similar in early stages of development. In some cases animals in fetal form will manifest vestigial features reflective of what were once functional traits of their ancestors. Thus, fetal whales, while still in their mothers' wombs, produce teeth after the manner of all vertebrates (creatures with an internal framework of bones), only to reabsorb those teeth, which they will not need in a lifetime spent filtering plankton through their jaws.

The molecular "language" of DNA also provides evidence of shared evolutionary lineage. When one studies the DNA of humans and chimpanzees, very close similarities rapidly become apparent. Likewise, there are common structures in the hemoglobin, or red blood cells, of different types of organisms. Comparisons of hemoglobin make it possible to pinpoint the date of the last common ancestor of differing species. For example, hemoglobin analysis reveals an ancestor common to humans and frogs dating back 330 million years, whereas the common human and mouse ancestor lived 80 million years ago, and the ancestor we share with the rhesus monkey walked the earth "only" 26 million years ago.

The Fossil Record

The fossil record also provides an amazing amount of evidence concerning common ancestors. Fossilized remains of invertebrates (animals without an internal skeleton), vertebrates, and plants appear in the strata or layers of Earth's surface in the same order that the complexities of their anatomy suggest. The more evolutionarily distant organisms lie deeper, in the older layers, beneath the remains of the more recent organisms. Geologists are able to date rock strata with reasonable accuracy, and the age of a layer always correlates with the fossils discovered there. In other words, there would never be a stratum dating back 400 million years that contained fossils of mastodons, which evolved much later.

A fossil is the remains of any prehistoric life-form, especially those preserved in rock before the end of the last ice age, about 10,000 years ago. The process by which a once living thing becomes a fossil is known as fossilization. Generally, fossilization involves changes in the hard portions, including bones, teeth, and shells. This series of changes, in which minerals are replaced by different minerals, is known as mineralization.

Fossilized remains of single-cell organisms have been found in rock samples as old as 3.5 billion years, and animal fossils have been located in rocks that date to the latter part of Precambrian time, as long ago as one billion years. Certain fossil types, known as index fossils or indicator species, have been associated strongly with particular intervals of geologic time. An example is the ammonoid, a mollusk that proliferated for about 350 million years, from the late Devonian to the early Cretaceous periods, before experiencing mass extinction.

The fossil record is far from an open book, however, and interpreting fossil evidence requires a great deal of judgment. All manner of natural phenomena such as earthquakes can destroy fossil beds, rendering the evidence unreadable or at least unreliable. Nor is it a foregone conclusion that the animals who left behind fossils are fully representative of the species existing at a given time. Fossils are far more likely to be preserved in certain kinds of protected aquatic environments, for instance, than on land (particularly at higher elevations, where erosion is a significant factor), and therefore paleontologists' knowledge of life forms in the distant past is heavily weighted toward marine creatures.

Faunal Succession and Other Forms of Dating

Key to the demonstration of evolution is the age of samples and the idea that many of the processes described took place a long, long time ago. This raises the question of how scientists know the age of things. In fact, they have at their disposal several techniques, both relative and absolute, for dating objects.

One of the earliest ideas of dating in geology was faunal dating, or the use of bones from animals (fauna) to determine age. This was the brainchild of the English engineer and geologist William Smith (1769-1839), whose work is an example of the fact that evolutionary ideas were "in the air" long before Darwin. While excavating land for a set of canals near London, Smith discovered that any given stratum contains the same types of fossils, and therefore strata in two different areas can be correlated. Smith stated this in what became known as the law of faunal succession: all samples of any given fossil species were deposited on Earth, regardless of location, at more or less the same time. As a result, if a geologist finds a stratum in one area that contains a particular fossil and another in a distant area having the same fossil, it is possible to conclude that the strata are the same.

Faunal succession is relative, meaning that it does not provide clues as to the actual age in years of a particular sample. Since the mid-twentieth century, however, scientists have had at their disposal several means for absolute dating, which make it possible to determine the rough age of samples in years. Most of these mechanisms for dating are based on the fact that over time, a particular substance converts to another, mirror substance. By comparing the ratios between them, it is possible to arrive at an estimate as to the amount of time that has elapsed since the organism died.

Chief among the techniques for absolute dating is radiometric dating, which uses ratios between two different kinds of atoms for a given element: stable and radioactive isotopes. Isotopes are atoms that differ in their number of neutrons, or neutrally charged subatomic particles, and radioactive isotopes are ones that spontaneously eject various high-energy particles over time. Because chemists know how long it takes for half the isotopes in a given sample to stabilize (a half-life), they can judge the age of such a sample by examining the ratio of stable to radioactive isotopes. In the case of uranium, one isotopic form, uranium 238, has a half-life of 4,470 million years, which is very close to the age of Earth itself.

Evolution At Work

Every creature that exists today is the result of an incredibly complex, lengthy series of changes brought about by mutation and natural selection, changes that influenced the evolution of that life-form. Take for instance the horse, whose evolutionary background is as well-documented as that of any creature.

The horse family, or Equidae, had its origins at the beginnings of the Eocene epoch about 54 million years ago. This first ancestor, known as Hyracotherium or eohippus ("dawn horse") was extremely small—only about the size of a dog. In addition, it had four hooves on its front feet and three on each rear foot, with all of its feet being padded, which is quite a contrast with the four unpadded, single-hoofed feet of the modern horse. These and other features, such as head size and shape, constitute such a marked difference from what we know about horses today that many scientists have questioned the status of eohippus as an equine ancestor. However, comparison with fossils from later, also extinct, horses shows a clear line of descent marked by an increase in body size, a decrease in the number of hooves, an elimination of foot pads, lengthening of the legs and fusion of the bones within, development of new teeth suited for eating grass, an increase in the length of the muzzle, and a growth in both the size and development of the brain.

Of course, this was not a clear-cut, neat, and steadily unfolding process, and some features appeared abruptly; still, the progression is there to be observed in the fossil record. Over the course of the many millions of years since eohippus, species have emerged that were distinguished by a particular feature—for example, teeth size and shape—only to disappear if conditions favored species with other traits. Evolutionary lines have branched off, with some dead-ending, and others continuing.

Thus, during the Miocene epoch, which lasted from about 26 million to 7 million years ago, various evolutionary branches competed for a time until the emergence of Parahippus. This species had teeth adapted for eating grass, in contrast to those of earlier horse ancestors, which grazed on leaves and other types of vegetation that did not require strong teeth. After Parahippus came Merychippus, which resembled a modern pony, and from which came numerous late-Miocene evolutionary lines. Most of these were three-toed, but Pliohippus had one toe per foot, and it was from this form that the genus Equus (which today includes horses, donkeys, and zebras) began to emerge in the late Pliocene epoch about 3 million years ago.

Industrial Melanism and the Pepper Moth

Despite the staggering spans of time involved in evolution, one need not look back billions of years to see evolution at work. Both natural selection and mutation play a role in industrial melanism, a phenomenon whereby the processes of evolution can be witnessed within the scale of a human lifetime. Industrial melanism is the high level of occurrence of dark, or melanic, individuals from a particular species (usually insects) within a geographic region noted for its high levels of dark-colored industrial pollution.

With so much pollution in the air, trees tend to be darkened, and thus a dark moth stands a much greater chance of surviving, because predators will be less able to see it. At the same time, there is a mutation that produces dark-colored moths, and in this particular situation, these melanic varieties are selected naturally. On the other hand, in a relatively unpolluted region, the lighter-colored individuals of the same species tend to have the advantage, and therefore natural selection does not favor the mutation.

The best-known example of industrial melanism occurred in a species known as the pepper moth, or Biston betularia, which usually lives on trees covered with lichen. (An example of a lichen is reindeer "moss"; see Symbiosis.) Prior to the beginnings of the Industrial Revolution in England during the late eighteenth century, the proportion of light-colored pepper moths was much higher than that of dark-colored ones, both of which were members of the same species differentiated only by appearance.

As the Industrial Revolution got into full swing during the 1800s, factory smokestacks put so much soot into the air in some parts of England that it killed the lichen on the trees, and by the 1950s, most pepper moths were dark-colored. It was at that point that Bernard Kettlewell (1907-1979), a British geneticist and entomologist (a scientist who studies insects), formed the hypothesis that the pepper moths' coloration protected them from predators, namely birds.

Kettlewell therefore reasoned that, before pollution appeared in mass quantities, light-colored moths had been the ones best equipped to protect themselves because they were camouflaged against the lichen on the trees. After the beginnings of the Industrial Revolution, however, the presence of soot on the trees meant that light-colored moths would stand out, and therefore it was best for a moth to be dark in color. This in turn meant that natural selection had favored the dark moths.

In making his hypothesis, Kettlewell predicted that he would find more dark moths than light moths in polluted areas, and more light than dark ones in places that were unpolluted by factory soot. As it turned out, dark moths outnumbered light moths two-to-one in industrialized areas, while the ratios were reversed in unpolluted regions, confirming his predictions. To further test his hypothesis, Kettlewell set up hidden cameras pointed at trees in both polluted and unpolluted areas. The resulting films showed birds preying on light moths in the polluted region, and dark moths in the unpolluted one—again, fitting Kettlewell's predictions.

Angiosperms and Gymnosperms

A final interesting example of natural selection at work lies in the comparative success rates of angiosperms and gymnosperms. An angiosperm is a type of plant that produces flowers during sexual reproduction, whereas a gymnosperm reproduces sexually through the use of seeds that are exposed, for instance in a cone. Angiosperms are a beautiful example of how a particular group of organisms can adapt to its environment and do so in a much more efficient way than that of its evolutionary forebears. On the other hand, gymnosperms, with their much less efficient form of reproduction, perhaps one day will go the way of the dinosaur.

Flowering plants evolved only about 130 million years ago, by which time Earth long since had been dominated by another variety of seed-producing plant, the gymnosperm, of which pines and firs are an example. Yet in a relatively short period of time, geologically speaking, angiosperms have become the dominant plants in the world. In fact, about 80% of all living plant species are flowering plants. Why did this happen? It happened because angiosperms developed a means whereby they coexist more favorably than gymnosperms with the insect and animal life in their environments.

Gymnosperms produce their seeds on the surface of leaflike structures, and this makes the seeds vulnerable to physical damage and drying as the wind whips the branches back and forth. Furthermore, insects and other animals view gymnosperm seeds as a source of nutrition. In an angiosperm, by contrast, the seeds are tucked safely away inside the ovary. Furthermore, the evolution of the flower not only has added a great deal of beauty to the world but also has provided a highly successful mechanism for sexual reproduction. This sexual reproduction makes it possible for new genetic variations to develop, as genetic material from two individuals of differing ancestry come together to produce new offspring. (For more about angiosperms and gymnosperms, see Ecosystems and Ecology.)

Where to Learn More

Campbell, Neil A., Lawrence G. Mitchell, and Jane B. Reece. Biology: Concepts and Connections. 2nd ed. Menlo Park, CA: Benjamin/Cummings, 1997.

Darwin, Charles, and Richard E. Leakey. The Illustrated Origin of Species. New York: Hill and Wang, 1979.

Dennett, Daniel Clement. Darwin's Dangerous Idea: Evolution and the Meanings of Life. New York: Simon and Schuster, 1996.

Evolution and Natural Selection (Web site). <http://www.sprl.umich.edu/GCL/paper_to_html/selection.html>.

Evolution. British Broadcasting Corporation (Web site). <http://www.bbc.co.uk/education/darwin/index.shtml>.

"Evolution FAQs." Talk Origins (Web site). <http://www.talkorigins.org/origins/faqs-evolution.html>.

Evolution. Public Broadcasting System (Web site). <http://www.pbs.org/wgbh/evolution/>.

Evolution. University of California, Berkeley, Museum of Paleontology (Web site). <http://www.ucmp.berkeley.edu/history/evolution.html>.

Levy, Charles K. Evolutionary Wars: A Three-Billion-Year Arms Race. The Battle of Species on Land, at Sea, and in the Air. New York: W. H. Freeman, 1999.

Starr, Cecie, and Ralph Taggart. Biology: The Unity and Diversity of Life. 7th ed. Belmont, CA: Wadsworth, 1995.


Jean-Louis Flandrin, in his introduction to Food: A Culinary History, sets out many of the crucial questions basic to our understanding of the evolution of human diet:

When and how did the eating behavior of human beings diverge from that of other animal species? Did humans distinguish themselves by the type of variety of foods they ate? By the fact that they prepared their food before eating it? By the ceremonial forms with which they surrounded the act of eating? Or by the conviviality of dining and its characteristic social forms? (p. 14)

These questions, as they relate to the evolution of human foodways, remain unanswerable. A major reason is the vast gulf that separates the living from earlier ancestors. Today, virtually all humans subsist on the products of agricultural activities, which include the raising of domestic animals for food. However, this way of life developed very late in the course of human evolution, with the domestication of plants appearing in several locations around the world at some point after 12,000 years ago; the domestication of food animals followed somewhat later. The vast earlier time, during which humans evolved from more primitive beings, was marked by other forms of subsistence. This time span, more than six million years in duration, witnessed dramatic changes in human biology, behavior, and adaptation. Although we have a treasure trove of fossil bones and archaeological materials that document much of this development, there is little in the record that can inform us of the precise dietary items consumed by these remote ancestors of ours, or enable us to answer the questions posed by Flandrin. There are, however, tantalizing hints of the ways of life followed by these earliest members of the human family, and in this essay, this record will be described, and the available evidence for the evolution of human foodways evaluated.

The data at our disposal for this investigation include the fossil bones and teeth of our ancestors, testaments to their evolving biological structures. There are also the residues of their activities, in the very earliest deposits often preserved as parts of natural accumulations of organic and inorganic remains, jumbled in with the fossil bones of very early human ancestors. Later in time, we find the archaeological remains of the actual living areas, where our ancestors slept, made tools, prepared and ate their food, and often buried or left their dead. All this varied information provides important insights about our evolutionary past, but it is very incomplete data for reconstructing dietary patterns. For example, very little in the way of actual food remains is found during archaeological excavations, and only relatively durable items like animal bones are preserved. This may provide some indication of the presence of meat in the diet, but it is not clear just how much it represents the total subsistence pattern and how much was composed of other foods, like vegetables and insects, which leave no archaeological traces. Similarly, the bones and teeth of our ancestors may preserve chemical and other traces of the sorts of foods that were emphasized in their diets, but these signs are often complex and must be carefully evaluated.

Given the difficulties in deciphering the actual residues, other, more indirect, sources of information have come to play an important role in reconstructing the foodways of our ancestors. These data come from the study of our closest living primate relatives, the chimpanzees, and observations recorded from the anthropological studies of those few modern human groups, called gatherers and hunters, who did not practice agriculture, but subsisted on an assortment of gathered vegetable foods, the collection of small animals, such as insects and small vertebrates, and the occasional successful hunting of larger animals. Comparisons with these living examples are often used to furnish clues to what sorts of foods our ancestors consumed. However, correlations of this sort have numerous limitations, and they must be used with caution. Chimpanzees and humans have had separate evolutionary pathways for at least six million years, and it is possible that during this time, chimpanzees have changed as much as humans in their biology and adaptation, making comparisons of living chimpanzees with our earliest ancestors tenuous at best (we have no fossil record of the specific evolutionary history of chimpanzees). Further, those few living gatherers and hunters who have been studied exist in environments that may be dramatically different from the locales of our ancestors. Finally, and perhaps most importantly, our early ancestors were neither bipedal apes nor humans in fur suits, but a series of biologically and behaviorally unique species whose way of life and biology are now wholly extinct.

Both modern chimpanzees and those gatherers and hunters who have been studied, and do not live in very specialized environments (like the Arctic, for example), have somewhat similar diets. The field research by Jane Goodall and her associates on chimpanzees living in the Gombe National Park in western Tanzania, as well as observations from other chimpanzee living-sites in Africa, indicate that these animals are overwhelmingly vegetarians, with a broadly based diet composed, at the Gombe, of the fruits, leaves, stems, blossoms, and gums of more than eighty different plants. Chimpanzees, however, emphasize a variety of fruits as the major part of their diet. Chimpanzees have also been observed consuming insects, sometimes using twigs, specially broken off and trimmed as tools, to obtain termites. Chimpanzees (often males), behaving together in a cooperative fashion, also deliberately hunt, kill, and eat a variety of small vertebrates, including bush pigs, monkeys, and antelopes. Meat, however, makes up a very small percentage of their total diet.

Human gatherers and hunters in tropical or subtropical areas also subsist on a diet that emphasizes a broad array of vegetable food sources, with smaller amounts of insects and vertebrate animals. The exact percentage of each of these elements differs seasonally or yearly, as well as varying between specific groups.

Like living gatherers and hunters, until the advent of agriculture, our ancestors probably lived an unsettled existence, regularly shifting their encampments to new locales in search of resources. Food storage would have been very difficult, and consumption of collected and hunted foods was probably immediate. Groups would have been small, with the social organization flexible enough to allow group size to fluctuate with the seasonal availability of food and other resources.

These comparisons provide only a very limited insight, and for more information, it is necessary to examine the direct evidence from the archaeological and fossil records.

Diet and Human Evolution

A variety of comparative genetic studies document that chimpanzees are our closest living relative. It has been estimated, for example, that humans and chimpanzees share about 98.5 percent of their genetic material. Calculations of the rate of genetic change over time indicate that humans last shared a common ancestor with this African ape between five and eight million years ago. This is the period when the evolutionary line that eventually led to living humans split from the line that led to chimpanzees, representing the beginnings of human evolution. The living and extinct members of this human evolutionary lineage are traditionally grouped into a biological family, the Hominidae, members of which are known as hominids.

We have no fossil or other evidence of the earliest members of the hominid family, just after they split off from the lineage leading to chimpanzees. We do not know what sorts of environments they lived in or what sorts of foods they ate. Because chimpanzees are native to Africa, and the earliest known hominid fossils are limited to Africa, it seems reasonable to place the homeland of the human family on that continent.

The Earliest Hominids

The recognition of Africa as the human homeland first came in 1924, with the discovery of the fossilized skull and jaw of a young child at T'aung, in the Cape Province of South Africa. Named Australopithecus africanus by its discoverer, Raymond Dart, hundreds of additional fossil specimens of this group, known collectively as the australopithecines, have subsequently been uncovered in south, east, and central Africa. There are now at least eight species of australopithecines, sometimes placed in other genera, like Paranthropus or Kenyanthropus. The australopithecines lived in Africa from about four million to perhaps as late as one million years ago. Like all members of the hominid family, they walked upright, allowing them to efficiently carry objects and food. Chimpanzees habitually walk on all four legs. However, the australopithecines were apelike in many of their biological features, possessing small, chimpanzee-sized brains in an apelike skull with a large, projecting face positioned out in front of the braincase. Their teeth were human-like in form, but they possessed massive back chewing teeth, the premolars and molars, that were much larger than those of living humans. The australopithecines, like all hominids, possessed nonprojecting canine teeth. This is in marked contrast to the large, tusklike canines of the apes. Like gorillas, australopithecines also seem to have been sexually dimorphic in body size, with the males considerably larger than the females.

There are fossil bones found in East Africa of still earlier-in-time creatures, for example, Orrorin tugenensis, at six million years, possibly the earliest hominid yet discovered, and Ardipithecus ramidus, who lived about four and a half million years ago, but little is currently known about these creatures and their biology.

The fossil bones of the australopithecines are most often discovered in natural accumulations that are the result of various sorts of geological activities. These fossil bones may have been transported by water over long distances before they were deposited in their final location. They are only infrequently discovered in a context that represents the locale where they actually lived. Thus, little is known about the kinds of environments in which the australopithecines lived, or how the various australopithecine species may have differed in habitat usage or in food choice and general diet.

For many years after the initial discoveries of the australopithecines, there was a prevalent idea that these creatures lived on the open grasslands or savannas of eastern Africa. According to this theory, their habitat would have provided only a limited selection of foods, and was the selective factor responsible for the development of hunting and meat eating. More recent reconstructions, however, have revealed a much more complex environmental context for these early hominids, with evidence for the use of forests and woodlands. Just how important hunting and meat eating has been in human evolution continues to be debated, and its importance in the ultimate appearance of modern humans remains unclear.

Australopithecine fossil bones have been carefully examined in a number of ingenious ways, in order to learn more about their dietary patterns, but thus far with only limited success.

For example, on the basis of comparisons with the teeth of other mammals, it is clear that these early hominids were not specifically adapted to meat eating. As in modern humans, the chewing surfaces of the teeth are covered with thick layers of enamel. Some australopithecine species, known as the ''robust" australopithecines, possessed truly massive back teeth, along with very large jawbones to house them, and large chewing muscles, sometimes so large that they formed a crest on the top of the skull. These general biological features of australopithecine jaws and teeth suggest that they emphasized the chewing of coarse vegetable food sources, but not the consumption of grasses, whose high cellulose content would have been very difficult for these creatures to digest.

Other studies of the dentition have attempted to determine more specific aspects of the dietary patterns of the australopithecines. One series of studies utilized scanning electron microscopy to examine the minute scratches and pits left by food particles on the chewing surface of the teeth. The results of these observations suggest that some of the australopithecines ate a diet rich in fruits, while others were consuming a more varied, but basically vegetarian, diet. One problem with these sorts of studies is that they tend to focus on the final meals the creatures ate before they died, providing a somewhat limited view of their overall diet, especially if they were seasonally exploiting a variety of different habitats and foods.

Other studies have examined the chemical composition of australopithecine fossil bones. One study employed the ratio of calcium and strontium in the fossil bones to determine whether the australopithecines were generally herbivorous, carnivorous, or omnivorous.

Another chemical analysis, based on staple isotopes including 13C and 12C, has reached a conclusion similar to that from the calcium-strontium analyses: some australopithecines, at least, were consuming animal foods, though the identity of these animals, and whether they were vertebrates or invertebrates, has not been determined.

These studies continue to support a variety of opinions about the dietary patterns of these early hominids, with some anthropologists suggesting a diet based primarily on fleshy fruits, nuts, and seeds, while others advocate a more broadly based diet, including some animal foods.

There is no direct evidence that the australopithecines collected foods to be brought back to some central camp to be consumed as part of a group activity. Rather, like chimpanzees, it appears likely that they consumed food continuously as they foraged in their environment.

The Evolution of the Genus Homo

Good evidence of the evolution of members of our genus, Homo, begins to appear around two million years ago at sites in East Africa. There was a dramatic increase in brain size, from the 500 ml common in the australopithecines to brains as large as 800 ml in these early humans (though still about half the size of those of living people). They also possessed smaller back chewing teeth. Chipped stone tools, first used about two-and-a-half million years ago, now became more common. These durable tools, made from water-rounded pebbles, are known as Oldowan tools. They were made by striking two stones together, knocking off chips to produce a cutting edge or point. Though crudely made, their development represented a major advance in the ability of the early hominids to exploit a wider variety of food sources. Hominids lacked sharp and hardened claws, as well as projecting and pointed canine piercing teeth, making them inefficient in dealing with many potential food sources. For example, without a digging tool or claws, many subterranean foods like insects, small burrowing mammals, tubers, and rhizomes, would have been impossible to obtain. The australopithecines are only rarely found in association with these chipped pebble tools, and most anthropologists believe the first stone tool makers were early members of the human genus Homo.

Also found at this time are animal bones, mainly from antelopes, with butchery marks made by a sharp stone edge. Although isotopic studies have indicated that the earlier australopithecines may have consumed animal foods, these cut marks represent definitive evidence of early meat eating. What is still being debated is the origin of these bones. They may have been the result of hunting activities, which is entirely reasonable given our knowledge of the cooperative hunting patterns of chimpanzees, but some scientists have suggested that they may also have been the result of scavenging activities. A safe way, it is said, to obtain bones with scraps of meat still adhering to them would be to claim animal bones from a predator kill after primary scavengers, such as hyenas and jackals, have finished with them. Thus, the initial meat eating in human evolution, according to this view, was to utilize stone tools to scrape off bits of rotting tissue from the bones of predator kills. One major flaw with this notion is that no primate is equipped with digestive mechanisms to protect them from the serious consequences of eating spoiled meat.

By about 1.8 million years ago, there are a number of different species of early Homo coexisting in eastern Africa. In addition, several species of robust australopithecines were also living at this time. What the possible dietary differences, if any, between all these hominids is unknown.

Expansion Out of Africa

At some point after 1.8 million years ago, in one of the most momentous events in human evolution, the hominids begin to move out of Africa. One site along the Jordan River Valley in Israel, dated at about one and a half million years old, is located along what must have been a major route into Eurasia. Along with stone tools similar to those from Africa were found numerous bones of African mammals, suggesting that the hominids were not the only creatures moving out of that continent.

Hominid sites in the Republic of Georgia and on the island of Java also testify to this dramatic increase in range. Although the reasons the hominids left Africa at this moment are unclear, one reasonable explanation is that stone tools enabled hominids to expand the range of dietary items open for exploitation, allowing them to move into new habitats.

During the course of the next million years, hominid brain size increased, so that by about 300,000 to 400,000 years ago, the volume of the braincase reached 1,200 ml, within the range of living humans. It may be that there was an associated increase in body size during this period as well. Increasing brain size would have required greater intakes of oxygen, as well as nutrients. It has been suggested that this brain size expansion relied on increased amounts of dietary fats. Hunted animals could have supplied these fats, but gathered insects, many of which are richly endowed with this nutrient (especially the essential fatty acid, linoleic acid), are equally likely sources. Larger body size also necessitated a greater number of calories.

The occupation of the European subcontinent appears to have taken place later than human expansion into more hospitable habitats in Asia. This is no doubt related to the presence of glaciers, which, beginning about two million years ago, periodically covered major parts of Europe. The earliest occupation site in Europe, dating to about 800,000 years ago, is located in northern Spain, near the present city of Burgos. From that time onwards, hominid presence in Europe was closely tied to the advance and retreat of the glaciers, with the continent relatively uninhabited during times of maximal glacial activity.

By 500,000 years ago, hominids, placed in the category Homo erectus, were intermittently occupying a large cave on the outskirts of what is now the village of Zhoukoudian, about twenty-five miles from Beijing, in northern China. Although there was no glacial activity in this part of Asia, winter would have been severe (Zhoukoudian is about as far north as Philadelphia). While it remains unclear if hominids actually wintered this far north, the earliest well-documented evidence of fire has been found here. Fire allowed hominids to use food sources that would be uneatable, or actually toxic, without cooking. Burned deer bones, as well as those with cut marks, testify to the use of meat by the inhabitants of the cave, but whether the meat was obtained by hunting or scavenging remains unknown.

From about the same time, a hominid skull was found in Ethiopia with cut marks on its frontal bone, suggesting skinning or scalping. Cannibalism has been documented at a number of other, later-in-time hominid sites; was the flesh a part of the diet, or was eating a dead friend or relative part of a ritual?

Modern Human Origins

The last 200,000 years of human evolution are much richer in data because actual living places have been located and excavated. Prior to this time, only a very few sites, like Zhoukoudian, represented the remains of an encampment, where the evidence of hominid activities are directly preserved. By about 115,000 years ago, our ancestors had begun the practice of the deliberate burial of their dead, thereby reducing the risk that the body would be destroyed by scavengers. Burying the dead resulted in a vast increase of ancient skeletons that have been preserved for study.

There continues to be debate about the precise way by which living humans emerged from our earlier ancestry. Some anthropologists suggest that modern humans evolved from these earlier hominids and, thus, are the culmination of a very long evolutionary history in various geographic areas. For example, living Asians are the descendants of ancestors who reached Asia more than a million years ago.

Most anthropologists support another theory, that all modern humans originated in Africa some 100,000 to 300,000 years ago and, subsequently, spread out from there to populate the rest of the planet, replacing the earlier hominids who were already living in these areas, descendants of the much earlier initial expansion.

One extinct fossil group that has figured prominently in these theories is the Neanderthals, a group of hominids who lived in Europe and the Middle East from about 130,000 to about 30,000 years ago, when they disappeared from the scene. Because they lived in Europe, where the most intensive archaeological investigations have taken place over the last 150 years, we have much more evidence about these creatures than about any other fossil hominids. This has provided a rich data source, but it also has a number of serious limitations. The most important is that emphasizing the Neanderthals gives a very Eurocentric view of human origins. The final glaciation occurred during much of the time Neanderthals were in Europe; this made major portions of the continent uninhabitable. Those parts that could be occupied by humans represented marginal environments that would have limited population density to extremely low levels.

Given the harsh environments of Europe in which the Neanderthals were living, vegetable foods were probably relatively scarce through much of the year, and meat was almost certainly a major dietary resource. This is confirmed by chemical analyses of their bones, which indicate that for some Neanderthals, fully 80 percent of their diet came from meat. The bones of numerous large animals, such as deer, aurochs, wild boar, and horses are preserved at Neanderthal sites, along with smaller animals. At sites along the Mediterranean, shells testify to the consumption of seafoods. Our evidence for the diet of peoples contemporary with the Neanderthals, but living in Africa and southern Asia, remains limited. At one site, located on the very southern coast of Africa, Klasies River Mouth Cave, there is abundant evidence of the use of a variety of food resources, including land and sea animals and shellfish. Because much of our current evidence comes from humans, like the Neanderthals, who lived in a harsh environment, the emphasis on hunting and meat eating that has come to characterize the diets of earlier hominids may represent a very biased picture.

Although the precise evolutionary relationships of the Neanderthals to living humans remain shadowy, excavation of their sites has revealed a complex picture. Often, living areas with hearths and signs of social areas around them have been uncovered. The bones of selected parts of animals, often with butchery marks on them, are scattered about. Clearly, Neanderthals, like living human gatherers and hunters, were carrying back to a central camp chosen pieces of animals. They may also have brought back other dietary items from their foraging and hunting activities, but the relative absence of small animal bones suggests that they may have been consumed immediately where they were found. It is quite possible that they sat around a fire sharing and consuming food, perhaps engaging in the uniquely human dinnertime interactions of storytelling and discussions of the day's activities. It is unclear, however, if the Neanderthals were actually able to use language, so this reconstruction remains a tentative one.

Sometime after 40,000 years ago, modern human-like peoples appeared in Europe, perhaps migrating there from their origins in Africa, or developing from ancestors already living in Europe. These modern humans brought with them new sorts of tool-making technologies, based on a broader array of raw materials, such as ivory, bone, and wood, with a wider assortment of beautifully made stone tools that show far greater sophistication than those made by the Neanderthals. The first artistic expressions also made their appearance at this time, with plastic art in the form of ivory and bone carvings of animals and people. Deep inside caves, they produced engravings and painted images of animals, and occasionally humans, some of them of great genius.

The sites occupied by these modern humans are littered with the bones of the same sort of animals, the earlier Neanderthals hunted, but the concentrations of bones indicate greater skills in hunting and a corresponding larger number of captured animals. This is also the case with much larger accumulations of shellfish along the coast.

These early modern humans continued this sort of hunting activity to the end of the last glacial period, about 12,000 years ago. In Europe, the retreat of the glaciers resulted in the spread of forests and a major change in dietary habits, with peoples hunting forest animals, like deer and rabbit, and utilizing to a much greater extent the riches of the sea. By this time, however, peoples in the Middle East and along the Yangtze River Valley in southern China were beginning to experiment with the cultivation of plants, which represented the beginnings of the agricultural revolution, and formed the foundations of settled urban life and the origins of civilization.

Although this sketch brings together much of our current knowledge of the evolution of human foodways, much clearly remains to be learned. For one thing, it tells us little about how human diet changed from eating what was necessary for nutritional needs to consuming what was enjoyable and pleasant to eat. Perhaps our ancestors always selected those foods that were enjoyable to eat, bringing about the basis of the consumption of food as a central focus in the social life of humans.

Bibliography

Eaton, S. Boyd, and Melvin Konner. "Paleolithic Nutrition." The New England Journal of Medicine (1985) 312:283–289.

Flandrin, Jean-Louis, and Massimo Montanari, eds. Food: A Culinary History from Antiquity to the Present. New York: Columbia University Press, 1999. (English edition edited by Albert Sonnenfeld; first published as Histoire de l'alimentation; Rome, 1996.)

Goodall, Jane. The Chimpanzees of Gombe: Patterns of Behavior. Cambridge, Mass.: Harvard University Press, 1986.

Hayden, Brian. "Cultural Capacities of Neandertals: A Review and a Re-evaluation." Journal of Human Evolution (1993) 24:113–146.

Kelly, Robert L. The Foraging Spectrum: Diversity in Hunter-Gatherer Lifeways. Washington, D.C.: Smithsonian Institution Press, 1994.

Klein, Richard. The Human Career. 2d ed. Chicago: University of Chicago Press, 2002.

Mann, Alan. "Diet and Human Evolution." In Omnivorous Primates: Gathering and Hunting in Human Evolution. Edited by R. Harding and G. Teleki. New York: Columbia University Press, 1981.

Somer, Elizabeth. The Origin Diet. New York: Henry Holt, 2001.

Stiner, Mary C. Honor Among Thieves: A Zooarchaeological Study of Neandertal Ecology. Princeton, N.J.: Princeton University Press, 1994.

Stringer, Chris, and Clive Gamble. In Search of Neanderthals. New York: Thames and Hudson, 1993.

Wolpoff, Milford H. Paleoanthropology. 2d. ed. New York: Mc-Graw Hill, 1999.

—Alan Mann