An A-Z of Unanswered Objections to Christianity: H. Human Origins

[For a brief explanation of my “An A-Z of Unanswered Objections to Christianity” series, and for the skeptical tone of this article, please see here. I’m starting my A-Z series with the letter H, and I’ll be zipping around the alphabet, in the coming weeks.]

In this article, I take aim at the Christian teaching that there was a special moment in history at which humans, who were made in the image of God, came into existence, and that a sharp line can be drawn between man and beast. I argue that on purely scientific grounds, it can be shown that such a view is highly unlikely. If the scientific arguments I put forward here are correct, then Christianity is in very big trouble.

KEY POINTS

Christianity is committed to the view that humans, who are the only animals made in the image and likeness of God, came into existence at a fixed point in time – a “magic moment,” if you like. However, anthropologists find this picture wildly implausible, for two main reasons.

First, science has not identified any “sudden spurts” in the evolution of the human brain over the past four million years. While there appear to have been a couple of periods of accelerated evolution, these lasted for no less than 300,000 years. As far as we can tell from the fossil record, there were no overnight improvements in human intelligence.

Second, when scientists examine the archaeological record for signs of the emergence of creatures with uniquely human abilities, what they find is that there are no less than ten distinct abilities that could be used to draw the line between humans and their pre-human forebears. However, it turns out that these ten abilities emerged at different points in time, and what’s more, most of them emerged gradually, leaving no room for a “magic moment” when the first true human beings appeared.

Finally, recent attempts by Christian theologians to evade the force of this problem by redefining the “image of God” not in terms of our uniquely human abilities (the substantive view of the Divine image), but in terms of our being in an active relationship with God (the relational view) or our special calling to rule over God’s creation (the vocational or functional view) completely fail, because neither our relationship with God nor our domination over other creatures defines what it means to be human: they are both consequences of being human. What’s more, humans only possess these qualities because of underlying abilities which make them possible (e.g. the ability to form a concept of God or the ability to invent technologies that give us control over nature), so we are back with the substantive view, which, as we have seen, supports gradualism. The verdict is clear: contrary to Christian teaching, no sharp line can be drawn between man and beast, which means that the Biblical story of a first human being is a myth, with no foundation in reality. And if there was no first Adam of any kind, then the Christian portrayal of Jesus as the second Adam (1 Corinthians 15:45-49) makes no sense, either.

I conclude that the difficulty posed above is a real one, which Christian apologists ignore at their peril.

One of the key problem areas for Christian belief relates to human origins. Let me state upfront that I will not be discussing human evolution in this post, as Christians of all stripes (Catholic, Protestant and Orthodox) readily agree that God could have used the natural evolutionary process to make the first human beings (or their bodies, at any rate), had He wished to do so. The only point of disagreement is an exegetical one: namely, whether the Biblical account of God creating Adam from the dust of the ground was intended to be taken literally, in the first place. (Most Christians now think it wasn’t.) Nor am I going to wade into the debate as to whether humanity sprang from a single original couple (Adam and Eve) as Christians have traditionally believed, or a much larger stock of several thousand people, as some genetic studies have suggested. (For a fair-minded summary of the current state of the evidence, see the article “Adam and Eve: lessons learned” [April 14, 2018] by Dr. Richard Buggs, Professor of Evolutionary Genomics at Queen Mary University of London.) In recent years, Christians have proposed many ingenious ways of reconciling Adam and Eve with the findings of evolutionary biologists, and Dr. Joshua Swamidass’s recent book, The Genealogical Adam and Eve: The Surprising Science of Universal Ancestry (IVP Academic, 2019), provides an excellent overview of the solutions that have been proposed to date, including a new proposal by Swamidass himself. On top of that, there is an ongoing controversy (see also here) as to whether the author of Genesis 2 ever intended to teach the descent of human beings from a single original couple. The name “Adam,” for instance, is generic: it means “man,” while “Eve” means “life.” For these reasons, I maintain that the issues of human evolution and monogenesis (the descent of all humans from Adam and Eve) pose no existential threat to Christianity: there is room for a legitimate diversity of interpretation of the Scriptures on these matters.

Instead, the problem I’m going to discuss in this post is a much more fundamental one: namely, the question of whether there really was a first generation of human beings, or in other words, whether there was a fixed point in time at which humans came into existence – a “magic moment,” if you like. According to Judaism, Christianity and Islam, there must have been one. But according to science, there couldn’t have been one. And if the scientists are right, then Christianity is in very big trouble. Please allow me to explain.

MAIN MENU

Introduction. Humans and beasts: a clear divide? The Bald Man paradox

A. The steady increase in human brainpower over the past four million years: where do you draw the line?
(i) The arrival of Homo erectus: a great leap forward in human mental capacities?
(ii) Neanderthals and Homo sapiens: in a league of their own?

B. The “Ten Adams” problem
1. Acheulean Adam

An Archaeological excursus
(i) Why can’t Oldowan pebble tools be used to identify the first true human beings, instead of Acheulean hand-axes?
(ii) Can “Acheulean Adam” be equated with any particular species of Homo in the fossil record?
(iii) Do Acheulean hand-axes appear suddenly in the archaeological record, or gradually?
(iv) Were Acheulean hand-axes the product of genetically programmed instincts or cultural transmission?
(v) What kind of minds did Acheulean toolmakers possess, and did they use language?

2. Fire-controller Adam
(a) The earliest evidence for the use of fire
(b) When did hominins learn to control fire?
(c) When did the earliest habitual use of fire begin?
(d) The Use and Control of Fire: Where the evidence currently stands

3. Aesthetic Adam

4. Geometrical Adam

5. Spear-maker Adam
(a) The first step: wooden spears
(b) The advent of stone-tipped spears, and what it tells us about their makers
(c) The next step: thrown spears
(d) The final step: the use of compound adhesives to attach stone points to wooden shafts

6. Craftsman Adam

7. Symbolic Adam

8. Linguistic Adam
(a) What is language? Two views
(b) The broad view, and the case for language use by Homo erectus, Heidelberg man and Neanderthal man
(c) The narrow view of language: the “single mutation” hypotheses proposed by Noam Chomsky and Andrey Vyshedskiy
(d) Why human language is almost certainly NOT the result of a single mutation

9. Ethical Adam
(a) Altruism
(b) Self-sacrifice for the good of the group

10. Religious Adam

C. Another way out for Christian apologists: Redefining the image of God?

D. Conclusion

Postscript: Was Adam an apatheist?

Introduction. Humans and beasts: a clear divide? The Bald Man paradox

RETURN TO MAIN MENU

The gentleman pictured above surely needs no introduction. I have chosen him because he exemplifies an old paradox from antiquity: the Bald Man paradox, first posed by the Greek philosopher Eubulides of Miletus. It goes like this:

A man with a full head of hair is obviously not bald. Now the removal of a single hair will not turn a non-bald man into a bald one. And yet it is obvious that a continuation of that process must eventually result in baldness.

The paradox arises because the word “bald,” like a host of other adjectives (“tall,” “rich,” “blue” and so on), is a vague one. Most scientists would say that the word “human” is equally vague, and, like the word “bald,” has no clear-cut boundaries. On this view, there never was a “first human” in the hominin lineage leading to modern humans, for the same reason that there never was a definite moment at which Prince William went bald. Evolutionary biologist Richard Dawkins explains this point with admirable lucidity here:

Or as Charles Darwin succinctly put it in his work, The Descent of Man, and Selection in Relation to Sex (1871, London: John Murray, Volume 1, 1st edition, Chapter VII, p. 235):

“Whether primeval man, when he possessed very few arts of the rudest kind, and when his power of language was extremely imperfect, would have deserved to be called man, must depend on the definition which we employ. In a series of forms graduating insensibly from some ape-like creature to man as he now exists it would be impossible to fix on any definite point when the term ‘man’ ought to be used.”

However, Judaism, Christianity and Islam all insist on a clear, black-and-white division between human beings and other animals. Humans are made in the image and likeness of God; beasts are not. Humans have spiritual, immortal souls that are made to enjoy eternity with their Creator, in Heaven; beasts will never go to Heaven. (Even Christians who believe in some sort of immortality for animals nevertheless acknowledge that only humans will get to behold God’s glory, face-to-face.) There are moral and political differences between humans and other animals, as well. Humans have inalienable rights, and in particular, an inviolable right to life; beasts, on the other hand, may be killed for food in times of necessity. (Indeed, most Christians would say that animals may be killed for food at any time.) Humans, especially when they are mature and fully developed, are morally responsible for their actions; beasts are not. We don’t sue chimps, as we don’t consider them liable for their actions, even when they do nasty things like kill individuals from neighboring communities, because we presume they can’t help it: they are merely acting on innate tendencies. And for the same reason, we believe God doesn’t punish them for the destructive things they do. There is no hell for chimpanzees – even vicious ones that cannibalize infants (as some do). Finally, human beings are believed to possess certain special powers which other animals lack. For some Christians, such as Aquinas, what distinguishes humans from other animals is the godlike faculty of reason; for others, such as John Wesley, it is not reason, but the ability to know, love and serve God that makes us special. However, all Christians agree that humans are in a league of their own, mentally and spiritually, and that they have certain powers which the beasts lack. In other words, there is a clear-cut distinction, on a metaphysical level, between man and beast.

What this means is that even Christians who believe in evolution are nonetheless mind creationists, to borrow a term from the philosopher Daniel Dennett, who used it in his book, Darwin’s Dangerous Idea (Simon & Schuster, 1995) and his more recent paper, “Darwin’s ‘strange inversion of reasoning'” (PNAS, June 2009, 106 (Supplement 1) 10061-10065) to refer to thinkers (both theistic and atheistic) who refuse to accept that the human mind is the product of a blind, algorithmic process: natural selection. Christians believe that on a spiritual level, humanity literally sprang into existence overnight, due to the creative action of God. And since what makes us human is the highest and noblest part of us (call it mind, spirit, the human soul or what you will), it follows that if humans are radically distinct from the beasts, then they cannot have emerged from the beasts gradually – which means that the distinctively human part of our nature must have appeared instantly. Thus, on the Christian view, the human mind or spirit appeared suddenly, and there was a first generation of creatures possessing a human mind or spirit. These were the first true human beings. Before that, there may have been other creatures looking very similar to us, but it is not merely our physical appearance that makes us human. What distinguishes human beings is the possession of certain unique, God-given powers that other creatures lack. And it is these powers that appeared in the blink of an eye, according to Christian doctrine.

Catholic physicist Stephen Barr is a sophisticated exponent of the modern Christian view, which seeks to combine biological evolution with the religious belief in a spiritual soul. Writing in the journal First Things (April 2017), Barr acknowledges that biological species (such as cats and Homo sapiens) do not appear overnight, but posits a special intervention by God in the course of prehistory, in which God infused a spiritual soul into some anatomically modern humans, endowing them with rationality (and, he later suggests, language):

It is quite consistent to suppose that a long, slow evolutionary development led to the emergence of an interbreeding population of “anatomically modern humans,” as paleo-archeologists call them, and that when the time was ripe, God chose to raise one, several, or all members of that population to the spiritual level of rationality and freedom.

I’d now like to explain why biologists find this picture utterly incredible. There are two major reasons: first, although human brainpower has increased fairly rapidly over the past four million years, science has not identified any “sudden spurts” in the evolution of the brain (unless you call 300,000 years sudden); and second, there are no less than ten distinct human abilities that could be used to draw the line between man and beast, but it turns out that all of them emerged at different points in time (so which one do you pick?), and in any case, all of them (including language) emerged gradually, over many thousands of years. (Acheulean tools are the only exception, and even these may have been invented on multiple occasions.)

At this point, some Christians may object that science is, by virtue of its very methodology, incapable of identifying the emergence of the human mind or spirit during the prehistoric past. Souls don’t leave fossils, after all. That may be so, but it is indisputable that humans (who are alleged to possess spiritual souls) behave in a strikingly different way from other creatures: after all, it is humans who hunt chimps, perform research on them and cage them in zoos, not the other way round. If the human soul is real, then, we should certainly expect to find traces of its behavioral manifestations in the archaeological record. And during the past few decades, archaeologists have become remarkably proficient at deciphering those traces. They can tell us, for instance, what level of planning would have been needed to make prehistoric tools, and whether the makers of those tools would have required language in order to teach others how to fashion them. They can tell us about the origin of human co-operation, and whether prehistoric communities engaged in altruistic behavior. They can tell us a lot about the origins of art and religion, as well. So I would invite Christians to look at the evidence with an open mind, and draw their own conclusions.

================================================================

A. The steady increase in human brainpower over the past four million years: where do you draw the line?

RETURN TO MAIN MENU

Magnetic resonance imaging, showing areas of the brain comprising the default mode network, which underlies our concept of self and is involved in thinking about other individuals. Image courtesy of John Graner, Neuroimaging Department, National Intrepid Center of Excellence, Walter Reed National Military Medical Center and Wikipedia.

The first argument I’d like to put forward is that the fossil evidence, which is based on the evolution of the human brain, strongly suggests that human intelligence has steadily increased over the past several million years, without any “quantum leaps,” as you would expect if human mental capacities had literally appeared overnight. Now, I am sure that some readers will be thinking: “The mind is more than just the brain. Just because the human brain increased gradually in size, it doesn’t follow that our mental capacities increased gradually.” But regardless of whether you’re a dualist or a materialist, the fact remains that a great thinker requires a great brain. In his book, Intellect: Mind Over Matter (Chapter 4), the Aristotelian-Thomist philosopher Mortimer Adler (1902-2001) argued that “we do not think conceptually with our brains,” as an individual organ such as the brain is incapable of embodying universal meanings, such as the meaning of “human” or “triangle” or “machine”; but at the same time, Adler was happy to acknowledge that “we cannot think conceptually without our brains.” Adler’s first point invites the obvious objection that an individual soul would be in no better position to apprehend universal meanings than an individual brain; however, his second point is surely a valid one. Allow me to explain why.

Regardless of whether or not it is capable of embodying semantic meaning, it is certainly true that the human brain, with its 86 billion neurons, is a very powerful information processor. Your brain has to handle a vast amount of information every second, in order that you can remain alive and consciously aware of what’s going on around you. You would be unable to think rationally unless your brain possessed the ability to process sensory data from your surroundings, organize that data as information, and store that information in your working memory and (if it’s useful) your long-term memory. As a rational human being, you also need to be able to plan ahead – a task that requires you to keep several things in mind at once: the sequence of steps you need to follow, in order to achieve your goal. Multi-tasking is often required, as well: very few tasks follow a simple, linear A -> B -> C -> D sequence. The ability to actively recall information you’ve previously learned whenever you need it is also vital. In other words, if you want to survive as a rational human being, then any old animal brain won’t do; you’ll need a pretty high-powered one, with a high information-processing and storage capacity, and a high degree of inter-connectivity as well.

“What about the concept of God?” the reader may object. “Surely neural hard-wiring is no help at all when it comes to entertaining the concept of an immaterial being. Why should you need a special kind of brain for that?” However, it turns out that having a concept of God (or gods) requires a highly sophisticated brain, too. To believe in God, you first need to have a concept of self. You also need to possess a theory of mind: a belief in other agents, having desires and intentions of their own. Additionally, you need to be able to entertain the notion of spirits or invisible agents, who are capable of controlling events from the top down. Finally, you need to have the concept of a Master Invisible Agent, Who made the entire universe (“heaven and earth”) and Who keeps the whole show running. While the neural underpinnings of belief in God are still being investigated, scientists already know that there are several specific brain regions associated with simply having a concept of self and a theory of mind (which most animals lack). Together, these regions comprise what is known as the brain’s default mode network (pictured above). The default mode network is the neurological basis for one’s concept of self: it stores autobiographical information, mediates self-reference (referring to traits and descriptions of oneself) and self-reflection (reflecting about one’s emotional states). It’s also involved in thinking about others – in particular, in having a theory of mind, understanding other people’s emotions, moral reasoning, social evaluations and social categories. On top of that, it’s utilized in tasks such as remembering the past, imagining the future, episodic memory and story comprehension. The default mode network is comprised of an interconnected set of brain regions, with the major hubs being located in the posterior cingulate cortex (PCC) and precuneus, the medial prefrontal cortex (MPFC) and the angular gyrus. Scientists have discovered, for instance, that increased neural activity in the medial prefrontal cortex is related to better perspective-taking, emotion management, and increased social functioning, while reduced activity in the above-mentioned areas is associated with social dysfunction and lack of interest in social interaction. And according to a recent neuroimaging study conducted by Raymond L. Neubauer (Religion, Brain and Behavior, Volume 4, 2014, Issue 2, pp. 92-103), “Results show an overlap between prayer and speaking to a loved one in brain areas associated with theory of mind, suggesting that the brain treats both as an interpersonal relationship. These brain areas are also associated with the default mode network, where the mind evaluates past and possible future experiences of the self.”

The point I’m making here is that if the ancestors of modern human beings underwent some kind of “Great Leap Forward” in their mental or spiritual capacities at some point in the past (as Christians suppose), their brains would have had to have been radically transformed as well. That holds true, regardless of whether you’re a materialist or a dualist.

Putting Christian “mind creationism” to the test


Above: The philosopher Daniel Dennett, who coined the term “mind creationists” to describe thinkers who either doubt or deny that the human mind is the product of natural selection (Jerry Fodor, Thomas Nagel, and the Christian evolutionary paleontologist, Simon Conway Morris), or who accept that the human mind is a biological machine, but deny that it arose as a result of purely algorithmic, step-by-step processes (John Searle, Roger Penrose). Image courtesy of Dmitry Rozhkov and Wikipedia.

——————–

I’d now like to present two hypotheses: the hypothesis that the human mind, spirit or rational soul (i.e. whatever capacity it is that distinguishes us from the beasts) appeared overnight, and the hypothesis that this capacity evolved gradually. I’m going to refer to these hypotheses as Hypothesis 1 and Hypothesis 2. The two hypotheses make different predictions. A quantum leap, or saltation, in the human brain’s information-processing capacity at some point during the past few million years, would tend to favor Hypothesis 1, as it would suggest that at some point in the past, the brains of hominins suddenly became suitable for the exercise of human agency. [Note: Hominins are defined as creatures such as Ardipithecus, Australopithecus, Paranthropus and Homo, which belong to the lineage of primates that broke away from the chimpanzee line and led eventually to modern humans, although many branches died out along the way, without leaving any descendants. The term ‘hominid’ used to have the same meaning that ‘hominin’ now has, but now includes humans, chimpanzees, gorillas and orang-utans, as well as all their immediate ancestors.] Even if you believe that our highest human capacities transcend the brain, you would still be heartened by the discovery of a sudden increase in the brain’s information-processing ability over the course of time, because a great thinker still requires a great brain, even if it’s not the brain itself that does the thinking. By contrast, a gradual increase in the human brain’s information-processing capacity over time would tend to favor Hypothesis 2, that our ancestors didn’t become human overnight, and that there’s no sharp line dividing man from beast, but a continuum instead.

Please note that I am not speaking of “proof” or “disproof” here, but merely of evidence. It’s certainly possible that the capacities that make us human emerged suddenly, even though the human brain’s information-processing capacity increased gradually: maybe there was a “critical point” at which our capacities suddenly manifested themselves, for instance. Who knows? Nevertheless, the discovery that there have been no “sudden spurts” in human brainpower over the past few million years favors Hypothesis 2, because it is a fact that Hypothesis 1 can account for only by making the ad hoc assumption that the human brain has a critical mass, whereas Hypothesis 2 requires no arbitrary assumptions in order to explain this fact.

Finding a good yardstick to measure the intelligence of our prehistoric forebears

So, which hypothesis does the scientific evidence support? To answer that question, we need to find a good yardstick to measure the brain’s information-processing capacity. Just as a high-capacity computer needs a lot of power to stay running, so too, a human brain needs a high metabolic rate, in order to continue functioning normally. Now, for any species of animal, the brain’s metabolic rate is mainly related to the energetic cost of the activity occurring in its synapses. For that reason, metabolic rate is widely thought to be a better measure of an animal’s cognitive ability than simply measuring its brain size.

The carotid canal (shown in green, bottom right) is the passageway in the temporal bone through which the internal carotid artery enters the skull from the neck. There is one on each side of the base of the skull. Professor Roger Seymour contends that the width of this canal in the skulls of prehistoric hominins serves as a useful measure of the information processing capacity of their brains – or in other words, how smart they were. Image courtesy of Wikipedia.

It turns out that human brains have a pretty high metabolic rate: indeed, the human brain uses no less than 20% of the body’s energy, despite the fact that it makes up only 2% of the body’s mass. If we look at other primates, we find that apart from some small primates, which are known to have a high brain mass to body mass ratio (for example, ring-tailed lemurs [average weight: 2.2 kg] and pygmy marmosets [0.1 kg]), the brain of a typical primate typically uses only 8 to 10% of its body’s energy, while for most other mammals, it’s just 3 to 5%. So, what about the brains of human ancestors? How much energy did they use, and what were their metabolic rates? We need no longer speculate about these questions; we have the answers. As Roger Seymour, Emeritus Professor of Physiology at the University of Adelaide, Australia, explains in an online article on Real Clear Science titled, “How Smart Were Our Ancestors? Blood Flow Provides a Clue” (January 27, 2020), we now possess a handy metric for measuring the metabolic rate for the brains of human ancestors, over the last several million years. In a nutshell: the arteries that carry blood to the brain pass through small holes in the base of the skull. Bigger holes mean bigger arteries and more blood to power the brain. By measuring the size of the holes in the base of the skulls of fossil humans, we can estimate the rate of blood flow to their brains, which in turn tells us how much information they were capable of processing, just as the size of the cables indicates how much information a computer is capable of processing.

Professor Seymour and his team performed these measurements for Ardipithecus, various species of Australopithecus, Homo habilis, Homo erectus and his descendant, Heidelberg man, who’s believed by some experts to be the ancestor of both Neanderthal man and Homo sapiens. (Others think it was Homo antecessor [see also here], who was older and somewhat smaller-brained than Heidelberg man, but whose face was more like ours. Unfortunately, we don’t yet have a complete skull of this species.) Seymour summarizes the findings: “The rate of blood flow to the brain appears to have increased over time in all primate lineages. But in the hominin lineage [that’s the lineage leading to human beings – VJT], it increased much more quickly than in other primates.” He adds: “Between the 4.4 million year old Ardipithecus and Homo sapiens, brains became almost five times larger, but blood flow rate grew more than nine times larger. This indicates each gram of brain matter was using almost twice as much energy, evidently due to greater synaptic activity and information processing.” So, how do our ancestors stack up, when compared to us? And is there any evidence of quantum leaps?

How rapidly has intelligence increased over the past few million years?

Over time, the brains of our hominin ancestors required more and more energy. This is the best gauge we have of their information-processing capacity. Image courtesy of Professor Roger Seymour (author) and theconversation.com/au.

Seymour’s 2019 study, which was conducted with colleagues at the Evolutionary Studies Institute of the University of the Witwatersrand in South Africa and reported in Proceedings of the Royal Society B (13 November 2019, https://doi.org/10.1098/rspb.2019.2208), found that for 4.4-million-year-old Ardipithecus, the internal carotid artery blood flow was less than 1 cubic centimeter per second, or about one third that of a modern chimpanzee. That suggests it wasn’t too bright. What about Australopithecus? Although Australopithecus had a brain bigger than a chimp’s, and about the size of a gorilla’s (despite having a much lighter body), it turns out that the brain of Australopithecus had only two-thirds the carotid artery blood flow of that of a chimp’s brain, and half the flow of a gorilla’s brain. Seymour concludes that Australopithecus was probably less intelligent than a living chimpanzee or gorilla. How about Homo habilis? Its carotid artery blood flow was about the same as a modern chimpanzee’s, but less than a gorilla’s, at just under 3 cubic centimeters per second. For early Homo erectus, which appeared only 500,000 years after Homo habilis, it was about 4.5 cubic centimeters per second (compared to about 3.5 for a gorilla), while for late Homo erectus, it was about 6. Surprisingly, it was a little less than 6 for Heidelberg man, who’s widely considered to be the next species on the lineage leading to modern man. And for Neanderthal man and Homo sapiens, it was around 8 cubic centimeters per second, suggesting that the Neanderthals’ intelligence roughly matched ours.

Are there any sudden spurts in human intelligence? Probably not

If we plot these figures on a graph (as shown above [image courtesy of Professor Seymour; an even better image of the graph can be found here]), we see that three-million-year-old Australopithecus afarensis (a.k.a. “Lucy”) and Homo erectus both had somewhat higher carotid artery blood flows than would be expected, for the time at which they lived. Was there a sudden spurt in our ancestors’ brainpower when Homo erectus appeared? And what about the surge in carotid artery blood flow from Heidelberg man (6 cubic centimeters per second) to modern man (8 cubic centimeters per second), over the last 300,000 to 400,000 years? What does the evidence indicate?

(i) The arrival of Homo erectus: a great leap forward in human mental capacities?


Above: Model of the face of an adult female Homo erectus.
Image courtesy of Smithsonian Museum of Natural History, Washington, D.C. and Wikipedia.

In particular, the jump from Homo habilis to Homo erectus appears fairly steep. It is worth bearing in mind, however, that this “jump” would have occurred over about 400,000 years (which is hardly overnight), as the oldest remains of what’s believed to be Homo habilis (from Lokalalei, West Turkana, Kenya) go back 2.34 million years, while the oldest Homo erectus remains (KNM-ER 2598, from Koobi Fora, Kenya) date back to 1.9 million years.

What’s more, the carotid artery blood flow for Homo habilis turns out to be based on a single specimen, if one examines the original data (see Supplementary Materials) on which the paper was based, which means that the “sudden jump” may well turn out to be a statistical artifact.

Anthropologists are also well-aware that the hominin fossil record is patchy and incomplete, with new species being discovered all the time, which means it is quite possible that some other hominin apart from Homo habilis (perhaps Homo rudolfensis) was ancestral to Homo erectus. [It appears that Homo georgicus is not a likely ancestor: as the graph reveals, its carotid artery blood flow was too small, despite its having a brain size intermediate between Homo habilis and Homo erectus.]

Finally, in a 2016 paper titled, “From Australopithecus to Homo: the transition that wasn’t” (Philosophical Transactions of the Royal Society B vol. 371, issue 1698, https://doi.org/10.1098/rstb.2015.0248), authors William H. Kimbel and Brian Villmoare take aim at the idea that the transition from Australopithecus to Homo was a momentous one, arguing instead that “the expanded brain size, human-like wrist and hand anatomy [97,98], dietary eclecticism [99] and potential tool-making capabilities of ‘generalized’ australopiths root the Homo lineage in ancient hominin adaptive trends, suggesting that the ‘transition’ from Australopithecus to Homo may not have been that much of a transition at all.” In Figure 5 of their article, the authors graph early hominin brain sizes (endocranial volumes, or ECVs) over time, from 3.2 to 1.0 million years ago, for various specimens of Australopithecus (labeled as A), early Homo (labeled H) and Homo erectus (labeled E). From the graph, it can be readily seen that there is a considerable overlap in brain size between Homo erectus and early Homo, shattering the myth of a quantum leap between the two species. Kimbel and Villmoare add that “brain size in early Homo is highly variable—even within fairly narrow time bands—with some early H. erectus crania (e.g. D4500) falling into the Australopithecus range” and conclude that “a major ‘grade-level’ leap in brain size with the advent of H. erectus is probably illusory.”

(ii) Neanderthals and Homo sapiens: in a league of their own?


Above: Reconstruction of a Neanderthal woman (makeup by Morten Jacobsen). Image courtesy of PLOS ONE (Creative Commons Attribution License, 2004), Bacon Cph, Morten Jacobsen and Wikipedia.

There was also a fairly large and rapid increase in carotid artery blood flow from Heidelberg man to his presumed descendants, Neanderthal man and Homo sapiens, from about 6 to 8 cubic centimeters per second. Once again, this increase was not instantaneous, but took place over a period of about 300,000 years. Also, the data relating to Heidelberg man is based on just two specimens; more complete data may reveal a more gradual picture. Moreover, if one examines the original data (see Supplementary Materials) on which the carotid blood flow rates are calculated for different species, it can be seen that the internal carotid artery blood flow rates are significantly smaller for the older specimens of Homo sapiens, suggesting a steady increase over time rather than a sudden leap. Finally, the distinctive globular brain shape that characterizes modern Homo sapiens is now known to have evolved gradually within the Homo sapiens lineage, according to a recent article by Simon Neubauer et al., titled, “The evolution of modern human brain shape” (Science Advances, 24 Jan 2018: Vol. 4, no. 1, eaao5961). This does not square well with the hypothesis of a “magic moment” at which this lineage became truly human.

The clinching evidence for gradualism: the increase in hominin endocranial volume

Finally, a 2018 article titled, Pattern and process in hominin brain size evolution are scale-dependent by Andrew Du et al. (Proceedings of the Royal Society B 285:20172738, http://doi.org/10.1098/rspb.2017.2738) provides clinching evidence against any sudden spurts in brain size. In the article, the authors make use of endocranial volume (ECV), which they refer to as “a reliable proxy for brain size in fossils.” Looking at hominins overall, they find that “the dominant signal is consistent with a gradual increase in brain size,” adding that this gradual trend “appears to have been generated primarily by processes operating within hypothesized lineages,” rather than at the time when new species emerged. Du et al. considered various possible models of ECV change over time for hominins, including random walk, gradualism, stasis, punctuated equilibrium, stasis combined with random walk and stasis combined with gradualism. What they found was that gradualism was the best model for explaining the trends observed, by a long shot:

Results from fitting evolutionary models at the clade level show gradualism is by far the best fit for describing hominin ECV change over time (figure 3a and electronic supplementary material, table S3). All mean ECV estimates of the observed time series fall within the 95% probability envelope predicted by the gradualism model (figure 3b and electronic supplementary material figure S1), and model R2 is 0.676 (electronic supplementary material, table S4). Multiple sensitivity analyses demonstrate that support for the gradualism model is robust to bin size or location (electronic supplementary material, figure S5).

The only times when macroevolutionary (as opposed to microevolutionary) processes exhibited a significant influence on trends in the increase in hominin endocranial volume over time were between 2 and 2.3 million years ago (when the larger-brained Homo appeared), between 1.7 and 2 million years ago (when the smaller-brained Australopithecus and Paranthropus disappeared from southern Africa) and between 1.1 and 1.4 million years ago, when Paranthropus (popularly known as “Zinj”) disappeared from east Africa. However, Du et al. found that “within-lineage ECV increase is the primary driver of clade-level change at 1.7–1.4, 1.1–0.8 and 0.5–0.2 Ma [million years ago].”

This finding invites an obvious objection: how could natural selection have possibly favored such a slow rate of brain size (ECV) increase as the one that actually occurred in the hominin line, over millions of years? An increase of 800 cubic centimeters over 3 million years might sound fast, especially when we compare hominins to other lineages of primates, but in reality, it works out at just one extra cubic centimeter of brain matter every 3,750 years! In response, the authors propose that “selection was for larger ECV on average but must have fluctuated and included episodes of stasis and/or drift,” adding that “[a]ll of this occurs on too fine a timescale to be resolved by the current hominin ECV fossil record, resulting in emergent directional trends within lineages.” What needs to be borne in mind, however, is that even punctuated equilibrium models of evolution (which the authors rejected as unnecessary, in order to account for the trends observed) posit changes occurring over tens of thousands of years, rather than overnight. In plain English, what this means is that even if there were any relatively sudden increases in brain size, they would have been fairly modest (say, 20 or 30 cubic centimeters), and they would have taken place over a time span of millennia, at the very least. After that, there would have been long periods when brain size did not increase at all.

Du et al. conclude that the overall trend toward increasing endocranial volume (ECV) within the hominin line “was generated primarily by within-lineage mechanisms that likely involved both directional selection and stasis and/or drift,” as well as “directional speciation producing larger-brained lineages” and “higher extinction rates of smaller-brained lineages.”

So, can we draw a line between humans and non-humans, in the hominin lineage?

In the light of these findings, Christian apologists need to squarely address the question: “Where do you draw the line between true human beings and their bestial forebears?” The fact is, there isn’t a good place to draw one. If you want to say that only Neanderthals and Homo sapiens were truly human, then an awkward consequence follows: their common ancestor, Heidelberg man, wasn’t human, which means that God created two distinct races of intelligent beings – or three if you include Denisovan man, another descendant of Heidelberg man. (Currently, we don’t have any complete skulls of Denisovan man.) Two or three races of intelligent beings? That doesn’t comport with what the Bible teaches or with what the Christian Church has taught, down the ages: only one race of beings (human beings) was made in God’s image (see for instance Genesis 3:20, Malachi 2:10 and Acts 17:26). If you insist that Heidelberg man must have been human as well, then you also have to include late Homo erectus, whose brain had a metabolic rate equal to that of Heidelberg man. But if you are willing to concede that late Homo erectus was truly human, then why not early Homo erectus, who belonged to the same species, after all? However, if you include early Homo erectus within your definition of “truly human,” then you have to address the question: why are you setting the bar so low, by including a species that was not much smarter than a gorilla when it first appeared, used only pebble tools for the first 200,000 years of its existence (from 1.9 to 1.7 million years ago), and only gradually became smarter, over a period of one-and-a-half million years?

In short: the anatomical evidence suggests that while human intelligence evolved relatively rapidly, in geological terms, climbing from the level of a chimp or gorilla to its modern human level in a little over two million years, there is no warrant for supposing that it sprang into existence overnight. The nearest events that we can find to a saltation in the evolution of the human brain are (i) the “sudden” appearance of Homo erectus and (ii) the equally “sudden” evolution of Neanderthal man and Homo sapiens from their presumed ancestor, Heidelberg man, but since the increases in brainpower that occurred here may have taken place over a period as long as 300,000 years, and since there is tentative evidence that they were gradual (at least, in the second case), they provide no support for the Judaeo-Christian hypothesis of a “magic moment” when the human mind, spirit or rational soul was created. The best one can say is that the “magic moment” hypothesis has not been ruled out by the fossil evidence.

================================================================

B. The “Ten Adams” problem


RETURN TO MAIN MENU

Note: The ten neutral faces shown above were taken from the Princeton Faces database. The software used for face generation was FaceGen 3.1, developed by Todorov et al. 2013, and Todorov & Oosterhof 2011.

But if the evidence from human brain anatomy does nothing to diminish your conviction that there was a “magic moment” at which our ancestors became human, maybe the archaeological evidence of their technical, linguistic, moral and spiritual capacities will change your mind. This brings me to my second reason why scientists reject the view that human intelligence appeared overnight. I’ve decided to call it the “Ten Adams” problem. Please note that the “ten Adams” whom I refer to below are purely hypothetical figures, intended to designate the inventors (whoever they were) of ten cultural breakthroughs that changed our lives as human beings. For the purposes of my argument, it is not necessary to suppose that these figures were particular individuals; the “inventors” of these breakthroughs could well have been entire communities of people.

When we look at the record, we find that there are not one, but ten different points in the past where one might try to draw the line between humans and their sub-human forebears, based on their mental abilities. I’m going to give each of these Adams a special name: first, Acheulean Adam, the maker of hand-axes; second, Fire-controller Adam, who was able to not only make opportunistic use of the power of fire, but also control it; third, Aesthetic Adam, who was capable of fashioning elegantly symmetrical, and finely finished tools; fourth, Geometrical Adam, who carved abstract geometrical patterns on pieces of shell; fifth, Spear-maker Adam, who hunted big game with stone-tipped spears – a feat which required co-operative, strategic planning; sixth, Craftsman Adam, who was capable of fashioning a wide variety of tools, known as Mode III tools, using highly refined techniques; and seventh, Modern or Symbolic Adam, who was capable of abstract thinking, long-range planning and behavioral innovation, and who decorated himself with jewelry. There’s an eighth Adam, too: Linguistic Adam, the first to use human language. And we can also identify a ninth Adam: Ethical Adam, the first hominin to display genuine altruism. Lastly, there’s a tenth Adam: Religious Adam, the first to worship a Reality higher than himself. And what do we find? If we confine ourselves to the first seven Adams, we find that the first Adam (Acheulean Adam) appeared 1.8 million years ago, while the last (Modern or Symbolic Adam) appeared as recently as 130,000 years ago, and certainly no more than 300,000 years ago. It’s harder to fix a date for Linguistic Adam: briefly, it depends on how you define language. On a broad definition, it seems to have appeared half a million years ago; on a narrower definition, it probably appeared between 200,000 and 70,000 years ago. Ethical Adam, if he existed, lived at least half a million years ago, while Religious Adam likely appeared some time after 100,000 years ago, if we define religion as belief in an after-life and/or supernatural beings. So the question is: which of these ten Adams was the first true human? Let’s look at each case in turn.

First, here’s a short summary of my findings, in tabular form:

The TEN ADAMS
Which Adam? Which species? When?
Acheulean Adam, the maker of Acheulean hand-axes Homo ergaster (Africa), Homo erectus (Eurasia). (Handaxes were later used by Heidelberg man and even early Homo sapiens.) 1.76 million years ago in Africa; over 350,000 years later in Eurasia.
By 1 million years ago, the shape and size of the tools were carefully planned, with a specific goal in mind. [N.B. Recently, a study using brain-imaging techniques has shown that hominins were probably taught how to make Acheulean hand-axes by non-verbal instructions, rather than by using language.]
Fire Controller Adam, the first hominin to control fire Homo ergaster (Africa), Homo erectus (Eurasia). 1 million years ago (control of fire; opportunistic use of fire goes back 1.5 million years);
800,000 to 400,000 years ago (ability to control fire on a regular and habitual basis; later in Europe).
Date unknown for the ability to manufacture fire, but possibly less than 100,000 years ago, as the Neanderthals evidently lacked this capacity.
Aesthetic Adam, the first to make undeniably aesthetic objects Late Homo ergaster/erectus. 750,000-800,000 years ago (first elegantly symmetric handaxes; sporadic);
500,000 years ago (production of aesthetic handaxes on a regular basis).
Geometrical Adam, maker of the first geometrical designs Late Homo erectus 540,000 years ago (zigzags);
350,000-400,000 years ago (parallel and radiating lines);
290,000 years ago (cupules, or circular cup marks carved in rocks);
70,000-100,000 years ago (cross-hatched designs).
Spearmaker Adam, the maker of stone-tipped spears used to hunt big game Heidelberg man 500,000 years ago (first stone-tipped spears; wooden spears are at least as old, if not older);
300,000 years ago (projectile stone-tipped spears, which could be thrown);
70,000 years ago (compound adhesives used for hafting stone tips onto a wooden shaft).
Craftsman Adam, the maker of Mode III tools requiring highly refined techniques to manufacture Heidelberg man (first appearance);
Homo sapiens and Neanderthal man (production on a regular basis).
500,000-615,000 years ago (first appearance; sporadic);
320,000 years ago (production on a regular basis).
Modern or Symbolic Adam Homo sapiens and Neanderthal man (modern human behavior, broadly defined);
Homo sapiens and Neanderthal man (symbolic behavior, in the narrow sense).
300,000 years ago (modern human behavior – i.e. abstract thinking; planning depth; behavioral, economic and technological innovativeness; and possibly, symbolic cognition);
130,000 years ago (symbolic behavior, in the narrow sense).
(Note: the pace of technical and cultural innovation appears to have picked up between 40,000 and 50,000 years ago, probably for demographic reasons: an increase in the population increased the flow of ideas.)
Linguistic Adam, the first to use language Heidelberg man(?), Homo sapiens and Neanderthal man (language in the broad sense);
Homo sapiens (language in the narrow sense).
500,000 years ago (language in the broad sense: sounds are assigned definite meanings, but words can be combined freely to make an infinite number of possible sentences);
70,000 to 200,000 years ago (language in the narrow sense: hierarchical syntactical structure).
Ethical Adam, the first to display genuine altruism and self-sacrifice Homo ergaster (altruism); late Homo ergaster/erectus or Heidelberg man (self-sacrifice). Altruism: 1,500,000 years ago (long-term care of seriously ill adults); at least 500,000 years ago (care for children with serious congenital abnormalities). Self-sacrifice for the good of the group: up to 700,000 years ago.
Religious Adam, the first to have a belief in supernatural religion Homo sapiens. 90,000 to 35,000 years ago (belief in an after-life);
35,000 to 11,000 years ago (worship of gods and goddesses).
(N.B. As these ideas and beliefs are found in virtually all human societies, they must presumably go back at least 70,000 years, when the last wave of Homo sapiens left Africa.)

1. Acheulean Adam


RETURN TO MAIN MENU

Left: An Acheulean biface from St. Acheul, France. Date: between 500,000 and 300,000 BP. Image courtesy of Félix Régnault, Didier Descouens and Wikipedia.
Right: Drawing of a hand holding a hand-axe. Image courtesy of Locutus Borg, José-Manuel Benito Álvarez and Wikipedia.

Summary: Until recently, it was widely assumed that Acheulean hand-axes (which humans can learn how to make but chimps cannot) were cultural artifacts. That all changed with the publication of a provocative paper by Raymond Corbey et al. in 2016, arguing that the manufacture of these hand-axes could have been genetically driven, at least in part, and that the hand-axes were no more remarkable than intricately built birds’ nests. Corbey’s paper drew a flurry of indignant responses by scientists working in the field, who argued that there were several features of Acheulean hand-axes which pointed to their being culturally transmitted from generation to generation, and that their makers (or at least, the makers of the later Acheulean hand-axes) had an eye for symmetry, and spend an inordinate amount of time making them – much more than would have been required just to make a tool that did the job of cutting up animal carcasses. That suggests they were interested in perfection for its on sake, which makes them a lot like us. After reviewing the evidence, I find that the cultural hypothesis is likely correct; nevertheless, it is highly doubtful whether “Acheulean Adam,” or the first hominin to make these axes (Homo ergaster, also known as African Homo erectus), was truly human. In the first place, language was not required to teach others how to make the hand-axes; basic teaching would have done the job. Even if language was used for instructive purposes, it was likely very rudimentary. And in the second place, the fact that it took Homo erectus more than one million years to get from making a simple hand-axe to making the first stone-tipped spear suggests to me that he wasn’t “bright,” in the human sense of the word. The pace of technological change seems to have picked up around one million years ago, however, which is roughly when Homo antecessor appeared. Subsequent to this date, Acheulean hand-axes became more refined and (in a few cases) aesthetically pleasing.

Go to Section 2

Acheulean tools: Recommended Reading

Coolidge, F. and Wynn, T. (2015). “The Hand-axe Enigma”. Psychology Today. Posted April 2, 2015.
Corbey, R. et al. (2016). “The acheulean handaxe: More like a bird’s song than a beatles’ tune?” Evolutionary Anthropology, Volume 25, Issue 1, pp. 6-19.
dela Torre, I. (2016). “The origins of the Acheulean: past and present perspectives on a major transition in human evolution”. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 371(1698): 20150245.
Diez-Martín, F. et al. (2016). The Origin of The Acheulean: The 1.7 Million-Year-Old Site of FLK West, Olduvai Gorge (Tanzania). Scientific Reports volume 5, Article number: 17839.
Gallotti, R. and Mussi, M. (2017). “Two Acheuleans, two humankinds: From 1.5 to 0.85 Ma at Melka Kunture (Upper Awash, Ethiopian highlands)”. Journal of Anthropological Sciences, Vol. 95, 137-181.
Hosfield, R. et al. (2018). “Less of a bird’s song than a hard rock ensemble.” Evolutionary Anthropology. 27 (1), 9­-20. ISSN 1520­6505 doi: https://doi.org/10.1002/evan.21551.
Li, X. et al.. (2017). Early Pleistocene occurrence of Acheulian technology in North China. Quaternary Science Reviews 156, 12-22.
Morgan, T. et al. (2015). “Experimental evidence for the co-evolution of hominin tool-making teaching and language”. Nature Communications 6, 6029, https://doi.org/10.1038/ncomms7029.
Putt, S. et al. (2017). “The functional brain networks that underlie Early Stone Age tool manufacture”. Nature Human Behaviour 1(6):0102, DOI: 10.1038/s41562-017-0102.
Semaw, S., Rogers, M. J. and Stout, D. (2009). The Oldowan-Acheulian Transition: Is there a “Developed Oldowan” Artifact Tradition? Chapter from Sourcebook of Paleolithic Transitions: Methods, Theories, and Interpretations. Camps, M. and Chauhan, P. (eds.), Springer Science+Business Media, LLC, 173-193.
Shipton, C. et al. (2018). “Were Acheulean Bifaces Deliberately Made Symmetrical?” Cambridge Archaeological Journal, July 2018. DOI: 10.1017/S095977431800032X.
Stout, D. et al. (2015). “Cognitive demands of Lower Palaeolithic toolmaking”. PlosOne 10, e0121804, doi:10.1371/journal.pone.0121804.
Wynn, T. and Gowlett, J. (2018). “The handaxe reconsidered.” Evolutionary Anthropology. Volume 27, Issue 1, 21-29.

What are Acheulean tools?

Oldowan tools, which first appear in the archaeological record 2.6 million years ago, are typified by pebble “choppers”: crudely worked stone cores made from pebbles that were chipped in two directions, in order to remove flakes and create a sharpened edge suitable for cutting, chopping, and scraping. Acheulean tools, which first appear 1.76 million years ago, are quite different, being distinguished by a preference for large flakes (>10 cm), used as blanks for making large cutting tools (handaxes and cleavers). The best-known Acheulean tools are the highly distinctive handaxes, which were pear shaped, teardrop shaped, or rounded in outline, usually 12–20 cm long and flaked over at least part of the surface of each side (which is why they are commonly described as bifacial). (For more information, readers are invited to view the online article, “Oldowan and Acheulean stone tools” by the Museum of Anthropology, University of Missouri.)

Before making an assessment of “Acheulean Adam,” there are several questions we need to address:

(i) Why can’t Oldowan pebble tools be used to identify the first true human beings, instead of Acheulean hand-axes?
(ii) Can “Acheulean Adam” be equated with any particular species of Homo in the fossil record?
(iii) Do Acheulean hand-axes appear suddenly in the archaeological record, or gradually?
(iv) Were Acheulean hand-axes the product of genetically programmed instincts or cultural transmission?
(v) What kind of minds did Acheulean toolmakers possess, and did they use language?

Answering these questions in detail will require an in-depth review of the current scientific literature on the subject. For the benefit of those readers who would rather skip over the technical details, here are some very brief answers.

(i) Why can’t Oldowan pebble tools be used to identify the first true human beings, instead of Acheulean hand-axes?

Answer: Most scientists don’t think that anything we’d call human intelligence was required to make these tools. Indeed, some experts think the maker of these tools was mentally no more advanced than a chimp, although others believe that he/she was more like one of us. In any case, the intelligence that fashioned these tools still fell far short of our own, as shown by the fact that even bonobos can be taught how to fashion Oldowan-style flakes that can be used for cutting food from small round stones. (That said, the bonobos didn’t do a very good job of making them.) By contrast, no non-human primate has ever manufactured anything like an Acheulean hand-axe.

(ii) Can “Acheulean Adam” be equated with any particular species of Homo in the fossil record?

Answer: Probably not. The first Acheulean hand-axes in Africa were made by Homo ergaster some 1.76 million years ago, while the first hand-axes in Europe and Asia were made by Homo erectus. (Note: Some experts classify both of these species as Homo erectus.) However, Homo erectus lived in Asia for hundreds of thousands of years before it started making Acheulean hand-axes, while Homo ergaster may have lived in Africa for over 100,000 years before starting to create these tools, so it would be unwise to equate either of these species with “Acheulean Adam,” tout simple. Incidentally, Acheulean technology was remarkably long-lived: later Acheulean hand-axes were made by Heidelberg man and even early Homo sapiens, as late as 200,000 years ago.

(iii) Do Acheulean hand-axes appear suddenly in the archaeological record, or gradually?

Answer: The oldest Acheulean hand-axes date back to 1.76 million years ago in eastern Africa, and appear fairly suddenly in the archaeological record, but the record for the next 150,000 years in this region of Africa is patchy. Nevertheless, there are strong theoretical reasons for believing that Acheulean technology must have arisen very suddenly. However, it has been hypothesized that this abrupt appearance could well be due to the confluence, at a particular point in time, of several enabling factors which themselves arose gradually. Also, we do not know whether Acheulean technology was invented once and only once, or whether it was invented independently on multiple occasions.

(iv) Were Acheulean hand-axes the product of genetically programmed instincts or cultural transmission?

Answer: Acheulean tools were most likely the product of cultural transmission. In 2016, Raymond Corbey and his colleagues sparked an academic furore when they argued that the explanation for Acheulean tool-making could be largely (but not entirely) genetic, and that Acheulean hand-axes were no more complex than birds’ nests in their level of intricacy. Several scholars took issue with Corbey’s paper (see here, here and here), pointing out that tool-making is a cultural phenomenon in all tool-making primate species studied to date, and that the makers of Acheulean tools went to extraordinary lengths to make them symmetrical as well as pleasing to the eye. From a strictly functional point of view, this beautification would have been a complete waste of time. Also, Acheulean hand-axes evolved over the course of time: while the earliest ones were fairly crude, those manufactured half a million years ago were much more refined. Indeed, some experts have even proposed that there were two Acheuleans, the second one starting some time after one million years ago. Finally, there were indeed local variations in Acheulean tools. Taken together, these facts suggest that a cultural process was at work, even if the pace of technological change was a glacial one by our standards.

(v) What kind of minds did Acheulean toolmakers possess, and did they use language?

Answer: The makers of Acheulean hand-axes were highly skilled toolmakers. From studies of modern human volunteers, we now know that it would have taken hundreds of hours of practice for them to learn how to make hand-axes and other tools. They also had an eye for symmetry, and after about 750,000 years ago, some of them had an eye for beauty as well, although the production of aesthetic artifacts did not become common until 500,000 years ago.

But did they use language when teaching one another to make these tools? The short answer is that we don’t know for sure, but it probably wasn’t required. Some researchers (Morgan et al., 2015) argue that the use of language would have made it easier for individuals to learn how to make Acheulean tools, but at the same time, they acknowledge that if any language was used in instructing novice tool-makers, it would probably have been pretty rudimentary. As they put it, “simple forms of positive or negative reinforcement, or directing the attention of a learner to specific points” would have been enough; neither a large number of symbols nor a complex system of grammar would have been required. Other researchers (Shelby Putt et al., 2017) argue that language would have actually been a hindrance, as internal verbalization while carrying out instructions would have prevented early hominins from attending to the sound of stone hitting against stone, while tools were being shaped. Instead of possessing linguistic skills, these hominins had highly sophisticated motor, memory and action planning skills, rather like those of a piano player.

At this point, some readers may wish to go on to Section 2, in which I discuss “Fire-controller Adam.” Those who would like to learn more about Acheulean tools are welcome to continue reading, as I proceed to address each of the five issues raised above, in detail.

—————————————————————-

An Archaeological Excursus

(i) Why can’t Oldowan pebble tools be used to identify the first true human beings, instead of Acheulean hand-axes?

RETURN TO MAIN MENU

Oldowan choppers dating to 1.7 million years BP, from Melka Kunture, Ethiopia. Image courtesy of Didier Descouens and Wikipedia.

Some readers may be wondering why I am focusing on Acheulean hand-axes, rather than the earlier (and rather large) Lomekwian anvils, cores, and flakes produced in Kenya 3.3 million years ago (reported in Nature, vol. 521, 21 May 2015, doi:10.1038), or the somewhat more sophisticated Oldowanpebble tools,” produced by Australopithecus garhi and later, Homo habilis and early Homo erectus, between 2.6 million and 1.3 million years ago. The reason why I am ignoring these tools here is that they are not sufficiently distinctive to be considered as a hallmark of the human race: arguably, chimpanzees could have made them. In 1989, Wynn and McGrew published an influential paper titled, “An ape’s view of the Oldowan” (Man 24(3):383-398), in which they concluded:

There is nothing about the Oldowan which demands human-like behavior such as language, ritual or shared knowledge of arbitrary design, or other sophisticated material processes. At most one can argue that the Oldowan pushed the limits of ape-grade adaptation; it did not exceed them… In its general features Oldowan culture was ape, not human. Nowhere in this picture need we posit elements such as language, extensive sharing, division of labor, or pair-bonded families, all of which are part of the baggage carried by the term human.” (1989, p. 394)

Wynn and McGrew claimed that only two behavioral patterns distinguished the makers of Oldowan tools from apes: transportation of tools or food for thousands of meters and competing with large carnivores for prey. Even these behaviors they considered to be “still within the ape adaptive grade” and accountable in terms of “differences in habitat.” In a 2011 follow-up paper (Evolutionary Anthropology 20: 181-197), Wynn, Hernandez-Aguilar, Marchant and McGrew reiterate their conclusion, adding: “The Oldowan was not a new adaptive grade, but a variation on an old one… Human-like technical elements made their appearance after the Oldowan… In a techno-behavioral sense, Homo erectus sensu lato [i.e. defined broadly, to include Homo ergaster – VJT] was the intermediate form between ape and human.” The primitive nature of pebble tools becomes especially apparent, when we compare the complexity of the sequential chains of steps (chaines operatoires) required to make Oldowan tools (7 phases, 4 foci, 14 steps, 4 shifts of attention) with the level of behavioral sophistication required by chimpanzees when making tools to extract termites, in order to eat them (7 phases, 4 foci, 9 steps, 4 shifts of attention). As can be seen, the numbers are comparable.


Video above: Chimps using twigs to dig for ants. Authors: Koops K, Schöning C, Isaji M, Hashimoto C. Video courtesy of Wikipedia.

Wynn and McGrew have a valid point. Scientists now know that modern chimpanzees follow an operational sequence when using stone implements to crack nuts, just as the makers of Oldowan pebble tools did. Sometimes they even create sharp-edged flakes in the process, albeit unintentionally. Additionally, chimpanzees in some communities in Guinea, West Africa, will transport stone tools (hammers and anvils) for the purpose of cracking nuts, reuse a stone tool that has been accidentally produced from another stone tool, and follow a sequence of repeated actions leading to the goal of cracking open nuts (for further details, see Susana Carvalho et al., “Chaînes opératoires and resource-exploitation strategies in chimpanzee (Pan troglodytes) nut cracking”, Journal of Human Evolution [2008] 55, 148-163).

Even more disconcertingly for “human supremacists,” recent research has shown that capuchin monkeys from Brazil are capable, while bashing rocks into dust, of unintentionally producing sharp-edged stone flakes that resemble ancient stone tools produced by hominins in eastern Africa some two to three million years ago, according to an article in Nature News, 19 October 2016 (“Monkey ‘tools’ raise questions over human archaeological record” by Ewen Callaway).

Other scientists have a different take on the tools produced by our hominin ancestors. In a recent paper titled, “An overview of the cognitive implications of the Oldowan Industrial Complex” (Azania: Archaeological Research in Africa, 2018, Vol. 53, No. 1, 3–39), Nicholas Toth and Kathy Stick take a more sanguine view of Oldowan tools, concluding that “a variety of archaeological and palaeoneurological evidence indicates that Oldowan hominins represent a stage of technological and cognitive complexity not seen in modern great apes (chimpanzees, bonobos, gorillas, orangutans), but transitional between a modern ape-like cognition and that of later Homo (erectus, heidelbergensis, sapiens).” Toth and Stick acknowledge the impressive tool-making feats of chimpanzees, but point out that in experiments conducted in 2006, when highly intelligent bonobos which had previously been shown how to use stones (made of lava, quartzite and flint) to produce flakes for accessing and cutting food, were then given small round stones (cobbles) to make their own tools from, they managed to produce usable flakes. However, Toth and Stick note that the quality of the flakes they made was poor: “The [2.6-million-year-old Oldowan] Gona cores and flakes were intermediate in skill levels between those of bonobos and modern humans, but much closer to the human sample.” What’s more, the bonobos weren’t as selective about their raw materials as the hominins living at Gona were.

Toth and Stick add that while the fractured debris unintentionally resulting from chimpanzees’ tool-making activities in the wild is reminiscent of prehistoric Oldowan assemblages, “[i]t should be kept in mind… that Oldowan assemblages are, in contrast, clearly the result of the intentional controlled fracture of stone by Oldowan hominins.” They also argue that the rapid increase in brain size that occurred in early Homo must surely mean something: “The difference in brain size between early Homo (650 cm3) and chimpanzees/bonobos and early australopithecines (∼400 cm3) shows an increase in Homo of about 60 percent within one million years. The authors contend that there must have been strong selective forces for this to happen, and that selection was almost certainly involving higher cognitive abilities in foraging, social interaction and communication.” Nevertheless, they candidly acknowledge that not all authorities agree with their upbeat assessment of Oldowan tools, adding that it is their opinion that “the hominins responsible for Oldowan sites herald a new and more complex form of cognition and behaviour”:

There is clearly a wide variety of opinions regarding the cognitive abilities of early hominins, ranging from the view that hominins were essentially like modern apes to that which sees them as having evolved to a new, more human-like threshold of cognitive abilities.”

In view of the vigorous disagreements among experts in the field regarding the level of cognitive sophistication and behavioral complexity required to manufacture Oldowan tools, it would be highly unwise, in my opinion, to use them as a litmus test of true humanity.

By contrast, no non-human primate has ever created anything like an Acheulean hand-axe, so that makes a better place to commence our investigation of when the first true humans appeared. In a recent groundbreaking study involving brain images taken of human volunteers while learning how to fashion Oldowan and Acheulean tools (“Cognitive demands of Lower Palaeolithic toolmaking”, PlosOne 10, e0121804 (2015) doi:10.1371/journal.pone.0121804), Dietrich Stout et al. demonstrated convincingly that the manufacture of Acheulean tools would have required more complex neurophysiological skills than that of Oldowan tools. The study authors collected structural and functional brain imaging data as volunteers made “technical judgments (outcome prediction, strategic appropriateness) about planned actions on partially completed tools” and found that performing these tasks measurably altered neural activity and functional connectivity in a particular region of the brain which is commonly regarded as the “central executive” of working memory, on account of its role in the selection, monitoring and updating of incoming information (the dorsal prefrontal cortex). It was discovered that the magnitude of this alteration correlated with the success of the strategies employed by the novice toolmakers. A correlation was also observed between the frequency of correct strategic judgments and the volunteers’ success in fashioning Acheulean tools; however, no such correlation was found for Oldowan tools, indicating that a lower level of cognitive skills would have been required to manufacture these tools.

—————————————————————-

(ii) Can “Acheulean Adam” be equated with any particular species of Homo in the fossil record?

RETURN TO MAIN MENU


Above: Facial reconstruction of the so-called “Turkana boy”, based on the skeleton of a 1.5 million-year-old Homo ergaster specimen found at Lake Turkana, Kenya. Homo ergaster was the hominin which fashioned the first Acheulean tools, some 1.76 million years ago. Image courtesy of Wolfgang Sauber (photograph), E. Daynes (sculpture) and Wikipedia.

The first hand-axes appear in the archaeological record about 1.76 million years ago. This era in tool-making is called the Acheulean era, so I’ll use the term “Acheulean Adam” to describe the hypothetical first individual or group of people that gained the ability to master this technology. (I should point out here that the term “Acheulean tools” includes not only hand-axes but also picks, cleavers and other large cutting tools, but in what follows, I’ll be focusing on hand-axes.)

The very first Acheulean tools were made by Homo ergaster (known in Europe and Asia as Homo erectus), the first hominin with a recognizably human body and a significantly larger brain than an ape’s. Homo ergaster first appeared in east Africa about 1.9 million years ago; his Asian counterpart, Homo erectus, is believed to have appeared around 1.8 million years ago. It is currently uncertain as to whether Homo ergaster started making Acheulean tools as soon as he first appeared in Africa, or whether he initially continued making and using only Oldowan pebble tools, like those made by Homo habilis, before inventing Acheulean tools. In the judgement of one eminent researcher, Ignacio dela Torre, “there is now evidence that the Acheulean appeared at least 1.75 Ma [million years ago] in the East African Rift Valley, which on an evolutionary scale coincides with the emergence of H. erectus” – a term he uses to include Homo ergaster (“The origins of the Acheulean: past and present perspectives on a major transition in human evolution”, Philosophical Transactions of the Royal Society of London B: Biological Sciences, 2016 Jul 5; 371(1698): 20150245).

To be sure, the appearance of Acheulean tools did not immediately kill off Oldowan pebble toolmaking: authors J. A. Catt and M. A. Maslin, in their 2012 article “The Middle Paleolithic” (in Chapter 31, section 31.5.2 of The Geologic Time Scale 2012, Elsevier, ed. Felix M. Gradstein et al.) note that “the Acheulian and Oldowan cultures are found together in East and South Africa between about 1.9 Ma [million years ago] and 1.5 Ma” – a fact which they attribute to “the long overlap in time between late H. habilis and early H. erectus.” Although we have no direct evidence for the hypothesis that early Acheulean tools were only manufactured by Homo erectus, authors Rosalia Gallotti and Margherita Mussi articulate the common view in their article, “Two Acheuleans, two humankinds: From 1.5 to 0.85 Ma at Melka Kunture (Upper Awash, Ethiopian highlands)” (Journal of Anthropological Sciences, Vol. 95 (2017), pp. 137-181), when they state: “Even if multiple hominin genera and species co-existed at the time, there is a general understanding that the new [Acheulean] techno-complex’s only knapper was Homo erectus/ergaster...” (p. 138).

However, F. Diez-Martín et al. question the popular view that Acheulean technology is the signature trait of Homo erectus/ergaster in their article, The Origin of The Acheulean: The 1.7 Million-Year-Old Site of FLK West, Olduvai Gorge (Tanzania) (Nature, Scientific Reports volume 5, Article number: 17839 (2016)):

The Acheulean technology has been argued to be the hallmark of H. erectus. However, at present this interpretation must be nuanced in the light of hominin types chronologically co-occurring with this and other technologies. First, the presence of H. erectus (or H. erectus-like) fossils antedate the earliest evidence of this technology by at least 200 Ka [200,000 years – VJT] (e.g., the 1.9 Ma KNMER 3228 or OH86)[54] and they occur at a time in which only classical Oldowan is documented. Secondly, there are H. erectus remains directly associated with typologically and technologically Oldowan assemblages (e.g., Dmanisi at 1.7 Ma)[55]. Thirdly, the traditional association of classical Oldowan and H. habilis from Olduvai Bed I has been challenged by the presence of a H. erectus-like hominin (OH 86) at this time[54].”

The authors attribute the relatively sudden appearance of Acheulean tools in the archaeological record to “major climatic changes towards aridity” occurring around 1.7 million years ago in east Africa, which would have favored the appearance of new and more flexible patterns of hominin behavior, and conclude: “The coincidence in time of these climatic changes and the occurrence of the earliest Acheulean would suggest a climatic trigger for this technological innovation and its impact in human evolution.”

Finally, in their article, Early Pleistocene occurrence of Acheulian technology in North China (Quaternary Science Reviews 156 (2017) 12-22), Xingwen Li et al. summarize the evidence for the earliest appearance of Acheulian tools, outside of East Africa:

“Current thinking is that Acheulian technology originated in East Africa (possibly West Turkana, Kenya) at least 1.76 million years ago (Ma) (Lepre et al., 2011), that it became distributed somewhat widely across Africa (e.g., Vaal River Valley and Gona) at ~1.6 Ma (Gibbon et al., 2009; Semaw et al., 2009), and then spread to the Levant at ~1.4 Ma (Bar-Yosef and Goren-Inbar, 1993), South Asia at 1.5-1.1 Ma (Pappu et al., 2011), and Europe at 1.0-0.9 Ma (Scott and Gibert, 2009; Vallverdú et al., 2014) (Fig. 1). The 0.8-0.9 Ma Acheulian stone stools from South and central China (Hou et al., 2000; de Lumley and Li, 2008) (Fig. 1) suggest that Acheulian technology arose in China at least during the terminal Early Pleistocene.”

The authors also report the recent discovery of Acheulean tools in North China, dating back to 900,000 years ago, which is about the same time that they appeared in Europe. As early as these dates may be, they are much later than the dates for the first appearance of Homo erectus in Europe and Asia. Homo erectus appears in Dmanisi, Georgia around 1.8 million years ago, in Yuanmou, China, 1.7 million years ago, in Sangiran, on Java, Indonesia, at least 1.66 million years ago, and in Europe at least 1.4 million years ago. Thus in many areas outside Africa, Homo erectus continued using Oldowan pebble tools for several hundred thousand years after he first appeared, and in some parts of east and south-east Asia, Homo erectus seldom or never produced hand-axes. (The Movius line, shown on the map below, indicates the boundary between areas where hand-axes were and were not created by Homo erectus.) What this research suggests is that the very first Homo erectus hominins to have left Africa might not have taken Acheulean culture with them, which would make more sense if they did not yet possess the technology.

Taken together, these findings make it difficult for us to equate “Adam” with the earliest members of Homo ergaster (or Homo erectus), as some creationists and/or Intelligent Design theorists have attempted to do. The picture is much more complicated than that. Although Homo ergaster/erectus was the first hominin to create Acheulean tools, this species didn’t do so right away. Over 100,000 years may have elapsed before Homo ergaster invented this technology, while its appearance in Asia and Europe was much later.

A short note on the Movius line

Map showing the approximate distribution of cultures using Acheulean bifaces during the Middle Pleistocene. In the brownish areas, Homo erectus used Acheulean hand-axes; in the blue-green areas, more primitive Oldowan pebble tools were used. (Australia and New Guinea were uninhabited at the time.) The line dividing the two areas is called the Movius line. In recent years, however, hand-axes have been found about 1,500 kilometers east of the Movius line, in China and South Korea, as well as at locations in Indonesia, causing some authorities to suggest that we should throw out the concept of the Movius line altogether. Other experts continue to defend the validity of the concept. Image courtesy of Locutus Borg, José Manuel Benito Álvarez and Wikipedia.

The concept of the Movius line is somewhat controversial among archaeologists today. Cassandra M. Turcotte, of the Center for the Advanced Study of Hominid Paleobiology, neatly summarizes the reasons that led archaeologist Hallam Movius to draw this boundary back in 1948, in her online article, Stone Tools (2016), written for the Bradshaw Foundation of Paleoanthropology:

“Interestingly, the Acheulean failed to permanently penetrate much of Asia and the boundary dividing the parts of Asia with and without handaxes was termed the ‘Movius’ line. The archaeologist for which the line was named, Movius, determined that this was because Asia was a continent of cultural stagnation. Other researchers have variably attributed the absence to availability of organic tools (i.e., bamboo shoots) or the continued use of Mode 1 technologies in eastern Asia while Europe, Africa and the Asian subcontinent moved on to the Acheulean (Lycett and Bae, 2010). Eventually, Acheulean-type technologies were discovered in eastern Asia, although always at low frequencies. Such sites include the Hantan River Basin in Korea and in the Bakosa Valley in Indonesia, where handaxes are found at higher frequencies but mostly less than 5 percent of the assemblage. Lycett and Bae (2010) propose five explanations for the possible low frequency, including the possibility that human dispersal into eastern Asia came before the innovation of the Acheulean… There are also raw material constraints, the possibility of geographical and topographical barriers, the issue of bamboo availability and questions of social transmission. In short, there are many reasons why the Acheulean never made a big splash in eastern Asia, although it certainly made it to the rest of the Old World and stayed there for over a million years.”

However, Professor Robin Dennell argues vigorously for a contrary position in his paper, Life without the Movius line: the structure of the east and Southeast Asian Early Palaeolithic (Quaternary International 400:14-22, 2 May 2016), where he concludes that “the Movius Line is no longer an appropriate way of studying the Early Palaeolithic of East and Southeast Asia, and should be disregarded.” Dennell’s key point is that although much of East and Southeast Asia would have been cut off from other Eurasian populations during periods of glaciation in the Pleistocene period, “East Asia was not isolated throughout the entire Early and Middle Pleistocene, but open to immigration during interglacials, as is indicated by its fossil hominin record.” Surprisingly, he calculates that the 1,500-kilometer distance from the Movius line to East Asia could have been traversed by a band of hominins within the space of a single generation, which means that there is no reason to suppose that technical skills (such as the ability to fashion an Acheulean hand-axe) would have been lost by falling out of living memory during the long trek: older individuals would still have possessed technical know-how, and could have passed it on to younger members of the tribe, allowing the technology to reappear in East Asia. Dennell also argues that Acheulean tools have been found in China and Korea (although as Turcotte notes above, they are relatively rare), and suggests that Movius made an unfortunate choice in selecting the archaeological site of Zhoukoudian (near Beijing) as a typical example of Chinese toolmaking during the Middle Pleistocene period, as it is a cave, and most Acheulean tools were made in open-air localities. All across Eurasia, there are many other cave sites that lack Acheulean bifaces, or where bifaces are rare. In other words, Movius was comparing apples and oranges: he should have compared open-air sites in Europe with those in East Asia.

However, Emeritus Professor Fumiko Ikawa-Smith (Department of Anthropology, McGill University) argues that the Movius line remains a genuine paradox, in an article on the subject written for the Encyclopedia of Global Archaeology (ed. Claire Smith, Springer, New York, 2013). Notwithstanding the discovery of Acheulean tools in China and Southeast Asia, Ikawa-Smith contends that the concept of the Movius line is a valid one, citing three reasons: the rarity of hand-axe sites in Eastern Asia; the fact that hand-axes account for less than 5% of artifacts at Korean sites and less than 7% at Indonesian sites; and finally, the fact that “hand-axes in Eastern Asia are morphologically different from those in the west: they are thicker, often with a plano-convex cross section.” She examines four possible explanations for the Movius line (namely, lack of available flint for making hand-axes at East Asian sites; barriers to migration and cultural diffusion; availability of alternative materials for toolmaking, such as bamboo; and loss of technical knowledge), rejecting the first and second explanations as inadequate while leaving the third and fourth possibilities open before concluding: “After decades of debate and a series of recent discoveries, the Movius Line still serves as an important demarcation line in the Pleistocene world.” Time will tell if Movius’s boundary was a valid one.

I should also note for the record that Oldowan-style tools persist in Europe until well after one million years ago.

—————————————————————-

(iii) Do Acheulean hand-axes appear suddenly in the archaeological record, or gradually?

RETURN TO MAIN MENU

Acheulean biface (trihedral), from Amar Merdeg, Zagros foothills, Lower Paleolithic. Image courtesy of National Museum of Iran and Wikipedia.

The first appearance of Acheulean tools in the archaeological record is relatively sudden. Author John Reader, in his highly acclaimed book, Missing Links: In Search of Human Origins (Oxford University Press, 2011) remarks on the sudden appearance of the Acheulean tool industry in east Africa:

“Since there is no sign of the Acheulean having been developed at Olduvai, its sudden appearance is generally credited to the arrival of more technically adept outsiders (though the innovation and perfection of the new technology could easily have been achieved at Olduvai. It need only have taken a few months; but even if it had taken years, or decades, the time involved was still too brief to have left any evidence of its development in the archaeological record). It could have been developed by Oldowans who split away from the main group; it could have been brought in by a different group that migrated into the region. But this is just speculation. All that is known for certain is that the Acheulean industry appears around 1.6 million years ago, almost simultaneously, at a number of sites across east Africa…” (pp. 306-307)

Reader’s book was written in 2011. Since then, the first appearance of Acheulean tools has been pushed back by an additional 160,000 years, from 1.6 million to 1.76 million years ago. However, we do not see a simultaneous flowering of the Acheulean tool industry at multiple sites, 1.76 million years ago. Instead, what we see is a slow takeoff of the new technology. In a recent paper titled, “Two Acheuleans, two humankinds: From 1.5 to 0.85 Ma at Melka Kunture (Upper Awash, Ethiopian highlands)” (Journal of Anthropological Sciences, Vol. 95 (2017), pp. 137-181), authors Rosalia Gallotti and Margherita Mussi acknowledge the somewhat sketchy nature of the early Acheulean archaeological record, between the first appearance of Acheulean tools 1.76 million years ago and their subsequent proliferation, 1.5 to 1.6 million years ago:

“The longest Acheulean Industrial Complex sequences are found in East Africa, where it emerged between 1.76 and ~1.50 Ma [million years ago – VJT]. It is known from a number of well-dated sites. The oldest ones are Kokiselei 4, in West Turkana (1.76 Ma; Lepre et al., 2011); KGA6-A1(1.75 Ma), and KGA4-A2 (1.6 Ma) in Konso (Beyene et al., 2013); FLK West (~1.7 Ma) at Olduvai (Diez-Martín et al., 2015); and BSN-12 and OGS-12 at Gona (1.6 Ma; Quade et al., 2004).

Around 1.5 Ma, the number of Acheulean sites increases, notably including Olduvai, Peninj, Nyabusosi, Melka Kunture and Gadeb … While not all the data from the earliest Acheulean sites (~1.7-1.6 Ma) have been published, all the lithic collections from the ~1.5 Ma Acheulan sites have been investigated in systematic technological studies.”

Given the dearth of Acheulean archaeological sites prior to 1.6 million years ago, it would be premature to conclude that Acheulean tools abruptly appeared 1.76 million years ago. What we can say instead is that during the 150,000 years subsequent to their appearance in the archaeological record, Acheulean tools were slow to proliferate across east Africa, for reasons which elude us.

Nevertheless, there are strong theoretical reasons for suspecting that Acheulean technology sprang on the scene very quickly. As authors S. Semaw, M. J. Rogers and D. Stout note in their 2009 article, The Oldowan-Acheulian Transition: Is there a “Developed Oldowan” Artifact Tradition? (chapter from Sourcebook of Paleolithic Transitions: Methods, Theories, and Interpretations, M. Camps, P. Chauhan (eds.), Springer Science+Business Media, LLC 2009, pp.173-193), “Our preliminary evaluation of the archaeological record at Gona, Ethiopia and elsewhere suggests a fairly abrupt appearance of the Acheulian after a temporally rapid transition from the Oldowan.” The authors add:

“The Oldowan and Acheulian entities appear to have been separated by a comparatively rapid change dependent on a single technical step which by its very nature could not have been taken gradually (G. Ll. Isaac 1969, 21).

“…Large flake production in the Early Acheulian thus involves different objectives, different raw materials, and different means of support, as well as much greater force, possibly involving different percussive techniques such as throwing (Toth 2001). This mode of flaking is qualitatively different from the production of Oldowan flakes and clearly represents a novel technological invention

“In sum, qualitative technological, behavioral,and cognitive differences between the industries make a ‘transitional’ industry difficult to envision. Essential neural, somatic, and behavioral preconditions must have been in place to afford the invention of this new technology; however, the technology itself represents a clear discontinuity.

However, the authors qualify their remarks later, when they suggest that the sudden appearance of Acheulean toolmaking technology may be due to the confluence of several different changes, each of which may have occurred gradually, during the late Pliocene and early Pleistocene epochs:

“Moreover, our discussion above and our work at Gona have led us to consider that the way the ‘Oldowan–Developed Oldowan–Acheulian’ transition has traditionally been conceived may be conflating separate cultural/technological/ecological changes occurring in the Late Pliocene/Early Pleistocene that may or may not be interconnected, such as: (1) the ability to knock off large flakes, (2) the ability to flake invasively and shape tools purposefully with predetermination or preconception of form, (3) the standardization of tool shape and/or technique, (4) changing diet and ranging patterns, (5) possible changes in group size and/or organization, and (6) possible changes in learning styles and abilities. Early Pleistocene hominins may have ‘experimented’ with these developments initially until all elements came together with the classic Acheulian.

To sum up: there are good reasons to believe that the appearance of Acheulean tools represents a sudden technological breakthrough. As we’ll see below, however, Acheulean tools continued to evolve and become more sophisticated during the next one million years after their appearance.

—————————————————————-

(iv) Were Acheulean hand-axes the product of genetically programmed instincts or cultural transmission?

RETURN TO MAIN MENU

It might seem obvious that prehistoric hominins had to be taught how to make Acheulean hand-axes: in other words, the skills required to fashion them were acquired by cultural transmission. However, this assumption has recently been questioned, on the grounds that some animals (e.g. certain species of birds) are capable of creating complex structures without the need for any kind of instruction, thanks to their genetically programmed instincts. It will therefore be instructive to examine the reasons why anthropologists are convinced that Acheulean hand-axes are cultural artifacts.

Raymond Corbey’s bombshell paper on Acheulean tools: are they really any more remarkable than what birds are capable of making?

The elaborate nest of a Baya weaver bird from Yelagiri, India. Some authors have questioned whether Acheulean hand-axes are any more impressive than birds’ nests, in their elaborate construction. Image courtesy of McKay Savage, Flickr and Wikipedia.

Until recently, it was generally assumed by anthropologists that Acheulean hand-axes were purely cultural artifacts, and that the techniques for making them were learned by being carefully handed down from generation to generation. On the surface, this looks like a reasonable hypothesis: after all, hand-axe construction would have required multiple decisions to be made at many different stages. Acheulean toolmakers needed to decide on the size, shape, and quality of the raw material, as well as adding the finishing touches. But as Raymond Corbey et al. point out in an article provocatively titled, “The acheulean handaxe: More like a bird’s song than a beatles’ tune?” (Evolutionary Anthropology, Jan/Feb. 2016, Volume 25, Issue 1, pp. 6-19), many birds have to do much the same thing when making their nests (for instance, British long-tailed tits build their nests from thousands of pieces of lichen, moss and spiderweb, plus numerous feathers), and it’s now generally agreed that nest-building in birds is largely (though not entirely) genetic. And if hand-axes were cultural artifacts, then why do they vary so little, from one place to another around the world? One would have expected local traditions to spring up, but they didn’t. Another difficulty for the view that hand-axes were cultural artifacts is that after the Acheulean era ended around 300,000 years ago, the rate of technological change suddenly increased. Why is this? Finally, anthropologists once believed that Acheulean hand-axes were only produced in Africa, Europe, the Middle East, India and parts of South-east Asia, but never in East Asia. The so-called Movius line, named after archaeologist Hallam Movius, runs through northern India, and marks the northernmost boundary for handaxes discovered in Asia. Recently, however, Acheulean handaxes were found in China and South Korea. Despite being hundreds of kilometers north-east of the Movius line, they look pretty much like the ones found elsewhere in the world. If hand-axes were cultural artifacts, we would expect some variation. Corbey et al. conclude that the techniques used to fashion Acheulean hand-axes had a substantial genetic component.

The case for Acheulean tools having been cultural artifacts

One of the hundreds of handaxes discovered by archaeological excavation at Boxgrove, Kent, U.K. This one, which was probably fashioned by Heidelberg man, was found in 2011. Age: approximately 500,000 years. Image courtesy of Midnightblueowl and Wikipedia.

On the other hand, psychology professor Frederick Coolidge and anthropology professor Thomas Wynn of the University of Colorado argue in an article in Psychology Today (April 2, 2015) titled, “The Hand-axe Enigma”, that the Acheulean hand-axe was not merely a functional artifact, but a thing of beauty, as the hand-axe in the photo above reveals. The authors point out that much more time was spent making hand-axes than would have been required to make them functional, that very few hand-axes show evidence of being used despite having been manufactured in their thousands, that some gigantic hand-axes were too big to have been used, and that even though even though the basic design of hand-axes remained unchanged for one-and-a-half-million years, hand-axes themselves became more elegant over the course of time: more regular in their proportions, more symmetrical, and thinner.

There’s more. In a thoughtful response to Corbey et al., Thomas Wynn and John Gowlett contend that the Acheulean hand-axe doesn’t require a genetic explanation. Instead, they propose a mixed ergonomic-esthetic explanation. In their paper, titled, “The handaxe reconsidered” (Evolutionary Anthropology, Volume 27, Issue 1, January/February 2018, pages 21-29), Wynn and Gowlett argue that in a society which lacked the technology to make handles for things like knives, axes and spears – a technique known as hafting – but which nonetheless possessed the more rudimentary ability to break off pieces of stone from a larger rock – a technique known as knapping – the Acheulean hand-axe was a sturdy, hand-held tool that did the job: it was ideal for cutting up animal carcasses and removing the meat. Additionally, prehistoric humans had what Wynn and Gowlett describe as “an esthetic preference for regular forms with gradual curves and pleasing proportions.” In particular, they had a built-in preference for objects with bilateral symmetry. They also liked big beautiful objects more than small ones – which is why some hand-axes were deliberately designed to be over-sized. Finally, they tended to like objects that closely matched an ideal form or prototype, and they made every effort to deviate from that prototype as little as possible.

Wynne and Gowlett also highlight several reasons for believing that Acheulean hand-axes are best explained culturally, rather than genetically. Some of these reasons have been discussed above, so I won’t repeat them, but here are a few more: among primates in general, toolmaking is cultural behavior. Modern humans can learn how to make hand-axes from other individuals by cultural tradition, although as Dietrich Stout et al. (2015) point out, it typically takes hundreds of hours of practice to become adept. Most importantly, “handaxes can be represented as a package of ideas that occur in subsets in other artifacts in different combinations (e.g., in bifacial choppers, or pointed picks)… This ultimate independence of characteristics that appear to belong together in a package is one of the most compelling reasons for thinking of the handaxe as a cultural object like other cultural objects.”

The authors discount Corbey’s proposal that hand-axe production by Homo erectus might have been driven by a new gene that appeared 1.8 million years ago, on the grounds that virtually the same areas of the primate brain are activated by humans when manipulating tools as by apes and monkeys when manipulating objects. The authors point out that when making hand-axes, a special part of the brain’s parietal lobe, associated with planning of sequential action, is activated. This part of the brain receives input from another region, which is associated with recognizing shapes. Extra demand is placed on the brain’s working memory, as well. However, the authors qualify their remarks somewhat by acknowledging that on the whole, there are only minor differences in brain activation when individuals manufacture hand-axes as opposed to when they fashion simple (Oldowan-style) core and flake tools, which (as we’ve seen above) chimps are capable of making.

Finally, Wynne and Gowlett contend that what was most important about hand-axe production is that for the first time, prehistoric humans began paying attention to the objects they made, trying to refine their features and even carrying them around when they traveled, unlike other primates, who discard their tools when they have done the job they were designed for. From an early stage, makers of Acheulean hand-axes even differentiated their tools from other tool types, such as picks and cleavers. Taken together, all of this suggests that hand-axes were primarily a cultural phenomenon.

How well does cultural transmission explain patterns of variability in Acheulean hand-axes?

In a reply to Raymond Corbey et al. (2016) titled, “Less of a bird’s song than a hard rock ensemble” (Evolutionary Anthropology, 27 (1), pp. 9­-20, ISSN 1520­6505 doi: https://doi.org/10.1002/evan.21551), R. Hosfield et al. contest Corbey’s claim that Acheulean technology was geographically uniform and monolithic. Corbey et al. had argued that there should be a lot more variability in Acheulean hand-axes if they are the products of cultural transmission, because different environments should be generating different solutions and because social learning down the generations tends to lead to an accumulation of copying errors. In reply, Hosfield et al. argue that “the Acheulean handaxe record shows a mixture of short-term and local variability, alongside long-term and ‘global’ similarities.” They show that Acheulean hand-axes display not only “site-specific modal forms,” but also “locally expressed, short-lived, idiosyncratic traits,” and they attribute large-scale convergences in the form of the Acheulean hand-axe to a combination of two factors: “shared social learning and the limitations imposed by the mechanics of handaxe production.” The authors also query whether functional implements such as knives or multi-purpose tools should necessarily be expected to track environmental variation: one would not expect a knife made in Africa to look any different to one made in Europe, for instance.

Corbey et al. had also argued that the recent discovery of Acheulean tools in China, some 1,500 kilometers east of the Movius line, was more easily accounted for by the hypothesis that their manufacture was largely due to genetic programming. In reply, Hosfield et al. argue that Dennell (2016) has convincingly rebutted the entire concept of the Movius Line (a somewhat contentious assertion – see note above). More to the point, the authors cite evidence (discussed by Dennell) of interglacial connections between West Asia and East Asia, which prehistoric hominins could easily have crossed within a single generation. Consequently, there is no reason to suppose that culturally acquired knowledge would have been lost during the long trek to China. Corbey et al. had hypothesized that Homo erectus may have possessed a gene for hand-axe making that went into abeyance when earky hominins migrated east of the Movius line and were unable to find suitable materials for making hand-axes, but was reactivated when they finally reached East Asia and discovered these materials. However, Hosfield et al. dismiss this hypothesis as implausible and argue that it does a poor job of accounting for the data available to us.

The symmetry of the Acheulean biface is a deliberate feature, not an accident

In a recent paper titled, “Were Acheulean Bifaces Deliberately Made Symmetrical?” (Cambridge Archaeological Journal, July 2018, DOI: 10.1017/S095977431800032X), Ceri Shipton et al. carefully examine the hypothesis (favored by Raymond Corbey and numerous other authors cited in the article) that the symmetry of Acheulean bifaces (i.e. stone implements flaked on both faces) was an unintentional product of the manufacturing process, and present their grounds for rejecting it, in favor of the hypothesis that the symmetry of these tools was a feature deliberately imposed on them. In their conclusion, the authors summarize their reasons for rejecting the null hypothesis that the flaking of bifaces could have automatically rendered them symmetrical:

“The archaeological comparison of Acheulean bifaces and Middle Palaeolithic bifacial cores indicated that the high symmetry levels in the former are not the inevitable result of bifacial flaking around the perimeter. The archaeological data also indicated that there is no relationship between symmetry and reduction intensity among Acheulean bifaces, so bifaces do not inevitably get more symmetrical as they are worked more. Symmetry was achieved on Acheulean cleavers using a variety of reduction methods, including unifacial reduction, and sometimes with different reduction methods on different sides and different surfaces of the bifaces. Symmetry is therefore not an epiphenomenon of knapping technique, but is an independent property of these tools.

The authors adduce several considerations which point to the conclusion that these tools were deliberately designed to be symmetrical:

“In conjunction, these analyses show that symmetry was deliberately imposed on Acheulean bifaces. Similar patterns in biface symmetry are evident whether we are considering bifaces from Britain, East Africa, or India. Despite the diversity of hominin species producing Acheulean bifaces, symmetry appears to have been a consistent goal. Over the course of such a long-lived culture we should expect to see improvements in symmetry if it was a goal of hominin knappers, and enhanced knapping skills were afforded by traits such as increases in the size of brain regions involved in biface knapping; the invention of new knapping techniques; or the addition of adolescence as a life-history stage in which knapping could be mastered (Shipton 2018). On relatively short timescales there are not discernible improvements in Acheulean biface symmetry (White & Foulds 2018). However, when considering the overall duration of the Acheulean, symmetry in multiple planes and deliberately warped symmetry appear to emerge during its later stages (Wynn 1979;2000;2002).”

In view of the evidence marshaled by Raymond Corbey’s critics and summarized above, I think it is fair to conclude that Corbey’s daring proposal that the Acheulean hand-axe was largely the product of genetic programming has been refuted.

Two Acheuleans?

Left: An early Acheulean stone tool (left) compared with an Oldowan style stone tool from Dmanisi, Georgia (right, 1.8 mya, replica). Image courtesy of Gerbil and Wikipedia.
Right: A late Acheulean biface from St. Acheul, France. Date: between 500,000 and 300,000 BP. Image courtesy of Félix Régnault, Didier Descouens and Wikipedia.

As the picture above reveals, not all Acheulean tools are created equal. In a recent paper titled, “Two Acheuleans, two humankinds: From 1.5 to 0.85 Ma at Melka Kunture (Upper Awash, Ethiopian highlands)” (Journal of Anthropological Sciences, Vol. 95 (2017), pp. 137-181), authors Rosalia Gallotti and Margherita Mussi contest the longstanding assumption that the Lower Pleistocene Acheulean (LPA), which lasted from about 1,500,000 to 800,000 years ago, was a period of cultural stasis, with little or no innovation. Based on their excavations at Melka Kunture, in the Ethiopian highlands, the authors distinguish “two Acheuleans” during this period, which are sharply defined by clear-cut differences in various aspects of prehistoric hominins’ techno-economic behaviors. The oldest Acheulean tools (circa 1.5 million years ago) principally consist of massive scrapers, retouched large cores and cleavers, in addition to unmodified large flakes. By 1.0 to 0.85 million years ago, a different culture had emerged:

“From an archeological standpoint, we underline the emergence of new concepts and behaviors at this time, as exemplified by the record discussed above. The main ones are morphometric predetermination [i.e. the shape and size of the tools were planned with a specific goal in mind – VJT], shaping, systematic procurement at primary sources, systematic fragmentation of the LCT [large cutting tool] chaînes opératoires [i.e. a more complex and refined chain of procedures involved in making these tools – VJT], and greater knowledge of the paleolandscape as related to resource exploitation.”

It seems to me that the authors make a convincing case, here.

Left: “Rhodesian man” [based on the Kabwe 1 cranium], an African subspecies of Heidelberg man, a species thought by some to have given rise to Homo sapiens. Museo de la Evolución Humana (Burgos), Spain. Image courtesy of Élisabeth Daynès (forensic reconstruction), Zarateman (photograph) and Wikipedia.
Right: Reconstruction of Homo antecessor, based on fragments from Gran Dolina, Atapuerca, Spain. Some experts believe that Homo antecessor is a more likely ancestor of Homo sapiens than Heidelberg man. Museo de la Evolución Humana (Burgos), Spain. Image courtesy of Élisabeth Daynès (forensic reconstruction) and Wikipedia.

——————–

More controversially, the authors propose that this sharp cultural discontinuity is related to the transition from Homo ergaster/erectus to Homo heidelbergensis (commonly known as Heidelberg man). I should point out, however, that the taxon Homo heidelbergensis is poorly defined, and some authorities even question its validity. I should add that the taxon Homo antecessor is used to designate archaic humans with a “unique mix of modern and primitive traits” living in Europe between about 1.2 million and 0.8 million years ago. Perhaps this taxon would be more appropriate, given the time period under consideration: most fossils of Heidelberg man date from between 700,000 and 300,000 years ago.

—————————————————————-

(v) What kind of minds did Acheulean toolmakers possess, and did they use language?

RETURN TO MAIN MENU

One thing we do know for sure: the hominins who manufactured Acheulean tools were skilled toolmakers. From studies on human volunteers, we now know that it would have taken hundreds of hours of practice for them to learn how to make Acheulean hand-axes.

Dietrich Stout et al. discuss the cognitive demands of Acheulean toolmaking, in a 2015 study of human volunteers (archaeology students from the University of Exeter, U.K.) who were taught how to fashion Oldowan and Acheulean tools (“Cognitive demands of Lower Palaeolithic toolmaking”, PlosOne 10, e0121804 (2015), doi:10.1371/journal.pone.0121804):

“Stone toolmaking is a demanding technical skill that can take years to master. With an average of 167 hours practice over 22 months, our subjects gained competence in [Oldowan] flake production but showed less improvement in handaxe-making. This provides a reference point for estimating the learning investments of Paleolithic toolmakers [17, 34].”

Additionally, it has been shown above that the hominins who manufactured Acheulean hand-axes had an eye for symmetry, and, after about 750,000 years ago, an eye for beauty as well.

But the question that will surely be uppermost in the minds of most readers is: were Acheulean toolmakers capable of using language?

The case FOR language use by Acheulean toolmakers

In a recent paper titled, “An overview of the cognitive implications of the Oldowan Industrial Complex” (Azania: Archaeological Research in Africa, 2018, Vol. 53, No. 1, 3–39), Nicholas Toth and Kathy Stick cite a 2015 study by T. Morgan et al., titled, “Experimental evidence for the co-evolution of hominin tool-making teaching and language” (Nature Communications 6, 6029 (2015), https://doi.org/10.1038/ncomms7029) suggesting that whereas the skills involved in making Oldowan tools were able to be imparted without the use of language, a much stronger case could be made that language (or at least, proto-language) would have been required in order to teach an absolute novice how to make an Acheulean hand-axe:

“In a study of teaching novices to produce Oldowan-like artefacts, Morgan et al. (2015) examined the premise that, in view of its probable social transmission, stone toolmaking spurred the evolution of teaching and language in our lineage. Using experiments in teaching novices to make stone tools, they explored five different avenues of learning the task: reverse engineering (from observation of the final artefact product); imitation/emulation (simple observation of the knapping operation); basic teaching (soliciting attention from trainee during the knapping); gestural teaching (emphasising aspects of the task with gestures); and verbal teaching (accompanying the knapping procedure with verbal instructions). They found that teaching with language was far superior to either imitation or emulation in transferring toolmaking skills among individuals. They concluded, however, that Oldowan toolmaking may have depended on imitation and emulation (observational learning) for transmission among groups and across generations, which they refer to as “low-fidelity social transmission” and suggest this as a reason for the relatively low rate of change in the Oldowan over many hundreds of thousands of years, while contending that Acheulean technology may have required teaching or ‘proto-language.’

However, in their 2015 paper, Morgan et al. are highly tentative about their proposal that Acheulean technology may have required the use of language. After summarizing their reasons for rejecting the view that Oldowan tool-makers used language, they continue:

“This leaves open the possibility that the transmission of Acheulean technology was reliant on a form of (gestural or verbal) proto-language (12,60,61). This need not imply that Acheulean hominins were capable of manipulating a large number of symbols or generating complex grammars. Our findings imply that simple forms of positive or negative reinforcement, or directing the attention of a learner to specific points (as was common in the gestural teaching condition), are considerably more successful in transmitting stone knapping than observation alone.”

The authors conclude by observing that their results imply that “hominins possessed a capacity for teaching — and potentially simple proto-language — as early as 1.7 mya [million years ago],” but caution that it arose very gradually, as the demands of Acheulean tool-making gradually become more complex over the course of time and relied increasingly on what they call “long sequences of hierarchically organized actions,” thereby generating selection for increasingly complex modes of communication. They believe that this selective pressure could explain the evolution of human language:

“Under this continued selection, teaching, symbolic communication and eventually verbal language may have been favoured, allowing the ready transmission of abstract flaking concepts, such as the role of the exterior platform angle in choosing where to strike(38), which our findings shown are effectively transmitted by language.”

The case AGAINST language use by Acheulean toolmakers

We have seen that the case for language use by Acheulean toolmakers is far from compelling. Is there any evidence against the claim that the makers of Acheulean tools were capable of using language? As it turns out, there is.

In 2017, Shelby Putt et al. conducted a groundbreaking study, reported in a journal article titled, “The functional brain networks that underlie Early Stone Age tool manufacture” (Nature Human Behaviour 1(6):0102, May 2017, DOI: 10.1038/s41562-017-0102), on two groups of subjects who were taught to make Oldowan and Acheulean artefacts. One group was taught verbally, while the other group was taught non-verbally. The study employed a brain-imaging technique called functional near-infrared spectroscopy (fNIRS). The reason why one group was given non-verbal instructions was that previous brain imaging studies of toolmaking had suggested that internal verbalization (which occurs when people listen to spoken instructions) might actually be interfering with the task of toolmaking, and the researchers wanted to prevent this from happening. Intriguingly, Putt et al. found that verbal instructions did indeed interfere with the task of making Oldowan tools: “The Oldowan task also appears to come under increased cognitive control when it has been learned in the absence of verbal instruction.” Differences were also observed in the task of making Acheulean tools: “Only participants in the nonverbal group emphasized sound and tactile sensation as important to their thought process while knapping.” (The term “knapping” refers to the action of shaping a piece of stone by striking it, so as to make a tool or weapon.) The authors explain why the ability to discriminate sounds is so important, when making Acheulean tools:

“We propose that, like the processing of an auditory speech stream, Acheulian knapping requires the knapper to discriminate between knapping sounds and to assign meaning to those sounds based on how they relate to the hierarchy of goals involved in making a handaxe (for example, how does this strike and its associated sound get me closer to setting up an ideal platform to remove a flake that will be long and thin enough to remove this nearby convexity; how does this strike and its associated sound relate to the overall shape of the handaxe that I am trying to achieve).”

To bolster this hypothesis, “fossil and neuroarchaeological evidence now show that a major shift in hominin auditory processing occurred after Homo diverged from Australopithecus and Paranthropus and before the appearance of H. heidelbergensis.”

Toth and Stick (2018) elaborate on Putt’s findings:

“Interestingly, the cognitive network seen in Acheulean handaxe production was associated with the visual working memory network (of the middle and superior temporal cortex) and was almost identical to that of trained pianists playing the piano (as opposed to speech), leading the researchers to suggest that this cognitive network was critical in audiomotor integration (the Oldowan artefact production was much weaker in these regions). They go on to suggest that Oldowan toolmakers prior to 1.8 Mya may have had more ape-like cognitive abilities, primarily involving the co-ordination of visual attention and motor control, while Acheulean toolmakers (probably the larger-brained Homo erectus) had more human-like cognitive abilities, requiring ‘the integration of higher-order motor planning, working memory and auditory feedback mechanisms’ (Putt et al. 2017: 4). This experimental work supports a working memory hypothesis rather than a language area hypothesis.

Putt et al. summarized their findings:

“Here we show that Acheulian tool production requires the integration of visual, auditory and sensorimotor information in the middle and superior temporal cortex, the guidance of visual working memory representations in the ventral precentral gyrus, and higher-order action planning via the supplementary motor area, activating a brain network that is also involved in modern piano playing. The right analogue to Broca’s area — which has linked tool manufacture and language in prior work — was only engaged during verbal training. Acheulian toolmaking, therefore, may have more evolutionary ties to playing Mozart than quoting Shakespeare.

The findings of Putt et al. make it clear that Acheulean toolmaking was a highly sophisticated cognitive task. However, in my judgement, it would be unwarranted to conclude that the makers of early Acheulean tools (which were much less sophisticated than their later Acheulean counterparts) possessed what we would call a proper language. “Simple forms of positive and negative reinforcement” (Morgan et al., 2015) do not qualify as language, and neither does directing someone’s attention to something. Additionally, the findings of Putt et al. constitute strong prima facie evidence against the hypothesis that Acheulean toolmakers relied on verbal instructions either to fashion their tools, or to teach other individuals how to do so.

—————————————————————-

Conclusion

To sum up: “Acheulean Adam” appears to have emerged in east Africa, at least 1.76 million years ago. The emergence of this hypothetical figure probably did not coincide with that of Homo ergaster, but occurred over 100,000 years later, and the first Homo erectus in Asia lacked the capacity to make these tools. The technology which gave rise to Acheulean tools seems to have appeared suddenly, possibly as a fortuitous confluence of several different technical skills that evolved gradually.

But was “Acheulean Adam” truly human? A very strong case can therefore be made for the claim that Acheulean tools were genuinely cultural artifacts, and that they were not simply mindlessly copied from generation to generation. Nevertheless, some troubling questions remain. For instance, why did it take Homo erectus (and his descendants) more than one million years to come up with the simple idea of attaching a handle to a spear point, in order to give it extra range? Granting that hand-axes became more elegant and aesthetically pleasing with the passage of time, isn’t it true nonetheless that the earliest hand-axes were quite crude and unsophisticated? Finally, it appears that the techniques for manufacturing early Acheulean hand-axes could have been imparted without the use of language – and if any language were used, it would have been very rudimentary, and it would have taken hundreds of thousands of years to evolve into what we would recognize as a proper language. For these reasons, we might reasonably hesitate to count “Acheulean Adam,” the first hand-axe maker, as being truly one of us.

————————————————————————————————————

2. Fire-controller Adam

RETURN TO MAIN MENU

A diorama showing Homo erectus, the earliest human species that is known to have controlled fire, from inside the National Museum of Mongolian History in Ulaanbaatar, Mongolia. Image courtesy of Nathan McCord (U.S. Marine Corps) and Wikipedia.

Overview: The ability to control fire didn’t emerge overnight. There is evidence that Homo ergaster (African Homo erectus) was able to use fire opportunistically as early as 1.5 million years ago, but it appears that Homo ergaster did not learn how to control fire until about 1 million years ago. And it was only between 800,000 and 400,000 years ago that humans learned to control fire on a regular and habitual basis – almost a million years after their ancestors first started using fire, on an irregular basis. What’s more, the discovery of fire-manufacturing technology (which may have been quite late in human prehistory) probably occurred in multiple places and at multiple times, and may well have been lost and rediscovered multiple times, as well. The evidence from fire control therefore militates strongly against the view that our ancestors literally became human overnight. Fire control and firemaking were skills acquired only gradually, and their acquisition certainly wasn’t a single, linear process.

Mastery of fire: Recommended Reading

Gowlett, J. A. J. (2016). “The discovery of fire by humans: a long and convoluted process”. Philosophical Transactions of the Royal Society of London B (Biological Sciences), 371(1696): 20150164.
Sandgathe, D. (2017). Identifying and Describing Pattern and Process in the Evolution of Hominin Use of Fire. Current Anthropology Volume 58, Supplement 16.
Sandgathe, D. and Dibble, H. (2017). Who Started the First Fire? Sapiens blog article, January 26, 2017.
Shimelmitz, R. et al. (2014). “‘Fire at will’: the emergence of habitual fire use 350,000 years ago”. Journal of Human Evolution, 77:196-203.

Background

After the invention of Acheulean tools, the second point where one might attempt to draw the line between humans and non-humans is the control of fire – a discovery which Charles Darwin, in chapter IV of his work, The Descent of Man, hailed as “probably the greatest, excepting language, ever made by man.” I’ll give the name “Fire-controller Adam” to the first human individual (or group) that learned to control fire. Fire is useful in many ways: as a source of warmth; as a means to ward off predators at night; as a way of killing animals en masse while hunting, by panicking them into stampedes over cliffs; as an engineering tool, for hardening spears and other weapons; and last but not least, for cooking: as Darwin pointed out, fire makes hard and stringy roots digestible, and poisonous roots or herbs innocuous.

Richard Wrangham of Harvard University controversially maintains that our ancestors started using fire to cook food as early as 2 million years ago, with the appearance of Homo ergaster (or African Homo erectus). Wrangham believes that cooking explains why humans evolved smaller teeth and guts; however, one major drawback to his hypothesis is that currently there’s no archaeological evidence for cooking having taken place at such an early date – a point admitted by Wrangham himself in a 2013 paper titled, The evolution of human nutrition (Current Biology 2013 May 6; 23(9):R354-5. doi: 10.1016/j.cub.2013.03.061), where he observes that “[b]iological evidence suggests cooking might have been practised first by Homo around 2 million years ago, while archaeological evidence of the control of fire tapers gently away between 250,000 and 1 million years ago.”

Other scientists argue that cooking is not needed to explain the increase in hominin brain size. In a 2016 paper titled, Human Brain Expansion during Evolution Is Independent of Fire Control and Cooking (Frontiers in Neuroscience, 25 April 2016, https://doi.org/10.3389/fnins.2016.00167), authors Alianda M. Cornélio et al. critically examine the hypothesis that the cooking of food drove the expansion of brain size in early hominins, and find the arguments in its favor wanting. Instead, the authors believe that the increase in hominin brain size resulted from improvements in foraging efficiency – e.g. “including new sources of food in their diet, especially seeds and meat,” coupled with “the rise in the use of tools.” Control of fire did not occur until hominins had reached the required cognitive level. Cornélio et al. summarize their findings:

“In conclusion, the appealing hypothesis of thermal processing of food as a pre-requisite to brain expansion during evolution is not supported by archeological, physiological, and metabolic evidence. Most likely, the control of fire and cooking are rather a consequence of the emergence of a sophisticated cognition among hominins.”

Use, controlled use and habitual use of fire: some terminology

Anthropologists make a distincton between the ability to use fire, the ability to control fire and the habitual and repeated use of fire. However, Dennis Sandgathe, in his paper, Identifying and Describing Pattern and Process in the Evolution of Hominin Use of Fire, (Current Anthropology Volume 58, Supplement 16, 2017), criticizes terms such as “controlled use” and “habitual use” of fire, which he regards as unacceptably vague:

“Both terms might be appropriate and useful in certain circumstances, but neither has been well defined so that other researchers understand what is meant or intended by their use.”

We will return to these terminological issues below. For the time being, let us examine the claims made for hominins’ ability to use and control fire.

(a) The earliest evidence for the use of fire

The earliest evidence for the use of fire can be found at two sites in Kenya (Koobi Fora and Chesowanga) dating back to 1.5 and 1.42 million years ago, respectively, where fire may have been used by Homo ergaster (or African Homo erectus). Another site with evidence of fire use is Gadeb, in Ethiopia. However, it should be noted that some scholars contest these claims. In a recent 2019 article titled, “Hominin fire use in the Okote member at Koobi Fora, Kenya: New evidence for the old debate” (Journal of Human Evolution 133 (2019) 214-229), Sarah Hlubik et al. present “multiple lines of evidence from new fieldwork and experimentation indicating archaeological remains … from the site complex at FxJj20, Koobi Fora, Kenya,” which “show the possibility of human controlled fire” at the site, but the evidence is hardly conclusive: all it establishes is “the association of combustion and human behavior on at least one site… dated to 1.5 Mya [million years ago].” The authors conclude by acknowledging: “The question of whether and how human ancestors in the early Pleistocene used pyrotechnology cannot yet be answered.” It should also be reiterated that the opportunistic use of fire is not the same as the ability to control it.

(b) When did hominins learn to control fire?

The earliest control of fire 1 million years ago, by Homo ergaster

The earliest date at which Homo ergaster is known to have controlled fire is 1 million years ago. The Wonderwerk cave, in the Northern Cape province of South Africa contains burnt bone and plant remains dating from that time, which provide solid evidence that deliberate burning took place inside the cave. Announcing the discovery in a research article titled, “Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape province, South Africa” (PNAS May 15, 2012 109 (20) E1215-E1220; https://doi.org/10.1073/pnas.1117620109), Francesco Berna et al. note:

“The ability to control fire was a crucial turning point in human evolution, but the question when hominins first developed this ability still remains. Here we show that micromorphological and Fourier transform infrared microspectroscopy (mFTIR) analyses of intact sediments at the site of Wonderwerk Cave, Northern Cape province, South Africa, provide unambiguous evidence — in the form of burned bone and ashed plant remains — that burning took place in the cave during the early Acheulean occupation, approximately 1.0 Ma [1 million years ago – VJT]. To the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context.

However, it’s worth noting that at another nearby site, named Swartkrans, there are fragments of bones (some of which appear to have been butchered), possibly dating as far back as 1.5 million years ago, that were heated to high temperatures (600 degrees Celsius) which are usually only found in hearths. (The temperature of most forest fires is only 300 degrees.)

Controlled use of fire 790,000 years ago by Homo erectus at Gesher Benot Ya’aqov, Israel

Excavations at the Gesher Benot Ya‘aqov site. Image courtesy of www.archaeology.wiki.

Our next piece of evidence comes from a site called Gesher Benot Ya’aqov in Israel, dating from 790,000 years ago, where there are multiple layers of burnt materials, many containing charcoal or burnt wood, and some containing burnt tools. Some of these tools are thought to have been used by Homo erectus for hunting. The evidence for the controlled use of fire was first reported in a paper by Naama Goren-Inbar et al., titled, “Evidence of Hominin Control of Fire at Gesher Benot Ya’aqov, Israel” (Science, 30 Apr 2004: Vol. 304, Issue 5671, pp. 725-727). The abstract of the paper reads as follows:

“The presence of burned seeds, wood, and flint at the Acheulian site of Gesher Benot Ya`aqov in Israel is suggestive of the control of fire by humans nearly 790,000 years ago. The distribution of the site’s small burned flint fragments suggests that burning occurred in specific spots, possibly indicating hearth locations. Wood of six taxa was burned at the site, at least three of which are edible—olive, wild barley, and wild grape.”

In a follow-up paper (Out of Africa and into Eurasia with controlled use of fire: Evidence from Gesher Benot Ya’aqov, Israel, Archaeology Ethnology and Anthropology of Eurasia 28(1):63-78, December 2006), co-authors Nira Alperson-Afil and Naama Goren-Inbar explained that they relied on burned flint artifacts, including a small fraction that was found to be spatially clustered, to determine whether the cause of the burnt materials found at the site was natural or artificial. The key assumption they used was that “natural wildfires result in extensive burning, while anthropogenic fires in the form of hearths result in spatially discernible clusters of burned material, specifically small-sized material” – an assumption which was based on “ethnographic and archaeological observations of hearth-related activities and discard patterns.” Across the world, humans tend to carry out a large range of activities, including social interactions, tool production, food processing, food consumption, and ritual ceremonies, in close proximity to hearths. This pattern of clustering was what enabled the researchers to conclude that as far back as 790,000 years ago, hominins living in Israel knew how to control fire. The authors considered various natural explanations, but rejected them, before concluding that they were artificial in origin:

“The paucity of burned items, their clustered distributions, and the fact that these are observed in two different occupation levels call for an interpretation other than naturally caused fire. Rather, they suggest that hominids were the agent responsible.

Control of fire by Peking man

Sculpture of Peking Man at the Zhoukoudian Museum, near Beijing. Image courtesy of BleachedRice and Wikipedia.

Over in China, the regular association of artifacts with burnt bone between 700,000 and 400,000 years ago provides strong evidence that Peking man (Asian Homo erectus) was able to control fire. New discoveries near Beijing (see also here, here and here) indicate that Peking man was able to not only use but also control fire, up to 770,000 years ago. Commenting on the site, Gowlett (2016) writes:

“Zhoukoudian near Beijing in China has been known for more than 80 years as a fire site [113,114]. Critiques have been made of its context, and on the nature of the ‘burnt’ material [115–118], much of which resulted from other natural processes. Nonetheless, the site is a record of the activities of Homo erectus in the period 0.4–0.7 Ma, with more than 100,000 artefacts, and preserving burnt bone [117,119,120]. The repeated associations argue for controlled fire [120].”

The new findings appear to preclude a natural origin for the burnt material found at Zhoukoudian, near Beijing, and the discovery in 2009 that Peking man lived during an ice age lends further support to the claim that he was able to control fire. Additionally, the existence of several cinder layers seems to point to the habitual and prolonged use of fire by Homo erectus in East Asia. However, it should be borne in mind that anthropological discoveries from China tend to get highly politicized, and this is particularly the case for Peking man. (For further details, see the article, “Is Peking Man Still Our Ancestor?” — Genetics, Anthropology, and the Politics of Racial Nationalism in China by Yinghong Cheng, Professor of History at Delaware State University, in The Journal of Asian Studies, Volume 76, Issue 3, August 2017, pp. 575-602.)

Fire was not used by Europeans until 300,000-400,000 years ago

In a ground-breaking paper titled, “On the earliest evidence for habitual use of fire in Europe” (PNAS March 29, 2011 108 (13) 5209-5214; https://doi.org/10.1073/pnas.1018116108), Wil Roebroeks and Paola Villa argue that the earliest hominins to enter Europe did so without relying on the habitual use of fire to stay warm. The authors point out that while the earliest evidence for hominins in southern Europe dates back to more than one million years ago, with indications that hominins reached England as far back as 800,000 years ago, the evidence for fire use in Europe is much more recent:

“…[S]urprisingly, evidence for use of fire in the Early and early Middle Pleistocene of Europe is extremely weak. Or, more exactly, it is nonexistent, until ∼300–400 ka [300,000 to 400,000 years ago – VJT]. Our review of the early European sites (Dataset S1) shows that the earliest possible evidence of fire comes from two sites dated to ∼400 ka [400,000 years ago – VJT], Beeches Pit in England and Schöningen in Germany. At Schöningen, the evidence consists of some heated flints (although mostly natural pieces) (22) and charred wood, including a wooden tool, with the studies of possible remains of former hearths still in progress (23). At Beeches Pit, dated to Marine Isotope Stage (MIS) 11 [424,000 to 374,000 years ago – VJT] (Dataset S1), the evidence consists of heated lithics and heated sediments (24, 25), interpreted as the remains of hearths.”

At the European site of Schöningen, in Germany, the evidence of fire cited above dates back to as early as 337,000 years ago (not 400,000 years ago, as previously believed). [Intriguingly, the discovery of short wooden staves with deep notches in the ends (discussed in Gowlett, 2016, cited above) tantalizingly suggests that they may have been hafted onto stone points, with the aid of twine heated in a fire.]

One possible reason for why both the use and the control of fire came so late to Europe, suggested by Sandgathe, is the relatively low frequency of lightning strikes, compared with subtropical latitudes. Lightning strikes often start small fires, which hominins living hundreds of thousands of years ago could have learned to harness. European hominins would have been at a disadvantage here.

(c) When did the earliest habitual use of fire begin?

Surprisingly, it is not until 350,000 years ago that we possess unequivocal evidence for the habitual use of fire. In their paper, “‘Fire at will’: the emergence of habitual fire use 350,000 years ago” (Journal of Human Evolution, December 2014; 77:196-203), Ron Shimelmitz et al. contend that “only when fire use became a regular part of human behavioral adaptations could its benefits be fully realized and its evolutionary consequences fully expressed,” and he argues that this did not take place until the second part of the Middle Pleistocene period. They write:

“Frequencies of burnt flints from a 16-m-deep sequence of archaeological deposits at Tabun Cave, Israel, together with data from the broader Levantine archaeological record, demonstrate that regular or habitual fire use developed in the region between 350,000-320,000 years ago.

Deposits in the cave date from 420,000 to 220,000 years ago. In layers older than 350,000 years, we find very few burnt flints, but in every layer after that, there are lots.

Shimelmitz et al. contend that this is when humans learned to control fire on a regular and habitual basis. They contend that the Tabun cave is the oldest site attesting to regular and habitual control of fire, although there’s another one nearby (Qesem) where a hearth was used repeatedly to burn bones and heat stone flints, 300,000 years ago. The authors’ conclusions are telling:

“Initial dates for any historical phenomenon are minimum estimates and new discoveries almost inevitably push the ages of ‘firsts’ back in time. The archaeological signal is much stronger when behaviors become widespread and habitual. While the earliest evidence of fire associated with hominin activities is much older, the data presented here indicate that fire became a regular and constant part of hominin behavioral adaptations in Eurasia only after 350 kya. The benefits of fire for processing food, altering raw materials or enhancing social interactions would be fully realized only when use of fire shifted from opportunistic and occasional to habitual and regular.

While Shimlemitz et al. contend that fire use became habitual around 350,000 years ago, Sandgathe (2017) contends that the habitual use of fire occurred somewhat earlier in subtropical latitudes: around 800,000 to 400,000 years ago. However, he acknowledges that the evidence for this early date is less than compelling:

“…[T]he first appearance of examples of repeated fire use within a site rightly take on major significance in the history of development of pyrotechnology.

“The oldest of these is the open air site Gesher Benot Ya’akov (Israel) dated to approximately 800 kya, which appears to have a few superimposed layers with fire residues (Alperson-Afil 2017). However, the earliest unquestionable examples of hominin use of fire and long-term, continuous fire use occur in cave sites in Israel dating from the latter half of the Middle Pleistocene. Between 350 and 300 kya we have the notable examples of Hayonim Cave, Qesem Cave, and Tabun Cave, where the sequences have recorded what appear to reflect regular and successive use of fire over much of this period.”

Among the sites mentioned by Sandgathe dating from between 800,000 and 400,000 years ago, most are relatively late. Shimlemitz et al. date most of these to around 300,000 to 400,000 years ago, with the exception of Gesher Benot Ya‘akov, which goes back to 790,000 years ago. (Curiously, Sandgathe does not discuss the site of Zhoukoudian, near Beijing in China, apart from a brief mention, citing a 2001 paper by Goldberg et al., discussing what he describes as “the problems we often have in even identifying genuine anthropogenic fire residues in early sites.” The paper questions the evidence for a 4-6 meter-thick layer of ash at the site.)

(d) The use and control of fire: where the evidence currently stands

In the conclusion of his paper, Identifying and Describing Pattern and Process in the Evolution of Hominin Use of Fire (Current Anthropology Volume 58, Supplement 16, August 2017), Dennis Sandgathe candidly acknowledges that scientists “do not yet have proper evidence to make big claims about either the earliest fire use or about when fire use became a regular component of technological repertoires and hominin adaptations came to depend on it.” He concludes by putting forward some default assumptions that he suggests scientists should make, regarding the development of fire technology over the course of history:

“• The development of pyrotechnology must be assumed to have been a long, drawn-out process that was probably relatively complex.
Initial fire use was probably intermittent with frequent fits and starts, and this might have been the situation for a significant part of subsequent prehistory.
• Initial fire use was probably based on the exploitation of natural fire sources (mainly lightening-ignited vegetation where and when available) and perhaps included simple fire maintenance at some times and in some places.
• Before the development and discovery of fire-making technology, it is unlikely that regular (“habitual”?) use of fire appeared among all hominins in all regions or even a single region at the same time — it probably became more regular in certain regions or with certain populations for periods of time.
• The discovery of fire-manufacturing technology probably occurred in multiple places and potentially even multiple times in any one region.
Fire-manufacturing technology could well have been a relatively late development.
• Such technology may very well have been lost and rediscovered multiple times as well, either through group fissioning events or through local or regional extinction events of hominin populations.
• The evidence might suggest that fire had come to be used repeatedly and successively at a single site over a significant period of time, but this cannot be seen as de facto evidence for the regular, constant use by a population over an entire region, never mind a species.”

Sandgathe (2017) summarizes the current state of the evidence for the first use of fire, and the first regular and repetitive use of fire:

“Assuming all the current claims for very early hominin use of fire in these regions are correct, the data still reflect a very spotty, intermittent record (e.g., Koobi Fora FxJj20 [Hlubik et al. 2017], Chesowanja in Kenya, Gadeb in Ethiopia, and Swartkrans and Wonderwerk Cave in South Africa). Furthermore, the first evidence of regionally based, repetitive or successive fire use is again restricted to subtropical latitudes beginning only between 800 and 400 kya (Gesher Benot Ya‘akov, Tabun, Qesem, and Hayonim, all in Israel, and Cueva Negra in Spain). In latitudes above 35 degrees north, the earliest potential evidence for any fire use is quite late, at ca. 400 kya (e.g., Beeches Pit in the United Kingdom, Bilzingsleben in Germany, and Vértesszőlős in Hungary).”

But who was the first hominin to make fire? In a controversial article titled, Who Started the First Fire? on the Sapiens blog (January 26, 2017), Sandgathe and Dibble (2017) argue that although Neanderthals were able to harness the power of fire, they lacked the technology for starting fires. Since lightning strikes would have been relatively rare in the glacial conditions of Ice Age Europe, opportunities to steal fire from natural fires would have been few and far between – which meant that if they were unable to start a fire by themselves, they would have been forced to endure long periods without the benefits of fire, such as warmth and cooking. According to Sandgathe and Dibble, this is precisely what we find: “The evidence from [the sites of] Pech IV and Roc de Marsal clearly shows that the Neanderthals at these sites lived without fire not only for long periods but also during the coldest periods.” The authors suggest that the Neanderthals were able to survive on raw meat and other uncooked foods, and speculate that they may have been exceptionally hairy, while acknowledging that there is some evidence that they were capable of making clothing (probably simple capes, adorned with necklaces or feathered ornaments). In other words, Neanderthals did not depend on fire: Sandgathe and Dibble think it may have been Homo sapiens, in the Upper Paleolithic (40,000 to 10,000 years ago), who first acquired this dependence.

(Incidentally, it has sometimes been alleged that the Tasmanian Aboriginals lost the ability to manufacture fire because they were a small population that had been isolated from the Australian mainland for several millennia, but as Rebe Taylor convincingly demonstrates in an article titled, The polemics of making fire in Tasmania: the historical evidence revisited (Aboriginal History Journal, vol 32 2008), it is overwhelmingly probable that they were able to make fire, in the light of the historical testimony from chroniclers and scientific observers living in the the eighteenth and nineteenth centuries.)

Conclusion

To sum up: the first use of fire probably goes back to over 1,500,000 years ago, in Africa. Humans may have learned to control fire as early as 1 million years ago, but they did not do so habitually until between 800,000 and 400,000 years ago in subtropical latitudes – and even later in Europe. The ability to manufacture fire may have emerged much later – perhaps it was first acquired by Homo sapiens, in the Upper Paleolithic, but we cannot be sure. Moreover, this ability may have been acquired and lost in multiple locations and on multiple occasions. In any case, the ability to control fire does not appear to have emerged suddenly – a fact which weakens the hypothesis that our ancestors literally became human overnight.

————————————————————————————————————

3. Aesthetic Adam


RETURN TO MAIN MENU

A late Acheulean hand-axe from Egypt. Note the English words on the left lower part of this axe. Found on a hill-top plateau, 1400 feet above sea level, 9 miles NNW of the city of Naqada, Egypt. Paleolithic. The Petrie Museum of Egyptian Archaeology, London. With thanks to the Petrie Museum of Egyptian Archaeology, UCL. Image courtesy of Osama Shukir Muhammed Amin FRCP(Glasg) and Wikipedia.

Overview: The ability to create undeniably aesthetic objects first seems to have arisen around 750,000 years ago, in the Acheulean period, in Southern Africa. At first, however, these aesthetic objects were the striking exception, rather than the rule. It was not until 500,000 years ago that the creation of beauty became a common activity. It is therefore difficult to propose an instant in time to which we can date the creation of beauty for beauty’s sake, which is often said to be a defining trait of humans. All the evidence suggests that our sense of beauty emerged gradually.

The evolution of the human aesthetic sense: Recommended Reading
Berleant, R. (2007). Paleolithic Flints: Is an Aesthetics of Stone Tools Possible?. Contemporary Aesthetics, Volume 5, 2007. (This paper began as a plenary presentation at the conference on the Aesthetics of Stone and Rock, 6th International Conference on Environmental Aesthetics, held from 11-14 June 2007 at Koli, Finland.)
Coolidge, F. and Wynn, T., [co-author Kelly, A.]. (2015). “The Neuroaesthetics of Hand-axes”. Psychology Today, May 12, 2015.
Cope, M. (2013). On the Master Hand Axe from Kathu Pan. Blog article, March 29, 2013.
Hodgson, D. (2011). The First Appearance of Symmetry in the Human Lineage: Where Perception Meets Art. Symmetry 2011, 3, 37-53; doi:10.3390/sym3010037.

When did the human sense of beauty first emerge?

The third cut-off point for the transition from human-like beings to true human beings is the time when the first undeniably aesthetic artifacts were manufactured, about 750,000 years ago. I’ll bestow the appellation “Aesthetic Adam” on the hominin who first manufactured these artifacts. The first tool which clearly manifests the human capacity to make a thing of beauty is the ‘Master Hand-Axe’, which was found embedded in a stratigraphic sequence in a sinkhole at Kathu Pan in the Northern Cape, South Africa. It was dated to 750,000 years ago because it was found near some tooth-plates of an extinct elephant called Reck’s Elephant. What makes it special is that it is the oldest artifact that was obviously designed for the purpose of exhibiting beauty and symmetry, perfectly oriented, and fashioned with a lot more care than one would need in order to make a functional hand-axe, which could easily have been produced with half as many blows. This hand-axe is thought to have been produced by Homo erectus, known in Africa as Homo ergaster. Writer and jeweller Michael Cope describes the Kathu Pan handaxe in a blog article here.

However, the Master Hand-Axe was the exception rather than the rule, as professors Frederick Coolidge and Thomas Wynn of the University of Colorado and co-author Anne Kelly acknowledge in an article in Psychology Today (May 12, 2015) titled, “The Neuroaesthetics of Hand-axes”:

“As it turns out, beautiful examples such as the Kathu Pan hand-axe above are actually fairly rare. There are literally thousands of hand-axes from the site of Kathu Pan, but almost all are drab affairs that are vaguely symmetrical but not highly finished.”

According to Coolidge and Wynn, the earliest site where we find hand-axes which were all highly finished is Boxgrove, in Kent, England. This site is just 500,000 years old. In a similar vein, archaeologist and researcher Derek Hodgson, in his article, The First Appearance of Symmetry in the Human Lineage: Where Perception Meets Art (Symmetry 2011, 3, 37-53; doi:10.3390/sym3010037), argues that the human preoccupation with symmetry intensified around 500,000 years ago, while acknowledging the occasional occurrence of elegant artifacts from an earlier period, such as Kathu Pan (Figure 1 in the quote below – the copyright photo is from Michael Cope’s blog, which is why I can’t reproduce it here):

A typical Acheulean stone handaxe consists of the classic teardrop shape that displays an obvious concern for symmetry (see Figures 1 and 2) with considerable quantities having been found throughout the world, including Africa and the Middle East [34]. Although first appearing around 1.6 million years ago, such tools endured up until approximately 200,000 BP and, in some instance, even later, and therefore represent the longest known “tradition” [1]. The shape of earlier Acheulean tools seem, however, to be less refined than those from about 500,000 BP (although what are termed “occasionals” appear before this period [35] (see, for example, Figure 1), when not only was there a greater concern for mirror symmetry but also more complex kinds of symmetries began to appear, such as twisted and broken symmetry [4].

At this point, one might ask: does one swallow make a summer? Which date marks the true dawn of aesthetic artifacts: 750,000 years ago, when they were freak occurrences, or 500,000 years ago, when they finally became the norm? Once again, there appears to be no black-and-white dividing line.

There are, of course, many other instances of aesthetic artifacts that were produced during the Paleolithic, and for a more extensive discussion of this subject, I would refer the curious reader to Riva Berleant’s highly illuminating article, Paleolithic Flints: Is an Aesthetics of Stone Tools Possible? in Contemporary Aesthetics (Volume 5, 2007), which traces the development of Paleolithic flints from French archaeological sites dating from about 500,000 years ago to about 11,000 years ago. The article concludes on a modest note:

“We can be certain that the senses employed in the making and using of stone tools were culturally embedded and in turn were cultural shapers. Stone tools carry and signal sensory patterns to their makers and users, patterns that are transmitted through the teaching and learning of tool-making. We can also be sure that the making of stone tools and hence an aesthetics of stone tools was a part of social life. We can be almost sure that the attributes of stone influenced its selection as raw material, and that the senses activated and employed in the making and use of tools were important parts of the sensory range of Paleolithic peoples. We can even make tools ourselves, but the interpretation of the sense experiences in that making is ours and cannot be projected into the past. We can try to reconstruct the sequences of operations, the social processes and relationships, in which aesthetic values were created, but our hypotheses cannot be confirmed. In sum, we can appreciate that an aesthetics of stone tools existed in the prehistoric past, but we cannot be certain in reconstructing it.”

Although we will never know the meaning of the Paleolithic tool aesthetic, we can still speculate about the causes of human beings’ fixation with symmetry. Derek Hodgson, at the conclusion of his above-cited 2011 article, suggests that evolutionary factors (such as the simple fact that biologically significant objects tend to be symmetrical whereas inanimate objects are not) gave rise to an initial preference for symmetry that gradually strengthened over time, in a self-reinforcing process:

Acheulean tools represent the first occasion when symmetry became detached from adaptive perceptual constraints or functional determinants… The procedure whereby symmetry came to transcend functional constraints can thus be summarized as follows: (1) Positive affect deriving from the incidental production of symmetrical handaxes resulting in perceptual fluency that led to, (2) increased synchronization in neural responses that gave rise to, (3) sensory exploitation of symmetry that engendered, (4) a rudimentary aesthetic sense that was, (5) integrated into social signaling. The very beginning of visual culture, which formed the basis for much later “art”, therefore appears to have deep roots, and began with an interest in symmetry that went beyond mere functional considerations as is testified by the detached concern for the shape of Acheulean handaxes.

To sum up: any history of the evolution of the human sense of beauty is necessarily a speculative one. Nevertheless, from what we can tell, it appears that this aesthetic sense first appeared some 750,000 years ago, but does not seem to have become well-established until 500,000 years ago. After that, abstract art emerged, as we’ll see in in section 4, while hundreds of thousands of years elapsed until the emergence of what we would properly call symbolic art (to be discussed below, in section 7). All the evidence suggests that the human aesthetic sense emerged gradually.

————————————————————————————————————

4. Geometrical Adam


RETURN TO MAIN MENU

Overview: Hominins (in this case, late Homo erectus) were inscribing simple geometrical patterns (zigzags) on to shells of freshwater mussels they ate, in Java, Indonesia, perhaps as early as 540,000 years ago. By 350,000 to 400,000 years ago, they were marking parallel and radiating lines on pieces of bone in Germany, and by 290,000 years ago, there were making circular cup marks (cupules) on slabs of rock, in India. By 70,000 years ago, however, Homo sapiens in South Africa was creating much more sophisticated cross-hatched design and parallel incised lines, suggesting that the human capacity to create abstract patterns evolved gradually.

The evolution of geometrical designs: Recommended Reading

Bednarik, R. (1988). “Comment on D. Mania and U. Mania: Deliberate engravings on bone artefacts of Homo erectus.” Rock Art Research 5,2: 96-100.
Bednarik, R. (1993). “About cupules”. Rock Art Research 10,2:138-139.
Bednarik, R. (1995). “Concept-mediated marking in the Lower Paleolithic”. Current Anthropology 36,4:605-634.
Bednarik, R. (2002). Cupules – the oldest surviving rock art. Online article.
Cornelius, K. (2020). When did modern human behavior evolve?. Scishow Psych video, posted by Steve Pomeroy at Real Clear Science, May 5, 2020.
Feliks, J. (2006). The Graphics of Bilzingsleben: Sophistication and Subtlety in the Mind of Homo erectus. Paper presented at the XVth UISPP Congress, September 7, 2006.
Harrod, J. (2002). Later Acheulian Marking Motifs II – Other Sites / a)bhim. Online article.
Harrod, J. (2004, 2007). Deciphering Later Acheulian Period Marking Motifs (LAmrk): Impressions of the Later Acheulian Mind. Version 3, updated 11.25.2007 (original: 2004). Online article.
Harrod, J. (2014). Palaeoart at Two Million Years Ago? A Review of the Evidence. Arts 2014, 3, 135-155; doi:10.3390/arts3010135.
Hawks, J. (2014). The art of Homo erectus. Blog article, December 3rd, 2014.
Joordens, J. et al. (2015). “Homo erectus at Trinil on Java used shells for tool production and engraving”. Nature 518, 228–231.
Mania, D. and Mania, U. (1988). “Deliberate engravings on bone artefacts of Homo Erectus,” Rock Art Research 5, 91-97.
Texier, P. (2010). A Howiesons Poort tradition of engraving ostrich eggshell containers dated to 60,000 years ago at Diepkloof Rock Shelter, South Africa. PNAS 107 (14) 6180-6185; https://doi.org/10.1073/pnas.0913047107.
White, R. (1992). BEYOND ART: Towards an Understandings of the Origins of Material Representation in Europe. Annual Review of Anthropology 21:537-564.

The fourth mental breakthrough that could be regarded as a hallmark of humanity relates to a hominin whom I’ll call “Geometrical Adam”: the first hominin to create abstract geometrical designs. As we’ll see, the first species of hominin to achieve this feat appears to have been late Homo erectus, around half a million years ago. Before I discuss the evidence, however, I’d like to make a brief comment on controversial claims that the earliest evidence for human designs actually goes back 2 million years.

On the need for scholarly caution

In an article provocatively titled, Palaeoart at Two Million Years Ago? A Review of the Evidence, James Harrod, of the Center for Research on the Origins of Art and Religion, Maine College of Art, Portland, proposes that “Homo habilis/rudolfensis or a very early Homo erectus had substantially more advanced cognitive, design and symbolic competencies than inferred in current theories,” and that “the earliest palaeoart actually is evident around 2 million years ago.” Harrod bases on his contention on the discovery of “nine Oldowan artifacts that have been proposed as possible non-utilitarian and possibly symbolic behavior.” However, Harrod’s claims have yet to win widespread scholarly acceptance, so I will be following a more cautious approach.

Harrod also asserts that “symbolic behavior, including palaeoart, first emerged in human evolution around 1 million years ago.” Elsewhere in the article, he acknowledges a wide diversity of scholarly views on this question:

Hypotheses for the date for the emergence of intentionally-worked hominin “art” or “symbolism” range from the Upper Paleolithic/Later Stone Age, ~50 thousand years ago (ka), to the Middle Paleolithic/Middle Stone Age, e.g., beads, incised ochre at Blombos Cave, South Africa, ~75 ka, and deeper in time to the Middle Acheulian period around 1 million years ago (Ma).

However, as the following quotes reveal, much of the evidence Harrod cites is contestable:

In southwest Asia, Gesher Benot Ya’aqov, ~750–800 ka [42], … there are a few cleavers among hundreds of bifacial tools in which the knapper appears to have preserved and enhanced the basalt vesicles … While there is as yet no other archaeological support for intentionality and none for symbolism, these bifaces appear, at least to me, to have pareidolic face-like features evoked by the knappers unusual preservation of two vesicles for “eyes” and other flaking features adding to the impression of a “nose” or “mouth”…

Apparently non-utilitarian, patterned incisions on bones occur in Kozarnika, Bulgaria, Layer 12, ~1.4–1.6 Ma, with “core-and-flake” non-pebble tools (Guadelli 2004; Sirakov et al. 2010) [46,79], though dating is faunal with no numerical age (Parés 2013) [76]…

… Highly aesthetic later Acheulian bifaces occur at multiple sites in Africa, Europe and Asia. Europe, e.g., cordiform with worked “eye” on vertical axis upper, La Morandiére, Loire-et-Cher, ~520–730 ka (Despriée et al. 2009) [28], and I note that this figure also illustrates an apparently hexagonally flaked core, though this may be simply a non-intentional by-product of ordinary flaking

Left: The Venus of Berekhat Ram, discovered in the Golan Heights (administered by Israel), in 1981, and dated to between 230,000 and 700,000 years ago. Right: The “Venus of Tan-Tan” (replica), an alleged artifact found in Morocco, and dated to between 300,000 to 500,000 years ago. Critics contend that the shape of these pebbles is the result of natural weathering and erosion, which coincidentally produced a remotely human-like object. Others concede that the marks on the pebbles were man-made, but question whether they had any representational intent – or if they did, whether they were intended to represent a woman. Museum of Human Evolution, Burgos, Spain. Image courtesy of Dbachmann and Wikipedia.

For the time being, I think it would be best to refrain from discussing these claims, until a scholarly consensus exists on their worth. I should also point out that Harrod’s treatment of other evidence for early art (e.g. the Venus of Berekhat Ram, dated to between 230,000 and 700,000 years ago, and the Venus of Tan-Tan, dated to between 300,000 and 500,000 years ago; pictured above), is not as balanced as it could be: many scholars still consider these alleged artifacts to be the products of natural weathering and erosion, while others argue that they were deliberately made by humans but may not have been intended to represent anything – and even if they were, that it is problematic to identify these pebbles as representations of a woman’s body. (For a brief summary of contemporary scholarly views, see here.)

The world’s first zigzags: the evidence from Trinil, Java (430,000-540,000 BP)

Zigzags carved by Homo erectus on a shell at Trinil, Java, approximately 500,000 years ago. Photographer: Henk Caspers/Naturalis Biodiversity Center. Image courtesy of Wikipedia.

Scientists now know that between 540,000 and 430,000 years ago, Homo erectus created the first known cases of abstract geometrical patterns: it carved zig zags on shells of freshwater mussels, at an archaeological site in Trinil, Java. These shells were also used as cutting tools. The mussel shells were opened by boring into them at the point where the adductor muscle joins the ligament, causing them to spring open. The mussels inside were then eaten as raw seafood. The discovery, which was made by Josephine Joordens of Leiden University in the Netherlands and a team of scientists, was reported in a letter to Nature magazine titled, “Homo erectus at Trinil on Java used shells for tool production and engraving” (Nature volume 518, pages 228–231 (2015)). The authors comment: “Although it is at present not possible to assess the function or meaning of the engraved shell, this discovery suggests that engraving abstract patterns was in the realm of Asian Homo erectus cognition and neuromotor control.”

Parallel and radiating lines from Bilzingsleben, Germany (350,000-400,000 BP

The Bilzingsleben bone fragment (Germany), an elephant tibia, has some groups of incised parallel lines and might represent an early example of art. Image courtesy of José-Manuel Benito and Wikipedia.

The next piece of evidence we possess comes from the site of Bilzingsleben, in Germany, is famous for its abundant Palaeolithic human fossils and artifacts. One bone fragment found at the site, an elephant tibia, has two groups of 7 and 14 incised parallel lines. Lines appearing to radiate from a single point are also visible (see here). It has been suggested that these markings might represent an early example of art. Certainly, the regular spacing of the incisions, their approximately equal lengths and their V-like cross-sections all point to their having been created at the same time, with a single stone tool (see Mania, D and Mania, U, 1988, “Deliberate engravings on bone artefacts of Homo Erectus,” Rock Art Research 5, 91-97.) Indeed, it has even been proposed that the tibia, which dates to between 350,000 and 400,000 years ago, may have been used as an early calendar.

However, anthropologist John Hawks takes a less sanguine view, in a blog article titled, The art of Homo erectus. In his article, Hawks lauds the pioneering work of Robert Bednarik, who examined prehistoric artifacts with engravings in his 1995 journal article, “Concept-mediated marking in the Lower Paleolithic” (Current Anthropology 36,4:605-634). Commenting on the discoveries at Bilzingsleben, Bednarik writes:

As I have noted before (e.g., Bednarik 1994a), all of the markings of the Lower and Middle Palaeolithic resemble modern doodling, which is spontaneous and subconscious. Contemporary doodling, the scientific value of which remains almost entirely ignored, could well have its neuropsychological roots in our early cognitive history.

Hawks agrees with Bednarik’s assessment that the engravings at Bilzingsleben are intentional, but doubts whether they are particularly meaningful. Hawks is actually more impressed with the shell marks from Trinil, which he thinks required “deliberate precision”:

It is precisely these kinds of regularities that enable us to recognize the objects as “concept-mediated”, that is, guided by an intentional mind in some way. Even though they are not representational – at least not in an iconic, pictorial sense – the markings can be recognized as intentional. Most of these objects are much more ambiguous than the shell from Trinil. The shell marks required the maker to match the beginning and ending points of lines with each other, requiring deliberate precision.

I do not think we can dismiss doodling as meaningless. But neither do I think we can promote it as highly meaningful.

On a more skeptical note, Randall White, of the Department of Anthropology at New York University, roundly dismisses Mania and Mania’s claim that the Bilzingsleben bone markings are intentional and possibly symbolic, in an article titled, BEYOND ART: Towards an Understandings of the Origins of Material Representation in Europe (Annual Review of Anthropology 21:537-564), where he declares: “I doubt this interpretation and find no greater patterning in these marks than on the wooden cutting board in my kitchen.

It is worth noting, however, that in his earlier, 1998 article, “Comment on D. Mania and U. Mania: Deliberate engravings on bone artefacts of Homo erectus,” (Rock Art Research 5,2: 96-100), Robert Bednarik highlighted the geometrical regularities of the engravings at Bilzingsleben:

“I see them as unequivocal responses to physical aspects of the artefacts. Psychologically they are responses to the shape of surfaces, perhaps to their edges… The configuration of the convergent lines on Artefact 3 reflects the outline of the implement and clearly focuses on its upper end. The trapezial form of the longitudinal surface on [Bilzingsleben] Artefact 1 is mirrored in the perfectly balanced arrangement of the markings. The seven lines near the pointed end of the object are about parallel to the trapezium’s oblique side, and the lines near the centre of the decorated facet are roughly perpendicular to its longitudinal edges.

Controversially, inter-disciplinary scholar John Feliks goes even further: he argues that the markings at Bilzingsleben reflect “graphic skills far more advanced than those of the average modern Homo sapiens,” including “abstract and numeric thinking; rhythmic thinking; ability to duplicate not only complex, but also, subtle motifs; iconic and abstract representation; exactly duplicated subtle angles; exactly duplicated measured lines; innovative artistic variation of motifs including compound construction, doubling, diminution, and augmentation; understanding of radial and fractal symmetries; impeccably referenced multiple adjacent angles; and absolute graphic precision by high standard and, practically, without error.” However, Feliks’s views find little support within the scholarly community, and appear to rest on the observation that the sets of markings on two bones (Artifacts 1 and 3) are roughly mirror images of one another. Feliks comments: “Duplicated motifs are the hallmark of language.” For my part, I find Feliks’ claim of mirror symmetry to be massively over-stated, but I would invite readers to check out the evidence for themselves.

Cupules from Auditorium Cave, India (more than 290,000 years ago)

Entrance to the Auditorium Cave, India. Image courtesy of Sanmarga Mitra and Wikipedia.

The next piece of evidence we have comes from Auditorium Cave, India, and dates to the late Acheulean (more than 290,000 years ago). The cave forms part of the Bhimbetka rock shelters, which are located in the Raisen District in the Indian state of Madhya Pradesh about 45 kilometres (28 mi) southeast of Bhopal. Bednarik describes the rock markings as follows, in his 1993 paper, “About cupules” (Rock Art Research 10,2:138-139):

“There is a large, circular cup mark on a massive floor boulder, and a pecked meandering line that approaches the cupule, then runs parallel to its edge for a distance before it veers off again and peters out.”

Further details about the markings (with a featured illustration by Bednarik) can be found here. In a 2002 article titled, “Cupules — the oldest surviving rock art”, Bednarik cautions: “Cupules (Fig. 1) are the earliest surviving rock art we know about in the world, but this does not necessarily make them the first rock art produced.” The point he is making here is that there may well have been other, perishable forms of art that were invented long before cupules. More specifically, “if the oldest art being found in a region happens to be of a type that is most likely to survive the longest, then there is only a very slim chance that it is indeed the oldest art historically made in that region. It is simply the type of art that had the best prospect of surviving.” Bednarik adds that cupules have also been found at the site of Daraki-Chattan, in central India [these have been recently dated to between 200,000 and 500,000 years ago], and that cupules were also made by Neanderthal man in Europe, between 70,000 and 40,000 years ago. Whether all of these rock markings deserve to be called “art” is more debatable, however – especially as Bednarik himself concedes that their purpose is unknown. He continues, “It is also very doubtful that all cupules were made for similar purposes, and it is even possible that some of those found on horizontal surfaces were used for some utilitarian process.” For my part, I think that the cupules from Daraki-Chattan (Figure 4 in Bednarik’s article) and the Neanderthal cupules from La Ferrassie, France (Figure 5), look artistic. What is indisputable, however, is that these are designs that were deliberately and repeatedly made by early humans.

The cross-hatched design from Blombos Cave, South Africa (70,000-100,000 BP)

Rock art by modern Homo sapiens from Blombos Cave, South Africa. Date: 70,000 to 100,000 years BP. Image courtesy of Wikipedia.

It is interesting to compare the zig-zags at Trinil and the doodlings at Bilzingsleben with the cross-hatched design and parallel incised lines found on two pieces of ochre at the Blombos Cave in South Africa and dating to between 70,000 and 100,000 years ago, making them the work of Homo sapiens. Here, the design is unmistakably an abstract geometrical pattern: it’s considerably more sophisticated than the zigzags from Trinil. Some authorities refer to this design as the world’s earliest instance of abstract art.

In a 2020 study, scientists used the Blombos cave ochres to determine how symbolic behavior may have evolved. Researchers took ochre and ostrich shells that our ancestors had engraved with lines and hashtag patterns between 110,000 and 52,000 years ago. The scientists showed these ancient carvings to modern humans and asked them to memorize and reproduce them. They found that as the engravings became more recent, participants paid more attention to them, memorized them better and drew them more accurately. The scientists concluded that as the engravings evolved, they became more effective in provoking a cognitive response in the viewer.

Finally, Pierre-Jean Texier et al., in their paper, A Howiesons Poort tradition of engraving ostrich eggshell containers dated to 60,000 years ago at Diepkloof Rock Shelter, South Africa (PNAS 107 (14) 6180-6185; https://doi.org/10.1073/pnas.0913047107), discuss the evidence for engravings on ostrich shells 60,000 years ago, at a rock shelter in South Africa:

“Engraved abstract patterns are widely accepted as evidence for the presence of symbolic thought (1 –11). The number of EOES [engraved ostrich eggshells – VJT] at Diepkloof is exceptional (n = 270) and has no equivalent in the current archaeological record…

“This unique collection demonstrates not merely the engraving of a single geometric pattern but the development of a graphic tradition (24) and the complex use of symbols to mediate social interactions. The large number of marked pieces shows that there were rules for composing designs but room within the rules to allow for individual and/or group preferences. In effect there were a number, albeit a limited number, of alternative patterns that could be transferred to ostrich eggs, transforming them from ordinary items into specifically and uniquely marked ostrich eggshells.”

Some of the patterns may be seen here and here.

Summing up: the ability to create these geometrical patterns seems to have gradually emerged over the course of time. The earliest geometrical patterns appeared over 500,000 years ago, but one cannot really describe them as “abstract art” until 70,000 years ago – some 430,000 years later. This gradual evolution in geometrical designs constitutes evidence against the hypothesis that our uniquely human mental capacities appeared overnight.

————————————————————————————————————

5. Spear-maker Adam

RETURN TO MAIN MENU

Overview: We do not know when the first wooden spears were invented by hominins, but they are known to have been used over 300,000 years ago at the site of Schöningen, Germany, and there’s good evidence that they were used half a million years ago, at the site of Boxgrove, England, and some anthropologists believe that hominins began making these spears as early as five million years ago, as chimps seem to be capable of something similar.

The next step forward was the invention of stone-tipped spears, where the stone tips had to be hafted onto a wooden shaft, and kept in place with twine. Some researchers believe that as early as 500,000 years ago, humans living at the site of Kathu Pan in South Africa, who probably belonged to the species Heidelberg man, attached stone tips to wooden spears, which they used for big-game hunting – an activity that required co-operative, strategic planning. Other archaeologists are skeptical about the evidence from Kathu Pan, but what is not in dispute is that by 300,000 years ago, the use of stone-tipped spears was widespread. The task of making these stone-tipped spears would have required the ability to multi-task, while focusing on one’s goal. Since different kinds of resources had to be collected, prepared and combined, the task would have placed demands on working memory as well.

The earliest stone-tipped spears were probably thrust into the sides of the animals they were used to kill, exposing the hunters who wielded them to the risk of being gored to death. It is not until 280,000 years ago that we find the first evidence of stone-tipped spears being used as projectiles, in Ethiopia. Hurling spears from a distance would have reduced the risk to the hominins who used them. As far as we can tell, Neanderthals never mastered this technology with stone-tipped spears, although many authorities believe they hurled wooden spears at their prey.

Finally, around 70,000 years ago, we observe the first use of compound adhesives to attach stone tips to wooden shafts, in order to keep them securely in place. At around the same time, the first bows and arrows were invented, probably in Africa.

The fifth dividing line between man and beast occurs 500,000 years ago, with the fashioning of stone-tipped spears for big-game hunting, which required co-operative, strategic planning – surely a hallmark trait of human beings. The spears probably looked like this. I’ll call the hominin who made this technological leap “Spear-maker Adam.”

The evolution of spears: Recommended Reading
Barras, C. (2012). First stone-tipped spear thrown earlier than thought. New Scientist, November 15, 2012.
BBC news. (2007). Chimpanzees ‘hunt using spears’. February 22, 2007.
Chazan, M. (2019). The Reality of Artifacts: An Archaeological Perspective. Routledge.
Downes, N. (2019). Neanderthal hunting spears could kill at a distance. UCL News, January 25, 2019.
Jha, A. (2012). Stone me! Spears show early human species was sharper than we thought. The Guardian, November 15, 2012.
Roach, J. (2007). Chimps Use “Spears” to Hunt Mammals, Study Says. National Geographic News, February 27, 2007.
Rots, V. and Plisson, H. (2014). Projectiles and the abuse of the use-wear method in search for impact Journal of Archaeological Science, DOI: 10.1016/j.jas.2013.10.027.
Ruta, G., Rots, V. and Peresani, M. (2018). Looking for a standard method: impact fractures analysis of the lithic materials from Riparo Villabruna (Belluno – Italy). Conference paper, DOI: 10.15160/1824-2707/13/0.
Sahle, Y. et al. (2013). “Earliest Stone-Tipped Projectiles from the Ethiopian Rift Date to >279,000 Years Ago”. PLoS ONE 8(11): e78092.
Than, K. (2012). Stone Spear Tips Surprisingly Old—”Like Finding iPods in Ancient Rome”. National Geographic News, November 16, 2012.
Wadley, L. et al. (2009). Implications for complex cognition from the hafting of tools with compound adhesives in the Middle Stone Age, South Africa. PNAS, 106 (24) 9590-9594; https://doi.org/10.1073/pnas.0900957106.
Weiss, R. (2007). For First Time, Chimps Seen Making Weapons for Hunting. Washington Post, February 23, 2007.
Wilkins, J. and Schoville, B. (2012). “Evidence for Early Hafted Hunting Technology”. Science 338:942, DOI : 10.1126/science.1227608.
Wilkins, J. et al. (2014). “An Experimental Investigation of the Functional Hypothesis and Evolutionary Advantage of Stone-Tipped Spears” PLoS One 9(8): e104514.
Wilkins, J. et al. (2015). “Kathu Pan 1 points and the assemblage-scale, probabilistic approach”. Journal of Archaeological Science 54, 294-299.

(a) The first step: wooden spears

A fire-hardened wooden spear found in situ at Schoningen, Germany, dating to 300,000 years BP. Image courtesy of P. Pfarr NLD (Niedersächsisches Landesamt für Denkmalpflege) and Wikipedia.

Many readers will be surprised to learn that chimpanzees are now known to use wooden “spears” (but not stone-tipped spears) to hunt other primates. Chimps usually break off a living tree branch to make their tools. After that, they trim the side branches and leaves. Occasionally, they also trim the ends of the branch and strip it of bark. Finally, some chimps even sharpen the tip of the tool with their teeth. Unfortunately, the researchers who made this discovery were unable to provide photographic evidence that these spears were actually used to kill other primates, rather than to probe for them in the grass. They did, however, observe the spears being jabbed into tree hollows, where small primates called lesser bush babies often sleep. The spears, which average 60 centimeters long and 11 millimeters in circumference, are too fragile to be used for throwing, but strong enough to be used for jabbing, although whether they could actually penetrate an animal’s hide remains uncertain. The discovery that chimps are capable of fashioning these implements has led Craig Stanford, a primatologist and professor of anthropology at the University of Southern California, to propose that early hominins may have used wooden spears as early as five million years ago. Wood, however, does not preserve well in the fossil record, so Stanford’s proposal remains speculative. (Stanford prefers to describe the jabbing implements used by chimpanzees as “bludgeons,” rather than spears, as he believes it exaggerates their resemblance to spears made by early man.)

At any rate, it is commonly acknowledged that wooden spears were being used by Heidelberg man in Europe, as early as 500,000 years ago. At the site of Boxgrove, Kent, U.K., archaeologists uncovered a horse’s shoulder blade with a semi-circular hole in it, which pathologist Bernard Knight found to be consistent with a blow from a thrown spear.

Additionally, two-meter-long wooden spears dating to over 300,000 years ago (and fashioned either by Heidelberg man or early Neanderthals) have been found in Europe, at the site of Schoningen, in Germany. Although these spears lacked stone points, recent experiments have shown that they could have been thrown fairly accurately, over a distance of around 20 meters. It has been suggested that these spears were used in ambush attacks on horses. The findings add to the evidence that Neanderthal man was capable of fashioning projectile weapons.

Not everyone is convinced, however. In chapter two of his book, The Reality of Artifacts: An Archaeological Perspective (Routledge, 2019), Michael Chazan lists several reasons for doubting that the wooden spears unearthed at Schoningen, Germany, were used for throwing: he points out that they were much heavier than modern projectile spears, they exhibit a lack of symmetry along the long axis, and most of them are not straight, but show pronounced bowing, making them difficult to throw accurately.

(b) The advent of stone-tipped spears, and what it tells us about their makers

In 2012, Jayne Wilkins of the University of Toronto, Canada, and her colleague, Benjamin Schoville, published an article titled “Evidence for Early Hafted Hunting Technology” (Science 338, 942 (2012), DOI : 10.1126/science.1227608), announcing their discovery of a collection of stone points, ranging from 4 to 9 centimeters long, which they believe were attached (or hafted) to the ends of wooden shafts, using resin. The stone points are the right shape and size for use in spears, and some even have fractured tips, indicating they were used as weapons. The points also show signs of having been re-sharpened to maintain their symmetry, which indicates that they were used multiple times. The spear points were discovered in 500,000-year-old deposits at Kathu Pan in South Africa – the same place where the Master Hand-Axe (fashioned by “Aesthetic Adam”) was found. Wilkins et al. discussed the significance of this new technique in an August 2014 paper in PLoS One (9(8): e104514), titled, ““An Experimental Investigation of the Functional Hypothesis and Evolutionary Advantage of Stone-Tipped Spears” In the authors’ words:

Hafting a stone tip to a wooden shaft was a significant innovation for Middle Pleistocene hominins and may represent the origin of new cognitive and social capacities within the human lineage. Part of human cognition is the ability to hold in attention multiple tasks and conduct goal-oriented behavior. The concept of ‘working memory’ has been used to highlight this capacity… The manufacture of hafted technologies is one type of behavior that requires working memory because it requires the collection, preparation, and combination of different kinds of resources – wood, stone, and binding material.”

Wilkins et al. attribute this cognitive breakthrough to Heidelberg man, as the stone-points found at Kathu Pan pre-date the advent of Homo sapiens.

Paleoanthropologist John Shea, who was not part of the study, remarked in astonishment that discovering spear points at a Heidelberg man site was “like finding an iPod in a Roman Empire site.” He added that there’s “no question” that hafting a spear point to would have involved speech. “It would probably not be something that could be taught by imitation. This is a technology that is so complex that it absolutely, positively requires language.” In an interview with Ker Than of National Geographic News (November 26, 2012), lead study author Jayne Wilkins agreed that the hafting process would have required forethought. “You have to plan days in advance before actually being able to use your weapons to hunt,” she said. And you’d want to teach your comrades to do the same, presumably by talking.” The article acknowledged, however, that the sediments around the stone points was dated using optically stimulated luminescence, a relatively new technique that can yield differing dates, although it added that a fossil zebra found next to the points was dated to 500,000 years ago using another technique: electron spin resonance dating.

But are they really spear points? A note of caution

I should point out that some scholars have questioned the validity of Wilkins’ work, which is based on the observation that certain types of breaks recur in projectile use and rarely result from other processes (such as knapping), and can therefore be considered diagnostic of impact. In a strongly worded critique of Wilkins’ 2012 paper, Rots and Plisson (2014) argue that the use-wear analysis employed by Wilkins and her colleagues relies on analogical reasoning, since it involves making inferences about how stone tools were used, based on similarities they share with tools in an experimental reference collection. Or as Wilkins put it in an interview with Alok Jha of The Guardian (November 15, 2012), “We know from experimental studies that, when a point is used as a spear tip, the concentration of damage is greater at the tip of the point than along the edges… That’s the same pattern we saw in the Kathu Pan point.” Rots and Plisson found this kind of reasoning highly questionable, arguing that only microscopic studies could properly determine whether a tool was actually used as a projectile or not. (Unfortunately, most of the tools found at Kathu Pan are made of coarse-grained, patinated material, making detailed microscopic analysis impossible.) In a follow-up study in 2018, Rots and colleagues added that they “currently do not know of any reliable method for distinguishing tip fractures caused by shooting into bodies from accidental contact with inorganic targets during shooting or from accidental damage.”

In a 2015 article titled, “Kathu Pan 1 points and the assemblage-scale, probabilistic approach” (Journal of Archaeological Science 54 (2015), 294-299), Wilkins et al. vigorously defended the validity of their method, explaining that their data constitute “reliable evidence for early hafted hunting technology.” The authors explained that their methodology involves: (i) an examination of assemblages of stone tools rather than individual tools; (ii) quantitative analysis of the manner in which macroscopic wear features are distributed across a tool; and (iii) probabilistic, statistical conclusions about variations between experimental and archaeological stone tools, in their wear feature characteristics. The authors argued that taken as an ensemble, there was a less than 5% chance that the wear found on the tools unearthed at Kathu Pan could be accounted for by other processes, apart from projectile impact.

At any rate, what is not in dispute is that by around 300,000 years ago, stone-tipped spears were being used all over Africa, Europe and western Asia. The evidence for these spears comes from triangular stone tips found at archaeological sites. According to Wilkins, stone-tipped spears manufactured in Europe and Asia at this time would have been made by Neanderthals, while those found in Africa would have been made either by Homo sapiens or its near ancestors. In their 2014 paper, Rots and Plisson (who adopt a more conservative chronology than that proposed by Wilkins) suggested that stone-tipped spears first appeared in southern Africa during the transition from the Early Stone Age (ESA) to the Middle Stone Age (MSA), around 250,000 years ago.

(c) The next step: thrown (projectile) spears

Despite Wilkins’ claim that Heidelberg man in South Africa was hunting with stone-tipped spears as early as 500,000 years ago, there is no evidence to suggest that he ever used them as projectiles. Indeed, the evidence suggests otherwise, for reasons summarized by Michael Chazan in his book, The Reality of Artifacts: An Archaeological Perspective (Routledge, 2019):

None of the early spear points seem particularly well designed for throwing. The stone points at Kathu Pan 1, for example, are heavy, and no effort has been made to create an aerodynamic design. Moreover, none of these early spears show evidence of barbs that would keep it embedded in the animal. Barbs are particularly important for tools launched from a distance as barbs not only make for a more effective weapon, but also raise the potential of recovering the weapon once the animal has been dispatched.

However, in 2013, Yonatan Sahle et al. reported the discovery of stone-tipped spears that were used as projectiles in Ethiopia nearly 300,000 years ago, in an article titled, “Earliest Stone-Tipped Projectiles from the Ethiopian Rift Date to >279,000 Years Ago” (PLoS ONE 8(11): e78092). In the Introduction, the authors explain the significance of projectile weapons for hunting, and the advantage they provided over hand-held weapons:

“A key component in prehistoric subsistence strategies, the invention of projectile weapons was a decisive advance over the thrusting spear [1]–[3]. The ability to wound or kill dangerous animals or enemies at a distance is considered one of the most significant adaptive advantages for Paleolithic hunters, reducing the likelihood of injury and increasing prey breadth [1]–[3]. In the Late Pleistocene, complex projectiles such as the bow-and-arrow probably contributed to the technological advantage enabling Homo sapiens to expand out of Africa and outcompete Neanderthals [3].

“At Kathu Pan, in South Africa, Middle Pleistocene hominins made hafted stone-tipped hunting spears ∼500 thousand years ago (ka); these were, however, not projectiles but hand-delivered thrusting weapons [4]. In addition, the stratigraphic placement of the studied artefacts from Kathu Pan relative to the dated layers remains as yet controversial. Pointed wooden spears from Schöningen, Germany, dating to ∼400 ka were likely used in hunting large game [5]. These were initially described as ranged weapons, but it has not been possible to definitively identify their mode of delivery [1], [2].”

(d) The final step: the use of compound adhesives to attach stone points to wooden shafts

Initially, rope or twine was probably used to attach stone tips to wooden shafts, to make spears. The next step was the use of adhesives, and later compound adhesives, to make the attachment more secure. This step required the use of sophisticated chemistry, and involved a complicated series of steps. As far as scientists can tell, the invention of compound adhesives took place around 70,000 years ago. Lynn Wadley et al. explain the significance of this breakthrough in their paper, Implications for complex cognition from the hafting of tools with compound adhesives in the Middle Stone Age, South Africa (Proceedings of the National Academy of Sciences, June 16 2009, 106 (24) 9590-9594; https://doi.org/10.1073/pnas.0900957106),

The use of simple (1-component) adhesives is ancient; for example, birch-bark tar was found on 2 flakes from ≈200,000 years (200 ka) ago at a site in Italy (3). At ≈40 ka, bitumen was found on stone tools in Syria (4), and a similarly aged site in Kenya yielded tools with red ochre stains that imply the use of multicomponent glue (5). Traces of even earlier (≈70 ka) compound adhesives occur, together with microfractures consistent with hafting, on Middle Stone Age (MSA) stone tools from Sibudu Cave, South Africa (see SI Text and Table S1). Several recipes are evident: sometimes plant gum and red ochre (natural iron oxide–hematite–Fe2O3) traces (Fig. 1) occur on tool portions that were once inserted in hafts (6⇓⇓⇓–10). Other tools have brown plant gums and black or white fat, but no ochre (Fig. 1 and SI Text)….

Artisans living in the MSA [Middle Stone Age] must have been able to think in abstract terms about properties of plant gums and natural iron products, even though they lacked empirical means for gauging them. Qualities of gum, such as wet, sticky, and viscous, were mentally abstracted, and these meanings counterpoised against ochre properties, such as dry, loose, and dehydrating. Simultaneously, the artisan had to think about the correct position for placing stone inserts on the hafts. Successful mental rotation requires advanced working memory capacity (36) and, in turn, complex cognition. Capacity for multilevel operations, abstract thought, and mental rotation are all required for the process of compound adhesive manufacture.”

The authors add that the invention of the bow and arrow marks the next major advance in hunting, somewhere between 100,000 and 50,000 years ago. The earliest stone arrows known to science come from Sibidu Cave, in South Africa, and date back to 64,000 years ago, while the earliest bone arrow is 61,000 years old.

The invention of spears: a sudden or a gradual event?

So, did spears appear suddenly or evolve gradually? It appears that wooden spears go back at least half a million years, and possibly earlier. (These spears may well have been used as projectiles.) Stone-tipped spears appear to have been invented in Africa around 500,000 years ago, although this date is not certain. The hafting process would have required extensive planning and would have almost certainly involved the use of speech. By around 300,000 years ago, stone-tipped spears were being used all over Africa, Europe and western Asia. These spears were made by both Neanderthals and Homo sapiens. Stone-tipped spears that were used as projectiles were invented in Africa around 280,000 years ago, dramatically enhancing the effectiveness of hunting. Adhesives were first used to attach stone tips to wooden shafts around 200,000 years ago, but it was not until 70,000 years ago that compound adhesives, with their complex chemistry, were invented. The manufacture of these adhesives required a sophisticated capacity for multilevel operations, abstract thought, and mental rotation. The overall picture that emerges is that the technology used to make spears evolved over a period of tens, if not hundreds of thousands of years. The case for an overnight appearance of Spear-maker Adam thus appears untenable.

————————————————————————————————————

6. Craftsman Adam


RETURN TO MAIN MENU

Overview: Somewhere between about 500,000 years ago and 320,000 years ago, a technological breakthrough occurred in Africa. Humans began fashioning a wide variety of tools, which were smaller and more carefully shaped. In addition, they appear to have used sophisticated adhesives (made from red ochre and plant gum), in order to attach spear points to spears. As there is a 200,000-year gap in the archaeological record in Africa, we cannot say exactly how rapidly this breakthrough occurred. However, the fact that even in the late Acheulean, between 615,000 and 500,000 years ago, we can occasionally find smaller tools, more sophisticated designs, and materials transported from many kilometers away, suggests that “Craftsman Adam” did not arrive on the scene overnight.

The Levallois technique of flint-knapping. Image courtesy of José-Manuel Benito Álvarez and Wikipedia.

The evolution of highly sophisticated craftsmanship: Recommended Reading
AAAS and sciencemag.org. “Signs of symbolic behavior emerged at the dawn of our species in Africa” (video). Science, doi:10.1126/science.aat5893.
Chatterjee, R. (2018). Scientists Are Amazed By Stone Age Tools They Dug Up In Kenya. NPR, Goats and Soda series, March 15, 2018.
Hirst, K. Kris. (2019, May 30). “The Evolution of Stone Tools”. Retrieved from https://www.thoughtco.com/the-evolution-of-stone-tools-171699.
Hirst, K. Kris. (2020, February 11). “Levallois Technique – Middle Paleolithic Stone Tool Working”. Retrieved from https://www.thoughtco.com/levallois-technique-stone-tool-working-171528.
Yong, E. (2018). “A Cultural Leap at the Dawn of Humanity”. The Atlantic, March 25, 2018.

The advent of highly sophisticated craftsmanship at Olorgesailie, Kenya

The sixth mental milestone could be described as the advent of highly sophisticated craftsmanship, so I’ll give the name “Craftsman Adam” to the first hominin exhibiting this behavior. It roughly coincides with the appearance of Homo sapiens in Africa, some 320,000 years ago. Excavations at the site of Olorgesailie, in Kenya, led by Rick Potts, director of the human origins program at the Smithsonian Institution in Washington D.C., have revealed that by that time, humans had switched from making heavy, clunky Acheulean hand-axes that changed very little over hundreds of thousands of years to fashioning a dazzling variety of tools, such as blades, scrapers and spear heads, which were designed not merely for scavenging animal carcasses but for hunting mammals such as hares, rabbits and springbok, and even a couple of species of birds and fishes. (By this time, larger mammals had disappeared from the scene in eastern Africa, following a period of massive climate change and dramatic swings between dry and wet seasons, beginning some 360,000 years ago.) The diversity of tools that were found at the site of Olorgesailie, as well as their smaller size, intricate detail and high degree of refinement are all indicative of advanced thinking and planning. At the same time, humans set up trading networks allowing rocks and minerals to be imported from sites dozens of miles away, in order to make advanced obsidian tools and ochre. The technical advances that occurred are described in the following Youtube video, courtesy of www.sciencemag.org and AAAS, titled, “Signs of symbolic behavior emerged at the dawn of our species in Africa” (Science, March 15 2018, doi:10.1126/science.aat5893):

The video identifies three key features of human behavior uncovered at the Middle Stone Age site of Olorgesailie, Kenya, and dating back to 320,000 years ago: the use of sophisticated tools, social networks and pigments. At the site, archaeologists found some small carefully shaped tools made of obsidian, which is valued for its sharpness. The tools were made by striking small blades and points off an asymmetrical core into a desired shape and thickness, which suggests complex planning. What’s more, the opposite ends of the points were sometimes modified, which allowed them to be attached to shafts. This projectile technology was an important step in the development of hunting, because the spears could be flung at a distance, reducing the danger to the hunter. Some of the obsidian used was transported from as far as 90 kilometers away, pointing to the existence of elaborate trading networks. The discovery of rocks used for coloring – rocks streaked with black manganese on one side and processed red ochre on the other. It has been suggested that these pigments may have been used for bodily ornamentation and also to identify people, enhancing trustworthiness. Taken together, the findings suggest that some aspects of modern human behavior evolved early in Africa and then evolved gradually over time. (I discuss this further in Section 7 below.)

Writing for an NPR report, Scientists Are Amazed By Stone Age Tools They Dug Up In Kenya (March 15, 2018) Rhitu Chatterjee interviewed paleoanthropologist Marta Mirazon Lahr to discuss the significance of the findings. Lahr cautioned against the naive assumption that the technology used at Olorgesailie emerged suddenly:

The new studies also show that by 320,000 years ago this technology was well established in the region, suggesting that human ancestors likely started developing it even earlier, she says.

“The technology they have is not a crude, early version of the Middle Stone Age. It is the full-blown Middle Stone Age,” Lahr says.

At any rate, what we now know is that as early as the dawn of Homo sapiens 320,000 years ago, humans possessed a high degree of mental sophistication. I should also mention that Neanderthal man was capable of making some fairly sophisticated weapons as well, including stone-tipped spears (see also here). Neanderthal man also hunted birds (either by climbing high cliffs to their nests or by using snares) and ate seals and dolphins. It was once believed that only Homo sapiens had the nous to hunt these creatures. It also appears that Neanderthals developed the ability to make highly sophisticated tools (known as Mode III tools) at around the same time as Homo sapiens, if not earlier. Archaeologist K. Kris Hirst summarizes the current state of the evidence in her online article, Levallois Technique – Middle Paleolithic Stone Tool Working (ThoughtCo., Feb. 11, 2020, thoughtco.com/levallois-technique-stone-tool-working-171528):

Levallois, or more precisely the Levallois prepared-core technique, is the name archaeologists have given to a distinctive style of flint knapping, which makes up part of the Middle Paleolithic Acheulean and Mousterian artifact assemblages. In his 1969 Paleolithic stone tool taxonomy (still widely used today), Grahame Clark defined Levallois as “Mode 3”, flake tools struck from prepared cores. Levallois technology is thought to have been an outgrowth of the Acheulean handaxe. The technique was reckoned a leap forward in stone technology and behavioral modernity: the production method is in stages and requires forethought and planning...

The Levallois technique was traditionally thought to have been invented by archaic humans in Africa beginning about 300,000 years ago, and then moved into Europe and perfected during the Mousterian of 100,000 years ago. However, there are numerous sites in Europe and Asia which contain Levallois or proto-Levallois artifacts dated between Marine Isotope Stage (MIS) 8 and 9 (~330,000-300,000 years bp), and a handful as early as MIS 11 or 12 (~400,000-430,000 bp): although most are controversial or not well-dated.

In another article, titled, “The Evolution of Stone Tools” (ThoughtCo., May 30, 2019), Hirst defines the term “Mode III tools,” which is widely used by archaeologists:

Mode 3: Flake tools struck from prepared cores, with an overlapping sequence of flake removal (sometimes referred to as façonnage) system – including the Levallois technology, Middle Paleolithic, Levallois, Mousterian, arose during the Late Acheulean at the onset of the Middle Stone Age/Middle Paleolithic, about 300,000 years ago.

What we can say is that at least 330,000 years ago, the Neanderthals were capable of creating highly advanced Mode III tools. This is about the same time as when the tools became widely prevalent in Africa, although as we’ll see below, they first appeared there over 500,000 years ago. However, the Neanderthals may have invented their technology over 400,000 years ago. So, did the Neanderthals borrow from us, or did they develop the technique independently? At the present time, we cannot say.

Was the breakthrough rapid or gradual?

Unfortunately, there is a gap in the Kenyan archaeological record between 500,000 years ago and 320,000 years ago, so we cannot say for certain whether the sophisticated craftsmanship exhibited by early Homo sapiens emerged suddenly or gradually. However, the limited evidence available suggests that it emerged gradually, over a period of hundreds of thousands of years. Reporting on recent discoveries at the site of Olorgesailie, science writer Ed Yong summarizes the evidence in an article in The Atlantic titled, ““A Cultural Leap at the Dawn of Humanity” (March 25, 2018):

Between 500,000 and 615,000 years ago, Acheulean technology still dominated at Olorgesailie. But there are occasional signs of smaller tools, more sophisticated designs, and materials being imported from long distances. ‘In the late Acheulean, we see the precursors of what became crystallized in the Middle Stone Age,’ says Potts. It’s almost as if humanity already had the capacity for our later leap, but were missing some kind of trigger — something that precipitated a break with hundreds of thousands of years of cultural stagnation. But what?”

That trigger, argues Yong, was environmental change:

“Around 500,000 years ago, the relatively stable lake basin at Olorgesailie turned into an etch-a-sketch landscape that was continuously remodeled by earthquakes. By 360,000 years ago, the climate had become incredibly unstable, with big swings between dry and wet seasons, and large changes in the layout of rivers, lakes, and floodplains.”

Yong quotes excavation team leader Rick Potts as suggesting that these changes may have led to the origin of our own species, Homo sapiens, at around the same time. At any rate, what is important for our purposes is that “Craftsman Adam” seems to have emerged gradually, rather than overnight. Highly sophisticated craftsmanship seems to have first appeared between 500,000 and 615,000 years ago, but only become prevalent some 320,000 years ago: a transition time of hundreds of thousands of years.

————————————————————————————————————

7. Modern or Symbolic Adam


RETURN TO MAIN MENU

Overview: If “modern human behavior” is defined broadly, in terms of a combination of four traits – abstract thinking (the ability to act with reference to abstract objects not limited in time or space), planning depth (the ability to formulate strategies based on past experience and to act upon them in a group context), behavioral, economic and technological innovativeness and symbolic cognition (the ability to represent objects, people and abstract concepts with arbitrary symbols), as Sally Mcbrearty and Alison Brooks propose – then it goes back about 300,000 years, to the advent of the the Middle Stone Age in Africa. It was at about this time that Homo sapiens emerged; however, Francisco d’Errico points out that many of the behaviors that archaeologists classify as “modern” can also be found among the Neanderthals. If we define modern human behavior more narrowly as behavior that is mediated by socially constructed patterns of symbolic thinking, then it appears to date back at least 130,000 years. Around that time, both Neanderthal man and Homo sapiens were making jewelry. In light of the fact that symbolic behavior was once believed to date back to no earlier than 50,000 years ago, the recent discovery of archaeological sites predating the so-called “Upper Paleolithic Revolution” suggests that the origins of symbolic behavior may go back far earlier than anyone previously believed. At any rate, the myth of a sudden cultural explosion has been eviscerated by recent discoveries in Africa and the Middle East. These discoveries cast further doubt on the hypothesis that our humanity is something which appeared overnight.

The evolution of modern human behavior: Recommended Reading
D’Errico, F. (2003). The Invisible Frontier. A Multiple Species Model for the Origin of Behavioral Modernity. Evolutionary Anthropology Issues News and Reviews 12(4):188-202.
Mcbrearty, S. and Brooks, A. (2000). The Revolution That Wasn’t: A New Interpretation of the Origin of Modern Human Behavior. Journal of Human Evolution 39(5):453-563.
Radovčić, D. et al. (2015). Evidence for Neandertal Jewelry: Modified White-Tailed Eagle Claws at Krapina. PLoS One, 10(3): e0119802. https://doi.org/10.1371/journal.pone.0119802.
Randolph-Quinney, P. and Sinclair, A. (2018). The revolution that wasn’t: African tools push back the origins of human technological innovation. The Conversation, March 16, 2018.
Vanhaeren, M. et al. (2006). Middle Paleolithic Shell Beads in Israel and Algeria. Science Vol. 312, Issue 5781, 1785-1788.
Vince, G. (2020). Ancient yet cosmopolitan. Aeon, June 18, 2020.
Wurz, S. (2012). The Transition to Modern Behavior. Nature Education Knowledge 3:10(15).

Modern human behavior: what is it?

The seventh alleged quantum leap in human prehistory relates to what’s commonly known as modern human behavior. However, this term is defined in various ways by different authors. One popular textbook defines behavioral modernity in terms of what it calls “the five behavioral B’s: blades, beads, burials, bone tool-making, and beauty” (Mark Jobling et al., Human Evolutionary Genetics, CRC Press, 2014, p. 344). However, this definition is too elastic to be scientifically useful: blades, which were long thought to have emerged in the Upper Paleolithic, some 40,000 years ago, were later pushed back to 200,000 years ago, and are now known to have been used in Kenya more than 500,000 years ago; beads have been found in Israel and North Africa dating back to between 100,000 and 135,000 years ago; the first known burials date back to somewhere between 300,000 and 100,000 years (see the discussion in section 10 below); bone tools are known to have been used in Africa some 1.5 million years ago, although they became much more common around 70,000 years ago, when awls and polished bone points were manufactured at the Blombos Cave site in South Africa; and finally, “beauty” is a very loose term, which as we have seen could be said to go back as far as 750,000 years ago, to the time when the first undeniably aesthetic artifacts were manufactured.

(a) The myth of the “human revolution”: did modern human behavior evolve gradually? The case for abstract thinking in the African Middle Stone Age

Some of the Middle Stone Age stone tools from Jebel Irhoud (Morocco), and dated to 300,000 years ago. These African Middle Stone Age tools were made by early Homo sapiens. Image courtesy of Mohammed Kamal, MPI EVA Leipzig and Wikipedia.

Sally Mcbrearty and Alison Brooks argue for the antiquity of modern human behavior in their 2000 paper, The Revolution That Wasn’t: A New Interpretation of the Origin of Modern Human Behavior (Journal of Human Evolution 39(5):453-563, December 2000). The authors state their thesis up-front:

In fact, many of the components of the “human revolution” claimed to appear at 40-50 ka [40,000-50,000 years ago] are found in the African Middle Stone Age tens of thousands of years earlier. These features include blade and microlithic technology, bone tools, increased geographic range, specialized hunting, the use of aquatic resources, long distance trade, systematic processing and use of pigment, and art and decoration. These items do not occur suddenly together as predicted by the “human revolution” model, but at sites that are widely separated in space and time. This suggests a gradual assembling of the package of modern human behaviors in Africa, and its later export to other regions of the Old World.

Mcbrearty and Brooks propose that modern behavior can be characterized by four distinctive traits relating to both adaptation and cognition: abstract thinking (the ability to act with reference to abstract objects not limited in time or space), planning depth (the ability to formulate strategies based on past experience and to act upon them in a group context), behavioral, economic and technological innovativeness and symbolic cognition (the ability to represent objects, people and abstract concepts with arbitrary symbols). They contend that the roots of all of these traits are much more ancient than the Upper Paleolithic (which began around 50,000 years ago). Instead, Mcbrearty and Brooks propose an earlier date of around 250,000 to 300,000 years ago for the appearance of what they describe as “Middle Stone Age technology and the first signs of modern behavior,” and they argue that if the Florisbad skull and other similar-looking fossils are classified as Homo sapiens (as it now is by many anatomists) then this date would roughly coincide with the emergence of Homo sapiens. On this point, the authors were remarkably prescient: the oldest fossils of Homo sapiens, discovered at Jebel Irhoud in Morocco, are now dated to approximately 315,000 years ago. Tools found at the site belong to the Middle Stone Age. The authors conclude:

It does seem that the “human revolution” that made us modern never was – archaeological evidence for modern behaviours arose much earlier, starting in groups that predated our own species. Every criterion that has historically been used to differentiate modern humans from archaic humans – culture, art, treatment of the dead, ornamentation and abstract symbolism – has much older examples.

Mcbrearty and Brooks are not alone. In his paper, The Invisible Frontier. A Multiple Species Model for the Origin of Behavioral Modernity (Evolutionary Anthropology Issues News and Reviews 12(4):188-202, August 2003), which draws upon the work of Mcbrearty and Brooks, Francisco d’Errico takes aim at the widely held view that views behavioral modernity as having appeared suddenly, with the appearance in the archeological record of “previously unseen carvings, personal ornaments, musical instruments, depictions on cave walls, and new stone and bone technology” – events which are said to indicate a cultural “revolution” among modern humans living in Europe around 40,000 years ago, possibly triggered by a brain mutation (in Europe or Africa). D’Errico contends that this definition of “behavioral modernity” is Eurocentric, and that it reflects a naive belief that human culture evolved according to a simple, linear sequence:

…[T]he traits used to identify behavioral modernity are no more than a list of the major archeological features that characterize the Upper Paleolithic in Europe. The problem is that this behavior is highly derived within Homo sapiens. It does not consistently characterize the behavior of the earliest Homo sapiens populations nor does it appear in many parts of the world (much of Africa, most of Asia, all of Australia) until long after its appearance in Europe ca. 40,000 years ago. Rather than accept that this complex of behaviors reflects adaptive strategies that were unique to the problem of colonizing Europe, many archeologists cling to the notion that the course of human behavioral evolution can be modeled in terms of a simple progression from archaic to modern behavior.

D’Errico also points out that many of the behaviors that archaeologists classify as “modern” can be found among the Neanderthals. Summarizing the results of his research, d’Errico rejects the view that behavioral modernity arose among only one species, Homo sapiens, and concludes that it arose among populations of both Neanderthals and modern humans, in different regions:

The application of the criteria used so far to identify behavioral modernity in the material culture of Neandertals and contemporary anatomically modern humans does not seem to support the single-species or single-population model for the origin of these modern traits. Neandertal subsistence strategies and technological and symbolic traditions do not significantly differ from those of contemporary human populations in Africa and in the Near East

“Modern” traits may have appeared in different regions and among different groups of humans, much as happened later in history with the inventions of agriculture, writing, and state society. Two hypotheses, which are not mutually exclusive, may explain both convergences and differences between the two populations on which I have focused here. The first is that the two populations [of modern humans and Neanderthals – VJT] reacted in comparable ways to comparable ecological pressures. The other is that, as their similar lithic technology in the Near East suggests, cultural barriers, and perhaps biological ones, between these populations were permeable.

Lastly, in a recent article titled, The revolution that wasn’t: African tools push back the origins of human technological innovation (The Conversation, March 16, 2018), Professors Patrick Randolph-Quinney and Anthony Sinclair roundly criticize the once-popular view there was a “human revolution” 40,000-50,000 years ago, during which modern behaviors such as symbolism, innovation and art emerged, when humans left Africa. In the authors’ opinion, not only is this view Eurocentric, it is also at odds with mounting evidence that some components of the alleged “human revolution” originated much earlier, during the African Middle Stone Age, which lasted from 280,000-50,000 years ago. Randolph-Quinney and Sinclair credit Mcbrearty and Brooks with being the originators of the new paradigm, which they defend in their paper.

(b) Symbolic thinking: the narrow view of “modern human behavior”

Blombos Cave shell beads, dating to over 70,000 years ago. These beads were strung on a cord and worn as personal ornamentation. Image created by Chris Henshilwood & Francesco d’Errico, and courtesy of Wikipedia.

On a narrow view, modern human behavior could be defined as behavior that is mediated by socially constructed patterns of symbolic thinking, actions, and communication that allow for material and information exchange and cultural continuity between and across generations and contemporaneous communities. I’ll give the name “Symbolic Adam” to the earliest hominin in whom this behavior appeared.

But how do we identify symbolic thinking in the archaeological record?

In her 2012 paper, The Transition to Modern Behavior (Nature Education Knowledge 3:10(15)), Sarah Wurz (Institute for Human Evolution, University of Witwatersrand) acknowledges that “[t]here is lively debate and divergent points of view on the appropriate markers for modern behavior in the archaeological record (Nowell 2010),” but adds that “when the standard of symbolism is applied, it can be shown that artifacts of a clearly symbolic nature appear only after 100,000 years ago (Henshilwood et al. 2002, Henshilwood et al. 2004, d’Errico et al. 2009, Texier et al. 2010).” Wurz hails the discovery of symbolic artifacts at the Blombos Cave in South Africa – in particular, shell beads, which she describes as “the earliest evidence for personal ornaments, dating to between 100,000 – 70,000 years ago.” Wurz concludes:

The extraordinary discoveries from Blombos Cave and the other examples discussed here show that the most fruitful ways of identifying modern behavior in the archaeological record have been through artifacts that demonstrate symbolism and complex planning abilities. The challenge for future research is to expand archaeological criteria for modern behavior that are fully integrated with neuro-evolutionary theory and cognitive science.

However, it turns out that the earliest shell beads may predate those found at the Blombos Cave by more than 30,000 years. According to a report in Science magazine (23 Jun 2006: Vol. 312, Issue 5781, pp. 1785-1788) by Marian Vanhaeren and her colleagues, shells of ocean-dwelling snails found at Skhul in Israel and Oued Djebbana in Algeria, were in fact used as decorative beads by modern Homo sapiens. The ones found in Israel are at least 100,000 years old, and possibly as much as 135,000 years old. The shells had tiny holes drilled into them, suggesting that they were strung together to make necklaces. What this suggests is that modern human forms of behavior, including the use of symbolism, go back over 100,000 years.

Evidence for Neandertal Jewelry: Modified White-Tailed Eagle Claws at Krapina. Figure 6. Articulation of Krapina 385.4 and 386.18. Dorsal (a) and lateral (b) view; arrow in (b) points to highly polished area on 385.4. Image courtesy of PLoS One.

Surprisingly, however, the earliest evidence for symbolism dates back to the Neanderthals, not Homo sapiens. Neanderthals in Krapina, Croatia, may have manipulated white-tailed eagle talons (shown above) to make jewelry as early as 130,000 years ago, according to a recent study published on March 11, 2015 in the open-access journal PLoS One by Davorka Radovčić et al. The authors write:

Some have argued that Neandertals lacked symbolic ability or copied this behavior from modern humans. These remains clearly show that the Krapina Neandertals made jewelry well before the appearance of modern humans in Europe, extending ornament production and symbolic activity early into the European Mousterian…

In any case, these talons provide multiple new lines of evidence for Neandertals’ abilities and cultural sophistication. They are the earliest evidence for jewelry in the European fossil record and demonstrate that Neandertals possessed a symbolic culture long before more modern human forms arrived in Europe.

Although geometrical designs are known to go back about half a million years, as we saw in Section 4 above, Blombos Cave in South Africa is the site of what is said to be the earliest drawing by human hands, which is estimated to be 73,000 years old. The drawing consists of a crosshatched pattern made up of nine fine lines, whose sudden termination is said to indicate that the pattern originally extended over a larger surface:

The oldest claimed drawing by human hands, discovered in Blombos Cave in South Africa, and estimated to be 73,000 years old. Image courtesy of Henshilwood, C.S. et al. and Wikipedia.

However, the first examples of clearly symbolic painting that we are aware of are animal designs from the Lubang Jeriji Saléh cave in Indonesia, dating back to 40,000 years ago:

A painting of a bull, made with ochre, discovered in Lubang Jeriji Saléh cave, East Kalimantan, Borneo, Indonesia, dating to 40,000 years ago. It is currently the oldest figurative cave painting in the world. Image courtesy of Wikipedia.

There are no known instances of painting by Neanderthal man. However, there are a couple of instances of cupules, or circular man-made hollows on the surface of a rock or a rock slab, which are known to have been produced by Neanderthal man, some 60,000 years ago in La Ferrassie, France. These are described here. However, the world’s oldest cupules (described in section 4 above) are from two ancient quartzite caves in the Madhya Pradesh region of central India: Bhimbetka (which includes the prehistoric Auditorium cave) and Daraki-Chattan. These go back 290,000 years. However, it is not at all certain that they constitute clear evidence of symbolic thinking.

Additionally, it appears that Neanderthal man was responsible for the creation of hand stencils in Spain, some 67,000 years ago, according to an article on the Sapiens blog (May 30, 2018) by Derek Hodgson and Paul Pettitt, who put forward the proposal that Homo sapiens stole the Neanderthals’ insight that a graphic mark could serve as a representation, and then took the final step of “creating the first complex figurative representations, with all the ramifications that followed for art and culture.” This may be the case, but at the same time, I would like to point out that Indonesia (the site of the world’s oldest symbolic painting) is a long way from Spain, where the Neanderthal hand prints were discovered.

Although the notion of a cognitive revolution occurring 40,000 to 50,000 years ago has been debunked by recent scientific advances, something significant seems to have happened around that time: it was the pace of change that picked up. What drove this cultural acceleration was a demographic boom, taking place sometime between 41,000 and 52,000 years ago, as science journalist Gaia Vince explains in a recent article in Aeon (Ancient yet cosmopolitan, June 18, 2020):

“Geneticists recently discovered that the greatest population boom in prehistory occurred 40,000 to 50,000 years ago, which helps to explain a swathe of cultural explosions seen at this time, from present-day Germany to Indonesia. Researchers modelling ancient population densities during periods of rapid cultural acceleration found that the connection seemed to bear out. For instance, there was great similarity between the demographics of Palaeolithic Europe 45,000 years ago and Sub-Saharan Africa 90,000 years ago, both times of local explosions in cultural activity. Bigger, more culturally diverse populations can call upon a greater resource of potential solutions as the physical or social environment changes. They have more opportunities to adapt their cultural practices, so they are more resilient. So these societies could survive longer, giving their technologies and cultural practices longer to evolve.”

The basic idea, according to Vince, is that “once group size increases enough, cultural innovation accelerates, because the group then holds a diversity of cultural practices that can be combined to produce further practices, and so on exponentially.” This explains how larger, connected populations can experience cultural explosions, as groups acquire new cultural practices and technologies from other groups.

The driver of the demographic explosion 40,000 to 50,000 years ago might well have been an environmental change. Scientists know, for instance, that it was a rare wet spell that enabled our African ancestors to migrate into the Middle East and across the globe, some 80,000 years ago. At any rate, the cultural and artistic changes occurring across the globe 40,000 to 50,000 years ago are not signs of a breakthrough but of a sudden (and demographically driven) acceleration in the rate of change. Likewise, cultural stagnation within certain communities may have been caused by demographic isolation, which would make innovation more hazardous, as it would threaten the survival of the group.

At any rate, what the foregoing data indicates is that we are unable to identify a sudden beginning of symbolic behavior in the human species. All we can say is that whenever it began, it was more than 100,000 years ago. What’s more, it occurred in both modern humans and Neanderthals, suggesting that it either arose independently in the two species over a period of tens of thousands of years or spread from one species to the other, over 100,000 years ago. What happened 40,000 to 50,000 years ago was that the pace of change picked up, driven by demographic factors: a prehistoric population boom.

Readers who are curious to learn more about the history of prehistoric art might like to have a look at the following online article: Top 10 Oldest Pieces of Art Ever Discovered, by Saugat Adhikari, writing for ancienthistorylists.com.

Summary: Regardless of whether we define behavioral modernity broadly (in terms of abstract thinking) or narrowly (in terms of symbolic behavior), it did not spring up overnight. Abstract thinking can certainly be said to have appeared by 300,000 years ago, with the advent of the Middle Stone Age in Africa, while symbolic behavior seems to have arisen more than 130,000 years ago, and to have occurred not only in our own species, but also in Neanderthal man.

————————————————————————————————————

8. Linguistic Adam


RETURN TO MAIN MENU

Overview: The ongoing academic controversy over exactly when the ancestors of modern humans acquired language boils down to a difference of opinion regarding what language is. On a broad view, language is a communication system in which symbols (such as sounds) are assigned definite meanings, but in which words can be combined freely to make an infinite number of possible sentences. On this view, a very powerful case can be made that not only Homo sapiens but also Neanderthal man was an adept language user, and that Heidelberg man, who lived half a million years ago, probably used language as well. On a narrow view, language (in the strict sense of the word) is defined in terms of sets of rules governing the way we construct sentences: more specifically, syntax and recursion. On this conception, hierarchical syntactical structure is a core feature of language. If we adopt this view, then language is most likely confined to Homo sapiens, and probably emerged during a 130,000-year window, some time between 200,000 and 70,000 years ago. However, recent research has shown that a single mutation is unlikely to have given rise to human language, and that it is much more likely that multiple mutations with smaller fitness effects were required to fix it within the human population. The key point here is that no matter which conception of language you happen to favor, it most likely did not magically emerge all at once, but over a period of tens of thousands of years.

The evolution of language: Recommended Reading
Alex, B. (2018). “Could Neanderthals speak? The Ongoing Debate over Neanderthal language”. Discover, November 6, 2018.
Berwick, R. and Chomsky, N. (2016). Why Only Us: Language and Evolution. Cambridge, Massachusetts, U.S.: MIT Press.
Chomsky, N. (2010). et al., “Some simple evo-devo theses: How true might they be for language” in The evolution of human language. Cambridge University Press.
Chomsky, N. (2012). The Science of Language: Interviews with James McGilvray. Cambridge University Press.
de Boer, B. (2020). et al., “Evolutionary Dynamics Do Not Motivate a Single-Mutant Theory of Human Language”. Scientific Reports, volume 10, 451, https://doi.org/10.1038/s41598-019-57235-8.
Dediu, D. and Levinson, S. (2013). “On the antiquity of language: the reinterpretation of Neandertal linguistic capacities and its consequences”. Frontiers in Psychology, 4:397, https://doi.org/10.3389/fpsyg.2013.00397.
Knapton, S. (2018). Language started 1.5m years earlier than previously thought as scientists say Homo Erectus were first to talk. The Telegraph, February 20, 2018.
Pomeroy, R. (2020). Did Language Evolve With a Single Mutation? A New Study Says That’s Unlikely. Real Clear Science, January 17, 2020.
Ponce de León, M. S. et al. (2016). “Brain development is similar in Neanderthals and modern humans”. Current Biology, Volume 26, Issue 14, Pages R665-R666.
Vyshedskiy, A. (2019). Language evolution to revolution. Research Ideas and Outcomes, doi: 10.3897/rio.5.e38546, version 9, July 19, 2019.

And now we come to Adam number eight: “Linguistic Adam”, or the first hominin to use language. Some readers may be wondering why I’ve said very little so far about the emergence of language. That’s because it’s hard to define and even harder to pinpoint in time: languages leave no fossils.

(a) What is language? Two views

So let’s begin with the first question: what is language? An article in Discover magazine titled, “Could Neanderthals speak? The Ongoing Debate over Neanderthal language” (November 6, 2018) by Bridget Alex sums up the current state of play, in the following quote:

“Without straying too far into academic debates over the nature of language, let’s just say there are broad and narrow theories when it comes to what actually constitutes language.

“A broad view defines language as a communication system in which arbitrary symbols (usually sounds) hold specific meanings, but are not fixed or finite. Words can be invented, learned, altered and combined to convey anything you can think.

Narrow definitions focus on syntax and recursion, structural properties shared by all human languages today. These both refer broadly to the set of rules that guides how statements can be formulated in any given language, and they are thought to be hardwired into our brains. By this view language is ‘a computational cognitive mechanism that has hierarchical syntactic structure at its core,’ in the words of biologist Johan Bolhuis and colleagues.”

Alex goes on to explain that those who favor the narrow definition tend to restrict language to Homo sapiens, and to propose that it appeared fairly recently, around 100,000 years ago. Scientists who prefer a broader definition, such as Dan Dediu (whose views I shall discuss below), are willing to allow that other human species were also able to use language, and that it evolved gradually, over hundreds of thousands of years.

(b) The broad view, and the case for language use by Homo erectus, Heidelberg man and Neanderthal man

Mousterian points, Grotte du Placard, Charente, France. Date: 300,000 to 30,000 BP. These tools were made by Neanderthal man. Some authorities contend that the ability to make these points presupposes the use of language. Image courtesy of Didier Descouens and Wikipedia.

Some experts have even claimed that Homo erectus possessed language. However, as we saw above, language was probably not required in order to manufacture Acheulean hand-axes. It has also been argued, notably by Daniel Everett, Professor of Global Studies at Bentley University, Massachusetts, in his book, How Language Began, that Homo erectus would have needed language in order to build boats to colonize remote islands such as Flores in Indonesia and Crete in the Mediterranean, but Professor Chris Stringer, head of human origins at the Natural History Museum of London, is skeptical: he points out that tsunamis could have transported early humans to these islands on floating rafts made of vegetation, without them needing to possess the skill of steering a boat (which would have required language). For his part, Stringer believes that Heidelberg man, a descendant of Homo erectus who lived from 700,000 to 300,000 years ago, would have had a complex enough life to require speech, though not at the level of modern human language.

Be that as it may, a very strong case can be made that Neanderthal man was capable of language. In their paper, “On the antiquity of language: the reinterpretation of Neandertal linguistic capacities and its consequences” (Frontiers in Psychology, 4:397, 5 July 2013, https://doi.org/10.3389/fpsyg.2013.00397), authors Dan Dediu and Stephen Levinson explain why they believe that Neanderthal man would have used language:

“The Neandertals managed to live in hostile sub-Arctic conditions (Stewart, 2005). They controlled fire, and in addition to game, cooked and ate starchy foods of various kinds (Henry et al., 2010; Roebroeks and Villa, 2011). They almost certainly had sewn skin clothing and some kind of footgear (Sørensen, 2009). They hunted a range of large animals, probably by collective driving, and could bring down substantial game like buffalo and mammoth (Conard and Niven, 2001; Villa and Lenoir, 2009).

“Neandertals buried their dead (Pettitt, 2002), with some but contested evidence for grave offerings and indications of cannibalism (Lalueza-Fox et al., 2010). Lumps of pigment — presumably used in body decoration, and recently found applied to perforated shells (Zilhao et al., 2010) — are also found in Neandertal sites. They also looked after the infirm and the sick, as shown by healed or permanent injuries (e.g., Spikins et al., 2010), and apparently used medicinal herbs (Hardy et al., 2012). They may have made huts, bone tools, and beads, but the evidence is more scattered (Klein, 2009), and seemed to live in small family groups and practice patrilocality (Lalueza-Fox et al., 2010)…”

The authors also contend that the complexity of Neanderthal tools, which require months of practice for modern humans to learn how to make, points to their ability to use language:

“Complex tool making of the Mousterian kind involves hierarchical planning with recursive sub-stages (Stout, 2011) which activates Broca’s area [of the brain] just as in analogous linguistic tasks (Stout and Chaminade, 2012). The chain of fifty or so actions and the motor control required to master it are not dissimilar to the complex cognition and motor control involved in language (and similarly takes months of learning to replicate by modern students).”

Summarizing their case, Dediu and Levinson conclude that “nothing like Neandertal culture, with its complex tool assemblages and behavioral adaptation to sub-Arctic conditions, would have been possible without recognizably modern language.

In addition, Dediu and Levinson argue that Neanderthals possessed the genes required for language use in human beings. “Neandertals and Denisovans had the basic genetic underpinnings for recognizably modern language and speech, but it is possible that modern humans may outstrip them in some parameters (perhaps range of speech sounds or rapidity of speech, complexity of syntax, size of vocabularies, or the like).” Based on their understanding of language as “the full suite of abilities to map sound to meaning, including the infrastructure that supports it (vocal anatomy, neurocognition, ethology of communication, theory of mind, etc.),” the authors propose that “speech and language are ancient, being present in a modern-like form over half a million years ago in the common ancestor of Neandertals and modern humans” (i.e. Heidelberg man), whose linguistic abilities evolved gradually from the more rudimentary abilities of H. erectus. Although they espouse the view that language evolved over deep time, Dediu and Levinson are not diehard gradualists: for instance, they are willing to allow that “the assemblage of the prerequisites for speech and language in the transition from H. erectus to H. heidelbergensis may well have been punctuated at times by relatively large changes in language-related features.”

Not everyone is impressed, however. Dediu and Levinson’s paper arguing for the existence of fully-fledged language capabilities in Neanderthal man has been critiqued by Berwick, Hauser and Tattersall, who argue in their commentary that: (i) “hominids can be smart without implying modern cognition”; (ii) “smart does not necessarily mean that Neanderthals had the competence for language or the capacity to externalize it in speech”; (iii) the earliest unambiguous evidence for symbolic communication dates from less than 100,000 years ago; and (iv) although they may have had the same FOXP2 genes as we do, “[n]either Neanderthals nor Denisovans possessed human variants of other putatively ‘language-related’ alleles such as CNTAP2, ASPM, and MCPH1.” On point (iii), it seems that Berwick, Hauser and Tattersall are incorrect, as we saw above in section 7: according to a 2015 study, Neandertals in Krapina, Croatia, were capable of making jewelry as far back as 130,000 years ago, before any members of Homo sapiens were doing so.

(c) The narrow view of language: the “single mutation” hypotheses proposed by Noam Chomsky and Andrey Vyshedskiy

According to linguist Noam Chomsky, language arose with a single mutation in the ancestral human genome. Image courtesy of Wikipedia.

In order to understand where the proponents of the narrow view of language are coming from, it is important to grasp the fundamental differences between human language and animal communication. These are ably summarized by Professors Robert Berwick and Noam Chomsky in their 2016 book, Why Only Us: Language and Evolution (Cambridge, Massachusetts, U.S.: MIT Press). In a nutshell: animal communication is linear (A -> B -> C) whereas human language has a hierarchical syntax which is capable of generating, by a recursive procedure, an unlimited number of hierarchically structured sentences, as shown by the example, “How many cars did you tell your friends that they should tell their friends . . . that they should tell the mechanics to fix?”, where the number of levels in the hierarchy can be extended without limit. Animals such as birds are capable of producing long and complex songs, but they are utterly incapable of learning songs with hierarchical syntax. Apes share the same inability: back in the 1970s, Nim Chimpsky’s “sentences” employed hundreds of signs, but never got beyond memorized two-word combinations, which lacked any hierarchical structure.

Until now, the brain’s ability to generate hierarchical syntax has remained an unsolved mystery. Currently, however, there are two intriguing hypotheses, known as the Promethean hypothesis (proposed by computational linguist and computer scientist Robert Berwick and linguist Noam Chomsky) and the Romulus and Remus hypothesis (proposed by Andrey Vyshedskiy) which suggest that human language, with its unique ability to generate hierarchical syntax, may be the product of a single mutation. Ingenious as these hypotheses are, they are almost certainly wrong, as we’ll see in part (d) below.

(i) Berwick and Chomsky’s Promethean hypothesis

Let’s begin with Berwick and Chomsky’s hypothesis. Berwick and Chomsky claim to have identified a very simple procedure, which they call “Merge,” which is capable of generating the hierarchies found in human language. Merge takes two linguistic units, call them X and Y, and combines them into an unordered pair {X,Y}. By successive application, or recursion, hierarchical sentences of unlimited length and complexity can be built up. Here’s how Chomsky describes Merge in layperson’s language, in his book, The Science of Language: Interviews with James McGilvray (Cambridge University Press, 2012):

You got an operation that enables you to take mental objects [or concepts of some sort], already constructed, and make bigger mental objects out of them. That’s Merge. As soon as you have that, you have an infinite variety of hierarchically structured expressions [and thoughts] available to you. (2012, p. 14)

In the book, Chomsky presents his case that Merge arose suddenly, in a single individual, some time before 60,000 years ago, and that it spread rapidly around the globe:

Around seventy, sixty thousand years ago, maybe as early as a hundred thousand, you start getting symbolic art, notations representing astronomical and meteorological events, complex social structures … just an outburst of creative energy that somehow takes place in an instant of evolutionary time – maybe ten thousand years or so, which is nothing… So it looks as if — given the time involved — there was a sudden ‘great leap forward.’ Some small genetic modification somehow that rewired the brain slightly [and] made this human capacity available. And with it came an entire range of creative options that are available to humans within a theory of mind — a second-order theory of mind, so you know that somebody is trying to make you think what somebody else wants you to think…

Well, mutations take place in a person, not in a a group. We know, incidentally, that this was a very small breeding group — some little group of hominids in some corner of Africa, apparently. Somewhere in that group, some small mutation took place, leading to the great leap forward. It had to have happened in a single person. Something happened to a person that that person transmitted to its offspring. And apparently in a very short time, it [that modification] dominated the group; so it must have had some selectional advantage. But it could have been a very short time in a small breeding group. Well, what was it? The simplest explanation – we have no reason to doubt it – is that what happened is that we got Merge. (2012, pp. 13-14)

If the “single mutation” hypothesis is correct, then human language arose instantly, in one individual, in the course of a single generation. Nevertheless, Professors Berwick and Chomsky argue strenuously, in their 2016 book, Why Only Us: Language and Evolution (Cambridge, Massachusetts, U.S.: MIT Press), that the account they propose is fully compatible with a Darwinian understanding of evolution. The authors hypothesize that language emerged at some time between the appearance of the first fully modern-looking modern human beings, some 200,000 years ago, and the last human exodus from Africa, some 60,000 to 70,000 years ago, and that the gene that enabled it came to rapidly dominate the human population, worldwide. Commenting on this 130,000-year interval of time, Berwick and Chomsky write:

This is not ‘overnight in one generation’ as some have (incorrectly) inferred — but neither is it on the scale of geological eons. It’s time enough — within the ballpark for what Nilsson and Pelger (1994) estimated as the time required for the full evolution of a vertebrate eye from a single cell, even without the invocation of any ‘evo-devo’ effects.” (2016, p. 157)

Later, the authors propose that the single mutation required for human language involved only a “small rewiring of the brain” (in Brodmann’s areas 44 and 45B) resulting in “a fully working syntactical system” (p. 157).

Such a change takes place in an individual — and perhaps, if fortunate, in all of their siblings too, passed on from one or (less likely) both parents. Individuals so endowed would have advantages, and the capacity might proliferate through a small breeding group over generations. (2016, p. 108)

In an article titled, “Some simple evo-devo theses: How true might they be for language” in The evolution of human language (Cambridge University Press, 2010), Chomsky et al. suggested that “roughly 100,000+ years ago, … a rewiring of the brain took place in some individual, call him Prometheus, yielding the operation of unbounded Merge, applying to concepts with intricate (and little understood) properties.” However, in his 2019 Biorxiv article, Language evolution to revolution (Research Ideas and Outcomes doi: 10.3897/rio.5.e38546), neurologist Andrey Vyshedskiy, who (like Chomsky) endorses a “narrow” view of language, identifies a fatal flaw in Chomsky’s Promethean hypothesis:

We argue however, that Prometheus could not have evolved alone. Modern children who are not exposed to recursive language before puberty cannot acquire PFS [Prefrontal synthesis – VJT] later in life(4). If parents did not expose Prometheus to recursive language, the only way for Prometheus to acquire PFS was to invent recursive language himself and then use it to train his own dialog-dependent PFS. This fit (sic – should be “feat”) can only be accomplished in a group of children. Consequently, Prometheus at the early age must have had a peer companion(s) to invent recursive elements of language and to carry out recursive conversations.

In a nutshell: “it is not enough to be [a] fully genetically modern individual to acquire PFS, one needs to be exposed to recursive language early in childhood.” Elsewhere in his article, Vyshedskiy points out that children are known to be capable of inventing a recursive language, without the assistance of adults: Nicaraguan sign language was invented in precisely this way, by deaf children in the 1980s. He goes on to argue that “[a] group of children better fits the role of the patriarchs of ‘unbounded Merge,’ than the lone Prometheus.” This is what he proposes in his Romulus and Remus hypothesis, to which we now turn.

(ii) Vyshedskiy’s Romulus and Remus hypothesis

By contrast, according to the Romulus and Remus hypothesis proposed by Andrey Vyshedskiy, in his 2019 Biorxiv article, Language evolution to revolution (Research Ideas and Outcomes doi: 10.3897/rio.5.e38546), there was a mental leap, over the course of a single generation, from a non-recursive communication system employing a rich vocabulary to a recursive, hierarchical language system. Vyshedskiy believes that this transition took place 70,000 years ago, and that it was associated with the acquisition of a brand-new component of the human imagination, called Prefrontal Synthesis, which allows us to imagine several objects or persons in a novel combination. Vyshedskiy suggests that this new imaginative capacity was enabled by a single mutation that slowed down the development of the human prefrontal cortex, so that instead of taking several months to mature, it now took five years. He hypothesizes that this mutation occurred simultaneously in two or more children (probably twins – hence the names, Romulus and Remus) belonging to the same family. These children would have created their own recursive language, and taught it to each other. Of course, one big drawback of the new mutation was that since it dramatically slowed down the maturation of a critical component of the human brain, it would have made human infants helpless for a much longer period of time, increasing the burden on parents. It is a remarkable fact that unlike three-year-old primates of other species, three-year-old human children are utterly incapable of assessing risks, which is why they cannot be left alone near fire, near an open apartment window, near a traffic road, or in a forest. Thus children with this mutation would have been at an increased risk of infant mortality. However, the payoff, on Vyshedskiy’s hypothesis, was that the children, once they had grown up, would have been able to imagine new ways of solving everyday problems: for instance, Vyshedskiy suggests that they may have invented the world’s first animal traps, making the task of feeding a tribe a lot easier, thanks to their newfound ability to think out of the box. In short, what Vushedskiy is proposing is that the occurrence of two simultaneous mutations within the same family triggered a Cognitive Revolution that swept the world, resulting in a rash of new inventions, approximately 70,000 years ago: (1) composite figurative arts, (2) bone needles with an eye, (3) construction of dwellings, and (4) elaborate burials.

In his article, Vyshedskiy cites the work of Liu et al., who reported (Genome Research 2012, 22: 611-622) that the mutation causing the delay in the development of the human prefrontal cortex (PFC) occurred sometime within the last 300,000 years, after the separation of the human and Neanderthal lineages. Vyshedskiy’s bold proposal is that this was the mutation that could have triggered what he terms the “Great leap forward” in human cognition, some 70,000 years ago. However, more recent research tells a different story. In their article, “Brain development is similar in Neanderthals and modern humans” (Current Biology Volume 26, Issue 14, 25 July 2016, Pages R665-R666), Marcia S.Ponce de León et al. summarize the results of their research:

Earlier research suggested that Neanderthals followed an ancestral mode of brain development, similar to that of our closest living relatives, the chimpanzees (2, 3, 4). Modern humans, by contrast, were suggested to follow a uniquely derived mode of brain development just after birth, giving rise to the characteristically globular shape of the adult human brain case (2, 4, 5). Here, we re-examine this hypothesis using an extended sample of Neanderthal infants. We document endocranial development during the decisive first two years of postnatal life. The new data indicate that Neanderthals followed largely similar modes of endocranial development to modern humans. These findings challenge the notion that human brain and cognitive development after birth is uniquely derived (2, 4).

Vyshedskiy’s interpretation of the archaeological data is also tendentious at certain points. He writes:

Unequivocal PFS [Prefrontal synthesis – VJT] evidence is completely missing from the archeological evidence before 70,000 years ago but is abundantly present after 62,000 years ago. Clear PFS evidence include (1) composite figurative arts, (2) bone needles with an eye, (3) construction of dwellings, and (4) elaborate burials. Together with (5) exceptionally fast colonization of the globe and migration to Australia (presumably by boats) at around 62,000 years ago and (6) demise of the Pleistocene megafauna (presumably with the aid of animal traps) this multitude of the archeological evidence indicates acquisition of PFS by some individuals around 70,000 years ago and their relentless conquest of the planet.

Unfortunately, the archaeological data does not square very well with Vyshedskiy’s hypothesis of a “Great Leap Forward,” 70,000 years ago.

The 37,000-year-old Löwenmensch (Lion-man) figurine, originally discovered in 1939 in the Hohlenstein-Stadel cave, Germany, after being restored in 2013. Dagmar Hollmann / Wikimedia Commons. License: CC BY-SA 4.0.

(1) As Vyshedskiy himself admits, the oldest composite figurative object, Lowenmensch (“lion-man”) sculpture (shown above) from the caves of Lone valley in Germany, dates back to 37,000 years ago, not 70,000 years ago.

(2) The world’s oldest known needle was made at least 50,000 years ago, in Siberia, but it was created by Denisovan man, not by Homo sapiens. (Recall that Vyshedskiy maintains that language is unique to Homo sapiens.) Vyshedskiy also claims that the oldest needles go back even further, to 61,000 years ago in Sibidu Cave, South Africa, but this identification remains controversial: the artifact in question is a point that might be from a bone needle.

(3) As Vyshedskiy acknowledges in his article, the oldest solid evidence of human dwellings dates back to 27,000 years ago, not 70,000 years. There is nothing significant about the date of 70,000 years ago, as far as dwellings are concerned, as we have claimed instances of dwellings going back hundreds of thousands of years before that. For instance, the French archaeologist Henry de Lumley claimed that shelters were built at the site of Terra Amata in France, some 380,000 years ago, after uncovering what he believed to be low walls of stones and vestiges of fireplaces, although another archaeologist, Paola Villa has cast doubt on this interpretation, arguing that the stones may have been naturally deposited through stream flow, soil creep or some other natural process. However, the oldest claimed evidence of a dwelling (and campfire) in Europe comes from Přezletice, Czech Republic, 700,000 years ago, during the Cromerian Interglacial (corresponding to Marine Isotopic Stages 19 to 13). The base of the dwelling measured about 3 m × 4 m on the outside and 3 m × 2 m on the inside, and it is thought to have been a hut with a vaulted roof (made either of thick branches or thin poles), supported by an earth and rock foundation. It has been suggested that the structure may have functioned as a winter base camp. (For more details, see Sklenář, K. (1987). “The Lower Paleolithic Dwelling Structure at Přezletice and its Significance”. Anthropologie. 25 (2): 101–103. JSTOR 26294864.) Finally, in Olduvai Gorge, Tanzania, rocks piled up in a circular structure, 15–23 cm high, were discovered in 1962. Anthropologist Mary Leakey later pointed out that the stone circle resembled temporary structures built by present-day nomadic peoples and suggested that the rocks may have been used to support either a windbreak or a base to support branches covered with skins or grass, some 1.75 million years ago. In short: it’s highly misleading of Vyshedskiy to claim that dwellings were invented no earlier than 70,000 years ago. Almost certainly, dwellings are far older than that.

(4) The oldest known adorned burials date back to 40,000 years ago – again, much later than 70,000 years ago, as Vyshedskiy posits for his “Great Leap Forward.” However, the earliest known ritual burial took place 92,000 years ago, at Qafzeh, Israel, where as many as 15 individuals were buried with 71 pieces of red ochre and ochre-stained stone tools. The skeleton of a ten-year-old boy was also found, with deer antlers laid on his hands. Once again, the evidence tells against Vyshedskiy’s hypothesis of an overnight cognitive revolution, 70,000 years ago.

(5) Australia appears to have been colonized around 65,000 years ago, so on this point, Vyshedskiy is correct.

(6) The extinction of the Australian megafauna took place around 46,000 years ago, and the role of humans in this event is disputed. Recent evidence suggests that environmental change was largely responsible.

In addition, as we’ll see below, there are very strong genetic grounds for doubting any “single mutation” scenario for the origin of human language.

(d) Why human language is almost certainly NOT the result of a single mutation

Does either Chomsky’s Promethean scenario or Vyshedskiy’s Romulus and Remus scenario for the acquisition of human language lend support to the hypothesis of a “Linguistic Adam”? In his review of Berwick and Chomsky’s book, Catholic physicist Stephen Barr (who supports an overnight emergence of true human beings), draws upon Berwick and Chomsky’s findings to argue that the language capacity arose very suddenly, most likely in a single member of the species Homo sapiens. That sure sounds like “Linguistic Adam” to me. However, several esteemed linguists have recently argued that Berwick and Chomsky’s proposal is highly unlikely, on scientific grounds. The arguments they put forward are equally telling against Vyshedskiy’s Romulus and Remus hypothesis.

In an article in Scientific Reports (volume 10, 451 (2020), https://doi.org/10.1038/s41598-019-57235-8) titled, “Evolutionary Dynamics Do Not Motivate a Single-Mutant Theory of Human Language”, Bart de Boer et al. make a cogent case against Chomsky’s proposal that a single mutation, which conferred a huge fitness advantage, gave rise to human language, which would mean that “our modern language capacity emerged instantaneously in a single hominin individual who is an ancestor of all (modern) humans.” Using the mathematical tools of diffusion analysis and extreme value theory, the authors argue that such a scenario is highly unlikely:

We find that although a macro-mutation is much more likely to go to fixation if it occurs, it is much more unlikely a priori than multiple mutations with smaller fitness effects. The most likely scenario is therefore one where a medium number of mutations with medium fitness effects accumulate. This precise analysis of the probability of mutations occurring and going to fixation has not been done previously in the context of the evolution of language. Our results cast doubt on any suggestion that evolutionary reasoning provides an independent rationale for a single-mutant theory of language.

Ross Pomeroy explains the authors’ reasoning in an article over at Real Clear Science (January 17, 2020) titled, Did Language Evolve With a Single Mutation? A New Study Says That’s Unlikely:

A big factor undergirding this result is that many, smaller mutations could spread at the same time. Moreover, even if some fail to become fixed in the human population, other mutations that produce similar phenotypic effects could emerge and take their place. This is not the case for Chomsky’s proposed single mutation – it either becomes fixed or fades out.

De Boer et al. point out that while their finding that language arose gradually is compatible with a narrow definition of language in terms of a unique ability (Merge), it does not square with the classic Chomskyan view that this ability is something you either have or you don’t:

By questioning the proposed evolutionary rationale for a single, un-decomposable computational innovation, we challenge one of the central theory-external motivations for Merge. Although our scenario allows the possibility of an ability for Merge that is based on an accumulation of a larger number of mutations, this is not in line with Berwick and Chomsky’s view of Merge as an atomic ability[3], i.e. an ability one either has or does not have.

The authors of the study conclude:

Combined with mounting, compelling evidence from the archeological record that goes against a very recent ‘great leap forward’[22,23], against the very possibility of formulating a single origin of modern humans[23] and that indicates that the ability for language is older and evolved over a longer time than hitherto thought[24,25,26], as well as genetic evidence that shows the rarity of truly fixed mutations in the modern human lineage[27,28,29] (none of them specifically tied to the language phenotype), we are inclined to argue that evidence favors the view that language emerged through a gradual accumulation of mutations.

The evidence speaks for itself. Mysterious as the origin of human language may be, it almost certainly didn’t emerge overnight.

————————————————————————————————————

9. Ethical Adam


RETURN TO MAIN MENU

Overview: Two features which are said to distinguish human morality are altruism (disinterested and selfless concern for the well-being of others) and self-sacrifice for the good of the group. Neither feature appears to have emerged suddenly. Field reports indicate that altruism can be observed in chimpanzees as well as humans, although care for individuals with serious congenital abnormalities (e.g. brain-damaged children) appears to be unique to humans. This behavior goes back at least 500,000 years, but may date as far back as nearly two million years ago. In any case, experts in the field maintain that there is no hard-and-fast distinction between human altruism and that of apes such as chimpanzees: instead, we should speak of a continuum. Self-sacrifice can be identified by two behaviors which became established features of human life: big-game hunting (which requires hunters to risk their lives for the benefit of the group) and life-long monogamy (which requires a long-term commitment for the benefit of one’s children). Once again, the evidence indicates that these behaviors did not emerge overnight, but over hundreds of thousands of years: in other words, co-operation for the greater good emerged only gradually in the human lineage. Taken together, the evidence strongly suggests that human morality did not suddenly appear at some “magic moment” in the past.

The evolution of morality: Recommended Reading
Hublin, J. (2009). “The prehistory of compassion”. PNAS 106 (16) 6429-6430.
Dubreuil, B. (2009). “Paleolithic public goods games: why human culture and cooperation did not evolve in one step”. Biology and Philosophy 25(1):53-73.
Spikins, P., Rutherford, H. and Needham, A. (2010). “From Homininity to Humanity : Compassion from the Earliest Archaics to Modern Humans”. Time & mind-The journal of archaeology consciousness and culture. pp. 303-325. ISSN 1751-696X.

But what about morality? Here, it might be argued, human beings are unique. Two features in particular, are said to distinguish us from other animals: first, our capacity for altruism, or selfless concern for the well-being of others, even if they are total strangers; and second, our capacity to control our selfish instincts and sacrifice our own individual good for the good of the group.

(a) Altruism

Belisarius begging for alms, by Jacques-Louis David. Palais des Beaux-Arts de Lille. Oil on canvas, 1781. Public domain. Image courtesy of Wikipedia.

Let’s begin with altruism. Humans display altruism towards complete strangers, helping them even at considerable cost to themselves. A man might give his life to save the life of an unrelated child, for instance. From a purely evolutionary perspective, one would expect assistance to be given only to individuals carrying the same genes (i.e. kin) or those who are able to reciprocate the favor: you scratch my back and I’ll scratch yours. So we might posit the existence of an individual called “Ethical Adam,” the first human being to display this kind of altruism. This hominin, if he existed, would be Adam number nine. However, paleoanthropologist Jean-Jacques Hublin pours cold water on the notion that altruism is a defining characteristic of human beings, in a provocative paper titled, “The prehistory of compassion” (PNAS April 21, 2009, 106 (16) 6429-6430). As he points out, altruism has been observed in chimpanzees:

“…[C]laims have been made that the level of altruism displayed by chimpanzees could be much higher than what was once thought (12). For example, there have been reported cases of captive chimpanzees rescuing companions from drowning (13). Boesch and Boesch-Achermann (14) have also described a case of a wild adult male chimpanzee adopting an unrelated orphan.”

If these reports are correct, then altruism cannot be considered unique to humans, after all. On the other hand, Hublin acknowledges that there is one kind of altruistic behavior which has been hitherto unobserved in chimpanzees. They don’t help infants with birth defects:

“What seems to be lacking in the ape repertoire is the survival of individuals with serious congenital abnormalities.”

Penny Spikins et al. argue that there is something genuinely unique about human compassion in their article, “From Homininity to Humanity : Compassion from the Earliest Archaics to Modern Humans” (Time & mind-The journal of archaeology consciousness and culture. pp. 303-325):

Compared with our own species, compassion in non-human species is typically fleeting – chimpanzees never make allowances for individuals who are slow or who cannot keep up with the group, nor do they ‘think through’ how to help others in the long term (Silk et al 2005, Jensen et al 2006). Human compassion seems to be qualitatively and quantitatively different from that in other species and though the existence of compassion cannot be taken and a symbol of ‘humanity’ its construction and expression are indeed unique…

As a result of widespread investments in the wellbeing of others, compassion in our own species is far more integral to how all of society works than in other species. Compassion is fundamental to human social life and Baron-Cohen and Wheelwright call it ‘the glue that holds society together’ (2004: 63). Indeed, it is fair to say that compassionate responses and reciprocal altruism forms the basis of all close human social relationships...

Unlike in other primates, compassionate motivations in humans also extend into the long term. Humans show a capacity to ‘regulate’ compassion, to bring compassionate motivations to help others into rational thought and plan ahead for the long term good of someone we care for (Gross and Thompson 2006).

However, Spikins et al. are not “mind creationists”; they do not believe that human compassion appeared suddenly. In their essay, they propose a four-stage model of the evolution of human compassion. In the first stage (which Spikins et al. date to between 6.0 and 1.8 million years ago), individuals were aware of one another’s feelings and immediate intentions, and assisted one another. However, their compassion was not yet rational; it was a short-lived emotion in response to another individual’s distress. In the second stage, which was manifested by Homo erectus and later Heidelberg man, meat was shared extensively, especially with pregnant females and those with young infants. compassion was “extended widely into non-kin and in potentially extensive investments in caring for offspring and equally for ill individuals,” with the result that “[t]hose who were incapacitated might be provisioned with food for at least several weeks if not longer.” “By around 400,000 bp [years ago] with the emergence of mortuary treatment such compassion, and grief at the loss of someone cared for, emotions which bind us to others might be able to be symbolised in communication and recognisable as something akin to ‘love’.” Spikins et al. go on to posit a third stage (dated from 300,000 to 50,000 years ago) in the evolution of human compassion, exhibited by Neanderthal man, during which human existence was characterized by “a long period of adolescence and a dependence on collaborative hunting”; in this stage, “compassion extends into deep seated commitments to the welfare of others,” including “the routine care of the injured or infirm over extended periods,” making the abandonment of a needy individual, such as a disabled child, “unthinkable.” Spikins et al. opine that “Neanderthals seem to have been no strangers to ‘love’ but it may not have been as we would know it,” as there was little contact between groups, which wre relatively isolated. In the fourth and final stage (dated to 120,000 years ago in Africa and typical of Homo sapiens), human compassion took a new turn: it was now directed at objects, in addition to being directed at people – “the capacity for compassion extends into strangers, animals, objects and abstract concepts,” such as symbolic art, which was protected. Long distance relationships were maintained through a trade in objects, such as personal ornaments (e.g. marine shells).

One might ask: when did human beings start exhibiting compassion, in the fossil record? Apparently, it goes back at least to Heidelberg man, over 500,000 years ago. Hublin (2009) describes the touching case of a brain-damaged prehistoric child, aged five, who was looked after by the other members of his/her tribe, in Spain:

“In this issue of PNAS, Gracia et al. (1) provide new evidence on the survival of an abnormal individual with possible cognitive deficits from a group of pre-Neandertal Pleistocene hunter-gatherers, currently assigned to a geological age of >500 ka. The cranium SH14 from the Sima de los Huesos (Sierra de Atapuerca, Spain) is the earliest documented case of human neurocranial and brain deformity in the fossil record to date. Despite her/his pathological condition, this individual was not rejected at birth and survived until at least 5 years of age, apparently receiving the same attention as other children from the group.

There is very good evidence that care for the incapacitated goes back even further in human prehistory, to Homo ergaster. One of the earliest known cases of long-term support for an incapacitated individual comes from KNM-ER 1808, a female Homo ergaster dated to around 1.5 million years ago, who suffered from hypervitaminosis A, a disease caused by excessive intake of vitamin A. Spikins et al. (2010) remark that the symptoms would have included “abdominal pain, nausea, headaches, dizziness, blurred vision, lethargy, loss of muscular coordination and impaired consciousness,” which “would have greatly hindered this individual’s capacity for independent survival.” The fact that she managed to survive into the advanced stages of her incapacitating disease indicates that “someone must have been feeding her, protecting her from carnivores.” Summing up, the authors quote the verdict of an acknowledged authority in the field (Cameron and Groves, 2004): “The group dynamics of early Homo must have been based on some form of mutual support.

Spikins et al. (2010) also describe an even earlier example of possible long term care from Dmanisi in Georgia, 1.77 million years ago: an early Homo female had lost all but one tooth, several years before her death, with all the sockets except for the canine teeth having been re-absorbed. This individual could only have consumed soft plant or animal foods. Probably she was reliant on support from others. However, it should be pointed out that chimpanzees can sometimes survive the loss of some teeth for a while, too.

In the light of this evidence, can we say that there is a sharp ethical divide between humans and other animals? Apparently Hublin (2009) doesn’t believe in a clear boundary between man and beast, judging from his conclusion, where he espouses the idea of a continuum:

“Finally, the divide between apes and early humans might not be as large as one tends to think. Rather than considering ancient human altruism as proof of the moral values of our predecessors, one should instead see it as merely part of the spectrum of adaptations that have made humans such a prolific and successful species.”

At the very least, we may justifiably conclude, on the basis of the evidence, that humanlike compassion goes back to Homo ergaster / erectus, and that there is no evidence for a sudden emergence of this compassion. Subsequently, compassion evolved through further stages in Neanderthal man and early Homo sapiens, reaching a stage where it even extended to animals, objects and abstract concepts, in Africa some 120,000 years ago.

(b) Self-sacrifice for the good of the group

Homo erectus ate primarily large game, such as the straight-tusked elephant (above), whose extinction in the Levant is thought by some authorities to have been caused by hunting by Homo erectus. Image courtesy of Apotea and Wikipedia.

What about the other distinguishing feature of human morality, self-sacrifice for the good of the group? The first reason for our ancestors’ acquiring a large brain may have been self-control, or more specifically, the ability to control one’s inhibitions and stay focused on one’s goal. In his article, “Paleolithic public goods games: why human culture and cooperation did not evolve in one step” (Biology and Philosophy 25(1):53-73, January 2009), Dr. Benoit Dubreuil argues that around 700,000 years ago, big-game hunting and life-long monogamy became established features of human life. Big-game hunting is highly rewarding in terms of food, if successful, but is also very dangerous for the hunters, who might easily get gored by the animals they are trying to kill; hence a hunter participating in this activity would have required a capacity to master his natural instinct to run away from impending danger – in other words, self-control. Some anthropologists have also argued that with the advent of Heidelberg man, whose brain was nearly as large as ours, it would have been impossible for mothers to rear human children, with their large, energy-demanding brains and prolonged infancy, without the assistance of a committed husband who could provide for the family. That means monogamy – another behavior that would have required a capacity for self-mastery and long-term thinking. Dubreuil refers to these two activities as “cooperative feeding” and “cooperative breeding,” and describes them as “Paleolithic public good games” (PPGGs), because they both required an individual to be able to put the good of the group above his own individual good. However, Dubreuil insists that the emergence of this co-operative ability was not an instantaneous affair. As he puts it in his abstract:

“It is widely agreed that humans have specific abilities for cooperation and culture that evolved since their split with their last common ancestor with chimpanzees. Many uncertainties remain, however, about the exact moment in the human lineage when these abilities evolved. This article argues that cooperation and culture did not evolve in one step in the human lineage and that the capacity to stick to long-term and risky cooperative arrangements evolved before properly modern culture. I present evidence that Homo heidelbergensis became increasingly able to secure contributions from others in two demanding Paleolithic public good games (PPGGs): cooperative feeding and cooperative breeding. I argue that the temptation to defect is high in these PPGGs and that the evolution of human cooperation in Homo heidelbergensis is best explained by the emergence of modern-like abilities for inhibitory control and goal maintenance. These executive functions are localized in the prefrontal cortex and allow humans to stick to social norms in the face of competing motivations. This scenario is consistent with data on brain evolution that indicate that the largest growth of the prefrontal cortex in human evolution occurred in Homo heidelbergensis and was followed by relative stasis in this part of the brain.”

It is worth noting that Dubreuil explicitly rejects the view that we became human overnight: in his own words, “cooperation and culture did not evolve in one step in the human lineage.” On the contrary, Dubreuil maintains that while co-operation for the sake of the greater good goes back at least to Heidelberg man and is associated with changes in the prefrontal cortex of the human brain, “symbolism, art, and properly cumulative culture” arose much later, subsequent to the appearance of modern Homo sapiens. These latter abilities were made possible by what Dubreuil describes as the “globularization of Homo sapiens’ cranium … between 300 and 100,000 years ago,” which was caused by “the relative enlargement of the temporal and/or parietal lobes,” rather than any changes in the brain’s prefrontal cortex.

Additionally, public co-operation did not emerge overnight, but over hundreds of thousands of years. Dubreuil acknowledges in his paper that “an argument can be made that Plio-Pleistocene hominins were already cooperating about future goals on an unprecedented scale” and he adds: “As for cooperative feeding, the evidence presented in this section does not imply that cooperative breeding evolved suddenly in Homo heidelbergensis. Homo erectus’s feeding and breeding pattern, for instance, were probably already quite different from that of australopiths and earlier hominins.” In other words, Dubreuil believes that co-operation for the greater good emerged gradually in the human lineage.

Finally, Dubreuil’s claim that big-game hunting and life-long monogamy became established features of human life during the time of Heidelberg man is itself a highly questionable one. As excavations at the Israeli site of a site of Gesher Benot Yaa’qov have revealed, big-game hunting goes back at least 700,000 years, to the time of Homo erectus, while the date when monogamy became the human norm is hotly debated, with some paleoanthropologists suggesting that it goes back more than three million years, to the time of Australopithecus afarensis (but see here for a contradictory evaluation of the evidence), whereas genetic studies seem to indicate that it may have first appeared as recently as ten to twenty thousand years ago, with the research suggesting that “over much of human prehistory, polygyny was the rule rather than the exception.”

The overall picture that emerges is that whether we define morality in terms of altruism or public co-operation, we find that it arose over a period of hundreds of thousands of years, rather than instantaneously.

————————————————————————————————————

10. Religious Adam


RETURN TO MAIN MENU

Left: The “Venus of Tan-Tan” (replica), an alleged artifact found in Morocco, aged 300,000 to 500,000 years old. Critics contend that the rock’s shape is the result of natural weathering and erosion, which coincidentally produced a remotely human-like object. Museum of Human Evolution, Burgos, Spain. Image courtesy of Wikipedia.
Middle: Two views of the Venus of Hohle-Fels, dated to between 40,000 and 35,000 years ago, which may have been worn as an amulet and is the earliest undisputed example of a depiction of a human being in prehistoric art. Image courtesy of Wikipedia.
Right: The Venus of Willendorf, dated to between 28,000 and 25,000 B.C. Image courtesy of Matthias Kabel and Wikipedia.

Overview: Ritual cannibalism may go back to the time of late Homo erectus, some 450,000 years ago, while intentional human burial dates back to somewhere between 300,000 and 100,000 years ago, and was almost certainly practiced by Neanderthal man, as well as Homo sapiens. The earliest instances of ritual burial of the dead (namely, burial with red ochre) first occur 100,000 years ago, and appear to be unique to Homo sapiens, although it is possible that the Neanderthals practiced burial rituals as well. However, none of these practices constitute solid evidence for religion as such. Burials with grave goods could properly be called religious, as they would appear to indicate belief in an afterlife, but burials of this sort date back no further than 35,000 years ago (tens of thousands of years after the last migration of Homo sapiens out of Africa, 70,000 years ago), and ornate burials (such as the one at Sunghir, in Russia) are the exception rather than the norm: most Paleolithic burials by Homo sapiens were fairly plain, and not too different from Neanderthal burials. The earliest evidence for religious worship is even more recent. Contrary to popular belief, there was no cave bear worship in the middle Paleolithic period. The earliest undisputed “Venus figurine” dates back to around 35,000 to 40,000 years ago, but we cannot be sure whether these figurines possessed a religious significance. The world’s oldest temple dates back to 11,000 years ago, making it the earliest definitive evidence for religion in the strict sense of the word. However, since the worship of gods, spirits, and supernatural forces is found in all human societies, it must have emerged far earlier than 11,000 years ago, but we are unable to say when. Finally, it should be borne in mind that there is no single definition of “religion” which satisfies the majority of experts in the field; indeed, some have even questioned the very validity of the category.

The evolution of religion: Recommended Reading
Balter, M. (1996). “Cave Structure Boosts Neandertal Image”, Science, 271, 5248: 449.
Bello, S.M., Wallduck R., Parfitt S.A, Stringer C.B. (2017). “An Upper Palaeolithic engraved human bone associated with ritualistic cannibalism.” PLoS One 12(8): e0182127. https://doi.org/10.1371/journal.pone.0182127.
Ember, C.R., Ember, M. and Peregrine, P. (2019). Anthropology. 15th ed. New York: Pearson.
Cole, J. (2017). Assessing the calorific significance of episodes of human cannibalism in the Palaeolithic. Scientific Reports 7, 44707. https://doi.org/10.1038/srep44707.
Coqueugniot, H. et al. (2014). Earliest Cranio-Encephalic Trauma from the Levantine Middle Palaeolithic: 3D Reappraisal of the Qafzeh 11 Skull, Consequences of Pediatric Brain Damage on Individual Life Condition and Social Care. PLoS One, 9(7), e102822. https://doi.org/10.1371/journal.pone.0102822
Egeland, C. et al. (2018). Hominin skeletal part abundances and claims of deliberate disposal of corpses in the Middle Pleistocene. PNAS 115 (18) 4601-4606; DOI: 10.1073/pnas.1718678115.
Geggel, L. (2018). This Ancient Society Buried Disabled Children Like Kings. Live Science, February 13, 2018.
Geggel, L. (2020). 70,000-year-old Neanderthal remains may be evidence that ‘closest human relative’ buried its dead. Live Science, February 19, 2020.
Kubota, T. (2016). Neanderthals Likely Built These 176,000-Year-Old Underground Ring Structures. Live Science, May 27, 2016.
Morton, G. (2019). When Did Adam Live? Part 1 Religion. The Migrant Mind blog, June 15, 2019.
Museum of Evolution Husnjakovo, Krapina. (2009). The World’s Largest Neanderthal Finding Site. Online article.
Pettitt, P. (2002). The Neanderthal dead: exploring mortuary variability in Middle Paleolithic Eurasia. Before Farming. 2002. 1-19.
Pettitt, P. (2011). Religion and Ritual in the Lower and Middle Palaeolithic. The Oxford Handbook of the Archaeology of Ritual and Religion (ed. Timothy Insoll), chapter 21. OUP.
Pomeroy, E. et al. (2020). New Neanderthal remains associated with the ‘flower burial’ at Shanidar Cave. Antiquity 2020. Vol. 94(373): 11-26. DOI: https://doi.org/10.15184/aqy.2019.207.
Rendu, W. et al. (2014). “Evidence supporting an intentional Neandertal burial at La Chapelle-aux-Saints.” PNAS 111 (1) 81-86; DOI: 10.1073/pnas.1316780110.
Smith, K. (2018). Machine-learning says Homo naledi may not have buried its dead. Ars Technica. April 6, 2018.
Times of Israel staff. (2020). New evidence points to Neanderthal burial rituals. The Times of Israel. February 20, 2020.
Trinkaus, E. (1985). Cannibalism and burial at Krapina. Journal of Human Evolution. Volume 14, Issue 2. Pages 203-216.
Whitehouse, D. (2003). Cave colours reveal mental leap. BBC News report, December 11, 2003.
Wunn, I. (2000). “Beginning of Religion”. Numen 47(4):417-452.

How should we define religion?

The tenth Adam whom I’ll identify is “Religious Adam,” the first hominin to have practiced some sort of religion. The dawn of religion is generally held to have occurred somewhere between 300,000 and 40,000 years ago, although a few authors (such as Glenn Morton) have proposed earlier dates. However, it needs to be borne in mind that there is no single definition of “religion” which satisfies the majority of scholars in the field; indeed, some scholars have even questioned the very validity of the category, as many societies do not have a word for “religion,” as a practice distinct from ordinary life. Religion may be defined as “any set of attitudes, beliefs, and practices pertaining to supernatural power, whether that power be forces, gods, spirits, ghosts, or demons” (Ember, Ember, and Peregrine, 2019, p. 500), but even this definition is problematic, as not all societies make a distinction between the natural and the supernatural.

The key evidence cited for the earliest signs of religion falls into three main categories: ritualistic cannibalism, religious worship and the respectful treatment of the dead. Accordingly, it is to these practices that we now turn.

(a) Cannibalism in the Paleolithic

Model of a female Homo antecessor practicing cannibalism (Ibeas Museum, Burgos, Spain). Image courtesy of Jose Luis Martinez Alvarez and Wikipedia.

James Cole discusses episodes of Palaeolithic cannibalism in considerable detail in his 2017 paper, Assessing the calorific significance of episodes of human cannibalism in the Palaeolithic (Scientific Reports 7, 44707). After noting that cannibalism has been practiced by a range of hominin species from at least the early Pleistocene, he continues:

Our understanding of prehistoric cannibalism has increased exponentially over the last few years thanks to methodological advances and increasing interpretive rigour when examining and recognising anthropogenically modified hominin remains[2,10,11,12]. In the majority of studies, the interpretation is that cannibalism was practiced for nutritional reasons[2,5,6,13] although there has never been a way to quantify how nutritional these episodes may be. For example, while varied practices of consumption have been identified amongst Neanderthal populations from Moula-Guercy (France)[14], Cueva del Sidrón (Spain)[15], Cueva del Boquete de Zafarraya (Spain)[16], Padrelles (France)[17,18], and Troisième caverne of Goyet (Belgium)[11], all are broadly interpreted as nutritional. A small number of studies also invoke ritual motivations to, for example, the Upper Palaeolithic episodes of cannibalism associated with Homo sapiens at Gough’s Cave (UK)[9,10,19] and, less certainly, at the potential Homo erectus site of Caune de l’Argo (France)[5,20]. Some sites, such as Krapina (Croatia), Brillenhöhle (Germany) and Monte Cicero (Italy), have served as useful cautionary tales, with initial behavioural interpretations of cannibalism being overturned once additional analyses were carried out on the hominin remains[21,22,23,24] (although the cases of Krapina and Brillenhöhle remain controversial in that they may well now be cannibalism sites[25,26]).

Cole’s key research finding is that humans aren’t so nutritious to eat, after all: their nutritional value is “significantly lower than a range of fauna often found in association with anthropogenically modified hominin remains” – a result which weakens the hypothesis that cannibalism was purely nutritionally motivated, and strengthens more socially or culturally driven interpretations of Paleolithic cannibalism. Cole concludes:

…[I]t is more likely that the motivations for cannibalistic episodes lay within complex cultural systems involving both intra-and inter-group dynamics and competition[6,13,20]. Certainly, this conclusion would support interpretations from Gran Dolina relating to Homo antecessor[6,13]. The intriguing possibility of Homo erectus ritual cannibalism from l’Argo[20] could further suggest that even the oldest episodes of cannibalism were social acts that had some cultural meaning for the consumers beyond an easy meal.

… We know that modern humans have a range of complex motivations for cannibalism that extend from ritual, aggressive, and survival to dietary reasons. Why then would a hominin species such as the Neanderthals, who seem to have had varying attitudes to the burial and treatment of their dead[22,60,61,62,72], not have an equally complex attitude towards cannibalism?

Cole’s point is a valid one, and it is entirely likely that from the time of late Homo erectus onwards, cannibalism was practiced for a variety of social and cultural reasons. At the same time, it needs to be noted that ritual is not the same thing as religion. An anthropologist might describe the behavior of crowds at British football games as highly ritualistic, but that does not make it religious.

(A) Location of Gough’s Cave (UK). (B) Photo of the engraved radius M54074 (anterior, medial, posterior and lateral sides). (C) Drawing of the preserved portion of the radius showing the location of the engraving marks (in red), human tooth marks (blue dots) and percussion damage (blue arrows). (D) Sketch of the location of muscles and muscle insertions on a human radius. Image courtesy of PLoS One.

Perhaps the best evidence for ritualistic cannibalism during the Paleolithic comes from the site of Gough’s Cave, England. Silvia M. Bello et al. evaluate the latest evidence in their 2017 article, “An Upper Palaeolithic engraved human bone associated with ritualistic cannibalism.” (PLoS One 12(8): e0182127)

Cut-marked and broken human bones are a recurrent feature of Magdalenian (~17–12,000 years BP, uncalibrated dates) European sites. Human remains at Gough’s Cave (UK) have been modified as part of a Magdalenian mortuary ritual that combined the intensive processing of entire corpses to extract edible tissues and the modification of skulls to produce skull-cups. A human radius from Gough’s Cave shows evidence of cut marks, percussion damage and human tooth marks, indicative of cannibalism, as well as a set of unusual zig-zagging incisions… The new macro- and micro-morphometric analyses of the marks, as well as further comparisons with French Middle Magdalenian engraved artefacts, suggest that these modifications are the result of intentional engravingThe sequence of the manipulations suggests that the engraving was a purposeful component of the cannibalistic practice, implying a complex ritualistic funerary behaviour that has never before been recognized for the Palaeolithic period.

However, it is worth noting that while Bello et al. believe the evidence clearly points to “a complex funerary cannibalistic behaviour that has never been recognized before for the Palaeolithic period” and even describe the marks on the bones as “engraving marks, produced with no utilitarian purpose apart from an artistic representation,” they nowhere argue for the existence of Palaeolithic religion in their paper. I conclude that the occurrence of cannibalism in our prehistoric past does not constitute clear evidence of religion.

(b) Is there any evidence for religious worship in the Paleolithic?

The evidence from Bilzingsleben, Germany

The museum building at the archaeological site of Bilzingsleben, Germany, which won the Thuringian Timber Construction Award in 2009. Image courtesy of Methylsteiner and Wikipedia.

Evidence for religious worship is pretty scant on the ground. One site which is often mentioned in the literature is that of Bilzingsleben, in Germany, where the discovery of shattered human bones in an area paved with stones has been cited as evidence for some sort of prehistoric ritual. Paul Pettitt reviews the evidence in his 2011 article, Religion and Ritual in the Lower and Middle Palaeolithic (The Oxford Handbook of the Archaeology of Ritual and Religion (ed. Timothy Insoll), chapter 21, OUP):

Whatever their specific nature, each ring-like structure at Bilzingsleben appears to have been associated with an area of burning, and an activity area usually consisting of elephant bones and a large travertine block interpreted as an ‘anvil’. Of particular interest is Zone V (Mania and Mania 2005:102), which consists of a sub-circular ‘pavement’ formed from a single layer of flat stones trodden into the soft sediments of the lake edge. An area of burning was located towards its centre, a large travertine block severely affected by heat at its eastern periphery, and a quartzite ‘anvil’ in its western periphery next to a large bovid skull retaining its horn cores. The lack of splintered bone, hammerstones, or other tools on the pavement contrasts markedly with the rest of the settlement area, and Mania and Mania (ibid.: 102) have suggested that the area around the anvil and bovid skull was intentionally cleaned. The two refitting cranial fragments of Hominin 2 were recovered over 3 m apart on this ‘pavement’, in close proximity to the anvil/skull, and in ‘smashed and macerated condition’ (ibid.: 113), in addition to the juvenile mandible. Small fragments of bone preserved in the natural crevices of the quartzite ‘anvil’ show that bones were smashed upon it, but rather than forward a prosaic interpretation (marrow acquisition, for example) the excavators suggest that this was an area of ‘special cultural activities’ (ibid.: 102) which ‘probably played a role in some kind of ritual behaviour’ (ibid.: 113), further evidenced by a ‘linear structure of large pebbles, which seem to run towards the circular area and which ends appear to have been marked by …elephant tusks’.

However, Dr. Ira Wunn, of Leibniz University in Hannover, Germany, casts doubt on claims at cultic activities at Bilzingsleben, in her article, “Beginning of Religion” (Numen 47(4):417-452, November 2000). She suggests that interference by animals and/or normal taphonomic processes (relating to how organisms decay and fossilize) could account for the distribution of bones found at the site, and sounds a note of warning against over-interpreting limited data:

In this connection it is necessary to emphasise that scholars can only come to a decision based on a series of complex investigations using a scanning electron microscope, whether scratches on fossil bones are due to violence caused by a stone tool or the teeth of a predatory animal. Since there are no archaeological findings for the entire Palaeolithic or Neolithic period to prove the opening of the skull by humans, none of the speculations about possible cult practise (sic) connected with human skulls is based on facts.[50]

Wunn notes in a footnote that experiments conducted on animal bones have revealed that scratches made by stone tools are indistinguishable from scratches caused naturally by sand, during the process of embedding, and that it is also very hard to distinguish traces of human activities performed on bones from the marks left by animal bites. In the end, it is only with the help of a scanning electron microscope that disputes about the significance of marks left on bones can be resolved.

In any case, even supposing the arrangement of bones at the site to be the result of some ritual activity, one need not suppose that a religious ritual was behind it all. An alternative explanation, proposed by Gamble and discussed by Pettitt, is that Bilzingsleben was the site of large gatherings of people, who spontaneously celebrated what he describes as “attachment rituals” (rituals of greeting). However, as Pettitt (2010, p. 337) points out, that is as far as science can take us:

To Gamble, the attaching rituals involved setting up the anvils, a ‘structured activity, the start of rhythmic gesture,’ with each individual contributing materials. Although, however, his interpretation is convincing, from the perspective of this chapter we arrive at the inevitable interpretative limits; we can observe specific uses of space and perhaps confidently assume that meaning was given to it by the temporary performance of an attachment ritual, but we can go no further.

The key point here is that whatever significance such rituals may have had, there is no proof that they were religious. While the evidence from Bilzingsleben is intriguing, it provides no indication of religion.

Underground rituals 176,000 years ago at Bruniquel Cave, France?

Structure built by Neanderthal man, 176,500 years ago, at the bottom of the Bruniquel cave, France, from 400 broken and arranged stalagmites. Image courtesy of Luc-Henri Fage/SSAC and Wikipedia.

Another site that deserves mention is the Bruniquel Cave in southwestern France, where ringed structures, made of stacked stalagmites and dating back to 176,000 years ago, have been uncovered, some 336 meters inside the cave. These are the earliest known constructions based on stalagmites. Were these circular structures used for rituals of some sort? Nobody knows. All we can say at present is that their construction must have been carefully organized, and that given the early date, it could only have been the work of Neanderthal man. Certainly, the site demonstrates a highly sophisticated use of lighting and fire on the part of Neanderthal man, as well as the ability to design elaborate constructions.

According to a report by Taylor Kubota in Live Science (May 27, 2016), there is a possibility that the circular structures found deep in the cave served a religious function, but we cannot be certain, as other, alternative uses are also possible:

Given their distance from the cave entrance and daylight, the [research] team said it was unlikely the circles were used as shelters. They didn’t rule out the possibility that they could have been used for technical purposes, such as water storage, or religious or ceremonial purposes. Jaubert said the next steps in studying the cave will include further examination of the structures, a more extensive survey of the cave’s interior to uncover any additional archaeological remains, and a closer look at the cave’s entrance.

The find has added extensively to scientists’ knowledge of Neanderthal social organization and human cave dwelling in prehistory, the researchers said. Even so, they added, the question remains: What were the structures used for?

I conclude that the evidence from Bruniquel cave is, at best, ambiguous: it suggests the possibility of religious ceremonies, but no more.

Did prehistoric man worship cave bears?

Reconstruction of a European cave bear (Ursus spelaeus). Image courtesy of Sergiodlarosa and Wikipedia.

It has been claimed that prehistoric man worshiped cave bears, based on the distribution of their bones in Paleolithic caves in Europe. Dr. Ina Wunn pours cold water on such claims in her above-cited article (2000). She points out that cave bears and brown bears often lived in caves in Europe, at that time – and they sometimes died there as well, for reasons including age, illness and lack of food, which is why their bone fossils are often found in such places. Wunn adds: “…[T]he assortment of bear skulls is not due to human activities, but to the flowing water or other transport mediums in the caves. It cannot be said clearly enough: There was no cave bear worship in the middle Palaeolithic period at all. The bear caves show exactly what a palaeontologist would expect.” Wunn concludes: “Conceptions of rituals during the middle Palaeolithic, of cannibalism or bear worship belong to the realm of legend.”

Goddess worship in the Paleolithic?

The Venus of Willendorf, a statue thought to have had a religious function for Paleolithic peoples. Museum of Natural History, Vienna. Image courtesy of Oke and Wikipedia.

Venus figurines are another oft-cited piece of evidence for religious worship. The earliest undisputed “Venus figurine,” the Venus of Hohle Fels, dates back to around 35,000 to 40,000 years ago. The Venus of Willendorf, shown above, dates to between 28,000 and 25,000 B.C. However, archaeologists are uncertain whether these figurines possessed a religious significance: they may have simply been an expression of health and fertility, or even self-depictions by female artists. (Incidentally, there have been claims for much older Venus figurines, such as the Venus of Tan-Tan, a piece of quartzite found in Morocco, dating back to between 300,000 and 500,000 years ago, which looks vaguely like a woman. However, many experts believe that the rock is simply the product of natural weathering, and that its resemblance to a female figure is purely accidental. Judging from its appearance, I have to say I agree.)

As we will see below, the earliest solid evidence of religious worship is surprisingly recent, dating back to only 11,000 years ago. However, religious worship is found in all modern societies, and as the last migration of Homo sapiens out of Africa took place 70,000 years ago, we are forced to conclude that religious worship must have been evolving gradually for at least 60,000 years, before leaving traces in the archaeological record.

(c) Evidence for intentional human burial in the Paleolithic

It should be noted at the outset that humans are not the only species which bury their dead; the practice has been observed in elephants. It has also been suggested that a dog that was recently observed burying a dead puppy may have simply done it for food. Although chimpanzees are not known to bury their dead, scientists recently observed a chimpanzee cleaning the teeth of a dead companion. Commenting on the discovery in a Live Science article titled, Cleaning Corpses: Chimpanzee Funerary Rites Seen for 1st Time (March 21, 2017), science journalist Megan Gannon writes:

…[S]cientists now have a growing body of evidence about some unusual animal kingdom mortuary practices. Crows seem to hold vigil over their dead. Elephants, dolphins and whales have been known to stick by their dead companions.

Chimpanzees, our closest living relatives, had also been seen engaging in some mourning behaviors in the past, like returning to, dragging and perhaps even trying to resuscitate corpses. But using tools to clean the dead is something new to science.

Keeping these caveats in mind, we can now investigate the antiquity of the practice of intentional human burial.

The earliest disputed cases of human burials

The earliest disputed evidence of intentional burial of the dead, dating from between 427,000 and 600,000 years ago, is located in a pit called Sima de los Huesos (“Pit of the Bones”) at a site called Atapuerca in Spain, where the bones of at least 28 individuals, formerly thought to belong to the species Heidelberg man and now believed to be ancestral to the Neanderthals, lie in the pit of a cave. Were they intentionally buried there, or were they transported into the chamber by big cats (e.g. lions or leopards) after being collected and abandoned by hominins elsewhere, or were they deposited there by a mudflow? At the present time, there is no consensus among experts in the field.

It has also been suggested that a very small-brained hominin named Homo naledi may have deliberately placed the bodies of dead or dying people in the nearly inaccessible Dinaledi Chamber at the back of the Rising Star Cave in South Africa, about 300,000 years ago, but even if the placement of the bodies was deliberate, we can’t be sure whether this was a proper burial or simply a way of disposing of decomposing corpses.

In 2018, Charles P. Egeland and colleagues attempted to resolve the question by using a machine learning algorithm, which analyzed two groups of burials: in one group, known prehistoric burials and undisturbed modern corpses that had been buried; and in another group, the scavenged remains of human corpses, as well as the remains of baboons that had died in a cave, and baboons that had been eaten by leopards. The algorithm tried to identify features of intentional burials and apply these to disputed prehistoric burials. Egeland et al. found that the tests they employed “consistently cluster the SH [Sima de los Huesos] and DC [Dinaledi Chamber] assemblages with the remains of scavenged human corpses, leopard-consumed baboons, and baboons that died naturally within a cave.” Regarding the Sima de los Huesos remains, the team concluded: “…[T]he skeletal element abundance data suggest that the SH corpses did not find their way into the cave chamber as complete skeletons and/or that they experienced a substantial level of disturbance after their deposition. While other taphonomic factors may have been at play, we consider the feeding activities of carnivores to be a likely source of this disturbance.” Commenting on the Dinaledi Chamber remains, Egeland et al. wrote: “Even in the absence of direct carnivore involvement, our machine-learning results for an assemblage composed of baboon remains accumulated by natural die-off in a cave demonstrate that an assemblage composed almost exclusively of a single, large-bodied primate with skeletal patterning like that seen in the DC need not necessarily require deliberate disposal by conspecifics [members of the same species – VJT].” A further possibility, mentioned by Egeland in an interview with Ars Technica, is that the hominins themselves ventured into the chamber, only to become trapped there.

Anthropologist John Hawks is skeptical of the results obtained by the machine-learning algorithm, pointing out that it also classified some human remains found at Skhul in Israel and dated to between 100,000 and 135,000 years ago, as with predator kills, not burials, despite widespread expert agreement that at least some of these remains were intentionally buried. “This study places Skhūl together with known cases of leopard predation. That tells you that the method doesn’t work for distinguishing burial from carnivore activity,” he told Ars Technica in an interview on the team’s findings. Egeland replies that his study simply shows there’s not enough evidence to prove that the Sima de los Huesos and Dinaledi Chamber remains were intentionally buried – a remarkable claim that requires strong evidence. Regarding Skhul, his study lists it as a case of “possible primary hominin interment,” rather than as an open-and-shut case of burial.

I conclude that for the time being, it would be prudent not to treat these remains as evidence of intentional human burial. I should add that even authorities such as Berger, who hypothesize that the remains deposited in the Dinaledi Chamber were placed there as part of some ritual, were careful to point out that by “ritual” they simply meant an intentional and repeated practice (the disposal of dead bodies in the cave), rather than a religious ritual.

Burials conducted by Neanderthal man

The earliest alleged Neanderthal burial dates from 130,000 years ago. The shattered bones of at least 24 adults discovered at the site of Krapina in Croatia back in 1899 have been interpreted as evidence that Neanderthals were burying their dead at that time – an interpretation argued for by Erik Trinkaus in his 1985 article, Cannibalism and burial at Krapina (Journal of Human Evolution, Volume 14, Issue 2, February 1985, pp. 203-216), which contends that “the frequencies of skeletal part preservation indicate that the Krapina Neanderthals were buried, by natural or human processes, soon after death.” However, a 2009 online article published by the Museum of Evolution Husnjakovo, Krapina, is more cautious in its assessment:

Some researchers think that the charred and fragmented bones are a proof of cannibalism, and that the bones themselves were intentionally torn apart and broken in order to get to the bone marrow inside of them. On some of the parietal bones there are visible scratching signs, which is also related to the act of cannibalism. However, other scientists think that the Krapina cave could have been a ritual site, where the dead were buried, and that is why a large number of human bones was found there.

Finally, an Encyclopaedia Britannica article on the Krapina remains puts forward another possibility: “Trampling by animals is another possible cause for the shattered bones.” Fascinating as the remains are, I think a prudent assessment would be that the evidence for intentional human burial at Krapina is inconclusive.

There are many other alleged Neanderthal burial sites in Europe and the Middle East, including the site of La Chapelle-aux-Saints, originally excavated in 1908 and further excavated between 1999 and 2012, where the well-preserved skeleton [known as LCS1] that was first uncovered at the site, coupled with the recent discovery of two children and one adult in what appears to be a man-made pit, point strongly to burial as the most likely explanation of how the human remains got there. The recent excavations are discussed in a 2013 National Geographic report by Ker Than (Neanderthal Burials Confirmed as Ancient Ritual, December 16, 2013) and a 2013 Live Science report by Charles Q. Choi (Neanderthals May Have Intentionally Buried Their Dead, December 16, 2013).

Summarizing the results of their investigation in a 2014 article in Proceedings of the National Academy of Sciences (“Evidence supporting an intentional Neandertal burial at La Chapelle-aux-Saints”, PNAS 111 (1) 81-86; DOI: 10.1073/pnas.1316780110), study authors William Rendu et al. remark:

More than a century after the discovery of the La Chapelle-aux-Saints skeleton, we have corroborated the information provided in the original excavation reports concerning the finding of an articulated, complete human skeleton within a depression in the bedrock. Microstratigraphic observation of the edges of the depression indicates that it postdates both the accumulation of Quina Mousterian deposits and their postdepositional cryoturbation and, therefore, that, originally, it cut through sediment fill, first, and then the bedrock itself. The anthropic origin of the excavation of this feature is the parsimonious reading of the evidence; a geogenic [i.e. natural geological – VJT] origin can be excluded, and there is no evidence that cave bears used the site for hibernation (and the site is too shallow for that to be possible in the first place). The taphonomy [i.e. process of fossilization – VJT] of the human remains sets them clearly apart from the site’s fauna, because no carnivore modification is apparent, indicating rapid burial, as one would expect in a funerary context. No reason exists to question the interpretation of the LCS1 burial [i.e. the skeleton originally found at the site – VJT].”

Inside the Shanidar cave in Iraq. Image courtesy of Hardsarf and Wikipedia.

Undoubtedly the most famous Neanderthal burial site is the Shanidar Cave site in the Zagros mountains of Iraq, where a team led by anthropologist Ralph Solecki of Columbia University discovered the buried remains of eight adult and two infant Neanderthals, dating from around 65,000–35,000 years ago, in excavations conducted between 1957 and 1961. Of all the skeletons found at the cave, it is Shanidar IV which provides the best evidence for Neanderthal burial ritual. The skeleton of an adult male aged from 30–45 years was carefully laid on his left side in a partial fetal position. Later, routine soil samples revealed the existence of whole clumps of pollen throughout the site, suggesting that flowers had been deposited in the grave. Furthermore, a study of the particular flower types suggested that these flowers may have been chosen for their medicinal properties, leading to the bold proposal that the man buried there could have been a shaman. However, more recent work into the flower burial has suggested that the pollen may have been introduced to the burial by animals – most likely a gerbil-like rodent known as the Persian Jird (Meriones persicus), which is known to amass seeds and flowers at certain points in its burrows. Archaeologist Paul B. Pettitt has acknowledged that “the deliberate placement of flowers has now been convincingly eliminated” in a 2002 article on Neanderthal mortuary traditions (The Neanderthal dead: exploring mortuary variability in Middle Paleolithic Eurasia, in Before Farming, 2002, 1-19); however, he is scornful of attempts by one skeptical researcher (Gargett, 1989) to explain away all of the Shanidar burials as natural, rather than intentional:

What is the likelihood that seven adult Neanderthals were, on separate occasions over at least 15 ka, all ‘killed and buried by ceiling collapse’ as Gargett (1989:18) has argued? True, the deliberate placement of flowers has now been convincingly eliminated[8], but on grounds of parsimony it seems more likely that the individuals were deposited here deliberately by their kin groups.

Pettitt’s conclusion is a measured one:

After 70 ka BP [70,000 years ago – VJT] some Neanderthal groups buried infants, or parts of them, in pits, infants and adults in shallow grave cuttings and indulged in primary corpse modification and subsequent burial. It may have been on occasion too that certain enclosed sites served as mortuary centres, and that their function as such was perpetuated in the memory of Neanderthal groups either through physical grave markers or social tradition. In all it would seem that at least in some Neanderthal groups the dead body was explored and treated in socially meaningful ways.

After carefully weighing up the evidence, Pettitt is inclined towards the view that “at least some transmission of mortuary tradition occurred among some Neanderthal groups, centred around a fixed point in the landscape which could be used, if not exclusively, to hide, process and bury the dead.” Even so, he adds that “it must be acknowledged that no convincing example of grave goods is known from Neanderthals.”

Recently, the discovery of a new skeleton at Shanidar, a middle-aged adult who has been dubbed Shanidar Z, as well as bones of other Neanderthal individuals beneath Z’s remains, lent further support to the claim that the individuals found at the site of Shanidar were buried there. The new research suggests that “some of these bodies were laid in a channel in the cave floor created by water, which had then been intentionally dug to make it deeper,” according to study senior author Graeme Barker, director of the Shanidar Cave project and professor in the Department of Archaeology at the University of Cambridge. “There is strong early evidence that Shanidar Z was deliberately buried.”

While it is now widely accepted that the Neanderthals intentionally buried their dead and observed certain post-mortem traditions, many experts continue to argue that these burials lacked the ritual and symbolic behavior that characterized Homo sapiens burials in the Upper Paleolithic (e.g. Tattersall, 1995). We may not therefore conclude that the Neanderthals practiced religion; such an inference goes beyond the currently available evidence.

Ritual human burials by Homo sapiens

Partial view of the Qafzeh 11 burial showing the deposit of the red deer antlers in close contact with the child skeleton (cast). Image courtesy of PLoS One.

The oldest ritual burial of modern humans (Homo sapiens) occurred 92,000 years ago at Qafzeh in Israel, where the remains of as many as 15 individuals (including a mother and her child) were found in a cave, along with 71 pieces of red ochre and ochre-stained stone tools, found near the bones. “We found 71 pieces of ochre and established a clear link between the red ochre and the burial process, it seems to have been used as part of a ritual,” Dr Erella Hovers of the Hebrew University of Jerusalem told BBC News Online. Red, black and yellow ochre-painted seashells were found around the cave. Curiously, the ability to make red pigment seems to have been lost later on, for it does not reappear until 13,000 years ago.

The skeleton of a ten-year-old boy was also found, with his arms folded alongside his body and his hands placed on either side of his neck. Deer horns were laid on his hands, probably constituting one of the offerings put in the grave. The boy’s skull bears signs of a head trauma. Commenting on the discovery in an article in PLoS One (2014; 9(7): e102822), authors Hélène Coqueugniot et al. conclude:

At Qafzeh several other burials occur [59]–[62], but Qafzeh 11 represents a unique case of differential treatment with convincing evidence for ritual behavior. We interpret the Qafzeh 11 burial as resulting from a ritual practice applied to a young individual who experienced a severe cranial trauma most probably followed by significant neurological and psychological disorders, including troubles in social communication. These biological and archaeological evidences reflect an elaborate social behavior among the Qafzeh Middle Palaeolithic people.

The evidence of ritual behavior and the careful placement of the deer antlers (which were probably a grave offering) on the skeleton of the young boy, could be fairly described as proto-religious. If one were looking for the earliest evidence of religion in the true sense of the word, one might begin here. Nevertheless, it is not clear that the people who interred the boy believed in an after-life. Maybe they did, but we cannot be sure.

By 35,000 years ago, elaborate burials were becoming common in Europe. Burial of the dead with grave goods strongly suggests that a religious significance was attached to their deaths.

Man in an Upper Paleolithic burial in Sunghir, Russia. The site is approximately 34,000 years old. Image courtesy of José-Manuel Benito Álvarez and Wikipedia.

Perhaps the most famous ritual burial from the Paleolithic era is the archaeological site of Sunghir, in Russia, where an older man and two adolescent children are buried along with elaborate grave goods that included ivory-beaded jewelry, clothing, and spears. More than 13,000 beads were found (which would have taken 10,000 hours to produce). Red ochre, an important ritual material associated with burials at this time, covered the burials. The burials date to around 34,000 years ago. Laura Geggel describes the burials in a Live Science article (This Ancient Society Buried Disabled Children Like Kings, February 13, 2018). Ten men and women are buried at the site, but the burials of two boys are the most remarkable, for the grave goods they contain:

About 34,000 years ago, a group of hunters and gatherers buried their dead — including two boys with physical conditions — using the utmost care. However, these dead were buried in fairly different ways, a new study finds.

The roughly 10- and 12-year-old boys were buried head to head in a long, slender grave filled with riches, including more than 10,000 mammoth ivory beads, more than 20 armbands, about 300 pierced fox teeth, 16 ivory mammoth spears, carved artwork, deer antlers and two human fibulas (calf bones) laid across the boys’ chests, the researchers said.

Surprisingly, the skeleton of an older but physically robust man of around 40 was buried with far fewer treasures: “about 3,000 mammoth ivory beads, 12 pierced fox canines, 25 mammoth ivory arm bands and a stone pendant,” according to the article. The difference in grave goods points to a degree of social complexity in the community that lived at Sunghir: “people were treated differently in death, and probably in life,” according to Erik Trinkhaus, one of the lead researchers at the site.

(d) The world’s oldest rock-solid evidence for religious worship

However, the earliest evidence for the worship of gods and goddesses is far more recent.

The ruins of Göbekli Tepe, the world’s oldest temple, built before 9,000 B.C. Image courtesy of Teomancimit and Wikipedia.

The earliest clear evidence for religious worship comes from the site of Göbekli Tepe, in modern-day Turkey, where archaeologists have uncovered the world’s oldest known temple, dating from prior to 9,000 B.C. Consisting of at least 20 circular enclosures, each surrounded by T-shaped pillars, the temple may have served to worship the dog star, Sirius. Alternatively, it may have been the center of a local cult of the dead. What is particularly interesting about this temple is that it predates the dawn of agriculture in the region, showing that religion came before agriculture. Since the 11,000-year-old temple is much more recent than the first human burials, one might conclude that worship of gods and goddesses arose relatively recently in human history. On the other hand, the worship of gods, spirits, and supernatural forces is found in all human societies; consequently, it must have emerged far earlier than 11,000 years ago, even if we are unable to say when. The origin of religion appears to be lost in the dim mists of antiquity.

Once again, the evidence suggests that the mental capacities underlying religious behavior did not emerge overnight in human beings, but in a series of stages: mortuary rituals, burial with grave goods, statues of goddesses and finally, temples. To sum up: all we can say is that “Religious Adam” gradually emerged between 92,000 and 11,000 years ago, and that to date, religious behavior has only been confirmed in Homo sapiens, although it may have occurred in Neanderthal man as well.

Summarizing the evidence

We’ve looked at ten possible cut-off points for the appearance of the first human being. So which cut-off point is the right one? There’s no obvious way to resolve this question. And to make matters worse, none of these points correspond to instantaneous changes: as far as we can tell, they all took place over tens of thousands of years. Even if you were to pick one of the ten Adams I’ve listed above as the first true human, the point is that none of these Adams appeared overnight, as Christianity insists the first true humans did.

To sum up: regardless of the origin of whether we look at the evidence from prehistoric human brains, or the tools and implements made by prehistoric people, or the origin of language, or the appearance of religion and morality, one conclusion seems inescapable: there was no magic moment at which humans first appeared. And if that’s right, the Christian doctrine of the Fall of our first parents (Adam and Eve) goes out the window, because we didn’t have a set of first parents, or even a first human community of (say) 10,000 people, from whom we are all descended. And without the Fall, what becomes of the doctrine of the Redemption?

================================================================

C. Another way out for Christian apologists: Redefining the image of God?

RETURN TO MAIN MENU

At this point, some Christians might object that the argument I have been making rests on the assumption that our being made in the image of God consists in our having certain unique capacities that distinguish us from the beasts. That’s known in theological circles as the substantive or structural view of the image of God, and it’s a view that has been dominant during most of the history of Christian theology. In recent years, however, Christians have proposed alternative views, including the relational view, advocated by Emil Brunner and Karl Barth, which grounds the image of God in our experience of being in an active relationship with God, and the vocational or functional view (espoused in recent times by Richard Middleton), which has become the dominant view among Biblical scholars since it was proposed in the early twentieth century, and which identifies the image of God with our special calling to rule over God’s creation, as God’s vice-regents, just as ancient Middle Eastern kings were often designated as the image of a particular god (e.g. Marduk). Do these views lessen the tension between the findings of science and the clear teaching of Jews and Christians, down the ages?

In “The Creation of Adam,” Michelangelo provides a great example of the substantive view of the image of God, through the mirroring of the human and the divine. Image courtesy of Wikipedia.

I should point out, first of all, that there are philosophical and theological problems associated with all three views, which are discussed in Millard Erickson’s book, Introducing Christian Doctrine (Grand Rapids, Baker Academic, 1992). On the substantive view, a person with greater spiritual and/or intellectual capacities would be a better image of God than one with minimal capacities, which means that we’re not all equally human: some of us (for instance, Isaac Newton, who possessed a gigantic intellect as well as being a very profound Christian thinker) are more human than others, and all of us are more human than our prehistoric forebears, who may have lacked some of these capacities. For example, 300,000 years ago, early Homo sapiens was a pretty sophisticated toolmaker who appears to have possessed a sense of morality and cared for the sick, but he probably lacked the capacity for art or religion, as we know them. On the substantive account, he would be less human than we are today. Or he might not be human at all.

The relational account has its problems, too, as it would imply that those who don’t have a relationship with God are not made in the image of God. As Erickson points out, this goes against the idea of universality, which states that everyone is made in the image of God. Consider, for instance, the Biblical story of Yahweh revealing Himself to Moses in the burning bush. Would anyone say that prior to such a revelation, Moses was not made in God’s image? And what about atheists, who reject the very idea of God? Surely, too, the human capacity to have a relationship with God is special in and of itself, regardless of whether that capacity is exercised or not. And if it is our capacity to have a relationship with God that makes us human, then we are back with the substantive view, which defines our humanity in terms of our abilities.

What about the functional view? Erickson remarks that while the dominion we exercise over creation is important, in the Genesis account, humans are created in the image of God before they are given dominion over the fish of the sea, the birds of the air and the beasts of the earth. So the functional view puts the cart before the horse. One might also ask: why should humans be given dominion over the beasts, if they don’t possess any special capacities that differentiate them from the beasts? Or as the Christian philosopher William Lane Craig expresses it, functions have to be grounded in ontology. Additionally, Genesis 1 clearly distinguishes between man’s being created in God’s image and man’s being given dominion over the beasts: a second blessing is bestowed on man, after he is created, so that he can exercise dominion over creatures. Finally, the analogy between humans and ancient Middle Eastern kings is a very poor one: the latter were literally believed to be incarnations of the gods whom they represented, which, as William Lane Craig has pointed out, is completely at odds with the Judaeo-Christian view that humans are not created morally perfect. Judaism categorically rejects the notion that human beings are embodiments of God, their Creator.

In the end, Erickson endorses the substantive view – as he puts it, “The image … refers to something we are rather than something we have or do” (ibid., 2nd edition, 2001, p. 176) – and he locates this image in our capacity for fellowship with God. But as we have sen from our study of “Religious Adam,” this ability did not emerge overnight.

In any case, neither the relational view nor the functional view of God’s image satisfactorily resolves the problem I’ve been discussing – namely, that according to science, there was no magic moment at which we became human. Proponents of the relational view might argue that in principle, science is incapable of detecting the moment at which God first spoke to one of our prehistoric forebears; hence, there is no conflict. I reply: having a conversation with God presupposes the ability to understand what God wants of us (which means having a language), as well as the ability to recognize that there is a God (which means having the concept of an Agent Who cannot be perceived by the senses and Who transcends the cosmos). And scientists tell us that neither of these abilities emerged overnight. Moreover, the relational view would imply that if there were two prehistoric hominins, both possessing identical abilities and both able to hear God’s voice, and if God decided to reveal Himself to one but ignore the other, the latter would never acquire human status. Such a view seems arbitrary in the extreme.

The functional view fares no better. There was certainly no magic moment at which our ancestors acquired dominion over the beasts: it was a gradual affair. By about 46,000 years ago, humans had become so dominant that in some parts of the world, they were driving other large animals into extinction. But as we go further back in history, we find that it took some time for man to become the apex predator, starting with the emergence of big-game hunting over 700,000 years ago among populations of late Homo erectus: at any rate, it certainly didn’t happen overnight.

In short: attempts to redefine the “image of God” (Genesis 1:26) in terms of our relationship with God or our domination over nature both fail, because neither our relationship with God nor our domination over creation defines what it means to be human: they are both consequences of being human. What’s more, humans only possess these attributes because of underlying abilities which make them possible (e.g. the ability to form a concept of God or the ability to invent technologies that give us control over nature), so we are back with the substantive view, which, as we have seen, supports gradualism.

We have seen that Christian apologists are unable to evade the force of the argument that our uniquely human capacities evolved gradually, by redefining the “image of God” in relational or functional terms, are unsuccessful. The conclusion that has been argued for above still stands: there was no magic moment at which our hominin ancestors became human.

================================================================

D. Conclusion

RETURN TO MAIN MENU

It has been argued above that even if we take account of other views of what it means to be made in the image and likeness of God, we are no nearer to resolving the problem posed at the beginning: that of squaring the verdict of science that humans appeared gradually with the insistence of Christians that they literally appeared overnight. Anatomy, technology, culture and language all developed over thousands, or even hundreds of thousands of years. So did morality and religion, and so did our dominance over nature. And while God’s first revelation to humans may have been a sudden event, the capacity to understand that revelation was not. The verdict is clear: the Biblical story of Adam is a myth, and there was no first human being, after all. And if that’s true, then the Christian understanding of Jesus as the second Adam (1 Corinthians 15:45-49) makes no sense, either.

Human origins therefore pose a potentially fatal problem to the credibility of Christianity – a problem to which Christian apologists have paid insufficient attention. They ignore it at their peril.

Postscript: Was Adam an apatheist?

In a recent panel discussion at the Human Origins Workshop 2020 hosted by Dr. Anjeanette Roberts and attended by Dr. William Lane Craig, Dr. Steve Schaffner, Dr. Nathan Lents and Dr. Fazale Rana, Christian apologist Dr. William Lane Craig cited a paper by Dr. Sally McBrearty and Dr. Alison Brooks, arguing that modern human behavior did not suddenly emerge 40,000 to 50,000 years ago, but actually goes back to the African Middle Stone Age, around 300,000 years ago. Dr. Craig urged Dr. Rana to acknowledge this fact. I was pleased to see that Dr. Craig is au fait with the literature on the subject, and I personally agree with his view that truly human beings have existed for hundreds of thousands of years, but I would also point out that as far as I have been able to ascertain, religion (as opposed to mere ritual behavior) goes back no more than 100,000 years. What that means is that if the first true humans emerged at least 300,000 years ago, then for more than two-thirds of human history, humans have been areligious – that is, devoid of any belief, disbelief or even interest in the supernatural. People like this are sometimes called apatheists.

There’s a beautiful passage in Chapter XIX of Huckleberry Finn, where Huck describes the joys of living on a raft and floating down the Mississippi, with Jim, and discusses whether the stars were made or “just happened”:

“We had the sky up there, all speckled with stars, and we used to lay on our backs and look up at them, and discuss about whether they was made or only just happened. Jim he allowed they was made, but I allowed they happened; I judged it would have took too long to make so many. Jim said the moon could a laid them; well, that looked kind of reasonable, so I didn’t say nothing against it, because I’ve seen a frog lay most as many, so of course it could be done. We used to watch the stars that fell, too, and see them streak down. Jim allowed they’d got spoiled and was hove out of the nest.”

It would appear that the earliest true humans were not capable of a conversation like that. Humans were behaviorally modern, and moral, long before they were religious.

150 thoughts on “An A-Z of Unanswered Objections to Christianity: H. Human Origins

  1. Mung: That would be an incorrect answer.

    https://en.wikipedia.org/wiki/New_World_monkey

    Seems partially correct,

    “At the time the New World monkeys split off, the Isthmus of Panama had not yet formed, ocean currents and climate were quite different, and the Atlantic Ocean was less than the present 2,800 km (1,700 mi) width by about a third; possibly 1,000 km less, based on the current estimate of the Atlantic mid-ocean ridge formation processes spreading rate of 25 mm/year.”

    What is the design explanation?

  2. newton: That would be against the rules. And not very funny.

    Even so, objecting to being compared to the man invites the thought that I have no objection to being compared to the dog. Went right over poor Adapa’s head of course.

  3. newton: What is the design explanation?

    It probably depends on who you ask and which ID model they are working from. Frankly, I think if monkey’s can evolve once they can evolve twice.

    And if monkeys can find their way from Africa to the America’s, humans can find a way to get to Tasmania. Perhaps they took an indirect route. Consider Columbus trying to get to India and landing in America by mistake.

  4. Mung: It probably depends on who you ask and which ID model they are working from. Frankly, I think if monkey’s can evolve once they can evolve twice.

    Since it is not proposed that monkeys evolved ex nihilo , it would follow that that we might find evidence of species monkeys evolved from in South America.

    Sounds like a concrete prediction.

    What are the main ID models?

    And if monkeys can find their way from Africa to the America’s, humans can find a way to get to Tasmania.

    Reading the article you linked, that is not debated. There is evidence that humans crossed during an Ice Age. The question was could humans during the time frame of the Biblical Adam transverse the Tasman Sea with the known sea going vessels that have been discovered.

    It is all about original sin. If you have a human population that does not descend from Adam, chaos ensues.

    Perhaps they took an indirect route. Consider Columbus trying to get to India and landing in America by mistake.

    Or God just did His thing, and they just woke up one morning with Original Sin. Makes about as much sense as inherited Original Sin.

  5. Jay313: On the analogy that the man’s (ha’adam‘s) first act was linguistic, I speculate that the first “speaker of words” (likely H. erectus) was the first member of the human family. I also freely admit that this is speculation, and the answer may remain a secret hidden in only God’s knowledge.

    The “moment of conception” for humanity may not have been gradual, but just like you and me, humanity wasn’t fully formed into the “people we are today” at conception, whether that’s defined as erectus or heidelbergensis. Brain, language, moral, and cultural development over many years were required in both cases.

    My only major reservation is that the evolution of language was itself (probably) a gradual process, with lots of different components emerging at different times. Bickerton, in his recent More Than Nature Needs, suggests that the first step was joint displaced reference: being able to convey information about something that is not perceptually present to either communicant. (Bickerton suggests that the need to scavenge carcasses over long distances would been a selective pressure for joint displaced reference, so the hominids that can do this more easily have an ecological advantage over those who don’t.)

    But this could have emerged as a relatively stable “proto-language” for hundreds of thousands of years before anything more complicated (like recursive grammar or a topic/comment distinction) was needed.

    This is to say that I don’t think one could identify any extinct species of Homo as “ha’adam, as “the speaker of words,” without an excursion into the evolution of language itself as well as some explanation as to which features of language are theologically significant.

    That is, if one has an argument as to which features of language are most significant for upholding a ‘made in the image of God’/’not made in the image of God’, and one also has a reasonable conjecture about when those features emerged, then one has an argument for “the first human (in the Biblical sense)”.

  6. Kantian Naturalist,

    “This is to say that I don’t think one could identify any extinct species of Homo as “ha’adam, as “the speaker of words,” without an excursion into the evolution of language itself as well as some explanation as to which features of language are theologically significant. … That is, if one has an argument as to which features of language are most significant for upholding a ‘made in the image of God’/’not made in the image of God’, and one also has a reasonable conjecture about when those features emerged, then one has an argument for “the first human (in the Biblical sense)”.”

    Just curious KN, following up on your recent fair question to me about the DI’s meanings of ID theory. Given that you have shared here your Jewish background, the only one afaik who has done so here at TSZ, and since ha’adam is treated in Jewish religious history as real & historical, cf. Jewish (Hebrew) Calendar, can you please confirm if you accept ha’adam likewise as real & historical? I’m guessing “No” is your answer. https://en.wikipedia.org/wiki/Hebrew_calendar

    If I understand the social (ir)religious landscape properly, secular Judaism, the current worldview majority of N. American Jews, does not require, believe in or promote a real, historical Adam & Eve. But religious Judaism still does accept and believe in them. Is that right?

  7. Flint: He’s presented his tentative conclusions. There is no compelling new evidence.

    Which is bullshit.

    Flint: And once again, one must ask: what evidence would change YOUR beliefs? Can you suggest anything, even hypothetically?

    I asked first, so a reply was due before a question. But I don’t mind answering: experimental evidence of “evolution” like here: http://nonlin.org/evotest/
    Now will you return the favor?

  8. Jay313: Did I catch you on a bad day, or is this your normal mode of communication?

    Gregory can be a bit spiky. I don’t think he realises how counterproductive to dialogue it can be.

  9. Alan Fox,
    Since I’m recommending that this “unanswered” series be abandoned – for Torley’s own mental & spiritual health – or at least re-located to a more appropriate and edifying venue, yes, my aim in this thread is counter-productive to dialogue. It is realized here and intentional.

    The dialogue with Torley over at PS is much more interesting and valuable than here among skeptics.

    E.g. this one:

    “So, I realize that Catholic hylomorphism might be a problem here, but supposing substance dualism and God at some point in history gives the first human at a mostly arbitrary point in physical development, but still meeting the minimum requirements, endows him with a spirit. Might that not be the first man and might that not have essentially nothing to do with your wall of text?”

    For all the skepticism of Vincent, he cannot fathom this possibility & continues with his reader unfriendly walls of self-doubting text.

  10. Hi everyone,

    The comment below is one I made over at Peaceful Science, but I think it’s equally relevant here.

    I presented good scientific evidence in my OP that human language did not emerge overnight, but over many thousands of years. Depending on how you define it, language either emerged about 600,000 years ago (with Heidelberg man) or between 200,000 and 70,000 years ago (in Homo sapiens). But it certainly wasn’t an overnight affair.

    I also presented strong evidence for the occurrence of symbolic behavior in Neanderthals, some 130,000 years ago. Unless you believe that God created two races of intelligent beings, that means the common ancestor of both (say, Heidelberg man) must have been truly human, too – but if we go that far back (at least 600,000 years ago), we can find no trace of an overnight breakthrough in human abilities.

    I presented further evidence that human morality – in particular, altruism and self-sacrifice for the good of the group – go back at least 500,000 years, with evidence for long-term care of the sick going back 1,500,000 years and big-game hunting (which required a high degree of public co-operation and a willingness to put one’s life at risk for the good of the group) going back 700,000 years. In my book, anyone who’s willing to lay down their life for others is displaying truly human moral behavior. Once again, we can identify no sharp beginnings here.

    Finally, I presented evidence that religious worship did not appear suddenly in the archaeological record, but that it probably arose less than 100,000 years ago.

    Taken together, the scientific evidence suggests that whenever true humans emerged, it didn’t happen instantaneously. Please note that I’m not saying that true humans actually appeared gradually; I’m just saying it sure looks that way, judging from the evidence.

    I’m aware that some readers want to shoehorn human beginnings into the last 100,000 years. But that ignores the evidence for symbolic behavior in Neanderthals (not to mention the evidence for modern human behavior), as well as the evidence for language (in the broad sense) in Neanderthals (and presumably Heidelberg man), and finally, the evidence for morality in Heidelberg man and late Homo erectus – all of which point to a much earlier beginning, over half a million years ago. And if I had to choose a date for “Adam,” I’d nominate that time (but see below).

    I also don’t think that an individual (such as Heidelberg man or late Homo erectus) that’s prepared to engage in long-term care of sick adults and children, and even lay down their life for others, deserves to be mentally categorized as child-like. Those sound like very adult-like capabilities to me.

    But as I point out in my postscript, humans have only been religious for less than 100,000 years. For most of human existence, humans have been apatheists. “An apatheist is someone who is not interested in accepting or rejecting any claims that gods exist or do not exist. The existence of God(s) is not rejected, but may be designated irrelevant.” (Wikipedia) See also here. That in itself is an alarming conclusion for Christians.

    Could there have been a brief flowering of belief in God some 600,000 years ago, when God first revealed Himself to humans (Heidelberg man), only to be rejected (in what we now call the Fall) and subsequently forgotten for 500,000 years, only to be rediscovered some 100,000 years ago? I suppose it’s possible. But any honest-minded believer must admit that this is pure speculation that runs counter to the evidence available.

    On the other hand, if we insist on confining human history to the past 100,000 years, then we have to explain the puzzling spectacle of a species (Neanderthal man) that was like our own in so many ways, including even symbolic behavior, not to mention a pretty complex language (even if it wasn’t recursive), a capacity for abstract thought (modern human behavior), and a sense of morality that was much like ours, except that it didn’t extend to objects, such as symbolic art – which hardly matters in my book. To categorize Neanderthal man as less than truly human seems to me to be arguing in the teeth of the evidence.

    So there’s the puzzle that I’ve laid out before you. And for a Christian, it is a genuine puzzle – not an insoluble one, to be sure, but one whose solution defies our best intellectual efforts. And it’s a puzzle that would certainly make me hesitate to embrace Christianity, if I were an outsider. I think it’s fair to categorize it as an unanswered objection to Christianity, because no-one has come up with a convincing answer yet.

    That was what I wanted to say. Cheers.

  11. Gregory,

    Since I’m recommending that this “unanswered” series be abandoned – for Torley’s own mental & spiritual health – or at least re-located to a more appropriate and edifying venue, yes, my aim in this thread is counter-productive to dialogue. It is realized here and intentional.

    So you’re trying to shut down a discussion. Thank you for your honesty.

    The dialogue with Torley over at PS is much more interesting and valuable than here among skeptics.

    There are sincere and intelligent interlocutors in both forums. There’s no need to choose between them.

    “So, I realize that Catholic hylomorphism might be a problem here, but supposing substance dualism and God at some point in history gives the first human at a mostly arbitrary point in physical development, but still meeting the minimum requirements, endows him with a spirit. Might that not be the first man and might that not have essentially nothing to do with your wall of text?” (Gregory’s quote from a post on Peaceful Science)

    The objection I have to this scenario is that it supposes that our ancestors’ becoming human could make no visible difference to their behavior. And given the nature of what a spirit is and what it can do, that’s a supposition I find frankly incredible. A spirit is something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice. The emergence of a being with a spirit would have undoubtedly been the most momentous event in the history of life on Earth. And you think it happened without a trace. Really?

  12. Nonlin.org: Which is bullshit.

    I asked first, so a reply was due before a question. But I don’t mind answering: experimental evidence of “evolution” like here: http://nonlin.org/evotest/
    Now will you return the favor?

    Sure. You linked to bullshit. And this is why nobody can engage in any kind of discussion with you. You insult, you preach, you lie, and you NEVER reference actual science. Your blog rationalizes a foregone conclusion. So you run along, pat yourself on the back for excreting propaganda at the atheistic evolutionists, and we’ll call it a day.

  13. vjtorley,

    “the nature of what a spirit is” = a sure recipe for a naturalistic answer to spirituality.

    “There’s no need to choose between them.”

    Yet you chose between them, choosing to post your series here. And then added a shortened version as requested there. So obviously there is reason to discern & distinguish them, since you did. Trying to make a bed with Swamidass, however, as I’m sure Jay would agree, is likewise far from the best option available. But it’s still far better than here for uplifting conversation.

    My recommendation is to abandon this thread and series as nothing will come of it other than wasted time. I’m sure you won’t agree. If not, then at least try to move it entirely to PS & forget about posting here. Skeptical Catholic apologetics among atheists and agnostics is simply a futile project.

    Many times over the years, Vincent, have I suggested you to find better venues to publish instead of spending all your time with atheists & agnostics. This is not an insult, but encouragement. This is simply wishing for wiser use of your talents and skills than this. Bye.

  14. newton: What are the main ID models?

    Front Loading
    Special Creation
    Guided Evolution
    Interventionist

    Perhaps others

  15. Mung: Front Loading
    Special Creation
    Guided Evolution
    Interventionist

    None of those “models” can be said to be scientific however:
    https://en.wikipedia.org/wiki/Scientific_modelling

    That page suggests that models can be evaluated like so:

    Ability to explain past observations
    Ability to predict future observations
    Cost of use, especially in combination with other models
    Refutability, enabling estimation of the degree of confidence in the model
    Simplicity, or even aesthetic appeal

    It’s clear that none of the models listed by you can be evaluated.

  16. Mung: Even so, objecting to being compared to the man invites the thought that I have no objection to being compared to the dog. Went right over poor Adapa’s head of course.

    Actually he found his own interpretation.

  17. OMagain: It’s clear that none of the models listed by you can be evaluated.

    And yet people evaluate the Special Creation model and reject it. They must do so for reasons other than scientific.

  18. Mung: And yet people evaluate the Special Creation model and reject it. They must do so for reasons other than scientific.

    Who are people?

  19. Mung: And yet people evaluate the Special Creation model and reject it. They must do so for reasons other than scientific.

    Is the Special Creation model scientific? If not it does not rise to the level of rejection on a scientific basis. It does not need to be so rejected.

    Those are your “other reasons”. It is rejected because it is not a model that can be assessed scientifically.

    Can you link to a scientific evaluation of the special creation model?

  20. vjtorley,

    To be honest, I’m sympathetic to Gregory’s conclusion that commenting here is going to be a waste of time for you — though not for the reasons he gives.

    It seems to me that your motive for commenting here is that you’re hoping for productive intellectual engagement. By engaging with self-described so-called “skeptics” (so the story goes) one might improve Christian apologetics — one arrives at a better argument if one has taken seriously the objections of one’s opponents, after all.

    The reason why I think this will not work for you here at TSZ is that none of us really know any theology or care to learn. We’re not the interlocutors you need to improve apologetics. I think it’s more apt to say that when it comes to theology and to apologetics, we just don’t care. We’re mostly apatheists and ignostics, together with one Anthroposophist and a few YECs.

    I’m not saying you shouldn’t post here. In fact I enjoy your posts and responding to them. I’m only saying that if your motivation for posting here is to improve your apologetics by exposing them to serious intellectual engagement, you will disappointed.

    That said, here are a few places where I think your view could be refined further.

    vjtorley: The objection I have to this scenario is that it supposes that our ancestors’ becoming human could make no visible difference to their behavior. And given the nature of what a spirit is and what it can do, that’s a supposition I find frankly incredible. A spirit is something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice. The emergence of a being with a spirit would have undoubtedly been the most momentous event in the history of life on Earth. And you think it happened without a trace. Really?

    The emergence of a being with spirit on Earth would be the most momentous event in the history of the Universe, assuming it had not already happened on other planets elsewhere.

    Be that as it may, a few responses that occur to me:

    1. I think one should distinguish the metaphysical question “is spirit causally efficacious?” from the epistemological question, “how likely is it that evidence of spirit from hundreds of thousands of years ago will be discernible to us today?” , and I still haven’t seen you engage much with the epistemological question. There still seems to be plenty of reason to be skeptical that conclusive evidence of the emergence of spirit would be detectable in the archeological and paleontological traces that have survived to the present day.

    2. Spirit is here glossed as “A spirit is something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice”. If these are distinct capacities that evolved at different times in hominid evolution and human history, then the word “spirit” does not track an empirically detectable natural kind. But you seem to be assuming that it does, or that (at any rate) a certain kind of Christian apologetics depends on that assumption.

    3. At times you use Aristotelian vocabulary and at times you sound more like a Cartesian. This is confusing. You sound like an Aristotelian when you put the emphasis on human persons as “rational animals”. But of course Aristotle was perfectly clear that non-rational animals have souls, because the soul is the form of a living thing, and the form is just whatever unchanging pattern or organization that maintains the persistence of the thing in being. (In fact it follows from Aristotle’s logic that all human beings actually and literally have the same soul because we are rational animals, the same type or kind of thing.)

    But when you define “spirit” as “something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice” it seems that you are using “spirit” in the way that Descartes used anima: as what only minds have and that machines lack.

    Finally, the big one

    vjtorley: Please note that I’m not saying that true humans actually appeared gradually; I’m just saying it sure looks that way, judging from the evidence.

    I find this really quite baffling, so much so that I am puzzled that you felt the need to say this at all, let alone what it could even mean.

    Consider this way: apart from the evidence as to how things look, do we have any basis at all for making judgments about how things really are?

    This goes back to why TSZ might not be an ideal place for you: there is no one else here who would have thought it necessary to issue such a caveat. We are, more or less, naive empiricists and dogmatic naturalists. Some of us are instrumentalists about scientific theories, and some of us (like myself) hold out the hope that scientific progress is an asymptotic approximation to things-in-themselves.

    I can see why a theologian or theological metaphysician would need to hold open a conceptual space between “how things really are” and “how things appear to be to us”, but no one else here would (I think — maybe I’m wrong about this) think it necessary or important to insist upon that distinction.

  21. Gregory: You made a statement: “Torley’s asking the hardest questions he can and throwing down the gauntlet for Christian apologists to respond.”

    Did you not really mean this or did you? Or was it not simply a misinterpretation because you’re new here and made a wrong assumption?

    I asked a question about the OP: “Why post something ‘for Christian apologists’ where there are none?”

    That’s a pretty logical question sequence. Maybe you just didn’t know that unlike BioLogos & Peaceful Science, “Christian apologists” don’t come here, or when they do, don’t stay long?

    I don’t understand why you care or why you’re making a thing out of this. You questioned Torley’s motives for posting here, and I simply restated what he’d already said. If you don’t believe that was his motivation, there’s a real simple solution: Ask Torley! He’s right here! It really doesn’t matter to me if this is a lousy venue for his stuff. Maybe he’s just testing the idea here and has other plans for it down the road. Ask him, if you want to know.

    Kantian Naturalist: My only major reservation is that the evolution of language was itself (probably) a gradual process, with lots of different components emerging at different times. Bickerton, in his recent More Than Nature Needs, suggests that the first step was joint displaced reference: being able to convey information about something that is not perceptually present to either communicant. … But this could have emerged as a relatively stable “proto-language” for hundreds of thousands of years before anything more complicated (like recursive grammar or a topic/comment distinction) was needed.

    Sorry I wasn’t clear. My reference to erectus or heidelbergensis as the first “speaker of words” was as the first species literally capable of speech. I agree language evolved gradually. Bickerton’s speculation makes sense. Since erectus had an “intermediate” hyoid bone, I suspect the first words were hummed as much as “spoken.” Steven Mithen proposes that idea in The Singing Neanderthals. Here’s an interesting 20-min. video where he discusses it along with the difference between the Neanderthal and the sapiens mind: Is the Human Mind Unique?

    I also agree that proto-language required hundreds of thousands of years to become fully modern human language. Most likely, language evolution followed a path similar to what we see in childhood language acquisition: 1) One-word stage, 2) Two-word stage, 3) Hierarchical structure but lacking subordinate clauses and other forms of embedding, 4) Flexibility/Recursivity (fully modern grammar). Most human children acquire the full grammar of their native language by age 5. Humanity required approximately 900,000 years — from 1 Mya to 100 kya.

    Coming back around to ha’adam/”the man”, I take that as a literary archetype (not a literal “first man”) representing the whole of humanity and every individual. Thus, when I identify erectus as the first speaker of words and the first member of the “human family,” I don’t mean that humanity was “fully human” at that point. Torley is right in that regard. Pick your category: brain evolution, language development, symbolic reference, innovation. Evolutionary roots can be identified, but everything was rudimentary and in its infancy. A long road still lay ahead. That’s why WLC’s identification of a literal “Adam and Eve” as heidelbergensis borders on the absurd. (There’s also the real possibility that heidelbergensis is the direct ancestor of Neanderthal and not sapiens, but that’s a different conversation.)

    Kantian Naturalist:
    That is, if one has an argument as to which features of language are most significant for upholding a ‘made in the image of God’/’not made in the image of God’, and one also has a reasonable conjecture about when those features emerged, then one has an argument for “the first human (in the Biblical sense)”.

    Good observation. We (meaning Christians) say an infant is “born in the image of God.” Every child is human in the biblical (and every other) sense, and every child is “made” in the image of God. But is an infant capable of any of the capacities that Torley mentions here:

    vjtorley: A spirit is something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice. The emergence of a being with a spirit would have undoubtedly been the most momentous event in the history of life on Earth. And you think it happened without a trace. Really?

    A newborn — surely “made in the image of God” and possessing a soul/spirit — isn’t capable of abstract thought, hasn’t learned to speak, has no knowledge of right and wrong, no knowledge of God, and is totally dependent on its parents. Even if one takes the imago Dei and the soul in the strictest Catholic sense as ultimately resolving to “structural” aspects of the human being, they still require “normal” development and growth to achieve their potential. Likewise, I see no reason why the same couldn’t be true of humanity writ large. In its infancy (erectus), humanity could be spoken of as “created in the image of God” and endowed with a soul/spirit, but abstraction, language, mature morality (capable of abstract categories such as “good” and “evil”), knowledge of God, and free will (choice without compulsion) all required millennia of development before they achieved their full human potential.

    If Jesus had to “grow in wisdom and stature” before he took up his earthly calling, why shouldn’t the same be true of all of us, including ha’adam?

  22. Kantian Naturalist: I’m not saying you shouldn’t post here. In fact I enjoy your posts and responding to them. I’m only saying that if your motivation for posting here is to improve your apologetics by exposing them to serious intellectual engagement, you will disappointed.

    Give me a chance before you write me off. haha

    vjtorley: I also don’t think that an individual (such as Heidelberg man or late Homo erectus) that’s prepared to engage in long-term care of sick adults and children, and even lay down their life for others, deserves to be mentally categorized as child-like. Those sound like very adult-like capabilities to me.

    Here’s a 6-yr-old who jumped in front of his 4-yr-old sister to save her from a dog attack. Was he morally and mentally an adult because he was capable of such action? Chris Evans praises boy who saved sister from dog attack

    vjtorley: But as I point out in my postscript, humans have only been religious for less than 100,000 years.

    I’m still catching up and still intend to return to your OP. (The “10 Adams” was a great concept, by the way. One of those “wish I’d thought of that” moments. Lol) Meantime, here’s another pertinent video. I don’t endorse her every conclusion, but she connects a lot of threads that pique your interest regarding working memory and when humanity acquired a “religious” sense. (Gregory may have to skip this part.) The Roots of Religion: Genevieve Von Petzinger. Since the time this video was made (2012), she finished her research and published it in The First Signs: Unlocking the Mysteries of the World’s Oldest Symbols

  23. Hi Jay313,

    A newborn — surely “made in the image of God” and possessing a soul/spirit — isn’t capable of abstract thought, hasn’t learned to speak, has no knowledge of right and wrong, no knowledge of God, and is totally dependent on its parents. Even if one takes the imago Dei and the soul in the strictest Catholic sense as ultimately resolving to “structural” aspects of the human being, they still require “normal” development and growth to achieve their potential. In its infancy (erectus), humanity could be spoken of as “created in the image of God” and endowed with a soul/spirit, but abstraction, language, mature morality (capable of abstract categories such as “good” and “evil”), knowledge of God, and free will (choice without compulsion) all required millennia of development before they achieved their full human potential.

    If Jesus had to “grow in wisdom and stature” before he took up his earthly calling, why shouldn’t the same be true of all of us, including ha’adam?

    As I see it, the big difference between (i) the development of an infant into a rational, language-using, moral and God-conscious adult and (ii) the evolution of Homo erectus into modern human beings who possess reason, language, morality and religion is that in the former case, there is no loss of nature (the infant does not become a new kind of being, but is at all stages a human being) and no loss of individuality (the infant remains the same individual human being as he/she matures), whereas in the latter case, there is a loss of both: Homo erectus evolves into a different kind of being (Homo sapiens), and of course, it’s not Homo erectus but his distant descendant who acquires the traits in question. Your analogy would make sense if you believed in something like orthogenesis, with Homo erectus being somehow programmed or destined to evolve into Homo sapiens. Then (and only then) you might say that the former contained the seed of the latter. But on the modern understanding, evolution has no direction.

    Here’s a 6-yr-old who jumped in front of his 4-yr-old sister to save her from a dog attack. Was he morally and mentally an adult because he was capable of such action? >Chris Evans praises boy who saved sister from dog attack

    Fair question. The six-year-old boy certainly showed outstanding bravery, but it was a spur-of-the-moment thing. That’s different from getting up every morning to hunt wild animals which you know full well might kill you, because they’re a lot bigger and stronger than you are, and highly aggressive when cornered and attacked, but choosing to fight all the same, because you have to feed your family. It’s the anticipation of death, day after day, and the choice to confront that risk, that make late Homo erectus and Heidelberg man more adult-like than child-like in their moral qualities, in my humble opinion.

    Meantime, here’s another pertinent video. I don’t endorse her every conclusion, but she connects a lot of threads that pique your interest regarding working memory and when humanity acquired a “religious” sense… The Roots of Religion: Genevieve Von Petzinger.

    Thanks very much for the link. It seems that Dr. Von Petzinger believes religion arose around 125,000 years ago, with the advent of symbolism, but that it didn’t enter its full flowering until about 40,000 to 50,000 years ago – about the time she thinks the last wave of modern humans left Africa. (The currently favored date for the last wave is 75,000 years ago, but let that pass.) I agree with her that symbolic behavior dates from around 125,000 years ago. But what she doesn’t point out (because it wasn’t known in 2012, when she gave her talk) is that around the same time, Neanderthals also started practicing symbolic behavior. So we have two species of human beings, last sharing a common ancestor at least 600,000 years ago, displaying symbolic behavior. What do you make of that? Cheers.

  24. vjtorley: Thanks very much for the link. It seems that Dr. Von Petzinger believes religion arose around 125,000 years ago, with the advent of symbolism, but that it didn’t enter its full flowering until about 40,000 to 50,000 years ago – about the time she thinks the last wave of modern humans left Africa. (The currently favored date for the last wave is 75,000 years ago, but let that pass.) I agree with her that symbolic behavior dates from around 125,000 years ago. But what she doesn’t point out (because it wasn’t known in 2012, when she gave her talk) is that around the same time, Neanderthals also started practicing symbolic behavior. So we have two species of human beings, last sharing a common ancestor at least 600,000 years ago, displaying symbolic behavior. What do you make of that? Cheers.

    You seem to be following the Dalai Lama’s advice, Vincent.

    If scientific analysis were conclusively to demonstrate certain claims in Buddhism to be false, then we must accept the findings of science and abandon those claims.

    ― Dalai Lama XIV, The Universe in a Single Atom: The Convergence of Science and Spirituality

  25. Hi Kantian Naturalist,

    I’m not saying you shouldn’t post here. In fact I enjoy your posts and responding to them. I’m only saying that if your motivation for posting here is to improve your apologetics by exposing them to serious intellectual engagement, you will disappointed.

    I understand what you’re saying, and I enjoy interacting with the committed Christians (some of whom are apologists) over at Peaceful Science as well. But The Skeptical Zone is willing to let me post, without moderating what I write, whereas over at Peaceful Science, even when I’m posting a short comment in reply to someone, I often find that it’s placed in moderation. To me, that suggests they don’t quite trust me. Here, I’m trusted. May I also point out that quite a few Christians – many of them very sincere ones – have already commented on this thread.

    There still seems to be plenty of reason to be skeptical that conclusive evidence of the emergence of spirit would be detectable in the archeological and paleontological traces that have survived to the present day.

    I wasn’t looking for conclusive evidence. Tentative evidence would be fine with me. I’ll take what I can get.

    … [W]hen you define “spirit” as “something which is capable of abstract thought, language, an understanding of right and wrong, a knowledge of God, and free choice” it seems that you are using “spirit” in the way that Descartes used anima: as what only minds have and that machines lack.

    As far as I can tell, the Aristotelian belief that the rational soul is the form of the human body is quite compatible with the belief (shared by both Aquinas and Descartes) that the human soul is a spirit, and that the human power to reason and/or choose is an immaterial capacity. Aristotle himself insisted that understanding was not a bodily act, for reasons that I’m sure you’re familiar with. (Of course, Aristotle did not believe in personal immortality, as Aquinas and Descartes did.)

    I can see why a theologian or theological metaphysician would need to hold open a conceptual space between “how things really are” and “how things appear to be to us”, but no one else here would (I think — maybe I’m wrong about this) think it necessary or important to insist upon that distinction.

    I was simply acknowledging here that we only have a limited amount of evidence, which we are often capable of misinterpreting – hence, how things appear to us may not be how they really happened. Science evolves. I was not intending to propose an invisible beginning of the human species, as I hold that human abilities are sufficiently remarkable that we should expect to find traces of them in the archaeological record. I hope that clears things up. Cheers, and thanks for your comment.

  26. I guess if a snake can talk, a man can talk, without regard to anything whatsover to do with evolution.

    There’s nothing in the Bible that i am aware of that indicates that God ever took away the ability for snakes to talk.

    Now if Vincent really wants to know when the first Adam appeared on the scene, I suggest he go look for scientific evidence of the first talking snakes.

  27. Hi Vincent. Sorry I’m slow. Real life got in the way. I need to quit saying “tomorrow” when I don’t know what tomorrow will bring. haha. I wanted to reply to some of your OP before replying to your most recent comments.

    Finally, the distinctive globular brain shape that characterizes modern Homo sapiens is now known to have evolved gradually within the Homo sapiens lineage, according to a recent article by Simon Neubauer et al., titled, “The evolution of modern human brain shape” (Science Advances, 24 Jan 2018: Vol. 4, no. 1, eaao5961). This does not square well with the hypothesis of a “magic moment” at which this lineage became truly human.

    Globularity also wasn’t a magic moment, but it was momentous. I don’t think it crossed your radar as much as mine because you were focused on endocranial volume and arterial blood flow to the brain, and I was focused on language and morality.

    Globularity may have represented an incremental decrease in overall brain size, if anything. Most of the figures that I’ve seen have Neanderthal slightly ahead of sapiens in volume, but my memory could be off. The reason is that they have elongated brain cases, along with every other hominin prior to us. (See upload for comparison.) In fact, even sapiens are born with an elongated brain case, but the rapid growth of the frontal, parietal, and especially cerebellar areas reshapes the skull into our familiar globe by age 1. (Continuing my “ontogeny” theme.) Here’s the reference: A uniquely modern human pattern of endocranial development. Insights from a new cranial reconstruction of the Neandertal newborn from Mezmaiskaya

    You hit the right note with the Neubauer article on The evolution of the modern brain shape. The sapiens from Jebel Irhoud has our face but the elongated skull, and by ~130-100,000 years ago our skulls were globular, though not quite as “finished” as they were by 30,000 years ago. As you rightly point out, the change was gradual and not overnight. But I’d like to hightlight a couple of other points Neubauer makes:

    The cerebellum is associated not only with motor-related functions like the coordination of movements and balance but also with spatial processing, working memory, language, social cognition, and affective processing (52–55). The developmental and evolutionary shape changes of the posterior cranial fossa are linked to the rapid cerebellar expansion during perinatal brain growth. Clinical neuroimaging data from modern humans show that in the first 3 months of life, the cerebellum grows at the highest rate of all brain parts (more than doubling in 90 days) (56). The cerebellum is frequently implied in childhood onset disorders and known to be vulnerable to environmental influences during early childhood (57). It is intriguing that the evolutionary brain globularization in H. sapiens parallels the emergence of behavioral modernity documented by the archaeological record.

    This article spells out in more detail the implications of a globular brain: Reconstructing the Neanderthal brain using computational anatomy. From the article (my emphasis):

    There is now strong evidence that the cerebellar hemispheres are important for both motor-related function and higher cognition including language, working memory, social abilities and even thought. Further, whole cerebellar size is correlated with cognitive abilities, especially in the verbal and working memory domain. Thus, we examined the relationship between cerebellar volumes and various cognitive task performances … Note that the functions such as attention, inhibition, cognitive flexibility, working memory, are thought to be main components of executive functions36. These results indicate that the cerebellar hemispheres are involved in the abilities of executive functions, language processing, and episodic memory function.

    In short, neither endocranial volume nor arterial blood flow to the brain are adequate measures of the difference between modern human “intelligence” and the intelligence of prior humans. The globular human brain has been rewired and reorganized. That doesn’t nullify your “gradual” observation, though.

    More tomorrow, assuming “tomorrow” ever comes. Be well!

  28. Mung: I guess if a snake can talk, a man can talk, without regard to anything whatsover to do with evolution.

    There’s nothing in the Bible that i am aware of that indicates that God ever took away the ability for snakes to talk.

    Wasn’t it the devil which did the talking, after all why would a snake care what humans ate?

  29. vjtorley: I was not intending to propose an invisible beginning of the human species, as I hold that human abilities are sufficiently remarkable that we should expect to find traces of them in the archaeological record.

    I don’t doubt that “human abilities are sufficiently remarkable that we should expect to find traces of them in the archaeological record”!

    What I doubt is that we can find evidence of a precise demarcation between abilities that reliably demonstrate the presence of spirit and those that do not. I think you yourself basically concede this much with no fewer than ten different “Adams”!

    It seems relatively clear to me that what you call spirit emerged gradually, over millions of years, with different components added at different times and due to different selective pressures. That’s not a problem for my Deweyan naturalism, since the integration of Darwin and Hegel is crucial to the whole enterprise. Whether it’s a problem for Christian apologetics is above my pay-grade.

  30. Hi Vincent. Tomorrow actually arrived for once. Lucky you! haha. Responding to your comments this time. I’d like to get back to your OP again next. I don’t want to spend all my time/words on my view rather than yours.

    vjtorley: As I see it, the big difference between (i) the development of an infant into a rational, language-using, moral and God-conscious adult and (ii) the evolution of Homo erectus into modern human beings who possess reason, language, morality and religion is that in the former case, there is no loss of nature (the infant does not become a new kind of being, but is at all stages a human being) and no loss of individuality (the infant remains the same individual human being as he/she matures), whereas in the latter case, there is a loss of both: Homo erectus evolves into a different kind of being (Homo sapiens), and of course, it’s not Homo erectus but his distant descendant who acquires the traits in question. Your analogy would make sense if you believed in something like orthogenesis, with Homo erectus being somehow programmed or destined to evolve into Homo sapiens. Then (and only then) you might say that the former contained the seed of the latter. But on the modern understanding, evolution has no direction.

    Taking things in order …

    There is no loss of nature … the infant is at all stages a human being.
    If erectus is defined as an immature human, albeit one who could not “mature” beyond certain inborn limited capacities, the analogy holds. (Keeping in mind that it is just an analogy.) As a special ed teacher, I had students who would never qualify as possessing abstract thought or mature human morality. They were forever “children,” both in the eyes of the state and in the eyes of God, I would venture to say. Like erectus, they would never achieve what we consider “normal” human maturity, yet we (Christians) still can say that they are made in the image of God and possess a soul/spirit.

    (There is) no loss of individuality (the infant remains the same individual human being as he/she matures)
    Assuming they mature. Every individual doesn’t achieve the full potential of “normal” human beings. Consider the case of a child who dies as a toddler. The child never achieves “fully human” development of reason, language, morality, or religion, yet we nevertheless credit it with full humanity. The same holds true for erectus. Whether individually or collectively, if they never achieved fully modern development of reason, language, morality, or religion, we nevertheless may credit them with full humanity.

    But on the modern understanding, evolution has no direction.

    On the Christian understanding, God had a purpose in creating humanity (Gen. 1:26). I see no reason why God couldn’t have incorporated seemingly “random” means to achieve his ultimate ends. After all, our God is a hidden God, as both Isaiah and Pascal point out.

    vjtorley: It’s the anticipation of death, day after day, and the choice to confront that risk, that make late Homo erectus and Heidelberg man more adult-like than child-like in their moral qualities, in my humble opinion.

    Good observation. No analogy is a perfect match in every detail. Erectus and heidelbergensis may have had child-like language capabilities, but they certainly were a mix of adult experience and child-like limitations. Anyway, one of the characteristics of childhood moral development is that they may have a “gut feeling” about what’s right or wrong, but they can’t articulate the reasons for those judgments. They “just know.” Without abstract language and fully symbolic thought, individual behaviors cannot be abstracted and generalized into categories and “rules.” For instance, I once had an autistic student who couldn’t generalize from a specific to an abstraction. If he cussed out a girl and got into trouble for it, he knew not to cuss her out again. Everyone else was still fair game.

    vjtorley: But what she doesn’t point out (because it wasn’t known in 2012, when she gave her talk) is that around the same time, Neanderthals also started practicing symbolic behavior. So we have two species of human beings, last sharing a common ancestor at least 600,000 years ago, displaying symbolic behavior. What do you make of that?

    Back to that paper on Diepkloof and Blombos. Individual symbols aren’t the same as fully symbolicity. Decoding the Blombos Engravings, Shell Beads and Diepkloof Ostrich Eggshell Patterns. From the article: “This suggests different kinds of relationships existed with various artefacts during the Middle Stone Age in southern Africa in that some reflected more complex patterns of ‘symbolic’ engagement than others…”

    Neanderthal was a smart guy, so to speak. Individual symbol usage doesn’t necessarily indicate full symbolicity.

  31. Hi Vincent et al. Just a few quick thoughts on your discussion of Acheulean Adam. Excellent discussion overall, but I admit to skipping the excursus on pebbles. “Tool maker” was long ago discarded as a demarcation of human behavior, but I suppose it needed to be said. BTW, I’m one of those who calls ergaster “early erectus.” I wish the anthropologists would make up their minds!

    Homo erectus lived in Asia for hundreds of thousands of years before it started making Acheulean hand-axes, while Homo ergaster may have lived in Africa for over 100,000 years before starting to create these tools, so it would be unwise to equate either of these species with “Acheulean Adam,” tout simple. Incidentally, Acheulean technology was remarkably long-lived: later Acheulean hand-axes were made by Heidelberg man and even early Homo sapiens, as late as 200,000 years ago.

    It’s a truism that new species appear long before any cultural advances that can be attributed to them. As you point out, early erectus took 100,000 years to improve upon Olduwan tools, and sapiens required 100,000 years to improve upon Acheulean tools. The same holds true for every technological and symbolic advance. This in itself is a pretty powerful argument against a “magic moment” Adam & Eve parachuting into history as fully-realized modern humans. You addressed this in different language in your conclusion, but you also raised the question of why erectus required 1 mil. years to attach a handle to a spear. The slow pace of innovation is what clearly stands out in early hominin evolution.

    In any case, the intelligence that fashioned these tools still fell far short of our own, as shown by the fact that even bonobos can be taught how to fashion Oldowan-style flakes that can be used for cutting food from small round stones. (That said, the bonobos didn’t do a very good job of making them.)

    I was surprised to learn this! Cool stuff. Still, I’m not shocked. Chimps and bonobos make tools and have a form of culture, but their learning process is as dyadic as their communication. (Human communication is triadic and referential, relying upon shared frames of reference.) A mother may teach a young chimp how to strip leaves off a branch to make a tool to fish for termites, but the learning process is painstakingly slow for the simple reason that chimps are emulating the final product, rather than imitating the process. In evolutionary terms, an increase in the ratio of fronto-parietal vs. fronto-temporal connectivity from monkeys to apes to modern humans provides a possible neurological substrate for this shift from emulation to imitation. (Process Versus Product in Social Learning: Comparative Diffusion Tensor Imaging of Neural Systems for Action Execution–Observation Matching in Macaques, Chimpanzees, and Humans)

    Globularity and social learning (mimesis) explain the “uptick” in the pace of innovation. In their seminal article The shape of the human language-ready brain, the authors say, “It is imitation that is likely to underlie the possibility of cultural innovation that is so characteristic of modern humans.”

  32. OMG sir Torley keeps getting worser. Now he wants Christianity in trouble, while his scientific evidence is a misreading of facts as always.

    Just one point on this:

    vjtorley: I presented further evidence that human morality – in particular, altruism and self-sacrifice for the good of the group – go back at least 500,000 years, with evidence for long-term care of the sick going back 1,500,000 years and big-game hunting (which required a high degree of public co-operation and a willingness to put one’s life at risk for the good of the group) going back 700,000 years. In my book, anyone who’s willing to lay down their life for others is displaying truly human moral behavior. Once again, we can identify no sharp beginnings here.

    How do you identify someone who is willing to lay down their life for others? You can identify someone who laid down their life for others, but it is a whole different matter to demonstrate whether this was done intentionally and willingly instead of accidentally. There is a rather sharp line between the two, but it is psychological, not empirical for everybody to see.

    Whatever animals do, they do instinctively. Whatever humans do, they can exercise will and practise enhancement of rational choice over instincts. Or they can practise suppression of rationality, sinking into animality.

    The brain mass (or whatever your “scientific” measure is) will not provide even tentative evidence to the particular path the particular human being is treading, but this intentionality and will is the sharp line you are looking for. The sharp line is there, but you are looking in the wrong place, as usual.

  33. Hi Kantian Naturalist,

    It seems relatively clear to me that what you call spirit emerged gradually, over millions of years, with different components added at different times and due to different selective pressures. That’s not a problem for my Deweyan naturalism, since the integration of Darwin and Hegel is crucial to the whole enterprise. Whether it’s a problem for Christian apologetics is above my pay-grade.

    I’m not disagreeing with you, as my OP is concerned purely with the implications of the gradual emergence of spirit for Christian apologetics. I don’t doubt that some version of naturalism (such as Dewey’s) can explain the known scientific data regarding human emergence. The point I was making in my OP is that Christian thinking is very binary, regarding its attitudes towards humans vs. other animals, and neither the fossil record nor the archaeological record sits well with this binary mindset. For all I know, it might be possible to thread the needle, but I haven’t seen an attempt that I’d regard as convincing. The vast majority of Christian apologists remain blissfully unaware of the enormity of the problem. Much to his credit, however, Jay313 is genuinely engaging the evidence.

    Hi Jay313,

    Sorry for not replying sooner. Thank you for the links to the various articles, especially the one titled, “Reconstructing the Neanderthal brain using computational anatomy”. I think the point it makes about the size of the cerebellum is a valid one. Regarding the Neanderthals, I’m inclined to think that there was certainly a large difference between our capacities and theirs, but I don’t see any reason to believe there was a qualitative difference.

    You also write: “If erectus is defined as an immature human, albeit one who could not “mature” beyond certain inborn limited capacities, the analogy holds.” My problem is: can one legitimately speak of an entire species as immature? In Aristotelian terms, each species is defined by its own telos. If “fully human” development of reason, language, morality, or religion was part of the telos of Homo erectus, and no members of this species attained this telos, then what you’re saying is that all members of the species Homo erectus were defective. In Aristotelian terms, that does not compute: you can call individuals malformed, but not entire species.

    Now, you may reject the Aristotelian notion that each biological species has its own telos. You may prefer to say that humans (who comprise several biological species) have a unique telos. Fair enough. But in that case, you still have to regard erectus as sapiens-in-the-making, which is a form of orthogenesis. Invisible, perhaps, if God uses “random” events, but it’s still directed evolution.

    It seems you regard Homo erectus as having had a spirit, albeit an immature one. Was there, on your view, a first individual into whom God infused a spirit? If I read you aright, your answer would be “yes.” Is that correct?

    Hi Erik,

    “Worser” is not a word.

    Whatever animals do, they do instinctively. Whatever humans do, they can exercise will and practise enhancement of rational choice over instincts…

    …[T]his intentionality and will is the sharp line you are looking for.

    1. Define “instinct.”

    2. Was Betty the crow acting instinctively when she fashioned a fish hook with her beak (a novel feat)? Are you aware of recent scientific evidence showing that crows are capable of planning three steps ahead when solving problems? If you don’t regard this as evidence of rational choice, then tell me: why not?

    3. What kind of behavior by an animal would falsify your sweeping claim that humans are rational, but animals act entirely on instinct? If you say “None,” then you’re basically acknowledging that nothing an animal does, no matter how clever, could ever prove it to be rational. That strikes me as an absurdly pig-headed position. But if you’re happy to propose a behavioral test for rationality, then your approach is not so very different from mine, after all. In my OP, I examined no less than ten proposed behavioral criteria, in section B (The Ten Adams). If you’d like to put forward an eleventh behavioral criterion, then by all means, let’s hear it.

    4. May I remind you of a remark made by the late economist, Paul Samuelson, in an interview in 1970: “Well when events change, I change my mind. What do you do?”

    Are you willing to change your mind?

  34. vjtorley: 1. Define “instinct.”

    This is one easy way to highlight your problems for you: You do not have the necessary definitions to be able to draw any conclusions.

    Your inability to identify the first humans is due to that you humanise animals so much that you cannot distinguish between animals and humans. And of course you insist on keeping it up. Good job!

    vjtorley: 2. Was Betty the crow acting instinctively when she fashioned a fish hook with her beak (a novel feat)? Are you aware of recent scientific evidence showing that crows are capable of planning three steps ahead when solving problems? If you don’t regard this as evidence of rational choice, then tell me: why not?

    Is ants’ nest a problem that ants solve by thinking gazillion steps ahead? Is each of them thinking individually or are they planning collectively? Why/Why not?

    They are animals because they do not think. As soon as you assume that they do, you have completely lost it. But yeah, your entire series is carried by this misconception. No help for you.

  35. Erik: Is ants’ nest a problem that ants solve by thinking gazillion steps ahead?

    Crows are not insects. Are crows capable of planning?

  36. vjtorley: I’m not disagreeing with you, as my OP is concerned purely with the implications of the gradual emergence of spirit for Christian apologetics. I don’t doubt that some version of naturalism (such as Dewey’s) can explain the known scientific data regarding human emergence. The point I was making in my OP is that Christian thinking is very binary, regarding its attitudes towards humans vs. other animals, and neither the fossil record nor the archaeological record sits well with this binary mindset. For all I know, it might be possible to thread the needle, but I haven’t seen an attempt that I’d regard as convincing. The vast majority of Christian apologists remain blissfully unaware of the enormity of the problem. Much to his credit, however, Jay313 is genuinely engaging the evidence.

    Fair enough! I guess I don’t see why it would be a problem to say that the image of God was implemented in stages. If one is willing to say that it took a few billion years for Creation to be fully actualized, then why not say that it took almost two million years for the spiritual aspect of human nature to become fully actualized?

  37. OMagain: Crows are not insects. Are crows capable of planning?

    How do you know crows are not insects? But if there is a clear sharp difference between those according to you, then why not between humans and animals also?

  38. Kantian Naturalist: Fair enough! I guess I don’t see why it would be a problem to say that the image of God was implemented in stages. If one is willing to say that it took a few billion years for Creation to be fully actualized, then why not say that it took almost two million years for the spiritual aspect of human nature to become fully actualized?

    Maybe it’s just me, but this stretch sounds a lot like post hoc reasoning. So we have this religious text, written thousands of years ago, by people trying to flatter themselves. Now, they were just winging it then, and by today we have enough knowledge to see that there’s nothing remotely plausible about their illusions. So should we accept the religious text for what it was? NO! What we do is find some way to rationalize what we know to force-fit it to what we know ain’t so.

    The simple approach, to recognize gods as pseudo-explanations for what wasn’t understood millennia ago and replace them with modern efforts at explanation, simply cannot cross the minds of the religious. Can we never outgrow our irrational NEED for some ineffable overall plan for reality, with us at the top?

  39. Flint: Maybe it’s just me, but this stretch sounds a lot like post hoc reasoning.

    Oh, sure. I wasn’t endorsing any of it. Just trying to see if I can help Torley at all with his problem.

    So we have this religious text, written thousands of years ago, by people trying to flatter themselves. Now, they were just winging it then, and by today we have enough knowledge to see that there’s nothing remotely plausible about their illusions. So should we accept the religious text for what it was? NO! What we do is find some way to rationalize what we know to force-fit it to what we know ain’t so.

    The simple approach, to recognize gods as pseudo-explanations for what wasn’t understood millennia ago and replace them with modern efforts at explanation, simply cannot cross the minds of the religious. Can we never outgrow our irrational NEED for some ineffable overall plan for reality, with us at the top?

    I don’t see religions as being in the explanation business at all to begin with.

  40. Kantian Naturalist:

    I don’t see religions as being in the explanation business at all to begin with.

    There are two ways to interpret this:
    1) Religions do not actually explain anything about our universe (no matter how assiduously they purport to do so)
    2) You are somehow unaware of millennia of human attempts to use gods and religion to explain everything from biology to astronomy to weather.

    But I can’t believe 2) is true. Science is accepted as being in the business of closing the gaps that gods live in — that is, finding accurate explanations for things that, not understood, were attributed to gods. Religion has been in the business of providing bogus explanations since forever. How did life get started? Goddidit. Where does “human spririt” come from? Goddidit. What is the nature of consciousness? Goddidit. How many times have you yourself observed that something that purports to explain everything, explains nothing?

    So I don’t understand your comment.

  41. Flint,

    I took KN to be expressing his opinion that explaining nature is not a proper role for religion. I don’t think he was denying that they do attempt such explaining.

  42. Erik: They are animals because they do not think. As soon as you assume that they do, you have completely lost it.

    You don’t seem to understand what “will” or “thinking” entails, nor how those ideas apply to insects, reptiles, birds, mammals, and humans, as Dazz and OMagain also pointed out. Insects may be “preprogrammed” stimulus/response creatures, but that doesn’t mean even they lead a predetermined existence. The environment (stimulus) is always changing and unpredictable, even on the insect level of existence.

    Reptiles and birds (imagine that!) have a completely different brain structure than mammals. Songbirds and parrots have forebrain neuronal densities comparable to some primates, so they’re not unintelligent. Nevertheless, their intelligence is different from mammals. Mammalian brains are characterized by the presence of the prefrontal cortex. This is the center of working memory and decision-making. I don’t know about you, but I’ve owned many dogs, and I’ve seen them make choices all the time. You could say that they’re completely driven by instinct, but you’d be wrong. Humans also have instincts. But in our case, only a few basic instincts are inborn; the rest are “preferences” that can be biological or cultural. If I offer you a cake or pie for dessert, your decision may be as determined as the dog’s, depending whether your taste buds prefer sweet (chocolate) or tart (key lime) and whether you grew up in a cake-eating or pie-eating family. Does this mean I’m endorsing determinism? No. I’m simply saying that animals make choices, and they can think about those choices even though they lack language.

    vjtorley: Regarding the Neanderthals, I’m inclined to think that there was certainly a large difference between our capacities and theirs, but I don’t see any reason to believe there was a qualitative difference.

    I draw the opposite conclusion. Your evidence and mine agree that there was no quantitative difference between our present brain and theirs. (Roughly the same volumes and arterial blood flow.) The difference was qualitative in the sense that globularity and the growth of the cerebellum allowed for a different type of thought than was possible for previous hominins. If you remember the article I shared on Pleistocene trade networks, Neanderthal was an oddity because their networks never extended beyond 75 km. Whatever the reasons, they seemed only to trade with nearby kin. That’s a distinct disadvantage in times of scarcity.

    vjtorley: My problem is: can one legitimately speak of an entire species as immature? In Aristotelian terms, each species is defined by its own telos. If “fully human” development of reason, language, morality, or religion was part of the telos of Homo erectus, and no members of this species attained this telos, then what you’re saying is that all members of the species Homo erectus were defective. In Aristotelian terms, that does not compute: you can call individuals malformed, but not entire species

    In evolutionary terms, every species is unique and serves its purpose. If Homo erectus was the necessary first step toward “fully human” development of reason, language, morality, and religion, then this entire species attained and performed its God-ordained telos even if it didn’t realize the final step.

    vjtorley: You may prefer to say that humans (who comprise several biological species) have a unique telos. Fair enough. But in that case, you still have to regard erectus as sapiens-in-the-making, which is a form of orthogenesis. Invisible, perhaps, if God uses “random” events, but it’s still directed evolution.

    I don’t take orthogenesis in the Wikipedia sense of a biologically determined, continually onward-and-upward march to “fully human.” But as a Christian, I’m not ashamed to admit a form of orthogenesis, if by that you simply mean directed evolution. The goal was stated in Gen. 1:26 following the creation of the animals: “Let us create adam/humanity in our image …” Such a telos is empirically unprovable and accepted by faith. But if the God and Father of Jesus Christ exists, then he had a reason for creating, and that reason reflects the heart of the Trinity — the love and fellowship (communication) that existed between Father, Son, and Holy Spirit from all eternity. A creature made of earth, like the animals, required certain structural qualities and communication abilities before it could “commune” and have any sort of fellowship with the Most High.

    Getting more “down to earth,” I think that evolutionary constraints come into play along certain paths. It’s somewhat along the lines of biological “fine tuning.” (Although I should add that the fine-tuning argument isn’t decisive “proof” like some think.) For instance, I ran across this interesting conclusion in an article on The language-ready head: Evolutionary considerations: (emphasis added)

    “Most accounts of language evolution take as their starting point a behavioral (e.g., “communication”), or cognitive (e.g., “recursion”), or neurological (e.g., “direct corticolaryngeal connection”) phenotype. (The Globular Language-Ready Brain) takes seriously Lieberman’s (2011) arresting statement that we all too often think that apes could evolve to become humanlike without much selection acting on head shape, as our cartoons and movies about talking animals reveal. Yet, in order to think like us, and express themselves like us, other species would have to evolve a brain like ours. Growing such a brain would significantly reshape their craniofacial development and, ultimately lead to a different head shape, making them look like us in relevant respects.

    vjtorley: It seems you regard Homo erectus as having had a spirit, albeit an immature one. Was there, on your view, a first individual into whom God infused a spirit? If I read you aright, your answer would be “yes.” Is that correct?

    Tricky question. I expect nothing less from you. haha. To be honest, my view of biblical anthropology is in flux. I lean toward the biblical scholars that we don’t so much “possess” a soul as that we are a soul. A complex unity, as I’ve heard others put it. Nevertheless, the traditional body/soul dichotomy claims a hold on my thoughts, and I can’t rule it out.

    To answer your question directly, I conceive of ha’adam/humanity as a population. Still, it’s hard to conceive of any “first” — whether making a stone tool or speaking a word or a committing a morally culpable sin — happening simultaneously to a population in the tens of thousands. Someone logically had to be the “first,” but ideas spread so quickly that it’s also hard to imagine any innovation being confined for long.

    That aside, I was arguing the Catholic point of view that Gen. 2:7 reflects God breathing a soul/spirit into ha’adam at his creation can’t be ruled out. How many children are born into the world every day? Google says 360,000. No “natural” process can create a spirit, so that’s 360,000 souls that God creates every day. I see no reason to presume that God couldn’t have created and infused souls into a much smaller number of erectus individuals over a period of years.

    The question then is whether the “infusion” of a spirit into an individual (and eventually population) would create a dramatic change of cognition/behavior. The only comparison I can think of is being “born of the spirit” as Jesus describes it in John 3. However you experienced that event within your own faith tradition, it should have resulted in a change in your thinking and behavior, yet not of the type that would leave a mark in the “archaeological record.”

    Kantian Naturalist: I wasn’t endorsing any of it. Just trying to see if I can help Torley at all with his problem.

    I’ve appreciated that about you from the start.

  43. Erik: How do you know crows are not insects?

    All labels are invented. If you want to call crows insects you can, but don’t expect anybody else to be interested.

  44. Neil Rickert:
    Flint,

    I took KN tobe expressing his opinion that explaining nature is not a proper role for religion.I don’t think he was denying that they do attempt such explaining.

    Let me put it this way: I don’t think that religions evolved to play an explanatory role, and I don’t think that is their primary or most important function.

    I think that we don’t really have a good sense of what a religion is, or if the term marks out a robust social kind at all. Consider: what do Madhyamaka Buddhism, Pentecostal Christianity, Yoruba religion, Aztec religion, Sunni Islam, and Australian Aboriginal myth all have in common such that they are all distinct species of the same genus?

    I’m skeptical that any plausible answer to that question — if there is one! — would give priority to explanation.

  45. Kantian Naturalist: Let me put it this way: I don’t think that religions evolved to play an explanatory role, and I don’t think that is their primary or most important function.
    I’m skeptical that any plausible answer to that question — if there is one! — would give priority to explanation.

    OK, I won’t disagree with that, but initially you wrote:

    I don’t see religions as being in the explanation business at all to begin with.

    Which is quite different, and quite clearly incorrect. ALL religions propose explanations to some aspect(s) of life. Even if we say an important part of religion is to guide us in living better lives (for some values of “better”), there is an underlying implication of WHY any religion’s precepts are (or should be) persuasive.

    I attended a short lecture at the Acoma reservation in New Mexico, on their origin myth. It was fascinating, and much more complex than Genesis. One attendee asked whether the Acoma believed their myth explicitly. And the answer was, “of course not, but we study our myths to gain a better understanding of ourselves and our history.” And at least me for me, the central purpose of explanation is to provide a better understanding. Of anything.

  46. Kantian Naturalist: I think that we don’t really have a good sense of what a religion is, or if the term marks out a robust social kind at all. Consider: what do Madhyamaka Buddhism, Pentecostal Christianity, Yoruba religion, Aztec religion, Sunni Islam, and Australian Aboriginal myth all have in common such that they are all distinct species of the same genus?

    Emotion!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.