The remains of Homo floresiensis, discovered at Liang Bua on the Indonesian island of Flores in 2003, and of Homo naledi, discovered inside the Rising Star Cave in South Africa’s Cradle of Humankind, have played an important part in helping us understand the diversity and complexity of our hominin past.
H. floresiensis, dubbed ‘The Hobbit’ by the media because of its diminutive size, with a brain capacity of around 380 cm3 and standing around a metre tall, was considered by many scientists to be a deformed or microcephalic H. sapiens. However, strong physical evidence such as humeral torsion[i] and a set of teeth unique among hominins[ii] has pretty well ended the debate about its status as a species in its own right. The main disagreement now, considering the size of its brain, is whether or not it should be included in the genus Homo.
And speaking of small brains …
H. naledi was half again as tall as H. floresiensis – about the same height as a large chimpanzee – and although its cranial capacity (between 460 cm3 and 610 cm3) was considerably bigger than the Hobbit’s, it was still well short of a modern human or any of our immediate cousins such as H. neanderthalensis.
As I wrote in a previous post, however, brain size is not necessarily a reliable indicator of intelligence.[iii]H. floresiensis almost certainly made and used stone tools[iv], and recently the University of Witwatersrand’s Lee Berger announced that researchers had found evidence of fire being used by H. naledi[v]. This last was probably something of a given, since the remains of H. naledi were found in a chamber of the Rising Star Cave that could only be reached through a long, dark and twisting route that was difficult and dangerous to follow even with artificial light – without some kind of illumination it would have been virtually impossible. Still, this recent evidence adds weight to the case that this species was capable of making and using fire.
As friend and palaeoanthropologist Debbie Argue asks, however, when and how did H. naledi learn to make fire? Could they possibly have acquired the skill from a contemporary hominin, such as H. sapiens? Or was it the other way around? Or did both species learn the trick from a third hominin group?
We’ll probably never know the answer to this question, but it is fun thinking about, and – at the risk of stretching a metaphor almost to breaking point – throws another log on the fire of revaluating exactly what it means to be human.
One of the great palaeoanthropological bombshells of the last generation was the discovery of Homo floresiensis on the Indonesian island of Flores. For years scientists debated what ancestor this new and somewhat diminutive hominin – dubbed the ‘Hobbit’ by the media – had come from, or indeed if it should even be included in our genus.
While now generally accepted as a member of our broader tribe, its origins are still fiercely argued, many insisting it’s nothing more than H. erectus that’s undergone insular dwarfism. But I think a 2017 paper written by Colin Groves, Debbie Argue, Michael Lee and William Jungers, convincingly demonstrates that H. floresiensis is not derived from H. erectus (or is a diseased example of H. sapiens), but rather from a much earlier hominim such as H. habilis or a sister species.[i]
A second paper, published in 2020[ii], backs up this hypothesis, and concludes with this statement:
‘ … something which on account of our inadequate current taxonomic framework we have to call “early Homo” differentiated in Africa, possibly as early as 2.8 (mya) … Subsequently, one or more members of this group reached the Mediterranean fringe and spread Out of Africa at 2.5 Ma. After successfully expanding over Asia, at least one of those hominins … gave rise to new species that reached the Caucasus by around 1.8 (mya), and thence Europe by ca. 0.9 (mya) … (the) eastward expansion (or occupation) in Asia of small-bodied and archaically-proportioned hominins continued, possibly in multiple waves; and, by ca. 0.8 (mya), representatives of this group had penetrated as far as insular southeast Asia, where H. floresiensis ultimately emerged … ’
Indeed, some scientists considered this possibility as early as 2005. A report about the brain of H. floresiensis published in Science in that year[iii] concludes with these lines: ‘Although it is possible that H. floresiensis represented an endemic island dwarf that, over time, became subject to unusual allometric constraints, an alternative hypothesis is that H. erectus and H. floresiensis may have shared a common ancestor that was an unknown small-bodied and small-brained hominin.’
I think an increasing weight of evidence strongly suggests that the first major exodus of our genus from Africa was carried out by H. habilis or one or more of her sisters. Furthermore, I think it’s possible that these closely related species then gave rise to H. erectus, H. pekinensis, H. luzonensis[iv] and H. floresiensis in Eurasia, while those remaining in Africa gave rise to H. ergaster. This does not preclude the possibility, or perhaps probability, of any or all of these species crossbreeding if they ran across each other.
But what of H. sapiens, our own species? As with H. ergaster and H. erectus, the evidence here is convoluted, confusing and often contradictory.
For those, like Colin Groves, who think H. ergaster is a species in its own right, the line of descent works something like the following.
About 600,000 years ago, H. ergaster, either directly or through an intermediary species called H. rhodesiensis, gave rise to H. heidelbergensis. This species was our size physically, and his brain capacity was well inside the standards of Anatomically Modern Humans (AMH). Following the great tradition of hominin migration, something that seems as ingrained in our genus as bipedalism, some members of this new species moved to Europe[v]. About 400,000 years ago, they gave rise to H. neanderthalensis. In a case of ‘well, we’ll show you’, those who stayed behind in Africa gave rise to H. sapiens at least 300,000 years ago, and possibly as long as 350,000 years ago.[vi]
I can’t stress this enough. Homo sapiens are Africans. It is where our archaic ancestors and AMH first appear[vii]. (Let me also stress that this story, as complicated as it gets from now on, does not resurrect the Multiregional Model for our evolution, where H. erectus gave rise to H. sapiens across its whole range at the same time, from Africa to Asia. This is an old theory, now largely discredited by the extensive fossil and DNA evidence that our species first evolved in Africa.[viii])
What happened next has been slowly and painstakingly uncovered by palaeoanthropologists doing field work throughout Africa and Eurasia, and by the outstanding work performed at the Max Planck Institute’s Department of Evolutionary Genetics, headed up by Svante Pääbo, into hominin DNA.[ix]
What the DNA evidence strongly suggests is that H. sapiens successfully left Africa between 70,000 and 100,000 years ago. (Although this wasn’t the first migration into Eurasia by our species. It is usually held that previous attempts left no trace in the DNA of AMH outside of Africa, but see these earlier posts, here and here.)
Members of the most recent migration interbred with H. neanderthalensis, probably in what is now the Middle East, and later with the Denisovans, another possible descendant of H. heidelbergensis, deeper in Eurasia[x]. To this day, the average ex-African H. sapiens carries between 1%-2% of the Neanderthal genome; but it is not the same one or two percent: we overlap. Overall, we carry up to 40% of the Neanderthal genome in our own genes. But the story gets more complex still: the genome of people from Oceania, such as Papuan New Guineans and Australian Aborigines, can have between 5-6% Denisovan DNA[xi]; indeed, recent research suggests that Ayta Magbukon Negritos in the Philippines have Denisovan ancestry 30-40% higher than either of these two groups.
The Natural History Museum of London’s Professor Chris Stringer says, ‘It is now clear there was a lot more interbreeding between ancient species, including early Homo sapiens and others, and that there was a lot more movement of populations both in the distant past – and relatively recently.’[xii]
Talking about recent research, in June last year Chinese scientists announced that a cranium first discovered in China almost a century ago, is a new species of Homo with a brain easily the equal of any AMH in size and carried inside a skull more massive than ours. Those making the announcement have named the new species H. longi (‘Dragon man’, and just as Denisovans are sometimes described as a sister species to Neanderthal, so H. longi is being claimed as a sister species to H. sapiens[xiii]).
As Lee Berger, from the University of Witwatersrand and the discoverer of Australopithecus sediba and H. naledi, has suggested, perhaps the different paths of human evolution are not best thought of as branches spreading from a single tree trunk, or even a messy, many-twigged bush, but rather a braided stream[xiv] with tributaries constantly running across each other before separating, rejoining and separating once more.
We, Anatomically Modern Humans, are the result of all this evolution. We are nothing more than a mongrel species.
And this from the Australian Museum: ‘Most scientists that accept H. floresiensis as a legitimate species now think its ancestor may have come from an early African dispersal by a primitive Homo species similar in appearance to H. habilis or the Dmanisi hominins. This means that it shared a common ancestor with Asian H. erectus but was not descended from it. Cladistic analysis supports the lack of a close relationship with H. erectus.’
[v] The first H. heidelbergensis fossils were found near Heidelberg in 1907.
[vi] Although this paper suggests the split between our two species might be found much further back … up to 800,000 kya or more!
[vii]Recent research from scientists at Australia’s Garvan Institute of Medical Research reveals that southern Africa is home to the oldest evidence for AMH: ‘… to contemporary populations that represent the earliest branch of human genetic phylogeny.’ The date they arrive at is 200,000 years ago.
As well, a report in the February issue of Science describes how thousands of genome sequences were collected from modern and ancient humans to create a family tree. In the words of the report’s first author, Anthony Wilder Wohns, ‘ … we definitely see overwhelming evidence of the Out-of-Africa event … ‘
[viii] See Stringer, C. & Andrews, P. The Complete World of Human Evolution. London, 2011. P 140 ff for a discussion of the two main theories for the evolution of Home sapiens: ‘Multiregional’ and ‘Out of Africa’.
[ix] And now, besides DNA, they are using protein analysis to identify ancient hominins, most recently the first Denisovan found outside of the Denisova Cave in Siberia … on the Tibetan Plateau of all places! See https://www.nature.com/articles/s41586-019-1139-x, 16 May 2019.
[x]Very recently, H. sapiens remains were discovered in the Grotte Mandrin rock shelter in the Rhône Valley in France that date back 54,000 years ago, pushing back our species arrival in Europe by at leat 10,000 years from previous estimates.
[xi] Please watch this fascinating talk Svante Pääbo gave at the University of California in 2018 after receiving the Nierenberg Award for Science in the Public Interest. It goes into all of this in much more detail. As Pääbo points out in the talk, the DNA evidence indicates humans ‘have always mixed’.
Humans walk upright, gorillas and chimpanzees walk on all fours, resting their weight on their knuckles, and orangutans can do just about anything – they hang and swing by their arms from branches, sometimes with the help of their oddly-shaped feet, and on the ground they can walk either upright or on all fours. The structure of the postcranial skeleton in all four animals is very different and reflects these locomotor patterns. Non-human great apes have short legs and long arms, whereas we have very long legs. With the gorilla and chimpanzee it is the shortness of the legs that differs from humans, the arms being much more similar in length compared to the torso; only the orangutan has enormously lengthened arms. When other great apes stand upright, their legs are straight from hip to ground, whereas humans are ‘knock-kneed’, as the thighs slope inward from the hip to the knee. The pelvis is very different in appearance: in humans the hip bone (ilium) is low and very broad, but in great apes it is high and fairly narrow. In humans the great toe is long and stout and aligned with the other toes, but in great apes it is divergent from the other toes (less in the gorilla), and in the orangutan it is very short.
In most great apes, the spinal column is more or less straight, but in humans the spine is curved into a double-S: the cervical (neck) vertebrae curve forward, the thoracic (chest) vertebrae curve backward, the lumbar vertebrae (those in the small of the back) curve forward again, the sacral vertebrae (which are fused together, and form the back wall of the pelvis) curve back again, and the coccyx (the partially fused vertebrae which are the tiny remnant of the tail) curves forward once more. The ribs (which are very variable in number, but average 12 in humans and orangutans, and 13 in chimpanzees and gorillas) together form the thorax; in humans the thorax is barrel-shaped (narrow at the top, broad in the middle, narrower again at the bottom), whereas in great apes it is funnel-shaped (narrow at the top, and broadening towards the bottom).
All of these differences between humans and the other great apes are developments stemming from bipedalism. So why did humans adopt bipedalism? Well, walk with me and we’ll take a brief look at the major theories.
Doing a runner
There seems to be a growing consensus among many scientists that our ancestors evolved bipedalism for several reasons rather than one overriding factor. What many of the competing theories do agree on, however, is that rainforest giving way to savannah because of climate change around 7.0 – 5.0 mya was a strong influence. Grassland with only scattered trees and no closed canopy meant tree-climbing primates had much more open territory to cover. Walking on two legs freed hands to carry infants, food or tools, including weapons. Walking on two legs made us taller, meaning we could locate food, potential predators and safe havens from further away; it also made it easier to pick low-hanging ripe fruit from trees. Walking reduced the amount of body surface area we exposed to the sun while in the open.
Of course, in some circumstances some of these ‘advantages’ could become disadvantages. For example, although bipedalism meant we could locate a predator from further away, it also meant if it was looking in the right direction, a predator could see us from further away as well (and our chief predators – leopards, hyenas and lions – all have good eyesight, not to mention excellent hearing and sense of smell). On the other hand, when our ancestors became active hunters, our extra height gave us an advantage over prey animals, many of whom rely on their sense of smell rather than their eyesight.
More recently, one of the major arguments for the successful adaptation of bipedalism was that it is a much more energy efficient method of locomotion[i]. Whatever the arguments for or against all these hypotheses regarding the origins of walking, when it came to running there is no denying our bodies evolved to make us one of nature’s supreme endurance runners[ii]. This seems to have happened about two million years ago and was a real game-changer when it came to predating: our ancestors evolved into persistence hunters, able to wear down much larger animals such as kudu and oryx[iii]. Basically, humans ran their prey into the ground, and much of our body shape is particularly adapted to long-distance running.
In other words, the characteristics that make us superb walkers and runners are the characteristics that most set us apart from other great apes. As Chris Stringer and Peter Andrews write in The Complete World of Human Evolution, ‘at present … (bipedalism) is taken as the earliest adaptation by which we can recognise human ancestors in the fossil record.’[iv]
The odd-sock drawer
Now it’s time to deal with one of the most controversial species in the human lineage – Homo ergaster. This species was described by Colin Groves and Vratislav Mazák in 1975[v]. Since then, palaeoanthropologists are divided on whether H. ergaster is a distinct species, or a subspecies belonging to H. erectus, palaeoanthropology’s pin-up boy and all-purpose species.
Once they learned to walk, our ancestors just kept on walking. In fact, they walked right out of Africa, into the Middle East, then east into Asia and Sahul, north to Europe, and eventually across the Bering Strait and into the Americas. On the way they continued evolving into new species that seemed to interbreed with each other at every opportunity, creating yet more new species, and eventually discovering agriculture, television and the internet. And interestingly, it’s the use of technology that provides us one piece of evidence that H. ergaster and H. erectus were two different species.
But first, let’s talk more about bones, specifically those belonging to the original H. erectus, parts of which were first discovered 1891 by Eugène Dubois, a Dutch doctor working for the army in Java. In fact, he went to Java with the objective of discovering evidence supporting the theory that H. sapiens evolved in Asia, an idea most determinedly supported by German naturalist Ernst Haeckel. Haeckel had hypothesised that our species’ progenitor, which he names Pithecanthropus alalus, had evolved on Lemuria, a mythical continent that subsequently sunk beneath the Indian Ocean (thereby conveniently leaving no fossils behind to prove – or for that matter, disprove – his theory).
Although Dubois had discovered ancient hominin fossils, he found little or no support among scientists in Europe that they amounted to anything significant. It wasn’t until Sinanthropus pekinensis was discovered in China over a quarter-century later that enthusiasm for Dubois’s discovery really picked up. In the early 1950s, Ernst Mayer reclassified both P. alalus and S. pekinensis as H. erectus[vi]. Since then, hominin fossils with roughly the same estimated brain size as H. erectus and aged between 2 million years old to just over 100,000 years old have been thrown in with H. erectus like differently coloured socks thrown into an odd-sock draw. It has become the species to have when you want to cover all of Africa and Eurasia and two million years of history.
In the early 1970s, for example, Richard Leakey and Alan Walker described two partial skulls found in Kenya as belonging to an African offshoot of H. erectus based on the fact that their calculated brain capacities (848 cc and 803 cc) were not dramatically smaller than that of some H. erectus skulls (around 950 cc), which is like arguing that since the Volvo S60 and the Volkswagen Passat have similar interior space, they’re both examples of a Toyota Camry.
However, in 1975, Colin Groves and Czech colleague Vratislav Mazák, after a comprehensive metric analysis of fossils from Koobi Fora, discovered they had uncovered a new species they names H. ergaster. Their argument was that there was no African version of H. erectus; further, Colin Groves believed that H. ergaster evolved in Africa and then migrated into Eurasia, eventually giving rise to H. erectus.[vii] The earliest dates for the new species goes back 1.9 million years[viii], as opposed to 1.6 million years (or 1.8 according to some estimates) for H. erectus, making H. ergaster the first truly human-looking hominin to stride the planet – tall, thin, decidedly bipedal, with a flatter face than its ancestors and an active hunter, fire-user and tool-maker.
Now, nearly fifty years after the initial paper by Groves and Mazák, a fierce debate still continues between those who think the two hominins are separate if linked species, or just subspecies. In common parlance, it’s a debate between splitters and lumpers.[ix]
But besides the obvious difference in the skull shapes of H. ergaster and H. erectus, another line of evidence convinces me that Colin was right in his opinion that we are talking about two species. This evidence involves tool making.
Out with the old, in with the new
Until the appearance of H. sapiens and H. neanderthalensis, stone age technology is divided into two broad and overlapping stages: Oldowan and Acheulean (sometimes called Modes 1 and 2). Oldowan technology was first discovered in the 1930s by Louis Leakey at the Olduvai Gorge in Tanzania. The oldest examples have been found at Gona in Ethiopia, and date back about 2.5 million years[x]. The technology seems to have spread very quickly, and recent discoveries have found stone tools in Jordan dated at 2.5 mya and China at 2.1 mya[xi]. This technology, the use of very simple flakes and rocks, had been developed before the appearance of H. habilis, possibly by Australopithecus garhi. Acheulean technology which started about 1.76 mya, is closely associated with the appearance of H. ergaster and involves more refined knapping and the development of specialised tools such as hand axes.
This doesn’t imply that Oldowan technology suddenly evaporated, and every hominin adopted the new style of knapping chert. In some places, Oldowan and Acheulean stone tools are found at the same site from the same period, suggesting that while H. ergaster or one of its descendants employed the improved technology, one of our cousins continued using the older method. But it’s clear Acheulean technology obviously conferred a significant advantage over the old style. It didn’t take long for it to spread beyond Africa, either because H. ergaster itself started spreading beyond Africa, or because it spread by ‘word-of-mouth’: neighbouring hominis picked up on the new fashion of making tools and copied it. Acheulean tools appear in what is now India, for example, by 1.5 mya, and in Europe by about 900 kya.
However, Acheulean technology did not seem to reach Java, where our friend H. erectus resided.
Which presents lumpers with a problem. If H. ergaster is indeed nothing more than a subspecies of H. erectus, then fossil evidence suggests this single species arose in Africa before spreading throughout Eurasia. Yet if this is also the species that developed Acheulean technology soon after evolving, why didn’t the technology travel with them to the far east?
On the other hand, if we are talking about two species, then it’s quite possible for Acheulean technology to be developed by H. ergaster in Africa, spread slowly throughout Eurasia, but never quite reach the home of H. erectus in Java.
If this was in fact the case, it raises a more important question: even if we accept H. ergaster is a separate and earlier species than H. erectus. Does it necessarily follow that H. ergaster gave rise to H. erectus? What if the two species are cousins rather than mother and daughter?
This is something we’ll discuss in the next, and final, post of ‘Us’.
‘(He) was the first to suggest that the genus Pithecanthropus should be subsumed into Homo, and in the same paper he proposed that fossils recovered from what was then called Choukoutien (now called Zhoukoudian), which were initially assigned to Sinanthropus pekinensis,26 should also be transferred to H. erectus.’
‘A growing number of scientists have redefined the species Homo erectus so that it now contains only east Asian fossils. Many of the older African fossils formerly known as Homo erectus have now been placed into a separate species, Homo ergaster and this species is considered to be ancestral to Homo erectus. The redefined Homo erectus is now generally believed to be a side branch on our family tree whereas Homo ergaster is now viewed as one of our direct ancestors. ‘
[viii] Oldest fossil dates according to the Australian Museum for H. ergasterhere and for H. erectushere. Recent work reported in the journal Science may push the dates even further back, between 1.95-2.04 mya (although in this paper the discussed specimen is describe as preserving ‘characters that align it morphologically with H. erectus sensu lato (including Homo ergaster)’. Go figure.
[ix] For a fuller description of the often heated debate about what makes a species, see here.
[x] Stringer, C. & Andrews, P. The Complete World of Human Evolution. London, 2011. P 208.
A new kind of stone age technology – Lomekwian – has been suggested after the recent discovery of stone tools at Lomekwi that predates Oldowan by more than 700,000 years. See the previous post for more details.
Forgive the pun, but for decades it seemed a no-brainer that the chief qualification to be considered human was the size of your brain. Obviously, it had to be a of a certain respectable capacity, never quite defined, but a degree or two larger than a chimpanzee’s organ was a good start. There was some embarrassment when it was determined that the average brain capacity of Homo neanderthalensis was larger than our own[i], but that misgiving aside it was assumed that if not a directly comparative intelligence was a prerequisite, then certainly something within shooting distance.
One mistaken assumption here is between brain size and intelligence, something made very clear in recent years by the discovery of the stone tool-making H. floresiensis (with a brain the size of a chimpanzee). Recent work done on corvids, for example, suggests that ravens and crows possess a Theory of Mind[ii] – the capacity to imagine that another crow might have its own thoughts – which in turn suggested a reasonably developed sense of self-awareness, an emergent property traditionally associated with intelligence[iii].
Another mistaken assumption is that our larger brain size is extraordinary among our cousins, but average brain size has not increased dramatically in total capacity since H. heidelbergensis, a species that first saw light of day 600,000 years ago.[iv]
Indeed, Homo species sit comfortably on the line that matches a generic primate’s brain size to its body size. In other words, if you’re a primate, the bigger you are the bigger your brain gets. (This isn’t peculiar to primates, of course, and applies to many mammalian groups, eg rodents, elephants and aardvarks, but primates do have larger brains than mammals of similar body mass).[v]
Interestingly, there are three exceptions to this general rule, all three of which are closer to us genetically than any other primates: the orangutan, the chimpanzee and the gorilla. The orangutan falls just below the curve, the chimpanzee falls a little further, and the gorilla furthest of all. Extensive studies with chimpanzees and gorillas, however, show that both species are intelligent and self-aware enough to have developed a Theory of Mind.
Demonstrably, brain size is not irretrievably married to a set physical size, just as brain size is not irretrievably married to a set level of intelligence.
I know that you know that I know …
It does seem self-awareness, or sentience, is an emergent property of intelligence.[vi] In other words, as an animal increases in intelligence, at some point it will become aware of its own existence. This is more than simply being able to experience pleasure or pain, but the ability to experience life subjectively.
Objects found with the remains of H. floresiensis strongly suggests they made stone-age level weapons and tools.[vii] Obviously, such complex toolmaking suggests an active intelligence capable of learning new skills and – as importantly – passing those skills on to the next generation. This in turn suggests H. floresiensis possessed a language; if not a spoken language such as ours, with a huge vocabulary and complex rules of grammar, then at least some way to transmit a limited amount of information effectively and efficiently.
Evidence also exists that H. floresiensis hunted and scavenged animals such as the dwarf stegodon, a kind of elephant. To be clear, a dwarf elephant could still grow to more than two metres in height. For something the size of H. floresiensis to hunt stegodon strongly suggests they hunted in groups, which in turn strongly suggests their language was something more than a series of grunts.
With H. naledi, we are on somewhat less firm ground. Although they were larger-brained hominins than H. floresiensis, the remains of at least 15 individuals from the Rising Star Cave in South Africa were discovered without any tools or evidence of tool making. However, the species possessed a hand not dissimilar to our own, and would probably have been capable of tool-making. It is hard to imagine a hominin species living in Africa in this period, between 236,000 and 335,000 years ago, and not picking up the skill from one of the other hominin species occupying southern Africa at the same time (including, quite possibly, our own).
Furthermore, palaeoanthropologist Lee Berger, who led the expedition to recover the H. naledi remains from the Rising Star Cave, believes bodies were intentionally and repeatedly deposited there. This implies two things: first, ritual behaviour on the part of the species, and second, that they were capable of making fire, since the chamber the bones were discovered in is at the end of a long, dark, dangerous and narrow route.[viii]
I may not know much about art, but …
In his influential work on human development, The Ascent of Man, Jacob Bronowski wrote, ‘Man is not the most majestic of creatures. Long before the mammals even, the dinosaurs were far more splendid. But he has what no other animal possesses, a jig-saw of faculties which alone, over three thousand million years of life, make him creative. Every animal leaves traces of what it was; man alone leaves traces of what he created.’[ix]
Whether or not Bronowski used the term man to mean, specifically, H. sapiens, or more broadly to mean humans in general, we know that our cousins left behind more than traces of what they created. We have hundreds of stone tools, the tailings and debris of stone-tool manufacturing, and even examples of art.
I would suggest this equates to culture.
But what if there are no physical signs of culture, does it mean culture does not exist? It is often fallacious to argue that absence of evidence is not evidence of absence, but in cultural endeavours such as language or dance, there can be no evidence before the invention of writing and art.
Simple language can be identified in many primates. Vervet monkeys, for example, have distinct calls for each of their four main predators: pythons, baboons, leopards and eagles. But we will probably never know which human species was the first to communicate with what we would describe as a complex language, one capable of conveying abstract thought. Vervet monkeys may be able to tell their fellows that a leopard is approaching, but they cannot say the leopard is hiding behind that bush or over that hill, let alone discuss the rights and wrongs of predation.
We see culture operating among our more social hominid cousins, the chimps and gorillas. Long-term field studies suggest, for example, that cultural variation exists among different chimpanzee groups, including differences in grooming, courtship and tool usage.[x] It is the ‘combined repertoire’ of chimp behaviours that is significant, demonstrating a range of cultural behaviours, a diversity that once was attributed only to our own species.
It is with the application and development of tool usage that the first signs of a distinct ‘human’ culture are found in palaeoanthropology. Whereas chimps and some bird species, like humans, use tools made from plants to gather food or built shelter, humans are the first animals to make stone tools, improving on the original material through knapping. Later, humans combined stone with other material, such as wooden handles, to improve their effectiveness; in other words using tools to make better tools. Indeed, the making of stone tools was once considered the boundary marker between members of Homo and earlier genera. Since then, the boundary for stone-tool making has been pushed well beyond those species traditionally grouped under our own genus.
The oldest crafted stone tools found so far are from Lomekwi in Kenya, dating back 3.3 mya[xi]. First discovered in 2011, they were probably made by a species belonging to either the Australopithecus or the Kenyanthropus genera. The tools were found in an area where Kenyanthropus platyops fossils had been found earlier.
But we have to wait more than 700,000 years before there is clear evidence of stone-tool making on a large scale, something we’ll cover in detail in a later post.
Eventually some hominins were not simply making stone tools: ‘The people who made the hand axes clearly had a specific shape in mind, and often went far beyond a purely utilitarian form in the care with which they produced them.’[xii] This is an example of humans crafting tools for aesthetic appeal, not just knapping to produce a sharp edge or a convenient grip.
It is with H. erectus we find the first real example of an attempt at making what we would now call ‘art’. In 2014, scientists from Netherland’s Leiden University announced the discovery of a sea shell that had been engraved with a zigzag pattern 500,000 years ago, something identified by ANU scientist Dr Stephen Munro (who did his PhD under Colin Groves!). The shell was originally collected with others at the end of the 19th century by Eugène Dubois – the discoverer of H. erectus in Java – but had not been closely examined since the 1930s. The scientists demonstrated that not only was the engraving not the result of natural forces, but that the pattern was made by ‘a strong and skillful tool-maker’[xiii]. The new date pushed back the first evidence for art by 400,000 years.
With language we’re on much shakier ground. Research suggests the physiological requirements for language exist in at least some monkeys. The stumbling block seems to arise in the way the brain is wired[xiv].
Nonetheless, as noted above with vervet monkeys, a language with a basic vocabulary exists among many primate species. It has even been shown that different species of monkey may understand some of each other’s vocabulary[xv]. Some species have even developed a basic grammar[xvi].
Extensive work has been done on language among the great apes, both in the wild and under controlled conditions. For example, the remarkable success scientists have had teaching American Sign Language to Washoe, a chimpanzee, and Koko, a lowland gorilla, demonstrate their capacity to learn quite complex vocabulary, often using it to express emotions such as sadness.
But even the most optimistic view of these experiments shows that non-human great apes never demonstrate a level of intelligence found in a three-year old human child. No chimpanzee or gorilla, for example, has ever used their acquired vocabulary to ask a question.[xvii]
There is genetic evidence to suggest that the development of the capacity for language accelerated in humans after we split from the chimpanzees some seven to eight million years ago[xviii], but precisely when humans started speaking in a way that we would describe as ‘human’ is unknown; it may never be known. As with so many things in evolution, the development of a complex language capable of expressing abstract thoughts almost certainly occurred along a spectrum.
Between them, language and craft handed humans a huge advantage in the evolutionary stakes. Making stone tools, for example, minimised our weaknesses, knives and hammers allowing us to make up for a lack of sharp claws and fangs. Later, bows and throwing spears made up for our lack of speed in the chase.
Language allowed us to magnify our strengths, especially the ability to learn new things and pass that learning on to succeeding generations.
Language, and culture generally, seems to be something we share with other members of our genus, and indeed, as they are presently classified, earlier genera.
In the next post we’ll talk about bipedalism and one of the most controversial of hominin species – H. ergaster.
[i] Specifically, larger on average than the modern human brain, although the brains of archaic H. sapiens were in fact comparable to H. neanderthalensis. The following excerpt is from here.
‘To measure fossil brain volume, anthropologists have traditionally filled skulls with beads or seeds, and dumped the contents into a graduated cylinder (a precise measuring cup). They’ve also submerged molds of skulls into water, measuring the volume displaced. Today CT (computed tomography) scanning methods offer more accurate (and less-messy) measurements, but much of the data in textbooks and other references was collected the old fashioned way.
‘Based on these values, we can confidently say fossil Neanderthals and modern humans from the same time period had similar brain sizes. Twenty-three Neanderthal skulls, dating between 40,000 and 130,000 years ago, had endocranial volumes between 1172 to 1740 cm3. A sample of 60 Stone Age Homo sapiens ranged from 1090 to 1775 cm3.’
[xvii] Some scientists argue that Koko’s language skills were a result of ‘operant conditioning’, whereas others state she was indeed capable of simple questions. See Wikipedia entry here for more information and references.
Most of us did some biology at school, and most of us came out with the idea that species are groups of populations that cannot interbreed. When we’re reminded of mules, which are the offspring of horses and donkeys, we think ‘Ah, but they are sterile, aren’t they?’ Almost invariably, although there have been a few cases of fertile mules, but when cattle and bison interbreed, while the male offspring are sterile, the female offspring are fertile. All the big cats can also mate with each other, producing hybrids (where the female is fertile), and in the case of the leopon, the hybrid between a leopard and a lion, even the male might be fertile.
Now that we can trace the ancestries not only of individual people, but whole populations and whole species, through DNA, it turns out that there has been a whole lot of successful – that is, fertile – interspecies breeding in the past. And it sometimes turns out that different species even today may interbreed with their neighbours on the quiet. For example, the primatologist Kate Detwiler discovered that two species of small monkeys in Tanzania’s Gombe National Park, the red-tailed monkey (Cercopithecus ascanius) and the blue monkey (Cercopithecus mitis), are found living in separate troops in some of the forested valleys, but in other valleys interbreed – in fact, in one or two valleys the monkey population consists entirely of hybrids.[i]
The idea that different species don’t interbreed is simply not true. They may not do so usually – but that is another thing entirely. We cannot use non-interbreeding as a criterion for species.
How then, can we define species?
For over 150 years now, the basic guiding principle of biology has been evolution – so the question we should be asking is what is the evolutionary status of species? The palaeontologist George Gaylord Simpson (1902-1984) suggested in 1961 that the essence of species is that they are evolutionary lineages. He got little reaction at the time because his colleagues largely were hung up on the non-interbreeding criterion, but from the late 1990s his insight has been more and more appreciated. The best way to recognise an evolutionary lineage is, quite simply, that it differs from other evolutionary lineages. Horses and donkeys differ consistently and therefore represent two separate evolutionary lineages, and are therefore two different species. Similarly, blue monkeys and red-tailed monkeys differ consistently and therefore constitute separate evolutionary lineages, and again represent two different species.
If there are whole populations which consist of hybrids between two species, then what? Sometimes hybrid populations remain isolated for a good length of time and become homogeneous – and a new species is born. At least one species of monkey, the stumptailed macaque (Macaca arctoides) of mainland Southeast Asia, is thought to have arisen about 1 million years ago from a hybrid population between two other species.[ii]
For a more detailed discussion about the arguments about how to define species, especially the contest between the Biological Species Concept and the Phylogenetic Species Concept, go here.
So … generally speaking, what are genera?
So what about genera, families and other taxa?
While the taxa at both ends of the ranking are pretty straight forward – ‘species’ is eminently useful, and ‘domain’ and ‘kingdom’ are irresistibly sensible – all the ranks in between can get awfully confusing. And they are actually rather arbitrary. When, for example, do we know that a group of organisms constitute a genus rather than a family?[iii]
One simple solution would be to organise those in-between ranks chronologically. In other words, the order Primates would include all those monkey, ape and human-like species which existed from the Palaeocene epoch, and the family Bovidae would include all those antelope, buffalo and cattle, and sheep and goat species which existed from the early Miocene epoch.
This is an idea first forcefully proposed by German biologist Willi Hennig (1913-1976), considered the founder of cladistics – or ‘phylogenetic systematics’ if your thesaurus is turned on.
In 1966, Hennig proposed linking the taxonomic rank of a clade to its time of origin. He argued that if taxa are to mean anything they must represent monophyla – that everything in that group must be descended from a common ancestor. He also argued that taxa had to be characterised chronologically.
Hennig was an entomologist and realised while many genera of insects separated from one another tens of millions of years ago, the genera of mammals and the genera of birds separated more recently.
The idea was taken up by American scientist Morris Goodman (1925-2010), one of the founders of molecular genetics. He set about constructing a consistent scheme for the group of mammals about which he was most familiar – the primates. In 1997, he suggested that a reasonable time depth for a primate genus would be seven million years, partly because this would do the least violence to the presently accepted system of determining genera.
Colin surveyed many of the mammalian genera that taxonomists had recognised and found that most had separated from each other less than seven million years ago. Subsequently, he proposed that five million years was a more appropriate time depth for mammalian genera: the Miocene-Pliocene boundary.
Furthermore, Colin suggested that the taxonomic rank of ‘family’ had a time depth of 24 million years, separate families splitting around the time of the Oligocene-Miocene boundary. Going up one more ranking, the different ‘orders’ separated around the time of the Cretaceous-Tertiary boundary (the famous K-T boundary that marks the arrival of the asteroid that wiped out non-avian dinosaurs[v]).
One of the consequences of Goodman’s proposals for palaeoanthropology is that most if not all members of the human lineage would belong to a single genus. Indeed, using his original suggested time depth of seven million years, Goodman even included chimpanzees into Homo. Overall, the later modifications devised by Colin play less havoc with the established order, but they would still require that most human fossils be placed in the same genus as ourselves.
Arguments about when hominins evolved into a genus that can be described as wholly human traditionally revolved around the relative importance of different physical characteristics: brain size, dentition, general morphology (body size, especially the extent of sexual dimorphism), and primary form of locomotion.
Other more controversial factors sometimes taken into consideration include tool-making, art and other signs of culture, and evidence of community living.
For example, some experts such as Ian Tattersall, curator emeritus with New York’s American Museum of Natural History, argue that the cranium of Homo floresiensis (the Hobbit, see here, here and here) is too archaic for it to be included in our genus.
This leads us to our second, and more controversial opinion: following Colin’s plan our genus would include not only H. floresiensis but even older and more archaically featured species traditionally belonging to other genera, such as the Australopithecines, for example, which include the Taung Child and Mrs Ples.
Colin argued that the Miocene-Pliocene boundary more or less corresponds to the onset of the only characteristic definitely belonging solely to our genus and to no other genera among the great apes – bipedalism. By bipedalism we mean that the main form of locomotion is walking or running on two legs, with the big toe aligned with the other toes in the foot.[vi]
Accepting this argument has two major implications and several minor ones for palaeoanthropology. First, and least controversially, brain size is not by itself a qualification for membership of the human genus. Specifically, a small brain does not exclude membership.
The discovery of H. floresiensis and H. naledi in the 21st century, with an average brain size of around 420 cm3 (about the size of a modern chimpanzee) and 500 cm3 respectively, clearly demonstrates that many humans were small brained compared to H. sapiens but possibly still capable of sophisticated tool-making and ritual behaviour.
Secondly, accepting a time criterion in determining what species do and do not belong to the genus Homo means that strictly morphological traits are no longer intrinsic in determining human status.
In the next post, we’ll look in more detail at brain size, culture and bipedalism as criteria for determining whether or not a species is human.
[iv] Groves, Colin. ‘Time and taxonomy’. Ludus Vitalis. Vol IX. No 15. 2001.
& Groves, Colin. ‘Speciation in hominin evolution’. African Genesis: Perspectives on Hominin Evolution. Ed Reynolds, Sally & Gallagher, Andrew. Cambridge University Press. 2012.
& Groves, Colin. ‘Current taxonomy and diversity of crown ruminants above the species level’. Zitteliana B32, International Conference on Ruminant Phylogenetics, ed. Prof. Dr G Worheide, Bavarian State Collection for Paleontology and Geology, Munich.
[v] Now also sometimes referred to as the Cretaceous-Palaeogene (K-Pg) boundary.
‘Burdalone’ is an old Scottish word meaning the last bird in the nest, the one left when all the other chicks have flown or all the other chicks have died. It’s a sad and lonely word, and perfectly describes Homo sapiens.
When one of the first members of our own species studied the world around her, most of what she saw would be familiar to us today, whether from personal experience or from watching nature documentaries about Africa. Extensive grasslands dotted with acacias, watering holes and narrow rivers with crumbling banks, herds of large grazing animals such as wildebeest and zebra, black herons and lizards, secretarybirds and crocodiles, a lion pride or two, and our deadliest predators – a leopard and a pack of hyenas.
What she also saw, and which none of us will ever see, is other groups of human beings that were not H. sapiens. Like our ancestor, they were striding on two legs and using their large brains and opposable thumbs to harvest nuts and berries, sometimes to hunt or scavenge for meat, and to fend off predators. They looked very similar to us, used tools, and some may even have created art and used language to talk to one another.
Around 350,000 years ago, at this stage the earliest date we know H.sapiens might have first strode the planet[i], there was no reason to think things would ever change.
But to say that we are human today is to say that we are members of a single worldwide species. This is extraordinary because for millions of years to be human meant that you could be a member of any one of a number of different but related species.
It should not be contentious to say that all members of the genus Homo are human – after all, this is what the Latin word ‘homo’ means – but it is contentious to suggest, as I will later in these posts, that all bipedal great apes are human.
But first it’s important to state that it’s currently impossible, and may forever be impossible, to finally determine when we stopped simply being hominids – that is, all the apes except for the gibbon – and became hominins as well – that branch of the hominids exclusive to us and our human cousins; this is the point at which chimpanzees went their way and we went ours.[ii] We may never know exactly when the thousands of physical and psychological characteristics that distinguish us from other great apes evolved; what we can be sure about is that almost of them were shared with at least some of our hominin ancestors.
We are so close to our cousins, genetically and historically, that making a distinction between whether or not they are human seems farcical. Indeed, the same argument can be made for any two species close to each other in the hominin line.
In 2005, British celebrity Alan Titchmarsh allowed professional make-up artists to disguise him as a Neanderthal; he then walked along the streets of London, almost completely ignored by everyone.[iii]
At some point we need to demarcate between those species we consider human and those we consider pre-human, and to date the only specific marker that distinguishes all of us from all of them is bipedalism, not some arbitrarily determined measurement of brain capacity, morphology or dentition.
As well, research into the workings of the human brain, and into animal intelligence generally, has thrown into doubt those psychological characteristics we traditionally considered to be peculiarly human, characteristics that made us special and put us above the rest of the animal kingdom. Once upon a time, we were considered the only animal to make tools, then the only animal to make tools and smile, then the only animal to make tools, smile and do handstands.
It’s similar to a town building a bridge and claiming it’s the only bridge in the world, only to discover that a nearby town has one as well. So the first town now claims it’s the only single-span bridge in the world, until it learns there is a another single-span bridge in the next county. The first town now claims it is the only single-span bridge in the world with green arches, and so on, every new definition increasingly trivialising what makes its bridge special.
There is strong evidence that intelligence has arisen many times in the animal kingdom: in primates, cetaceans, elephants, larger carnivores such as dogs, hyenas and the big cats; birds, particularly corvids and parrots; and some molluscs such as octopuses and possibly squids.
There is also growing evidence that self-awareness and even a theory-of-mind[iv] exists in other primates such as chimpanzees and some birds such as crows.
So what are the characteristics that separate humans from our nearest living relatives, the chimpanzee and bonobo?
Before we answer that, we have to talk taxonomy and cladistics – how scientists classify living things.
Life is a spectrum
In his book The Vital Question, biochemist Nick Lane writes that ‘the distinction between a “living planet” – one that is geologically active – and a living cell is only a matter of definition … Here is a living planet giving rise to life, and the two can’t be separated without splitting a continuum.’[v]
Different scientists may employ different markers or waypoints in determining the start of life on Earth, all of which are subject to controversy and disagreement, but the truth is that there is no precise point in time when anyone could claim that a given chemical process for the first time was created by life rather than geology; it would be an arbitrary decision.
The same principle applies throughout evolution. There is no precise point in time where we can say fish gave rise to amphibians or basal reptiles to dinosaurs.
Changes in life brought about by evolution through natural selection isn’t episodic, it’s a spectrum.
But evolution does present a handful of events when with some certainty we can say a new direction had begun – a direction with significant ramifications for all life that follows.
The first of these, and covered in some detail in The Vital Question, concerns the creation of the eukaryotic cell: a morphologically complex cell that contains a separate nucleus and mitochondria, each surrounded by a double membrane. As far as we know this remarkable event occurred only once in all history[vi]. An archaeon, a single-celled prokaryote, absorbed another kind of prokaryote – a bacteria – and instead of consuming it established a symbiotic relationship.
At some point later in history, some of the descendants of that first complex cell started a symbiotic relationship with a second prokaryote invader, creating chloroplasts and starting the line that would eventually lead to plants and green algae.
More recently, the arrival of the first human was an event with tremendous ramifications for all life on earth.
But when did this happen?
The king of Spain did what?
Before we go any further, we need to talk about a subject that normally works like a sedative on anyone not interested in taxonomic detail: the organisation of the taxa themselves.
I promise to keep this short and to the point, but it’s important to cover because we need to reconsider how and where our human family fits in with other living thing. And, of course, when it all happened.
Mnemonics are as much a part of school science classes as microscopes and Bunsen burners. For example, one mnemonic frequently used in the last century for memorising the different taxonomic ranks was ‘King Philip Came Over From Great Spain’, a mnemonic for the main taxa in the Linnaean system:
Taxonomic ranking has been around for as long as humans have been curious about the natural world, but the above ranking developed from a system introduced by the Swedish naturalist Carl Linnaeus in the 18th century. He wanted to organise living things so their biological relationship to each other was made very clear. He did this by using shared characteristics to lump things together.
For example, these days all animals with fur, warm blood and that suckle their young with milk are put into one group, the class called mammals. The mammals themselves are grouped together with all animals with a backbone to form a phylum called the chordates. The chordates and animals without a backbone are thrown together into a kingdom called Animalia. More formally, taxa – the collective noun for the rankings – sharing a more recent common ancestor are more closely related than they are to taxa which share a more remote common ancestor: in other words a wombat is more closely related to a dog than it is to a crocodile, and more closely related to a crocodile than it is to a flatworm.
Linnaean taxonomy also introduced the binomial, the familiar two-name identifier used in science to classify an organism at the most detailed commonly used level, that of species. Homo sapiens, for example, is the binomial for human beings, just as Panthera leo is the binomial for lions and Quercus robur is the binomial for the English oak. The first word is the genus (plural genera), the second the species.
Taxonomy is a lovely idea, and appeals to anyone who thinks good old common sense is all you need when sorting bookshelves and tidying kitchen cupboards. For over two hundred years it was regarded as an almost fool-proof system: a place for every living thing and every living thing in its place.
But our knowledge of the natural world is not like that of our kitchen. Like the natural world itself, it is messy, chaotic, growing and constantly evolving.
In 1990, American microbiologist Carl Woese (1928-2012) suggested a new step was needed at the top of the taxonomic ladder to reflect the discovery of a whole branch of life whose existence was never suspected until the 1970s. The archaea, single-celled prokaryotes, were long thought to be a kind of bacteria, but work by Woese and other scientists revealed they are as chemically different from bacteria as we are.
The commonly accepted taxonomic ranks now start with ‘domain’, leaving us with cumbersome and self-defeating mnemonics such as ‘Determined, Kind People Can Often Follow Ghostly Screams’ or ‘Do Kings Prefer Chess On Fridays, Generally Speaking’.
Domain isn’t the only extra rank added over the decades. We also have ‘subfamily’, ‘tribe’, and sometimes ‘subtribe’, ‘subgenus’ and ‘subspecies’, and that’s just in the field of zoology.
In the story of ‘Us’ we’ll be dealing mainly with genus and species, and in the next post we’ll discuss what makes up both taxa.
[ii] Some palaeoanthropologists include chimps and bonobos in the hominin. Rather than outlining all the arguments for or against, I’ll err on the side of caution and include only our immediate family in the hominins.
[iv] This is the ability to attribute mental states similar to your own to other members of at least your own species and possibly other species as well.
[v] Lane, Nick; The Vital Question; London, 2015; p 27.
[vi] This may have happened a second time. A single-celled organism with a nucleus, and possibly mitochondria, dubbed Parakaryon myojinensis, was retrieved a few years ago from the foot of a sea creature found off a coral atoll not far from Japan.
The main difference between P. myojinensis and all other eukaryotes is that its nucleus and mitochondria are surrounded by a single membrane instead of a double one, and its DNA is stored in filaments (as in bacteria) suggesting it is the result of a different line of evolution from all other eukaryotes. Indeed, there is some argument as to whether it is a true eukaryote at all. The only thing that can be said with some certainty is that it is definitely not a prokaryote.
No other example of this creature, or anything similar, has since been recovered. Nonetheless, when it comes to science, hope springs eternal …
This and the following five posts will be about Us. Not Uncle Sam or Ultra Sound or Ultimate Spas. Not you and me. But all of Us. Every single human alive today and every single human who has existed in the past. And by every human I mean every member of the genus Homo, and every member of the genera Australopithecus, Kenyanthropus and Paranthropus, a lineage that stretches back nearly five million years in the past and is still going strong today.
I considered another title for this series of posts – ‘Mongrel’ – because Homo sapiens are mongrels. I don’t mean in the way an Australian might call you a ‘mongrel’ if you rear-end his ute or support a different footie team, but in the sense that we are animals of mixed breeding.
I want to write about the revelation made by palaeoanthropology over the last 25 or so years that Anatomically Modern Humans (AMH) have no single direct ancestor. The different species that gave rise to us bred with each other again and again, cross-pollinating over millions of years. We are, each and every one of us a mulatto, a crossbreed, a cafuzo, a zambo … in short, a mongrel. This is something for us to crow about. We are the beneficiaries of millions of years of striving, surviving and thriving by many other members of our hominin tribe. Having said that, recognising that we owe our existence to a plethora of species and not to one single predestined or divinely sanctioned line of descent, may also help us shed our belief in the exceptionalism of H. sapiens.
These posts are also a way for me to record a project I long dreamt of doing and eventually started some six years ago but can no longer complete, a book about hominin evolution I was writing with my friend, the late Colin Groves[i]. I cannot write that book without Colin – his knowledge and experience were unique even in the rather rarefied circle of palaeoanthropology – but what I can do is finally record as faithfully as possible some of his ideas about hominin evolution.
To start with, I’d like you to meet a small child. A child named Taung.
Darwin was right
The child first came to attention in 1924 when it’s tiny skull was discovered by Raymond Dart in one of two boxes of tufa and sandstone debris he received as he was dressing to attend a wedding as best man.
Dart, an Australian doctor and anatomist, had only recently taken up the post of professor at the University of Witwatersrand in Johannesburg, and had spread the word he was interested in any fossils his students or acquaintances might uncover to help stock his fledgling laboratory. In this case, the debris was from a limestone quarry in Taung, a small mining town in South Africa’s Northwest Province.
When the boxes arrived he hurriedly inspected them. In the second box he saw something that changed his life and the history of palaeoanthropology.
In his own words, a thrill of excitement shot through him.
‘On the very top of the rock heap was what was undoubtedly an endocranial cast or mold of the interior of the skull. Had it been only the fossilised brain cast of any species of ape it would have ranked as a great discovery, for such a thing had never before been reported … a brain three times as large as that of a baboon and considerably bigger than that of an adult chimpanzee …
‘But was there anywhere among this pile of rocks, a face to fit the brain? I ransacked feverishly through the boxes. My search was rewarded, for I found a large stone with a depression into which the cast fitted perfectly … Here I was certain was one of the most significant finds ever made in the history of anthropology.
‘Darwin’s largely discredited theory that man’s early progenitors probably lived in Africa came back to me.’[ii]
Indeed, Dart’s discovery eventually switched the focus of palaeoanthropology’s search for the origin of our species from Eurasia to Africa, an origin Charles Darwin had predicted in The Descent of Man in 1871.
Using his wife’s knitting needles, it took Dart weeks to separate the Taung Child (Taung 1) from its breccia matrix. The paper[iii] he wrote about the discovery appeared in Nature in early 1925, and in that paper he named the specimen Australopithecus africanus, Africa’s southern ape.
At first, the scientific establishment reacted negatively to Dart’s hypothesis that the Taung Child represented an ancestor of modern humans. Heretofore it had been believed humans must have evolved in Europe or Asia, a belief reinforced with the discovery of H. neanderthalensis in 1829 (but not recognised as a different species from us until 1856) and H. erectus in Java in 1891 (a story we’ll come back to later in this series of posts).
Over the following decades, however, the number and diversity of fossils uncovered in southern and eastern Africa have overwhelmingly supported the ‘Out of Africa’ hypothesis for human origins.[iv]
The Taung Child itself was thought to be about three years old when it died. Not only was its life short, it ended violently. In 2006, the University of Witwatersrand’s Lee Berger wrote that marks in the Taung Child’s eye sockets and on its skull suggested it was probably killed by a large bird of prey.[v]
Even though it was the first described member of the genus, it turned out A. africanus was not its oldest member, and may not even have been one of our direct ancestors.
Meet the great-great-great-grandparents
At the risk of making a bad rhyme, exactly what does it mean to be an Australopithecine?
This is a matter of debate. Some scientists merge a chronologically older primate genus, Ardipithecus, with Australopithecus, to make the subtribe Australopithecina. Others leave out Ardipithecus, and include Paranthropus and Kenyanthropus with the Australopithecines. While they’re at it, some scientists consider Australopithecines to be a member of the human family, while others think the family starts much later – with the first species in the genus Homo.
It gets very confusing very fast, especially since every new discovery – and over the last 25 years there have been many of those – seems to generate a new species and subsequently a new debate of what it means to be human, hominin, or hominini (generally accepted to be humans plus chimpanzees). Or for that matter, what should be included in the genera Australopithecus, Homo, Paranthropus and so on and so forth.
For the sake of these posts, I’m assuming at this point that Australopithecines are fine and upstanding members of our human family. Great-great-great-grandparents (or cousins to the nth degree), in a manner of speaking. At a later point I’ll be examining more deeply what makes a genus … but we’ll paddle that delta when we get to it.
The oldest species belonging to this genus is A.anamensis[vi], kicking off just over four million years ago (mya). Other Australopithecines include A. garhi, A. afarensis (Lucy is probably the most famous example of this species, if not the most famous human fossil of all), A. bahrelghazali, A. deyiremeda, A. prometheus and A. sediba. A. sediba is the last known of the genus as well as the most recently discovered[vii], existing as recently as 1.8 mya, making it a contemporary of one of our ancestors, H. ergaster.
Over the two million plus years the genus existed, cranial capacity jumped from around the 360cc mark (slightly smaller than the average for a chimpanzee) to nearly 440cc, an increase of over 20%.
The Australopithecines are generally thought to have given rise to our genus around 2.4 mya. Occasionally one Australopithecine or another is nominated as materfamilias, but the truth is no one really knows which species – if any of those so far discovered – gave rise to our side of the family. As well, there is constant toing and froing about how many species there actually are (and as we’ll see the same toing and froing goes on in discussions about the members of our own genus).
In the next post I’ll discuss what lays at the heart of all of these debates: the big question, a question that may never be satisfactorily answered.
The late Australian philosopher and ecofeminist Val Plumwood was attacked and almost killed by a saltwater crocodile in 1985. The fact she survived three ‘deathrolls’ is down to her sheer determination to escape and a good amount of luck. Severely injured, one leg was exposed to the bone, she somehow managed to walk and finally crawl to the nearest ranger station, some three kilometres away.
In her essay ‘Prey to a crocodile’, Plumwood writes that during the attack ‘I glimpsed the world for the first time “from the outside”, as a world no longer my own, an unrecognizable bleak landscape composed of raw necessity, indifferent to my life or death.
‘ … It was a shocking reduction, from a complex human being to a mere piece of meat.’
Human exceptionalism is the belief that we as individuals and as a species are separate and superior to all other life on earth. It is a belief innate in almost each and every human, especially those belonging to so-called developed societies, that stems from our almost complete domination of the planet’s landscapes and ecologies. We are the world’s most numerous large animal, and our technology has enabled us to travel from the deepest abyss to the surface of the moon. Some aspects of our technology are overwhelmingly prolific and invasive: plastic, for example, is now found from the highest point to the lowest point on Earth’s surface and throughout our own food chain.
Human exceptionalism partly stems from the way we historically treat the animals and plants with which we share the planet. They are the resources we need to survive and thrive, and we reshape entire ecosystems to sustain industries that provide those resources in the cheapest, most efficient and in the greatest amount possible. This has been at the expense of vast swathes of rainforest, wetlands and temperate forests, environments essential to the health of life on earth.
But as Val Plumwood discovered, it doesn’t take much to reduce a single human being from a member of the planet’s dominant animal to just another source of food.
In 2020, in the middle of South Africa’s first and strongest COVID-19 lockdown, I wrote a short story called ‘Speaker’ for a competition run by Sapiens Plurum, an organisation created to ‘inspire (humans) to aspire beyond what was humanly possible.‘
The competition’s theme was ‘how can technology increase empathy and connection?’ They wanted authors to imagine ways technology can improve how we relate to each other and bring us closer, even across species.
The idea for ‘Speaker’ came from one of those moments of serendipity – or perhaps synchronicity is a better term – when two ideas fuse to create a third idea. The first idea was based on the development of protein microchips, a scientific endeavour that had its research heyday in the 80s; one objective of the research was finding a way to help people suffering from brain injury to regain full health. The second idea is a personal fantasy, really to one day communicate with one of our hominin cousins, such as Homo neanderthalensis or H. ergaster. The fusion of these two ideas created the third idea: using linked protein microchips for communication between two modern species, Home sapiens and, in this case, Crocuta crocuta – the spotted hyena[i].
The story won the competition, and subsequently Sapiens Plurum asked Slate Magazine to consider publishing it. Slate agreed, and in January published it in Future Tense, a partnership between Slate, New America (a Washington-based think tank), and Arizona State University’s Center for Science and Imagination.[ii] Specifically, my story was part of series sponsored by the Learning Futures initiative out of Mary Lou Fulton Teachers College at ASU.
Stories appearing in Future Tense have a ‘response essay’ written by someone who is an expert in the field or issue covered by the story. In my case, I was fortunate to have Iveta Silova, an expert in global futures and learning, write the response in a piece called ‘If Nonhumans Can Speak, Will Humans Learn to Listen?’
As an extra bonus, Mary Lou Fulton Teachers College then arranged for an online discussion between Iveta, Punya Mishra, a professor and Associate Dean of Scholarship and Innovation at the college, and myself, on the creation of ‘Speaker’ and the issues covered by it and Iveta’s response. That discussion was recorded and subsequently uploaded to YouTube.
The discussion’s central issue turned out to be about human exceptionalism. As Iveta explains in her essay:
‘Today … we are forced to acknowledge that we are not so special after all. On the one hand, we wonder and worry whether artificial intelligence will become conscious, leading us down a dystopian spiral of human irrelevance. On the other hand, we see a major shift in scientific thinking about plant intelligence and animal consciousness, suggesting that the difference between human and nonhuman species is just a matter of degree, not of kind. Meanwhile, our hyperseparation from the natural world is threatening every species on Earth—including humans.’
Iveta goes on to write that ‘Overcoming the modernist assumption of human exceptionalism and reconfiguring our relationship with a more-than-human world is a complex and long-term project.’
In ‘Speaker’, linking humans with different species is an attempt to overcome human exceptionalism, but the exercise itself is fraught with difficulties, especially the hurdles imposed by our own innate prejudices and assumptions about what it means to be human in a world that seems to be so completely dominated by humans.
And this is where our hubris kicks in. For the most part life on Earth is dominated by viruses, archaea and bacteria, but we are so coddled by civilisation that even if we understand this intellectually, it is usually impossible to acknowledge it instinctively. The current Covid-19 pandemic, for example, has demonstrated that for all our technological and cultural achievements, our entire civilisation can be put on hold by a virus so small that all the world’s Covid-19 particles can be contained a single soft drink can. It is well to remember that in ancient Greek tragedies, hubris comes before a great fall.
Linked to that hubris is the assumption in the story that given the capacity to link our own minds with those of other animals, we will go ahead and do it. The story doesn’t engage with the ethical issues of communicating in such a way with another species. For example, what repercussions would there be for the recipient species? How do we stop the link resulting in one species overly influencing or even dominating the other? In fact, how would we even begin to estimate what impact there might be? And if the decision was made to go ahead and make the link, how do we deal with the issue of privacy? How do the two linked intelligences stop invading each other’s most private thoughts? Can thoughts be turned on and off like a tap, or would the link open a floodgate that would drown both parties in a wave of facts, emotions and random thoughts?
Perhaps most importantly of all, and in the context of ‘Speaker’ the most relevant, is how do we interpret those thoughts? How do we know for sure that our brains won’t ‘mistranslate’ the thoughts it receives, and vice versa? In the story this is handled with the ‘joking’ subtext, the way Akata and Samora try to find a way around their very different life experiences to reach a common understanding for the concept of humour, something humans but not hyenas possess (at least in the story).
And yet, despite all of these issues, I see linking with another species as a wonderful opportunity and a positive action at so many levels. In her responding essay, Iveta actually quotes Val Plumwood:
‘According to … Val Plumwood, we must reimagine “the world in richer terms that will allow us to find ourselves in dialogue with and limited by other species’ needs, other kinds of minds.” This is, she argues, “a basic survival project in our present context.”’
It’s time for humans to put aside their exceptionalism and hubris. Apart from the damage to the planet such an attitude encourages, it damages us, keeping us artificially apart from the rest of life on earth. We cannot flourish as a species by ignoring the fact that we, like spotted hyenas and saltwater crocodiles and for that matter centipedes and flies, are animals. We aren’t the endpoint of evolution, just one of its offshoots.
[i] An animal seriously misrepresented in human culture. The spotted hyena is an intelligent and extraordinarily social predator that lives in large troops dominated by females. And I do mean ‘predator’; despite its historic image as a scavenger, almost all its food comes from actively hunted prey and not from stealing some other animal’s kills.
‘We need to codesign programs that move away from disempowering communities and indigenous people to giving them the power to be strong stewards of the natural resources and the lands,’ says Dr Patricia Mupeta-Muyamwa, Strategy Director for the African Indigenous Landscape program at The Nature Conservancy, a charitable environmental organisation with its base in the US.
Her job involves working with local communities to protect and nurture the natural environment. Patricia says she fell into the work more by accident than design.
‘I did my undergraduate degree in wildlife ecology at the University of Zambia in Lusaka, and in my last six months did an internship monitoring wildlife and vegetation in a national park. The job involved interacting with the park scouts, and after listening to their experiences I realised that it was people and not wildlife that was the problem, and I asked myself how do we empower people to make them better stewards of nature?
‘I did my Masters in conservation and tourism in the UK, and learned about different models of conservation. Because of the chequered history between national park administration and local communities, which left a great deal of animosity towards the state, my work promotes the importance of getting the rights to land and natural resources to the people that live closest to them.
‘Historically, African national parks and nature reserves were created for aesthetic reasons using an American model first developed for Yellowstone National Park.
‘Up until the 1990s, the state and not the local people ran national parks and conservation areas; it was a relic of Africa’s colonial past, and part of my work is to help address this injustice by reconciling local people so they’re a part of the conservation solution.
‘Local communities were forced out. People were seen as part of the conservation problem and not as part of the solution. For example, in South Africa national parks are still state run in a very centralised way; there are many communities around Kruger but few are getting any real benefit from it except a few people that find employment.’
Patricia says her long job title came from her work as it evolved.
‘A large part of the job is focused on protecting wildlife corridors spanning across parks, private and community-owned lands.
‘The work itself has three main objectives. First, giving land and resource rights back to the local community. Second, developing community skills to manage natural resources for example protecting and monitoring wildlife . Third, helping develop community opportunities for making a living from conservation, for example with tourism and programs that empower women.’
Patricia stresses this is a bottom-up approach. ‘A big part of my job is to consult with communities and their leaders to find the best conversation solution. I listen to their stories about living and interacting with the land.’
Patricia leads teams that are managing four big landscape projects, one in Kenya involving 39 separate communities, two in Tanzania and one in Zambia.
‘We’ll soon be starting a fifth one in Angola, based around the headwaters of the Okavango River.’
As an example of what these projects can achieve, Patricia cites the work done with a local partner Northern Rangelands Trust with 39 separate communities.
‘Establishing wildlife corridors between these communities has been successful in increasing numbers of previously threatened animals such as elephants.
Patricia was born and raised in Kitwe, a mining town in Zambia’s Copper Belt on the Kafue River, Zambia’s third largest river. This is also where she first met her husband Andrew, now a Maths Studies high school teacher. The two of them have fond memories of growing up in this small, quaint mining town.
‘My parents worked for a mining conglomerate. My father worked for 27 years as a human resources manager for a copper mining company. He was a real people person, and connected with people from all walks of life.
‘My mother was a teacher, training first in Zambia to teach home economics, but later she studied in Liverpool in the UK to become a Montessori teacher; and was the first Zambian to achieve this.’
Patricia grew up in a one-party state created by independent Zambia’s first president, Kenneth Kaunda. Following a period of instability, the 1973 signing of the Choma Declaration banned all parties except Kaunda’s own, the United National Independence Party (UNIP). He remained in power until he was ousted after being forced to hold multi-party elections in 1991.
‘Kitwe’s British-South African owned mining company was nationalised by the Zambian government, so I grew up thinking it was normal to grow up in a black-run black society. It was a source of pride for us that Zambians were in charge of the company.’
Patricia says that even though she grew up in a one-party state, she only became aware of that as she finished high school.
‘But living in it as a child you don’t necessarily feel authoritarian measures, for example restricted access to the world outside Zambia. We were cocooned, but that didn’t feel bad. In some ways I would rather live in that state than what exists now. Things worked: there was infrastructure that worked, equity for all seventy-two tribes and a sense of security. I believe Kenneth Kaunda was motivated for the greater good of society. He created an environment that allowed everyone to have access to healthcare, education and employment regardless of background.
‘Kaunda created a system where we didn’t feel black, but Zambian. My father’s generation, which grew up under colonial rule in what was then Northern Rhodesia, was taught British, European and American history at school; my generation was taught pre- and post-colonial African history.
‘Kaunda led the way in institutionalising a Zambian identity. As a kid I didn’t really appreciate the gravity of this, but looking back now I see that it helped me navigate through life as a Zambian. Kaunda called this philosophy “humanism” – in the sense that the core values were about recognising our common humanity, and that we should always be aware that history was judging us and so be peaceful, respectful and good to each other.’
But things started to change in the late 1980s and early 1990s.
‘The economy was stalling and there were food shortages. Up to then the majority of Zambians had been politically passive; there wasn’t a lot of collective activism. The system that existed helped make it that way. But at that point the multi-party democracy movement challenging Kaunda was slowly taken up by the people.
‘When I was sixteen I was apolitical, but then my dad took me and my older brother to my first political rally just before Kaunda left. It wasn’t simply an anti-Kaunda rally, but more about a wind of change. It was huge and exciting – there was a great desire for change – and when it came I was hopeful. Everything felt new and that at last we were going places and fighting for a better Zambia. There was a sense of entrepreneurship in the early 90s, and new markets were opening up. The mines were privatised, for example, and different assets were being sold, like the mining homes, and many Zambians became home owners for the first time.
‘But in the euphoria we forgot what Kaunda had done for Zambia. The current political system in Zambia is not as effective as the old political system. There is less equity and less access to health, work and education. The Zambian economy is on life support.’
The one great source of stability for Patricia is her family.
‘I come from a very strong nuclear family, which is not the norm for families in Africa. It is a central part of who I am. My husband, parents, siblings and my maternal grandmother have all influenced my life in different aspects.’
Patricia says her grandmother, Dorika, was independent, strong-minded, political and entrepreneurial. Born in the early 1920’s, she witnessed her country move from a colonial to a post-colonial era.
‘She was a Kaunda supporter and freedom fighter from the colonial era. She later became a strong organizer in the women’s league of the United National Independent Party (UNIP).
‘Towards the end of the colonial period she accompanied her husband, a community development officer, to different postings all over the country. In one posting he was sent to a district in the northwest at the same time as the colonial authorities imprisoned Kaunda there; when Dorika saw Kaunda being taken for his daily walk she would go up and talk with him, much to the distress of the local British officials. During one encounter she was reprimanded by the District Governor for this action. She held her ground, and continued with her actions. This upset the Governor and he later transferred my grandfather away from the district because of his “troublesome wife”.
‘During the time when there was a call for change from Kaunda’s rule, she said “No! No change!”’
After her husband died, Dorika supported her family of eight children by selling bread and other baked goods from home and at the market.
‘With two other women she set up one of the first female trading markets in Kabwe, a small mining town in central Zambia; it’s still operating to this day.’
Patricia says she drew a great deal from her grandmother.
‘I admired the way she navigated through life and survived as a woman and as a leader. She did so much in her life and in her own way. The older she got the stronger she got, and she was a great female model. She really lived life in her own terms.’
Patricia’s father, David, was the biggest male influence on her life. ‘My love of reading came from him. I loved going into his library. I read his 12-volume encyclopedias over and over.’
Patricia says growing up she never gave her mother the same attention she gave her father.
‘I was a “daddy’s girl”, and she wasn’t in my “cool space” back then. Now I realise just how similar we were. She was a trail-blazer. She was the first Zambian to study and teach Montessori; that took a lot of initiative and courage.’
Perhaps the biggest influence her mother had on her life was her decision to send Patricia and her sister, Edith, to an all-female boarding school run by German nuns; one of the oldest and best schools from its establishment in the early 1900s. She remembers the school was run under a very strict regime.
‘I did not like it at all. The nuns worked us very hard. When I tell people I went there they ask me if my parents hated me! But in hindsight, the education I gained from that time was invaluable.’
Patricia says she wasn’t really conscious of her skin colour until she travelled to the UK and, especially, the US, for study.
‘I’m not sure whether or not that was a peculiarly Zambian experience. I’ve heard very different stories about encounters with racism from other black people, many of them heartbreaking.
‘Up to then I never thought of myself as a “black” person. My first racist encounter was in the UK when I was in my early 30s, when a hobo at a train station yelled at me to go “home”. I was shocked more than hurt by it because for the first time I became truly aware that this society was different from the one I grew up in.’
She says that while studying for her masters at the University of Kent she felt she was living in a bit of a bubble because she was very familiar with the British tradition and culture that had been such a part of Zambia before independence.
‘Growing up in Kitwe I had many encounters with non-racist and progressive Brits. It wasn’t until I was studying in the US that racism really hit me.
‘Soon after I arrived at the town where I was going to study I started looking for accommodation and came across a poor black neighbourhood. I began to understand how a community placed like this, separated from better-off communities, institutionalised racism.
‘US culture was strange and interesting. I was living in a diverse and liberal university town in northern Florida, but you didn’t have to drive far from the town to find Confederate flags flying in front yards. It was a totally different society.
‘For the first time I felt and identified as “black”. I found myself gravitating towards black student unions and organisations helping black communities.’
Patricia was saddened to see great poverty in some black communities in the US. ‘I had seen poverty in Africa, of course, but here it was like the lights had gone out. There was a lot of hurt and anger in that tribe – a tribe I can relate to – but the hurt and anger also existed in the academic environment which was so different from my previous experience it threw me off guard somewhat.
‘What I also found interesting was the way the black community was divided among African Americans, Caribbeans and Africans. It could be hard to cross the divide, but I’m not sure how much that was due to my own naivety. The black student union had a good ethos, for example, but it’s leadership was African American, and they defined the union’s agenda and this is where a lot of the union’s energy was spent. I had to think about what it meant to be an African in this situation. My initial enthusiasm at being part of the union started to wane because I couldn’t see what my role might be.’
Patricia says she identifies as Zambian but feels African.
‘I especially feel broadly connected to sub-Saharan Africa. African countries like Zambia, Kenya, Botswana and South Africa have more in common than not.
‘There is a connection around tradition, culture and how we think about family. There is a very strong “oneness” around family events that goes with a sense of community. This means there is still an especially strong tie in many countries between urban and rural communities; people working in the big cities still go back to their families living in rural areas for important occasions.’
Patricia hopes those values will see sub-Saharan Africa through to a better future. ‘Right now, for example, that rural link for urban dwellers means many of them have a comparatively safe refuge during the current COVID-19 pandemic.
‘Strangely, this isn’t what’s happening in Zambia, where the rush to urbanise seems to have cut many of those ties to the country. I don’t know the village where dad came from, for example.
‘Africa needs to reconnect to its core identity. I believe we lost this connection as we urbanised. My hope is that we will see those links repaired in Zambia and other parts of Africa.’
Some Australians take perverse pride in the legion of venomous animals infesting the continent and its surrounding seas, from the very small members of the Irukandji group of box jellyfish[i] up to the very large mulga snake[ii].
On the face of it, Australia seems to have had the bum run when it comes to its snakes, spiders, ants, octopuses, cone shells and jellyfish, and this hardly exhausts the list of venomous creatures that call Australia home. On the face of it, if venomous wildlife is your thing then you should be calling Australia home, too.
(As an unpleasant aside, Australia’s venomous biota is not even restricted to its animals; I dare you to read this with the lights off: Australia’s venomous trees.)
If we exclude the 120 kg drop bear[iii], which is sometimes erroneously claimed to use venomous claws to subdue its prey, then the big three that dominate most conversations after a few beers at the pub are the inland taipan, the box jellyfish (particularly the sea wasp), and the Sydney funnel-web spider.
For a timid and rarely seen snake, in recent years the inland taipan has garnered a fearsome reputation for itself. In fact, one of its alternative names is the fierce snake, but this is entirely due to its venom, milligram for milligram the most lethal of any of the world’s reptiles. It is often reported that the venom from a 110 mg bite, if carelessly (or maliciously) injected, could kill 100 adult men. The fact that the average dose delivered by an inland taipan is about 44 mg is rarely mentioned, although since this is still enough to kill at least 40 adult men it could be argued I’m being pedantic. Compare this to the most lethal member of the saw-scaled vipers[v], which can reportedly kill six adult males with the amount of venom it delivers with one bite. (We’ll be returning to the saw-scaled viper a little later.)
The chance of encountering the inland taipan, which inhabits that semiarid corner of hell-on-earth between Queensland and South Australia, is vanishingly small. Indeed, in Australia your chance of dying from thirst or a camel stampede is probably greater than dying from a snake bite from any species. It’s also worth noting that the inland taipan has been described as placid and reluctant to strike; of course, if cornered or mishandled it will not hesitate to bite with remarkable speed and precision, and more fool you.
The sea wasp is another matter altogether, not because it is remotely vicious, but because it just doesn’t give a damn. All envenomations are accidental. The largest of the box jellyfish, it spends its life floating in the warm tropical waters off northern Australia, Papua New Guinea and Southeast Asia. Well, floating isn’t entirely correct. The sea wasp does swim, but not in the determined way that would get it a place in Australia’s Olympic swimming team; apparently at full pelt they can cover about six metres in a minute. In the right season and the right place, the chance of accidentally bumping into one of these almost transparent jellyfish is depressingly high. Beaches all along the northern, tropical shorelines of Australia have signs warning swimmers of the danger.
An adult sea wasp is made up of a roughly square-shaped bell about 30 centimetres in diameter; 15 tentacles trail from each of the bell’s corners, each of which can be up to three metres long and are covered in around 5,000 cells called cnidoblasts, each of which in turn houses a nematocyst, which is Latin for ‘this will hurt’.[vii]
Nematocysts are the business end of a sea wasp’s venom delivery mechanism. When its prey, usually prawns or small fish, brush against the tentacles, the cnidoblasts release the nematocysts. The nematocysts penetrate the skin of the victim like miniature harpoons and then release their venom. Despite having actual eyes, the sea wasp seems incapable of restraining the cnidoblasts from releasing their load if the tentacles accidentally brush against something which isn’t prey, such as a human. Since this means the sea wasp is missing out on a meal and must now spend what I assume is a lot of energy to rearm the cnidoblasts, this is a serious design fault. Admittedly, that’s small comfort for anyone writhing in the water in unbearable pain, but one can only imagine the cuss words going through what passes for a sea wasp brain.[viii]
According to one study[ix], a sea wasp carries enough venom to kill 60 adults, which considering its size compared to, say, the inland taipan, is some achievement. Nonetheless, most encounters with a sea wasp don’t end with a fatality. The quick application of vinegar to neutralise any nematocysts still attached to the skin, and ice to relieve the pain, is often all that’s necessary. Having said that, one study[x] shows that 8% of envenomations require hospitalisation:
‘Because of the rapidity of fatal C. fleckeri envenoming, the critical window of opportunity for potentially life-saving use of antivenom is much smaller than that for snake envenoming, possibly only minutes. Furthermore, from animal study data, it was calculated that around 12 ampoules of antivenom may be required to counter the effects of a theoretical envenoming containing twice the human lethal dose of venom.’
The lesson here is if you come across a sign at a beach that says beware of box jellyfish (or for that matter crocodiles) consider something marginally safer and decidedly less painful for your daily outing, like jumping off a cliff.
I’m an arachnophobe, and this spider pretty well defines the content of my worst nightmares.
I readily admit I’m scared of vampires, malevolent ghosts, land sharks, Brussel sprouts and omelettes – for that matter, any food made mainly from eggs – but my fear of spiders is on a whole other level. Even if I catch a glimpse from the corner of my eye of the completely innocuous daddy longlegs a long shiver will pass down my spine. I don’t know what it is about arachnids that gets me all goosebumpy or triggers my fight or flight instinct (to be honest, my fly or fly-twice-as-fast instinct), but it might have something to do with spiders like huntsmen, wolf spiders, tarantulas and funnel-webs being so damn hairy. It just isn’t right; it’s as if they’d killed a dog or cat, skinned it and donned the fur. Then there’s the eight legs. Six legs on creatures such as ants and earwigs are hard enough to put up with, but eight seems a serious case of overengineering.
Anyway, of all the world’s spiders, the Sydney funnel-web ticks every yuck box: wears dog fur, tick; eight legs, tick; lives in a hole in the ground, tick; likes entering human households, tick; has more than two eyes, tick; has fangs long enough to pierce your toe nail to get to the vulnerable flesh underneath, tick; can kill you with single bite, tick.
Indeed, I cowrote a short story about the Sydney funnel-web with good friend, colleague and fellow-arachnophobe Sean Williams. The story, ‘Atrax’, must have hit a nerve with quite a few people: it won the Aurealis Award for best horror short story in 1999.
The Sydney funnel-web’s lethality can be put down to an extraordinary compound in its venom called δ-atracotoxin (sometimes referred to as delta-hexatoxin[xii]), which bizarrely is brilliant at killing its normal prey of insects, but in small doses causes no harm to mammals … with the single exception of primates. And humans, regrettably in this single instance, are primates. Why the venom should be so damn selective is anyone’s guess, and there have been a few.[xiii]
The other peculiar fact about the Sydney funnel-web is that the male’s venom is up to six times more toxic than the female’s[xiv]. The best theory to explain this is that the male goes wandering during the mating season looking for females and has to defend itself against hungry predators, as hard as it is to imagine any predator being so hard up it needs to feed on such an ugly, hairy and extraordinarily venomous assassin. Admittedly, this doesn’t quite explain why the venom is so effective against primates; I assume almost every human on the continent, like myself, would go to great lengths to avoid antagonising any spider let alone one that can kill you, and as far as I know, humans are the only primates to have made their home in Australia.
Ultimately, the venom’s ability to kill humans is just an accidental byproduct of its evolutionary development.
But, and this is a big ‘but’, no human has died from the bite of a Sydney funnel-web spider since an antivenom became available in 1981.
Most venomous versus most dangerous
And this is where we return to the saw-scaled viper. One of these smallish snakes, the largest will grow no bigger than 90 cm, may only be able knock off six fully grown adults, as opposed to the inland taipan’s potential 100 victims, but nonetheless, to my mind the viper is the more dangerous of the two snakes.
Before I set out my reasons for this, we should remember the saw-scaled viper and the inland taipan only have to kill you once to ruin your day, not six or a hundred times, which would seem – and please excuse the pun – something of an overkill. As far as the average human is concerned, a bite from either of these snakes will see your life flashing before your eyes.
And why do I think the saw-scaled viper is the more dangerous of the two?
First, your chance of encountering a saw-scaled viper on its home turf – anywhere dry in Africa, the Middle East and southern Asia – is dramatically higher than your chance of encountering the inland taipan on its home turf.
Second, the saw-scaled viper is a much testier beast than the inland taipan, and seems inclined to bite anyone passing within striking distance, something the inland taipan is not inclined to do.
Third, your chance of getting good medical care through much of the saw-scaled viper’s range, let alone the appropriate antivenom, can be very small.
Indeed, the saw-scaled viper may be responsible for more human deaths than any other snake, whether we’re talking about other vipers, adders, taipans, cobras, rattlesnakes, kraits or mambas. It’s reported to be responsible for up to 90% of all snakebites in Africa.[xv]
But rather than picking on any one snake, it’s important to understand that snakebites are a serious health problem in most developing countries. According to the World Health Organization[xvi]:
‘Worldwide, up to five million people are bitten by snakes every year. Of these, poisonous (envenoming) snakes cause considerable morbidity and mortality. There are an estimated 2.4 million envenomations (poisonings from snake bites) and 94 000–125 000 deaths annually, with an additional 400 000 amputations and other severe health consequences, such as infection, tetanus, scarring, contractures, and psychological sequelae. Poor access to health care and scarcity of antivenom increases the severity of the injuries and their outcomes.’
It seems to me these statistics, which barely reflect the pain, misery and social desolation that can be caused by a snakebite, are the ones we should obsess over, rather than how many humans can be killed by a single and remarkably shy Australian snake.
One final point. On average, more Australians die each year from the stings and bites of ants, wasps, bees and ticks than snakebite, largely thanks to anaphylactic shock (and not prophylactic shock as I once tipsily declaimed). From 2000 to 2013, 27 Australians died from snakebite; over the same period, 32 Australians died from animals that fly and crawl around us every day of our lives without us giving them a second thought. In the same period, no one died from a spider, scorpion or centipede bite, and only three people died as a result of envenomation from a marine creature[xvii].
To put these statistics into proper perspective, horses were responsible for the deaths of 77 Australians between 2000 and 2010[xviii]. To make the perspective even sharper, consider that between 2000 and 2013, more than 21,000 Australians died in car accidents[xix].
By the way, in those same thirteen years, two people were recorded to have died from an unknown animal or plant. I’m betting it was a drop-bear.
[vii] Disappointingly, and rather mundanely, nematocyst is Latin for ‘a cell with threads’.
[viii] In fact, sea wasps don’t have a brain as such, or anything else we might recognise as a central nervous system. But it does have something: ‘The box jellyfish’s nervous system is more developed than that of many other jellyfish. They possess a nerve ring around the base of the bell that coordinates their pulsing movements … ’ See https://en.wikipedia.org/wiki/Box_jellyfish.
[xv] James Cook University toxinologist Professor Jamie Seymour carefully lays out what makes one venomous animal more dangerous than another in the National Geographic documentary World’s Worst Venom, not only comparing and ranking the inland taipan with other snakes, but also including sea stingers, spiders, scorpions and many other venomous creatures. Well worth a look if you can get your hands on it. See: