Science

16 March 2024: Life and the universe

In 1977 two American microbiologists discovered a whole new branch of life: the archaea. Single cells without a nucleus, archaea are prokaryotes that for a long time were considered to be bacteria. The work of Carl Woese and George E. Fox, however, revealed that archaea are in some respects closer to eukaryotes (cells with a nucleus, which includes all multicellular life such as you and me) than bacteria.

As a result of their discovery, the traditional taxonomic tree, starting with kingdom and ending with species, was – so to speak – recapped. A new level was placed at the top of the tree – the domain – with three members: bacteria, archaea and eukaryotes.

‘So where do viruses fit in?’ I hear you ask.

Good question. Unfortunately, there isn’t a good answer. The issue is confusing because there is no clear understanding of their evolutionary development, or even if they share a common ancestor. Some may have developed from small, free-floating bits of DNA called plasmids, while others may have evolved from bacteria. It’s generally easier to hedge around the issue of whether or not viruses are actually alive rather than confronting it, and they are often simply called infectious agents or biological entities. The main sticking point is that viruses cannot replicate without infecting a host cell, and are therefore entirely dependent for their existence on ‘biological entities’ that are undoubtedly living. An example of how contentious this argument can be, however, is that the bacteria that causes the STD chlamydia can only exist in a host cell … therefore, should this bacteria be considered technically alive?

Now, to complicate matters even further, a preprint article published in bioRxiv earlier this year announced the discovery of ‘viroid-like colonists’ the authors call ‘Obelisks’, which sounds awfully like the proposed title for a 1950s sf monster movie set somewhere in Death Valley.

However, Obelisks are minute, not monstrous. In fact, they’re small enough to fit inside bacteria. The article’s first sentence describes them  as a ‘previously unrecognised class of viroid-like elements that we first identified in human gut metatransciptomic data.’ Basically, they were found in samples of human poo. I think I’d prefer to research giant creatures rampaging through Death Valley, but to each their own.

In size they fit somewhere between viruses and viroids (infections strands of RNA), and are rod-shaped, hence their name. And despite only being discovered recently, they are a lot of them. As the authors report: ‘Large scale searches identified 29,959 Obelisks … with examples from all seven continents and in diverse ecological niches.’

At this point, researchers don’t know if Obelisks are truly alive or not, what they evolved from, or if they are harmful or beneficial to their host organisms. In other words, stay tuned.

At the other end of the scale, a paper by two physicists from University College London suggest that dark energy and dark matter may not exist. To put this in perspective, under the currently most popular model of how the universe works – the lambda-CDM model (or, more simply, ΛCDM) – dark energy and dark matter make up nearly 95% of the universe. It’s like suggesting the theory of evolution through natural selection is fine except for the bit about natural selection … and maybe the bit about evolution.

Authors Jonathan Oppenheim and Andrea Russo ‘… consider a proposed alternative to quantum gravity, in which the spacetime metric is treated as classical, even while matter fields remain quantum.’ Making sense of this is way above my pay grade, but The Guardian’s science correspondent, Hannah Devlin, explains the theory this way: ‘(It) envisages the fabric of space-time as smooth and continuous (classical), but inherently wobbly. The rate at which time flows would randomly fluctuate … space would be haphazardly warped and time would diverge in different patches of the universe.’

If this sounds a bit like Doctor Who trying to explain time to Carey Mulligan’s character in the episode ‘Blink’, it may be because the universe is indeed ‘inherently wobbly’. It’s rather unsettling to think that the Weeping Angels might feel quite at home in Oppenheim and Russo’s universe.

Neither paper has been peer-reviewed at this point, but that hasn’t stopped them garnering media attention and commentary from other scientists. At the very least, ‘Obelisks’ and ‘wobbly spacetime’ have stirred the often lethargic currents of scientific orthodoxy; at best, they demonstrate that all scientific knowledge is provisional.

As FBI Special Agent Dana Scully says, ‘Mulder, the truth is out there.’

Which is why we keep on searching.

22 March 2023: Birds got brains

In part 4 of a thematic series of posts called Us, I said this about toolmaking:

‘It is with the application and development of tool usage that the first signs of a distinct ‘human’ culture are found in palaeoanthropology. Whereas chimps and some bird species, like humans, use tools made from plants to gather food or built shelter, humans are the first animals to make stone tools, improving on the original material through knapping.’[i]

Goffin’s cockatoo.
Image courtesy of Creative commons.

I short-changed chimps, it seems. Not only do they use tools, they use tool sets: in other words they prepare different tools for different jobs.[ii]

More surprisingly, I now discover that I also short-changed birds. I was pleasantly surprised by the new(ish) information about chimps, but astounded by the news[iii] that at least one species of bird – Goffin’s cockatoo[iv] from Indonesia – also makes and uses different tools for different jobs.

I shouldn’t be astounded, of course. In a much earlier post I wrote about research providing evidence that corvids possessed a Theory of Mind. And as the article in The Conversation points out, an Australian bird – the palm cockatoo – is already known to regularly make drumsticks to beat against hollow trunks during courtship. I suppose it’s not a giant leap from all that to learning that at least one non-avian dinosaur could do with a tool box to keep its implements tidy and dust-free.

It seems that Goffin’s cockatoo actually manufactures three different tools – for wedging, cutting and spooning.

Again, as the article points out, this means the cockatoo’s cognitive skills can be compared directly with a chimps. Importantly, they have ‘… been confirmed as the third species that can not only use tools, but can carry toolsets in anticipation of needing them later on.’

The original research paper leading to the article in The Conversation can be found here in the journal Current Biology.


[i] https://simonbrown.co/2022/02/28/28-february-2022-us-part-4-using-your-noggin/

[ii] https://www.nature.com/articles/srep34783#:~:text=In%20addition%2C%20chimpanzees%20use%20two,probe%20to%20fish%20for%20termites.

[iii] From this article in The Conversation:

https://theconversation.com/goffins-cockatoo-named-third-species-that-carries-toolsets-around-in-preparation-for-future-tasks-199408

[iv] Also known as the Tanimbar corella (Carcatua goffiniana).

20 March 20123: New bug stomper … maybe

No, not another instalment in the Starship Troopers media franchise, but an exciting development in the war against the bugs that make us sick … and sometimes kill us.

Alexander Fleming’s discovery of penicillin in 1928 was a turning point in our struggle against bacteria-caused infection. Research carried out by Fleming, and subsequently by Cecil George Paine, Howard Florey and Ernst Chain, marked the start of the systematic production and use of antibiotics, at first in developed countries and later worldwide.

Salmonella bacteria.
Image courtesy Creative Commons.

But after eighty years of use, antibacterial resistance is increasingly common. A 2014 report from the World Health Organisation states it is a threat ‘to global public health.’ The report found ‘high rates of resistance … in all WHO regions in common bacteria … ‘[i]

A 2016 review of antimicrobial resistance commissioned by the UK Prime Minister estimated that 700,000 people died each year from resistant infections, and that by 2050 ‘ … 10 million lives a year and a cumulative 100 trillion USD of economic output (could be) at risk … ’[ii]

So it’s kind of surprising that a paper published in February in eBioMedicine with the comparatively catchy title ‘A broad-spectrum synthetic antibiotic that does not evoke bacterial resistance’[iii] might garner some attention in the media. But there’s been hardly any attention at all, if any.

These few lines indicate why the paper may prove to be very important indeed in the future:

‘ … a promising compound, COE2-2hexyl, (exhibits) broad-spectrum antibacterial activity. (It) effectively-treated mice infected with bacteria derived from sepsis patients … including a CRE K. pneumoniae strain resistant to nearly all clinical antibiotics tested. Notably, (it) did not evoke drug resistance in several pathogens tested. (It) has specific effects on multiple membrane-associated functions  … that may act together to abrogate bacterial cell viability and the evolution of drug-resistance.’

So not only did it treat bacteria (from sepsis patients) in mice, including a highly resistant strain, it did not evoke resistance and – importantly – might act against the evolution of drug resistance.

Hell, maybe COE2-2hexyl should feature in the next Starship Troopers movie. Sounds like it could take on any bug.


[i] https://web.archive.org/web/20150706105859/http://apps.who.int/iris/bitstream/10665/112647/1/WHO_HSE_PED_AIP_2014.2_eng.pdf?ua=1

[ii] https://amr-review.org/sites/default/files/160525_Final%20paper_with%20cover.pdf

[iii] https://www.thelancet.com/journals/ebiom/article/PIIS2352-3964(23)00026-9/fulltext

08 January 2023: Homo floresiensis and Homo naledi, the species that keep on giving

The remains of Homo floresiensis, discovered at Liang Bua on the Indonesian island of Flores in 2003, and of Homo naledi, discovered inside the Rising Star Cave in South Africa’s Cradle of Humankind, have played an important part in helping us understand the diversity and complexity of our hominin past.

Homo floresiensis. Photo courtesy of Creative Commons. Created by ATOR.

H. floresiensis, dubbed ‘The Hobbit’ by the media because of its diminutive size, with a brain capacity of around 380 cm3 and standing around a metre tall, was considered by many scientists to be a deformed or microcephalic H. sapiens. However, strong physical evidence such as humeral torsion[i] and a set of teeth unique among hominins[ii] has pretty well ended the debate about its status as a species in its own right. The main disagreement now, considering the size of its brain, is whether or not it should be included in the genus Homo.

And speaking of small brains …

H. naledi was half again as tall as H. floresiensis – about the same height as a large chimpanzee – and although its cranial capacity (between 460 cm3 and 610 cm3)  was considerably bigger than the Hobbit’s, it was still well short of a modern human or any of our immediate cousins such as H. neanderthalensis.

Homo naledi. Photo courtesy of Creative Commons.

As I wrote in a previous post, however, brain size is not necessarily a reliable indicator of intelligence.[iii] H. floresiensis almost certainly made and used stone tools[iv], and recently the University of Witwatersrand’s Lee Berger announced that researchers had found evidence of fire being used by H. naledi[v]. This last was probably something of a given, since the remains of H. naledi were found in a chamber of the Rising Star Cave that could only be reached through a long, dark and twisting route that was difficult and dangerous to follow even with artificial light – without some kind of illumination it would have been virtually impossible. Still, this recent evidence adds weight to the case that this species was capable of making and using fire.

As friend and palaeoanthropologist Debbie Argue asks, however, when and how did H. naledi learn to make fire? Could they possibly have acquired the skill from a contemporary hominin, such as H. sapiens? Or was it the other way around? Or did both species learn the trick from a third hominin group?

We’ll probably never know the answer to this question, but it is fun thinking about, and – at the risk of stretching a metaphor almost to breaking point – throws another log on the fire of revaluating exactly what it means to be human.


[i] https://doc.rero.ch/record/15287/files/PAL_E2586.pdf

[ii] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4651360/

[iii] (And in an even earlier post I write about evidence suggesting corvids, with comparatively lightweight brains (c. 20-25 grams, give-or-take), may have a Theory of Mind.)

[iv] https://www.nature.com/articles/nature17179

[v] https://www.sciencenews.org/article/homo-naledi-fire-hominid-cave-human-evolution

30 December 2022: Maybe we’re not living in a special universe after all: just a beige one

According to a recent article in Quanta Magazine by staff writer Charlie Wood, a recent calculation by two physicists Neil Turok and Latham Boyle, suggests our universe is the most likely option for all universes.

Wood quotes Boyle’s analogy of a sack of marbles, each marble representing a different universe. ‘ … the overwhelming majority of the marbles have just one color — blue, say — corresponding to one type of universe: one broadly like our own, with no appreciable curvature and just a touch of dark energy. Weirder types of cosmos are vanishingly rare.’

The observable universe. Created by Andrew Z. Colvin. Courtesy of Creative Commons.

Turok and Boyle published their calculation in October this year under the extraordinarily catchy heading of ‘Thermodynamic solution of the homogeneity, isotropy and flatness puzzle (and a clue to the cosmological constant)’; but the introductory paragraph contains this killer sentence:

‘The gravitational entropy favors universes like our own which are spatially flat, homogeneous, and isotropic, with a small positive cosmological constant.’

The calculation stems from working with ‘a clock that ticks with imaginary numbers’ enabling Turok and Boyle to calculate the quantity of entropy that corresponds with our universe.

What this might mean for physics is being hotly debated. What is also interesting is the effect it might have on those who think the teleological argument for the existence of god or gods – especially the particular flavour of the argument called the ‘fine-tuned universe’ – has a strong case. This argument states that the universe is special because it is so finely tuned – especially for the existence of life – and that in turn this is evidence of the work of a creator. But if Turok and Boyle are right, then this universe is not so special after all – it is rather common and ordinary. I’m not suggesting this completely negates the argument for a fine-tuned creation, but I think it certainly dilutes it.

However, it is something of a letdown to discover we’re living in a beige universe.

10 March 2022: ‘Us’ Part 6 – Kith and kin

Hobbits and their ancestors

One of the great palaeoanthropological bombshells of the last generation was the discovery of Homo floresiensis on the Indonesian island of Flores. For years scientists debated what ancestor this new and somewhat diminutive hominin – dubbed the ‘Hobbit’ by the media – had come from, or indeed if it should even be included in our genus.

Homo floresiensis reconstruction. Courtesy of Creative Commons. This image created by ATOR.

While now generally accepted as a member of our broader tribe, its origins are still fiercely argued, many insisting it’s nothing more than H. erectus that’s undergone insular dwarfism. But I think a 2017 paper written by Colin Groves, Debbie Argue, Michael Lee and William Jungers, convincingly demonstrates that H. floresiensis is not derived from H. erectus (or is a diseased example of H. sapiens), but rather from a much earlier hominim such as H. habilis or a sister species.[i]

A second paper, published in 2020[ii], backs up this hypothesis, and concludes with this statement:

‘ … something which on account of our inadequate current taxonomic framework we have to call “early Homo” differentiated in Africa, possibly as early as 2.8 (mya) … Subsequently, one or more members of this group reached the Mediterranean fringe and spread Out of Africa at 2.5 Ma. After successfully expanding over Asia, at least one of those hominins … gave rise to new species that reached the Caucasus by around 1.8 (mya), and thence Europe by ca. 0.9 (mya) … (the) eastward expansion (or occupation) in Asia of small-bodied and archaically-proportioned hominins continued, possibly in multiple waves; and, by ca. 0.8 (mya), representatives of this group had penetrated as far as insular southeast Asia, where H. floresiensis ultimately emerged … ’

Indeed, some scientists considered this possibility as early as 2005. A report about the brain of H. floresiensis published in Science in that year[iii] concludes with these lines: ‘Although it is possible that H. floresiensis represented an endemic island dwarf that, over time, became subject to unusual allometric constraints, an alternative hypothesis is that H. erectus and H. floresiensis may have shared a common ancestor that was an unknown small-bodied and small-brained hominin.’

Homo habilis. Courtesy of Creative Commons. Photographer unknown.

I think an increasing weight of evidence strongly suggests that the first major exodus of our genus from Africa was carried out by H. habilis or one or more of her sisters. Furthermore, I think it’s possible that these closely related species then gave rise to H. erectus, H. pekinensis, H. luzonensis[iv] and H. floresiensis in Eurasia, while those remaining in Africa gave rise to H. ergaster. This does not preclude the possibility, or perhaps probability, of any or all of these species crossbreeding if they ran across each other.

But what of H. sapiens, our own species? As with H. ergaster and H. erectus, the evidence here is convoluted, confusing and often contradictory.

Mongrel

For those, like Colin Groves, who think H. ergaster is a species in its own right, the line of descent works something like the following.

Homo heidelbergensis. Courtesy of Creative Commons. This image created by ATOR.

About 600,000 years ago, H. ergaster, either directly or through an intermediary species called H. rhodesiensis, gave rise to H. heidelbergensis. This species was our size physically, and his brain capacity was well inside the standards of Anatomically Modern Humans (AMH). Following the great tradition of hominin migration, something that seems as ingrained in our genus as bipedalism, some members of this new species moved to Europe[v]. About 400,000 years ago, they gave rise to H. neanderthalensis. In a case of ‘well, we’ll show you’, those who stayed behind in Africa gave rise to H. sapiens at least 300,000 years ago, and possibly as long as 350,000 years ago.[vi]

I can’t stress this enough. Homo sapiens are Africans. It is where our archaic ancestors and AMH first appear[vii]. (Let me also stress that this story, as complicated as it gets from now on, does not resurrect the Multiregional Model for our evolution, where H. erectus gave rise to H. sapiens across its whole range at the same time, from Africa to Asia. This is an old theory, now largely discredited by the extensive fossil and DNA evidence that our species first evolved in Africa.[viii])

What happened next has been slowly and painstakingly uncovered by palaeoanthropologists doing field work throughout Africa and Eurasia, and by the outstanding work performed at the Max Planck Institute’s Department of Evolutionary Genetics, headed up by Svante Pääbo, into hominin DNA.[ix]

What the DNA evidence strongly suggests is that H. sapiens successfully left Africa between 70,000 and 100,000 years ago. (Although this wasn’t the first migration into Eurasia by our species. It is usually held that previous attempts left no trace in the DNA of AMH outside of Africa, but see these earlier posts, here and here.)

Female Homo neanderthalensis. Courtesy of PLOS ONE.

Members of the most recent migration interbred with H. neanderthalensis, probably in what is now the Middle East, and later with the Denisovans, another possible descendant of H. heidelbergensis, deeper in Eurasia[x]. To this day, the average ex-African H. sapiens carries between 1%-2% of the Neanderthal genome; but it is not the same one or two percent: we overlap. Overall, we carry up to 40% of the Neanderthal genome in our own genes. But the story gets more complex still: the genome of people from Oceania, such as Papuan New Guineans and Australian Aborigines, can have between 5-6% Denisovan DNA[xi]; indeed, recent research suggests that Ayta Magbukon Negritos in the Philippines have Denisovan ancestry 30-40% higher than either of these two groups.

The Natural History Museum of London’s Professor Chris Stringer says, ‘It is now clear there was a lot more interbreeding between ancient species, including early Homo sapiens and others, and that there was a lot more movement of populations both in the distant past – and relatively recently.’[xii]

Homo sapiens (Oase 2) reconstructed from bones 37,000-42,000 years old discovered in the cave of Peştera cu Oase in Romania. Around 7.3% of his DNA is from H. neanderthalensis, from an ancestor 4-6 generations back. Courtesy of Creative Commons. Photo: Daniela Hitzemann.

Talking about recent research, in June last year Chinese scientists announced that a cranium first discovered in China almost a century ago, is a new species of Homo with a brain easily the equal of any AMH in size and carried inside a skull more massive than ours. Those making the announcement have named the new species H. longi (‘Dragon man’, and just as Denisovans are sometimes described as a sister species to Neanderthal, so H. longi is being claimed as a sister species to H. sapiens[xiii]).

As Lee Berger, from the University of Witwatersrand and the discoverer of Australopithecus sediba and H. naledi, has suggested, perhaps the different paths of human evolution are not best thought of as branches spreading from a single tree trunk, or even a messy, many-twigged bush, but rather a braided stream[xiv] with tributaries constantly running across each other before separating, rejoining and separating once more.

The Waimakariri River in New Zealand is braided along almost its entire length. A good metaphor for hominin interbreeding? Courtesy of Creative Commons. Photo: Greg O’Beirne.

We, Anatomically Modern Humans, are the result of all this evolution. We are nothing more than a mongrel species.

What a splendid, exhilarating thought.

Other posts in this series can be found here:

‘Us’ Part 1 – Out of Africa

‘Us’ Part 2 – Burdalone

‘Us’ Part 3 – The devil in the detail

‘Us’ Part 4 – Using your noggin

‘Us’ Part 5 – Feet and socks


[i] https://www.sciencedirect.com/science/article/abs/pii/S0047248417300866

And this from the Australian Museum: ‘Most scientists that accept H. floresiensis as a legitimate species now think its ancestor may have come from an early African dispersal by a primitive Homo species similar in appearance to H. habilis or the Dmanisi hominins. This means that it shared a common ancestor with Asian H. erectus but was not descended from it. Cladistic analysis supports the lack of a close relationship with H. erectus.

[ii] https://onlinelibrary.wiley.com/doi/abs/10.1002/evan.21863

[iii] Falk, Dean, et al. ‘The Brain of LB1, Homo floresiensis’. Science, 308, 242 (2005).

[iv] https://www.nature.com/articles/s41586-019-1067-9

[v] The first H. heidelbergensis fossils were found near Heidelberg in 1907.

[vi] Although this paper suggests the split between our two species might be found much further back … up to 800,000 kya or more!

[vii] Recent research from scientists at Australia’s Garvan Institute of Medical Research reveals that southern Africa is home to the oldest evidence for AMH: ‘… to contemporary populations that represent the earliest branch of human genetic phylogeny.’ The date they arrive at is 200,000 years ago.

As well, a report in the February issue of Science describes how thousands of genome sequences were collected from modern and ancient humans to create a family tree. In the words of the report’s first author, Anthony Wilder Wohns, ‘ … we definitely see overwhelming evidence of the Out-of-Africa event … ‘

[viii] See Stringer, C. & Andrews, P. The Complete World of Human Evolution. London, 2011. P 140 ff for a discussion of the two main theories for the evolution of Home sapiens: ‘Multiregional’ and ‘Out of Africa’.

[ix] And now, besides DNA, they are using protein analysis to identify ancient hominins, most recently the first Denisovan found outside of the Denisova Cave in Siberia … on the Tibetan Plateau of all places! See https://www.nature.com/articles/s41586-019-1139-x, 16 May 2019.

[x] Very recently, H. sapiens remains were discovered in the Grotte Mandrin rock shelter in the Rhône Valley in France that date back 54,000 years ago, pushing back our species arrival in Europe by at leat 10,000 years from previous estimates.

[xi] Please watch this fascinating talk Svante Pääbo gave at the University of California in 2018 after receiving the Nierenberg Award for Science in the Public Interest. It goes into all of this in much more detail. As Pääbo points out in the talk, the DNA evidence indicates humans ‘have always mixed’.

[xii] https://www.theguardian.com/science/2017/nov/19/human-evolution-dna-revolution-mapping-genome

[xiii] See here and here.

[xiv] See Berger talk about this towards the end of this Nova documentary, the Dawn of Humanity.

04 March 2022: ‘Us’ Part 5 – Feet and socks

One foot in front of the other

Humans walk upright, gorillas and chimpanzees walk on all fours, resting their weight on their knuckles, and orangutans can do just about anything – they hang and swing by their arms from branches, sometimes with the help of their oddly-shaped feet, and on the ground they can walk either upright or on all fours.  The structure of the postcranial skeleton in all four animals is very different and reflects these locomotor patterns.  Non-human great apes have short legs and long arms, whereas we have very long legs. With the gorilla and chimpanzee it is the shortness of the legs that differs from humans, the arms being much more similar in length compared to the torso; only the orangutan has enormously lengthened arms.  When other great apes stand upright, their legs are straight from hip to ground, whereas humans are ‘knock-kneed’, as the thighs slope inward from the hip to the knee.  The pelvis is very different in appearance: in humans the hip bone (ilium) is low and very broad, but in great apes it is high and fairly narrow. In humans the great toe is long and stout and aligned with the other toes, but in great apes it is divergent from the other toes (less in the gorilla), and in the orangutan it is very short.

Courtesy of Creative Commons. Artist unknown.

In most great apes, the spinal column is more or less straight, but in humans the spine is curved into a double-S: the cervical (neck) vertebrae curve forward, the thoracic (chest) vertebrae curve backward, the lumbar vertebrae (those in the small of the back) curve forward again, the sacral vertebrae (which are fused together, and form the back wall of the pelvis) curve back again, and the coccyx (the partially fused vertebrae which are the tiny remnant of the tail) curves forward once more.  The ribs (which are very variable in number, but average 12 in humans and orangutans, and 13 in chimpanzees and gorillas) together form the thorax; in humans the thorax is barrel-shaped (narrow at the top, broad in the middle, narrower again at the bottom), whereas in great apes it is funnel-shaped (narrow at the top, and broadening towards the bottom).

All of these differences between humans and the other great apes are developments stemming from bipedalism. So why did humans adopt bipedalism? Well, walk with me and we’ll take a brief look at the major theories.

Doing a runner

There seems to be a growing consensus among many scientists that our ancestors evolved bipedalism for several reasons rather than one overriding factor. What many of the competing theories do agree on, however, is that rainforest giving way to savannah because of climate change around 7.0 – 5.0 mya was a strong influence. Grassland with only scattered trees and no closed canopy meant tree-climbing primates had much more open territory to cover. Walking on two legs freed hands to carry infants, food or tools, including weapons. Walking on two legs made us taller, meaning we could locate food, potential predators and safe havens from further away; it also made it easier to pick low-hanging ripe fruit from trees. Walking reduced the amount of body surface area we exposed to the sun while in the open.

Early morning on the savannah. The change in the landscape from rainforest to savannah between 7 mya to 5 mya probably helped kickstart bipedalism in hominins. Photo: Simon Brown.

Of course, in some circumstances some of these ‘advantages’ could become disadvantages. For example, although bipedalism meant we could locate a predator from further away, it also meant if it was looking in the right direction, a predator could see us from further away as well (and our chief predators – leopards, hyenas and lions – all have good eyesight, not to mention excellent hearing and sense of smell). On the other hand, when our ancestors became active hunters, our extra height gave us an advantage over prey animals, many of whom rely on their sense of smell rather than their eyesight.

Our genus has evolved to become a natural endurance runner, and through that a natural persistence hunter. Courtesy of Creative Commons. Photographer unknown.

More recently, one of the major arguments for the successful adaptation of bipedalism was that it is a much more energy efficient method of locomotion[i]. Whatever the arguments for or against all these hypotheses regarding the origins of walking, when it came to running there is no denying our bodies evolved to make us one of nature’s supreme endurance runners[ii]. This seems to have happened about two million years ago and was a real game-changer when it came to predating: our ancestors evolved into persistence hunters, able to wear down much larger animals such as kudu and oryx[iii]. Basically, humans ran their prey into the ground, and much of our body shape is particularly adapted to long-distance running.

In other words, the characteristics that make us superb walkers and runners are the characteristics that most set us apart from other great apes. As Chris Stringer and Peter Andrews write in The Complete World of Human Evolution, ‘at present … (bipedalism) is taken as the earliest adaptation by which we can recognise human ancestors in the fossil record.’[iv]

The odd-sock drawer

Now it’s time to deal with one of the most controversial species in the human lineage – Homo ergaster. This species was described by Colin Groves and Vratislav Mazák in 1975[v]. Since then, palaeoanthropologists are divided on whether H. ergaster is a distinct species, or a subspecies belonging to H. erectus, palaeoanthropology’s pin-up boy and all-purpose species.

Once they learned to walk, our ancestors just kept on walking. In fact, they walked right out of Africa, into the Middle East, then east into Asia and Sahul, north to Europe, and eventually across the Bering Strait and into the Americas. On the way they continued evolving into new species that seemed to interbreed with each other at every opportunity, creating yet more new species, and eventually discovering agriculture, television and the internet. And interestingly, it’s the use of technology that provides us one piece of evidence that H. ergaster and H. erectus were two different species.

But first, let’s talk more about bones, specifically those belonging to the original H. erectus, parts of which were first discovered 1891 by Eugène Dubois, a Dutch doctor working for the army in Java. In fact, he went to Java with the objective of discovering evidence supporting the theory that H. sapiens evolved in Asia, an idea most determinedly supported by German naturalist Ernst Haeckel. Haeckel had hypothesised that our species’ progenitor, which he names Pithecanthropus alalus, had evolved on Lemuria, a mythical continent that subsequently sunk beneath the Indian Ocean (thereby conveniently leaving no fossils behind to prove – or for that matter, disprove – his theory).

Eugène Dubois, discover of Homo erectus. Courtesy of Creative Commons. Photographer unknown.

Although Dubois had discovered ancient hominin fossils, he found little or no support among scientists in Europe that they amounted to anything significant. It wasn’t until Sinanthropus pekinensis was discovered in China over a quarter-century later that enthusiasm for Dubois’s discovery really picked up. In the early 1950s, Ernst Mayer reclassified both P. alalus and S. pekinensis as H. erectus[vi]. Since then, hominin fossils with roughly the same estimated brain size as H. erectus and aged between 2 million years old to just over 100,000 years old have been thrown in with H. erectus like differently coloured socks thrown into an odd-sock draw. It has become the species to have when you want to cover all of Africa and Eurasia and two million years of history.

In the early 1970s, for example, Richard Leakey and Alan Walker described two partial skulls found in Kenya as belonging to an African offshoot of H. erectus based on the fact that their calculated brain capacities (848 cc and 803 cc) were not dramatically smaller than that of some H. erectus skulls (around 950 cc), which is like arguing that since the Volvo S60 and the Volkswagen Passat have similar interior space, they’re both examples of a Toyota Camry.

However, in 1975, Colin Groves and Czech colleague Vratislav Mazák, after a comprehensive metric analysis of fossils from Koobi Fora, discovered they had uncovered a new species they names H. ergaster. Their argument was that there was no African version of H. erectus; further, Colin Groves believed that H. ergaster evolved in Africa and then migrated into Eurasia, eventually giving rise to H. erectus.[vii] The earliest dates for the new species goes back 1.9 million years[viii], as opposed to 1.6 million years (or 1.8 according to some estimates) for H. erectus, making H. ergaster the first truly human-looking hominin to stride the planet – tall, thin, decidedly bipedal, with a flatter face than its ancestors and an active hunter, fire-user and tool-maker.

KNMER 3733, possible cranium of a female Homo ergaster. Photo: Simon Brown.

Now, nearly fifty years after the initial paper by Groves and Mazák, a fierce debate still continues between those who think the two hominins are separate if linked species, or just subspecies. In common parlance, it’s a debate between splitters and lumpers.[ix]

But besides the obvious difference in the skull shapes of H. ergaster and H. erectus, another line of evidence convinces me that Colin was right in his opinion that we are talking about two species. This evidence involves tool making.

Out with the old, in with the new

Until the appearance of H. sapiens and H. neanderthalensis, stone age technology is divided into two broad and overlapping stages: Oldowan and Acheulean (sometimes called Modes 1 and 2). Oldowan technology was first discovered in the 1930s by Louis Leakey at the Olduvai Gorge in Tanzania. The oldest examples have been found at Gona in Ethiopia, and date back about 2.5 million years[x]. The technology seems to have spread very quickly, and recent discoveries have found stone tools in Jordan dated at 2.5 mya and China at 2.1 mya[xi]. This technology, the use of very simple flakes and rocks, had been developed before the appearance of H. habilis, possibly by Australopithecus garhi. Acheulean technology which started about 1.76 mya, is closely associated with the appearance of H. ergaster and involves more refined knapping and the development of specialised tools such as hand axes.

This doesn’t imply that Oldowan technology suddenly evaporated, and every hominin adopted the new style of knapping chert. In some places, Oldowan and Acheulean stone tools are found at the same site from the same period, suggesting that while H. ergaster or one of its descendants employed the improved technology, one of our cousins continued using the older method.  But it’s clear Acheulean technology obviously conferred a significant advantage over the old style. It didn’t take long for it to spread beyond Africa, either because H. ergaster itself started spreading beyond Africa, or because it spread by ‘word-of-mouth’: neighbouring hominis picked up on the new fashion of making tools and copied it. Acheulean tools appear in what is now India, for example, by 1.5 mya, and in Europe by about 900 kya.

Acheulean hand axes. Compare the careful knapping done here to the more primitive Oldowan tools illustrated in the previous post. Courtesy of Creative Commons. Photographer unknown.

However, Acheulean technology did not seem to reach Java, where our friend H. erectus resided.

Which presents lumpers with a problem. If H. ergaster is indeed nothing more than a subspecies of H. erectus, then fossil evidence suggests this single species arose in Africa before spreading throughout Eurasia. Yet if this is also the species that developed Acheulean technology soon after evolving, why didn’t the technology travel with them to the far east?

On the other hand, if we are talking about two species, then it’s quite possible for Acheulean technology to be developed by H. ergaster in Africa, spread slowly throughout Eurasia, but never quite reach the home of H. erectus in Java.

If this was in fact the case, it raises a more important question: even if we accept H. ergaster is a separate and earlier species than H. erectus. Does it necessarily follow that H. ergaster gave rise to H. erectus? What if the two species are cousins rather than mother and daughter?

This is something we’ll discuss in the next, and final, post of ‘Us’.

Other posts in this series can be found here:

‘Us’ Part 1 – Out of Africa

‘Us’ Part 2 – Burdalone

‘Us’ Part 3 – The devil in the detail

‘Us’ Part 4 – Using your noggin

‘Us’ Part 6 – Kith and kin


[i] https://www.newscientist.com/article/dn12269-walking-on-two-feet-was-an-energy-saving-step/

But then again, see https://www.sciencedirect.com/science/article/abs/pii/S0047248412001443

[ii] https://www.nature.com/articles/nature03052

[iii] https://en.wikipedia.org/wiki/Persistence_hunting

[iv] Stringer, C. & Andrews, P. The Complete World of Human Evolution. London, 2011. P 19.

[v] https://www.irmng.org/aphia.php?p=taxdetails&id=10031853

[vi] According to Britannica, Mayr did this in 1944. But see Bernard Wood who writes it was Franz Weidenreich who first came up with the idea in 1940:

‘(He) was the first to suggest that the genus Pithecanthropus should be subsumed into Homo, and in the same paper he proposed that fossils recovered from what was then called Choukoutien (now called Zhoukoudian), which were initially assigned to Sinanthropus pekinensis,26 should also be transferred to Herectus.’

[vii] From the Australian Museum:

‘A growing number of scientists have redefined the species Homo erectus so that it now contains only east Asian fossils. Many of the older African fossils formerly known as Homo erectus have now been placed into a separate species, Homo ergaster and this species is considered to be ancestral to Homo erectus. The redefined Homo erectus is now generally believed to be a side branch on our family tree whereas Homo ergaster is now viewed as one of our direct ancestors. ‘

[viii] Oldest fossil dates according to the Australian Museum for H. ergaster here and for H. erectus here. Recent work reported in the journal Science may push the dates even further back, between 1.95-2.04 mya (although in this paper the discussed specimen is describe as preserving ‘characters that align it morphologically with H. erectus sensu lato (including Homo ergaster)’. Go figure.

[ix] For a fuller description of the often heated debate about what makes a species, see here.

[x] Stringer, C. & Andrews, P. The Complete World of Human Evolution. London, 2011. P 208.

A new kind of stone age technology – Lomekwian – has been suggested after the recent discovery of stone tools at Lomekwi that predates Oldowan by more than 700,000 years. See the previous post for more details.

[xi] https://onlinelibrary.wiley.com/doi/abs/10.1002/evan.21863

21 February 2022: ‘US’ Part 2 – Burdalone

‘Burdalone’ is an old Scottish word meaning the last bird in the nest, the one left when all the other chicks have flown or all the other chicks have died. It’s a sad and lonely word, and perfectly describes Homo sapiens.

When one of the first members of our own species studied the world around her, most of what she saw would be familiar to us today, whether from personal experience or from watching nature documentaries about Africa. Extensive grasslands dotted with acacias, watering holes and narrow rivers with crumbling banks, herds of large grazing animals such as wildebeest and zebra, black herons and lizards, secretarybirds and crocodiles, a lion pride or two, and our deadliest predators – a leopard and a pack of hyenas.

The south African Highveld, the kind of savannah our ancestors evolved in over four million years ago. Photo: Simon Brown.

What she also saw, and which none of us will ever see, is other groups of human beings that were not H. sapiens. Like our ancestor, they were striding on two legs and using their large brains and opposable thumbs to harvest nuts and berries, sometimes to hunt or scavenge for meat, and to fend off predators. They looked very similar to us, used tools, and some may even have created art and used language to talk to one another.

Around 350,000 years ago, at this stage the earliest date we know H. sapiens might have first strode the planet[i], there was no reason to think things would ever change.

But to say that we are human today is to say that we are members of a single worldwide species. This is extraordinary because for millions of years to be human meant that you could be a member of any one of a number of different but related species.

It should not be contentious to say that all members of the genus Homo are human – after all, this is what the Latin word ‘homo’ means – but it is contentious to suggest, as I will later in these posts, that all bipedal great apes are human.

But first it’s important to state that it’s currently impossible, and may forever be impossible, to finally determine when we stopped simply being hominids – that is, all the apes except for the gibbon – and became hominins as well – that branch of the hominids exclusive to us and our human cousins; this is the point at which chimpanzees went their way and we went ours.[ii] We may never know exactly when the thousands of physical and psychological characteristics that distinguish us from other great apes evolved; what we can be sure about is that almost of them were shared with at least some of our hominin ancestors.

We are so close to our cousins, genetically and historically, that making a distinction between whether or not they are human seems farcical. Indeed, the same argument can be made for any two species close to each other in the hominin line.

In 2005, British celebrity Alan Titchmarsh allowed professional make-up artists to disguise him as a Neanderthal; he then walked along the streets of London, almost completely ignored by everyone.[iii]

Homo sapiens or H. neanderthalensis?
Courtesy of Creative Commons. Photographer unknown.

At some point we need to demarcate between those species we consider human and those we consider pre-human, and to date the only specific marker that distinguishes all of us from all of them is bipedalism, not some arbitrarily determined measurement of brain capacity, morphology or dentition.

As well, research into the workings of the human brain, and into animal intelligence generally, has thrown into doubt those psychological characteristics we traditionally considered to be peculiarly human, characteristics that made us special and put us above the rest of the animal kingdom. Once upon a time, we were considered the only animal to make tools, then the only animal to make tools and smile, then the only animal to make tools, smile and do handstands.

It’s similar to a town building a bridge and claiming it’s the only bridge in the world, only to discover that a nearby town has one as well. So the first town now claims it’s the only single-span bridge in the world, until it learns there is a another single-span bridge in the next county. The first town now claims it is the only single-span bridge in the world with green arches, and so on, every new definition increasingly trivialising what makes its bridge special.

There is strong evidence that intelligence has arisen many times in the animal kingdom: in primates, cetaceans, elephants, larger carnivores such as dogs, hyenas and the big cats; birds, particularly corvids and parrots; and some molluscs such as octopuses and possibly squids.

There is also growing evidence that self-awareness and even a theory-of-mind[iv] exists in other primates such as chimpanzees and some birds such as crows.

There is growing evidence that animals other than humans, such as chimpanzees, have a Theory of Mind. Courtesy of Creative Commons. Photographer unknown.

So what are the characteristics that separate humans from our nearest living relatives, the chimpanzee and bonobo?

Before we answer that, we have to talk taxonomy and cladistics – how scientists classify living things.

Life is a spectrum

In his book The Vital Question, biochemist Nick Lane writes that ‘the distinction between a “living planet” – one that is geologically active – and a living cell is only a matter of definition … Here is a living planet giving rise to life, and the two can’t be separated without splitting a continuum.’[v]

Different scientists may employ different markers or waypoints in determining the start of life on Earth, all of which are subject to controversy and disagreement, but the truth is that there is no precise point in time when anyone could claim that a given chemical process for the first time was created by life rather than geology; it would be an arbitrary decision.

The same principle applies throughout evolution. There is no precise point in time where we can say fish gave rise to amphibians or basal reptiles to dinosaurs.

Changes in life brought about by evolution through natural selection isn’t episodic, it’s a spectrum.

But evolution does present a handful of events when with some certainty we can say a new direction had begun – a direction with significant ramifications for all life that follows.

The first of these, and covered in some detail in The Vital Question, concerns the creation of the eukaryotic cell: a morphologically complex cell that contains a separate nucleus and mitochondria, each surrounded by a double membrane. As far as we know this remarkable event occurred only once in all history[vi]. An archaeon, a single-celled prokaryote, absorbed another kind of prokaryote – a bacteria – and instead of consuming it established a symbiotic relationship.

The eukaryotic cell – morphological complexity derived from the synthesis of two prokaryotes, an archaeon and a bacterium. After the creation of life itself, this synthesis is perhaps the most significant event in the history of our planet. Courtesy of Creative Commons.

At some point later in history, some of the descendants of that first complex cell started a symbiotic relationship with a second prokaryote invader, creating chloroplasts and starting the line that would eventually lead to plants and green algae.

More recently, the arrival of the first human was an event with tremendous ramifications for all life on earth.

But when did this happen?

The king of Spain did what?

Before we go any further, we need to talk about a subject that normally works like a sedative on anyone not interested in taxonomic detail: the organisation of the taxa themselves.

I promise to keep this short and to the point, but it’s important to cover because we need to reconsider how and where our human family fits in with other living thing. And, of course, when it all happened.

Mnemonics are as much a part of school science classes as microscopes and Bunsen burners. For example, one mnemonic frequently used in the last century for memorising the different taxonomic ranks was ‘King Philip Came Over From Great Spain’, a mnemonic for the main taxa in the Linnaean system:

Courtesy of Creative Commons.
  • Kingdom,
  • Phylum,
  • Class,
  • Order,
  • Family,
  • Genus, and
  • Species.

Taxonomic ranking has been around for as long as humans have been curious about the natural world, but the above ranking developed from a system introduced by the Swedish naturalist Carl Linnaeus in the 18th century. He wanted to organise living things so their biological relationship to each other was made very clear. He did this by using shared characteristics to lump things together.

For example, these days all animals with fur, warm blood and that suckle their young with milk are put into one group, the class called mammals. The mammals themselves are grouped together with all animals with a backbone to form a phylum called the chordates. The chordates and animals without a backbone are thrown together into a kingdom called Animalia. More formally, taxa – the collective noun for the rankings – sharing a more recent common ancestor are more closely related than they are to taxa which share a more remote common ancestor: in other words a wombat is more closely related to a dog than it is to a crocodile, and more closely related to a crocodile than it is to a flatworm.

Linnaean taxonomy also introduced the binomial, the familiar two-name identifier used in science to classify an organism at the most detailed commonly used level, that of species. Homo sapiens, for example, is the binomial for human beings, just as Panthera leo is the binomial for lions and Quercus robur is the binomial for the English oak. The first word is the genus (plural genera), the second the species.

Taxonomy is a lovely idea, and appeals to anyone who thinks good old common sense is all you need when sorting bookshelves and tidying kitchen cupboards. For over two hundred years it was regarded as an almost fool-proof system: a place for every living thing and every living thing in its place.

But our knowledge of the natural world is not like that of our kitchen. Like the natural world itself, it is messy, chaotic, growing and constantly evolving.

In 1990, American microbiologist Carl Woese (1928-2012) suggested a new step was needed at the top of the taxonomic ladder to reflect the discovery of a whole branch of life whose existence was never suspected until the 1970s. The archaea, single-celled prokaryotes, were long thought to be a kind of bacteria, but work by Woese and other scientists revealed they are as chemically different from bacteria as we are.

The commonly accepted taxonomic ranks now start with ‘domain’, leaving us with cumbersome and self-defeating mnemonics such as ‘Determined, Kind People Can Often Follow Ghostly Screams’ or ‘Do Kings Prefer Chess On Fridays, Generally Speaking’.

Domain isn’t the only extra rank added over the decades. We also have ‘subfamily’, ‘tribe’, and sometimes ‘subtribe’, ‘subgenus’ and ‘subspecies’, and that’s just in the field of zoology.

In the story of ‘Us’ we’ll be dealing mainly with genus and species, and in the next post we’ll discuss what makes up both taxa.

Other posts in this series can be found here:

‘Us’ Part 1 – Out of Africa

‘Us’ Part 3 – The devil in the detail

‘Us’ Part 4 – Using your noggin

‘Us’ Part 5 – Feet and socks

‘Us’ Part 6 – Kith and kin


[i] https://simonbrown.co/2017/10/07/07-october-2017-new-evidence-suggest-we-are-much-older-than-300000-years/

[ii] Some palaeoanthropologists include chimps and bonobos in the hominin. Rather than outlining all the arguments for or against, I’ll err on the side of caution and include only our immediate family in the hominins.

[iii] Titchmarsh did this in the wonderful natural history series The British Isles: a Natural History. See https://www.bbc.co.uk/programmes/b01fkhdx.

[iv] This is the ability to attribute mental states similar to your own to other members of at least your own species and possibly other species as well.

[v] Lane, Nick; The Vital Question; London, 2015; p 27.

[vi] This may have happened a second time. A single-celled organism with a nucleus, and possibly mitochondria, dubbed Parakaryon myojinensis, was retrieved a few years ago from the foot of a sea creature found off a coral atoll not far from Japan.

The main difference between P. myojinensis and all other eukaryotes is that its nucleus and mitochondria are surrounded by a single membrane instead of a double one, and its DNA is stored in filaments (as in bacteria) suggesting it is the result of a different line of evolution from all other eukaryotes. Indeed, there is some argument as to whether it is a true eukaryote at all. The only thing that can be said with some certainty is that it is definitely not a prokaryote.

No other example of this creature, or anything similar, has since been recovered. Nonetheless, when it comes to science, hope springs eternal …

See here for more information.

04 February 2021: Family, community and conservation – a conversation with Dr Patricia Mupeta-Muyamwa

‘We need to codesign programs that move away from  disempowering communities and indigenous people to giving them the power to be  strong stewards of the natural resources and the lands,’ says Dr Patricia Mupeta-Muyamwa, Strategy Director for the African Indigenous Landscape program at The Nature Conservancy, a charitable environmental organisation with its base in the US.

Her job involves working with local communities to protect and nurture the natural environment. Patricia says she fell into the work more by accident than design.

Dr Patricia Mupeta-Muyamwa (Photo: Simon Brown)

‘I did my undergraduate degree in wildlife ecology at the University of Zambia in Lusaka, and in my last six months did an internship monitoring wildlife and vegetation in a national park. The job involved interacting with the park scouts, and after listening to their experiences I realised that it was people and not wildlife that was the problem, and I asked myself how do we empower people to make them better stewards of nature?

‘I did my Masters in conservation and tourism in the UK, and learned about different models of conservation. Because of the chequered history between national park administration and local communities, which left a great deal of animosity towards the state, my work promotes the importance of getting the rights to land and natural resources to the people that live closest to them.

‘Historically, African national parks and nature reserves were created for aesthetic reasons using an American model first developed for Yellowstone National Park.

‘Up until the 1990s, the state and not the local people ran national parks and conservation areas; it was a relic of Africa’s colonial past, and part of my work is to help address this injustice by reconciling local people so they’re a part of the conservation solution.

‘Local communities were forced out. People were seen as part of the conservation problem and not as part of the solution. For example, in South Africa national parks are still state run in a very centralised way; there are many communities around Kruger but few are getting any real benefit from it except a few people that find employment.’

Patricia says her long job title came from her work as it evolved.

‘A large part of the job is focused on protecting wildlife corridors spanning across parks, private and community-owned lands.

‘The work itself has three main objectives. First, giving land and resource rights back to the local community. Second, developing community skills to manage natural resources for example protecting and monitoring wildlife . Third, helping develop community opportunities for making a living from conservation, for example with tourism and programs that empower women.’

Patricia stresses this is a bottom-up approach. ‘A big part of my job is to consult with communities and their leaders to find the best conversation solution. I listen to their stories about living and interacting with the land.’

Patricia leads teams that are managing  four big landscape projects, one in Kenya involving 39 separate communities, two in Tanzania and one in Zambia.

Patricia with two young Hadza girls in Tanzania (Photo: Dr Patricia Mupeta-Muyamwa)

‘We’ll soon be starting a fifth one in Angola, based around the headwaters of the Okavango River.’

As an example of what these projects can achieve, Patricia cites the work done with a local partner Northern Rangelands Trust  with 39 separate communities.

‘Establishing wildlife corridors between these communities has been successful in increasing numbers of previously threatened animals such as elephants.

#

Patricia was born and raised in Kitwe, a mining town in Zambia’s Copper Belt on the Kafue River, Zambia’s third largest river. This is also where she first met her husband Andrew, now a Maths Studies high school teacher. The two of them have fond memories of growing up in this small, quaint mining town.

‘My parents worked for a mining conglomerate. My father worked for 27 years as a human resources manager for a copper mining company. He was a real people person, and connected with people from all walks of life.

‘My mother was a teacher, training first in Zambia to teach home economics, but later she studied in Liverpool in the UK to become a Montessori teacher; and was the first Zambian to achieve this.’

Patricia grew up in a one-party state created by independent Zambia’s first president, Kenneth Kaunda. Following a period of instability, the 1973 signing of the Choma Declaration banned all parties except Kaunda’s own, the United National Independence Party (UNIP). He remained in power until he was ousted after being forced to hold multi-party elections in 1991.

‘Kitwe’s British-South African owned mining company was nationalised by the Zambian government, so I grew up thinking it was normal to grow up in a black-run black society. It was a source of pride for us that Zambians were in charge of the company.’

Kenneth Kaunda, Zambia’s first president (1964-1991) (Photo: Creative Commons)

Patricia says that even though she grew up in a one-party state, she only became aware of that as she finished high school.

‘But living in it as a child you don’t necessarily feel authoritarian measures, for example restricted access to the world outside Zambia. We were cocooned, but that didn’t feel bad. In some ways I would rather live in that state than what exists now. Things worked: there was  infrastructure that worked, equity for all seventy-two tribes and a sense of security. I believe Kenneth Kaunda was motivated for the greater good of society. He created an environment that allowed everyone  to  have access to healthcare,  education and employment regardless of background.

‘Kaunda created a system where we didn’t feel black, but Zambian. My father’s generation, which grew up under colonial rule in what was then Northern Rhodesia, was taught British, European and American history at school; my generation was taught pre- and post-colonial African history.

‘Kaunda led the way in institutionalising a Zambian identity. As a kid I didn’t really appreciate the gravity of this, but looking back now I see that it helped me navigate through life as a Zambian. Kaunda called this philosophy “humanism” – in the sense that the core values were about recognising our common humanity, and that we should always be aware that history was judging us and so be peaceful, respectful and good to each other.’

But things started to change in the late 1980s and early 1990s.

‘The economy was stalling and there were food shortages. Up to then the majority of Zambians had been politically passive; there wasn’t a lot of collective activism. The system that existed helped make it that way. But at that point the multi-party democracy movement challenging Kaunda was slowly taken up by the people.

‘When I was sixteen I was apolitical, but then my dad took me and my older brother to my first political rally just before Kaunda left. It wasn’t simply an anti-Kaunda rally, but more about a wind of change. It was huge and exciting – there was a great desire for change – and when it came I was hopeful. Everything felt new and that at last we were going places and fighting for a better Zambia. There was a sense of entrepreneurship in the early 90s, and new markets were opening up. The mines were privatised, for example, and different assets were being sold, like the mining homes, and many Zambians became home owners for the first time.

‘But in the euphoria we forgot what Kaunda had done for Zambia. The current political system in Zambia is not as effective as the old political system. There is less equity and less access to health, work and education. The Zambian economy is on life support.’

#

The future Dr Patricia Mupeta-Muyamwa in 1976 with older brother Chris, left, and younger brother Michael, centre (Photo: Dr Patricia Mupeta-Muyamwa)

The one great source of stability for Patricia is her family.

‘I come from a very strong nuclear family, which is not the norm for families in Africa. It is a central part of who I am. My husband, parents, siblings and my maternal grandmother have all influenced my life in different aspects.’

Patricia says her grandmother, Dorika, was independent, strong-minded, political and entrepreneurial. Born in the early 1920’s, she witnessed  her country move from a colonial to a post-colonial era.

‘She was a Kaunda supporter and freedom fighter from the colonial era. She later became a strong organizer in the women’s league of the United National Independent Party (UNIP).

‘Towards the end of the colonial period she accompanied her husband, a community development officer, to different postings all over the country. In one posting he was sent to a district in the northwest at the same time as the colonial authorities imprisoned Kaunda there; when Dorika saw Kaunda being taken for his daily walk she would go up and talk with him, much to the distress of the local British officials. During one encounter she was reprimanded by the District Governor for this action. She held her ground, and continued with her actions. This upset the Governor and he later transferred my grandfather  away from the district because of his “troublesome wife”.

‘During the time when there was a call for change from Kaunda’s rule, she said “No! No change!”’

Dorika and Bilson Muzi, Patricia’s grandparents, taken in 1963 at Kabompo, North Western Province, where Dorika upset local authorities by talking with the imprisoned Kenneth Kaunda. Patricia’s future mother is standing on the right. (Photo: Dr Mupeta-Muyamwa)

After her husband died, Dorika supported her family of eight children by selling bread and other baked goods from home and at the market.

‘With two other women she set up one of the first female trading markets in Kabwe, a small mining town in central Zambia; it’s still operating to this day.’

Patricia says she drew a great deal from her grandmother.

‘I admired the way she navigated through life and survived as a woman and as a leader. She did so much in her life and in her own way. The older she got the stronger she got, and she was a great female model. She really lived life in her own terms.’

Patricia’s father, David, was the biggest male influence on her life. ‘My love of reading came from him. I loved going into his library. I read his 12-volume encyclopedias over and over.’

Patricia says growing up she never gave her mother the same attention she gave her father.

‘I was a “daddy’s girl”, and she wasn’t in my “cool space” back then. Now I realise just how similar we were. She was a trail-blazer. She was the first Zambian to study and teach Montessori; that took a lot of initiative and courage.’

Perhaps the biggest influence her mother had on her life was her decision to send Patricia and her sister, Edith, to an all-female boarding school run by German nuns; one of the oldest and best schools from its establishment in the early 1900s. She remembers the school was run under a very strict regime.

‘I did not like it at all. The nuns worked us very hard. When I tell people I went there they ask me if my parents hated me! But in hindsight, the education I gained from that time was invaluable.’

#

Patricia says she wasn’t really conscious of her skin colour until she travelled to the UK and, especially, the US, for study.

‘I’m not sure whether or not that was a peculiarly Zambian experience. I’ve heard very different stories about encounters with racism from other black people, many of them heartbreaking.

‘Up to then I never thought of myself as a “black” person. My first racist encounter was in the UK when I was in my early 30s, when a hobo at a train station yelled at me to go “home”. I was shocked more than hurt by it because for the first time I became truly aware that this society was different from the one I grew up in.’

She says that while studying for her masters at the University of Kent she felt she was living in a bit of a bubble because she was very familiar with the British tradition and culture that had been such a part of Zambia before independence.

‘Growing up in Kitwe I had many encounters with non-racist and progressive Brits. It wasn’t until I was studying in the US that racism really hit me.

‘Soon after I arrived at the town where I was going to study I started looking for accommodation and came across a poor black neighbourhood. I began to understand how a community placed like this, separated from better-off communities, institutionalised racism.

‘US culture was strange and interesting. I was living in a diverse and liberal university town in northern Florida, but you didn’t have to drive far from the town to find Confederate flags flying in front yards. It was a totally different society.

‘For the first time I felt and identified as “black”. I found myself gravitating towards black student unions and organisations helping black communities.’

Patricia was saddened to see great poverty in some black communities in the US. ‘I had seen poverty in Africa, of course, but here it was like the lights had gone out. There was a lot of hurt and anger in that tribe – a tribe I can relate to – but the hurt and anger also existed in the academic environment which was so different from my previous experience it threw me off guard somewhat.

‘What I also found interesting was the way the black community was divided among African Americans, Caribbeans and Africans. It could be hard to cross the divide, but I’m not sure how much that was due to my own naivety. The black student union had a good ethos, for example, but it’s leadership was African American, and they defined the union’s agenda and this is where a lot of the union’s energy was spent. I had to think about what it meant to be an African in this situation. My initial enthusiasm at being part of the union started to wane because I couldn’t see what my role might be.’

#

Patricia says she identifies as Zambian but feels African.

‘I especially feel broadly connected to sub-Saharan Africa. African countries like Zambia, Kenya, Botswana and South Africa have more in common than not.

Patricia at the farm owned by her and her husband Andrew in the village of Chifwema, southeast of Lusaka, Zambia’s capital. (Photo: Dr Mupeta-Muyamwa)

‘There is a connection around tradition, culture and how we think about family. There is a very strong “oneness” around family events that goes with a sense of community. This means there is still an especially strong tie in many countries between urban and rural communities; people working in the big cities still go back  to their families living in rural areas for important occasions.’

Patricia hopes those values will see sub-Saharan Africa through to a better future. ‘Right now, for example, that rural link for urban dwellers means many of them have a comparatively safe refuge during the current COVID-19 pandemic.

‘Strangely, this isn’t what’s happening in Zambia, where the rush to urbanise seems to have cut many of those ties to the country. I don’t know the village where dad came from, for example.

‘Africa needs to reconnect to its core identity. I believe we lost this connection as we urbanised. My hope is that we will see those links repaired in Zambia and other parts of Africa.’

11 November 2020: Venomous statistics

Some Australians take perverse pride in the legion of venomous animals infesting the continent and its surrounding seas, from the very small members of the Irukandji group of box jellyfish[i] up to the very large mulga snake[ii].

On the face of it, Australia seems to have had the bum run when it comes to its snakes, spiders, ants, octopuses, cone shells and jellyfish, and this hardly exhausts the list of venomous creatures that call Australia home. On the face of it, if venomous wildlife is your thing then you should be calling Australia home, too.

(As an unpleasant aside, Australia’s venomous biota is not even restricted to its animals; I dare you to read this with the lights off: Australia’s venomous trees.)

If we exclude the 120 kg drop bear[iii], which is sometimes erroneously claimed to use venomous claws to subdue its prey, then the big three that dominate most conversations after a few beers at the pub are the inland taipan, the box jellyfish (particularly the sea wasp), and the Sydney funnel-web spider.

The inland taipan[iv]

For a timid and rarely seen snake, in recent years the inland taipan has garnered a fearsome reputation for itself. In fact, one of its alternative names is the fierce snake, but this is entirely due to its venom, milligram for milligram the most lethal of any of the world’s reptiles. It is often reported that the venom from a 110 mg bite, if carelessly (or maliciously) injected, could kill 100 adult men. The fact that the average dose delivered by an inland taipan is about 44 mg is rarely mentioned, although since this is still enough to kill at least 40 adult men it could be argued I’m being pedantic. Compare this to the most lethal member of the saw-scaled vipers[v], which can reportedly kill six adult males with the amount of venom it delivers with one bite. (We’ll be returning to the saw-scaled viper a little later.)

The chance of encountering the inland taipan, which inhabits that semiarid corner of hell-on-earth between Queensland and South Australia, is vanishingly small. Indeed, in Australia your chance of dying from thirst or a camel stampede is probably greater than dying from a snake bite from any species. It’s also worth noting that the inland taipan has been described as placid and reluctant to strike; of course, if cornered or mishandled it will not hesitate to bite with remarkable speed and precision, and more fool you.

The sea wasp[vi]

The sea wasp is another matter altogether, not because it is remotely vicious, but because it just doesn’t give a damn. All envenomations are accidental. The largest of the box jellyfish, it spends its life floating in the warm tropical waters off northern Australia, Papua New Guinea and Southeast Asia. Well, floating isn’t entirely correct. The sea wasp does swim, but not in the determined way that would get it a place in Australia’s Olympic swimming team; apparently at full pelt they can cover about six metres in a minute. In the right season and the right place, the chance of accidentally bumping into one of these almost transparent jellyfish is depressingly high. Beaches all along the northern, tropical shorelines of Australia have signs warning swimmers of the danger.

Sea wasp.
Photo Creative Commons

An adult sea wasp is made up of a roughly square-shaped bell about 30 centimetres in diameter; 15 tentacles trail from each of the bell’s corners, each of which can be up to three metres long and are covered in around 5,000 cells called cnidoblasts, each of which in turn houses a nematocyst, which is Latin for ‘this will hurt’.[vii]

Nematocysts are the business end of a sea wasp’s venom delivery mechanism. When its prey, usually prawns or small fish, brush against the tentacles, the cnidoblasts release the nematocysts. The nematocysts penetrate the skin of the victim like miniature harpoons and then release their venom. Despite having actual eyes, the sea wasp seems incapable of restraining the cnidoblasts from releasing their load if the tentacles accidentally brush against something which isn’t prey, such as a human. Since this means the sea wasp is missing out on a meal and must now spend what I assume is a lot of energy to rearm the cnidoblasts, this is a serious design fault. Admittedly, that’s small comfort for anyone writhing in the water in unbearable pain, but one can only imagine the cuss words going through what passes for a sea wasp brain.[viii]

According to one study[ix], a sea wasp carries enough venom to kill 60 adults, which considering its size compared to, say, the inland taipan, is some achievement. Nonetheless, most encounters with a sea wasp don’t end with a fatality. The quick application of vinegar to neutralise any nematocysts still attached to the skin, and ice to relieve the pain, is often all that’s necessary. Having said that, one study[x] shows that 8% of envenomations require hospitalisation:

‘Because of the rapidity of fatal C. fleckeri envenoming, the critical window of opportunity for potentially life-saving use of antivenom is much smaller than that for snake envenoming, possibly only minutes. Furthermore, from animal study data, it was calculated that around 12 ampoules of antivenom may be required to counter the effects of a theoretical envenoming containing twice the human lethal dose of venom.’

The lesson here is if you come across a sign at a beach that says beware of box jellyfish (or for that matter crocodiles) consider something marginally safer and decidedly less painful for your daily outing, like jumping off a cliff.

The Sydney funnel-web spider[xi]

I’m an arachnophobe, and this spider pretty well defines the content of my worst nightmares.

I readily admit I’m scared of vampires, malevolent ghosts, land sharks, Brussel sprouts and omelettes – for that matter, any food made mainly from eggs – but my fear of spiders is on a whole other level. Even if I catch a glimpse from the corner of my eye of the completely innocuous daddy longlegs a long shiver will pass down my spine. I don’t know what it is about arachnids that gets me all goosebumpy or triggers my fight or flight instinct (to be honest, my fly or fly-twice-as-fast instinct), but it might have something to do with spiders like huntsmen, wolf spiders, tarantulas and funnel-webs being so damn hairy. It just isn’t right; it’s as if they’d killed a dog or cat, skinned it and donned the fur. Then there’s the eight legs. Six legs on creatures such as ants and earwigs are hard enough to put up with, but eight seems a serious case of overengineering.

Sydney funnel-web.
Photo Creative Commons

Anyway, of all the world’s spiders, the Sydney funnel-web ticks every yuck box: wears dog fur, tick; eight legs, tick; lives in a hole in the ground, tick; likes entering human households, tick; has more than two eyes, tick; has fangs long enough to pierce your toe nail to get to the vulnerable flesh underneath, tick; can kill you with single bite, tick.

Indeed, I cowrote a short story about the Sydney funnel-web with good friend, colleague and fellow-arachnophobe Sean Williams. The story, ‘Atrax’, must have hit a nerve with quite a few people: it won the Aurealis Award for best horror short story in 1999.

The Sydney funnel-web’s lethality can be put down to an extraordinary compound in its venom called δ-atracotoxin (sometimes referred to as delta-hexatoxin[xii]), which bizarrely is brilliant at killing its normal prey of insects, but in small doses causes no harm to mammals … with the single exception of primates. And humans, regrettably in this single instance, are primates. Why the venom should be so damn selective is anyone’s guess, and there have been a few.[xiii]

The other peculiar fact about the Sydney funnel-web is that the male’s venom is up to six times more toxic than the female’s[xiv]. The best theory to explain this is that the male goes wandering during the mating season looking for females and has to defend itself against hungry predators, as hard as it is to imagine any predator being so hard up it needs to feed on such an ugly, hairy and extraordinarily venomous assassin. Admittedly, this doesn’t quite explain why the venom is so effective against primates; I assume almost every human on the continent, like myself, would go to great lengths to avoid antagonising any spider let alone one that can kill you, and as far as I know, humans are the only primates to have made their home in Australia.

Ultimately, the venom’s ability to kill humans is just an accidental byproduct of its evolutionary development.

But, and this is a big ‘but’, no human has died from the bite of a Sydney funnel-web spider since an antivenom became available in 1981.

Most venomous versus most dangerous

And this is where we return to the saw-scaled viper. One of these smallish snakes, the largest will grow no bigger than 90 cm, may only be able knock off six fully grown adults, as opposed to the inland taipan’s potential 100 victims, but nonetheless, to my mind the viper is the more dangerous of the two snakes.

Before I set out my reasons for this, we should remember the saw-scaled viper and the inland taipan only have to kill you once to ruin your day, not six or a hundred times, which would seem – and please excuse the pun – something of an overkill. As far as the average human is concerned, a bite from either of these snakes will see your life flashing before your eyes.

And why do I think the saw-scaled viper is the more dangerous of the two?

First, your chance of encountering a saw-scaled viper on its home turf – anywhere dry in Africa, the Middle East and southern Asia – is dramatically higher than your chance of encountering the inland taipan on its home turf.

Saw-scaled viper.
Photo Creative Commons

Second, the saw-scaled viper is a much testier beast than the inland taipan, and seems inclined to bite anyone passing within striking distance, something the inland taipan is not inclined to do.

Third, your chance of getting good medical care through much of the saw-scaled viper’s range, let alone the appropriate antivenom, can be very small.

Indeed, the saw-scaled viper may be responsible for more human deaths than any other snake, whether we’re talking about other vipers, adders, taipans, cobras, rattlesnakes, kraits or mambas. It’s reported to be responsible for up to 90% of all snakebites in Africa.[xv]

But rather than picking on any one snake, it’s important to understand that snakebites are a serious health problem in most developing countries. According to the World Health Organization[xvi]:

‘Worldwide, up to five million people are bitten by snakes every year. Of these, poisonous (envenoming) snakes cause considerable morbidity and mortality. There are an estimated 2.4 million envenomations (poisonings from snake bites) and 94 000–125 000 deaths annually, with an additional 400 000 amputations and other severe health consequences, such as infection, tetanus, scarring, contractures, and psychological sequelae. Poor access to health care and scarcity of antivenom increases the severity of the injuries and their outcomes.’

It seems to me these statistics, which barely reflect the pain, misery and social desolation that can be caused by a snakebite, are the ones we should obsess over, rather than how many humans can be killed by a single and remarkably shy Australian snake.

One final point. On average, more Australians die each year from the stings and bites of ants, wasps, bees and ticks than snakebite, largely thanks to anaphylactic shock (and not prophylactic shock as I once tipsily declaimed). From 2000 to 2013, 27 Australians died from snakebite; over the same period, 32 Australians died from animals that fly and crawl around us every day of our lives without us giving them a second thought. In the same period, no one died from a spider, scorpion or centipede bite, and only three people died as a result of envenomation from a marine creature[xvii].

To put these statistics into proper perspective, horses were responsible for the deaths of 77 Australians between 2000 and 2010[xviii]. To make the perspective even sharper, consider that between 2000 and 2013, more than 21,000 Australians died in car accidents[xix].

By the way, in those same thirteen years, two people were recorded to have died from an unknown animal or plant. I’m betting it was a drop-bear.


[i] Genus Carukiidae.

[ii] Pseudechis australis.

[iii] Thylarctos plummetus – in my humble opinion, the best species name ever.

[iv] Oxyuranus microlepidotus.

[v] Echis carinatus.

[vi] Chironex fleckeri.

[vii] Disappointingly, and rather mundanely, nematocyst is Latin for ‘a cell with threads’.

[viii] In fact, sea wasps don’t have a brain as such, or anything else we might recognise as a central nervous system. But it does have something: ‘The box jellyfish’s nervous system is more developed than that of many other jellyfish. They possess a nerve ring around the base of the bell that coordinates their pulsing movements … ’ See https://en.wikipedia.org/wiki/Box_jellyfish.

[ix] http://emedicine.medscape.com/article/769538-overview

[x] https://www.mja.com.au/journal/2005/183/11/prospective-study-chironex-fleckeri-and-other-box-jellyfish-stings-top-end#authors

[xi] Atrax robustus

[xii] For example, see:

https://theconversation.com/i-didnt-mean-to-hurt-you-new-research-shows-funnel-webs-dont-set-out-to-kill-humans-146406

[xiii] For an explanation that makes sense to me, see: https://biology.stackexchange.com/questions/8825/why-is-funnel-web-spider-venom-so-lethal-to-humans-and-not-so-much-for-other-mam

[xiv] https://en.wikipedia.org/wiki/Delta_atracotoxin

[xv] James Cook University toxinologist Professor Jamie Seymour carefully lays out what makes one venomous animal more dangerous than another in the National Geographic documentary World’s Worst Venom, not only comparing and ranking the inland taipan with other snakes, but also including sea stingers, spiders, scorpions and many other venomous creatures. Well worth a look if you can get your hands on it. See:

https://www.imdb.com/title/tt1132196/?ref_=rvi_tt

[xvi] https://www.who.int/en/news-room/fact-sheets/detail/animal-bites

[xvii] https://biomedicalsciences.unimelb.edu.au/news-and-events/archive-news/professor-daniel-hoyer-and-dr-ronelle-welton-featured-academics-in-pursuit-article

[xviii] https://www.australiangeographic.com.au/topics/wildlife/2016/03/here-are-the-animals-really-most-likely-to-kill-you-in-australia/

[xix] https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_Australia_by_year