The ethics of prehistoric hunter-gatherers: caretaking in the Stone Age
Maarten Peels
Utrecht, June 2024
Written as part of a research internship at Hanno Sauer's Moral Progress research team, Faculty of Humanities, Utrecht University.
Abstract
This paper explores the ethics of prehistoric hunter-gatherer societies, focusing specifically on the concept of caretaking. The research combines archaeological findings, evolutionary studies, and ethnoarchaeological comparisons to provide insights into ethical behaviours, particularly those related to caring for the vulnerable. The paper discusses the epistemic challenges of using analogies with modern hunter-gatherer societies and emphasizes the need for cautious interpretation. Hard evidence is provided through an analysis of healed bone fractures and evidence of medical procedures such as trepanation. General inferences about ethics are made based on the social evolution of humans. The study highlights how prehistoric communities likely invested in the care of their members. Despite the inherent difficulties in drawing definitive conclusions from these three branches of science, the interdisciplinary approach offers a nuanced understanding of prehistoric caretaking behaviours. This patchwork of evidence provides a foundation for further exploration of the ethical dimensions of our ancestors' lives.
Introduction
What can be said of the ethics of prehistoric people? This question is quite broad, as prehistory is a rather long period. Moreover, it seems no sure answer can ever be given, since the ethical choices made by people in the deep past were fleeting phenomena. As Nelson and Jurmain (1988:460) put it, "behaviour does not fossilize."[1] However, there are three main approaches from which prehistoric behaviour can be inferred: 1) evolution, 2) archaeological finds and 3) anthropological comparisons (also called ethnoarchaeology). Focussing on caretaking as an ethical phenomenon, this paper will explore the findings of these two fields and discuss their epistemic peculiarities. Importantly, in each case it will be established where the science stops and speculation begins. Though the examples together form a patchy picture at best, it is better than nothing.
People studying prehistory almost without exception combine data from two distinct scientific fields: archaeology and anthropology. Though the way practitioners gather information differs greatly between these disciplines, the literature gives the impression that they are inextricably linked. Archaeologists try to explain their findings by using the observations of anthropology or ethnography (the difference between ethnography and anthropology is only subtle and I will use both terms). At the same time, anthropologists studying hunter-gatherers or traditional farmers sometimes claim the observations of today say something about prehistoric times. Critics have pointed out that this combinatory approach (sometimes called comparatism) has some major pitfalls, and some question whether any valid conclusions can be drawn this way at all (Gosselain 2012). Nevertheless, combining archaeology and anthropology is a practice as old as the two fields themselves, and I will discuss their data in order to get a grip on the ethics of prehistoric people.
Forming a theory about the hunter-gatherer past is not just a standalone endeavour. The scientific picture we form of our distant ancestors shapes how we look at ourselves today (and vice versa). Claude Blanckaert (2022) calls this the "dialectic of modernity in prehistory." Labelling certain archaeological remains as belonging to 'modern' humans means to identify with these humans, to say they were one of us and to put our roots at a certain point in time. Blanckaert points out that the first scientists studying prehistory, following the discovery of unmistakably ancient skeletons in the Dordogne in 1868, thought these people must have been violent and uncivilised. They based their view on the numerous premortem damages found on the remains; this view still informs clichés about prehistory today. But in the decades after, as increasingly sophisticated artefacts were found, some scholars started seeing the later phase of European prehistory instead as a prelude to the great societies that came after (including their own). Were humans indeed very violent and brutish, and do we consequently form a cynical view of human nature today? Or, as Rousseau's cliché holds, did people mostly live in harmony with each other and with nature too, agriculture subsequently ruining that Eden and the Industrial Revolution giving the final blow? Neither claim is right, in fact the questions themselves are both wrong. Using 'modernity' to distinguish one phase of the deep past from another is rooted in the habit of seeing societies as following certain phases of development, with an agricultural-industrial complex as the inevitable outcome. This view, at the heart of most historical narratives since the Victorian era, has been increasingly challenged in recent decades, notably by Graeber & Wengrow in The Dawn of Everything (2022). They and other scholars in this vein have pointed out many examples of societies consciously choosing not to adopt agriculture or other innovations in order to maintain their way of life. Moreover, though subsistence methods obviously changed in most places, this developmental process was not linear. Some tribes still living according to their traditional lifestyle indeed present a continuing proof against the claim of inevitable progress.
The point of the above is that we should be wary of making sweeping claims about life in prehistory: not only is a lawlike, development-oriented view of history highly problematic, the period under discussion is also incredibly long. Within that timeframe, so many cultural and environmental factors could have differed between groups that most general claims about life in prehistory are necessarily misleading, if not outright false. Especially the topic of this paper, the ethics of prehistoric people, is full of problems that have to do with generalization. But there are interesting things to say nonetheless.
But first a note on context. It is difficult to grasp the vast expanses of time that form prehistory. In order to assess what can be said of the ethics of prehistoric people, I first have to introduce the reader to the giant puzzle of the last Ice Age: the pieces of the puzzle are the enigmatic evolution of hominids, wild climate swings, many curious flora and fauna and finally the rise of Homo sapiens. As will become clear, sometimes the first 'modern' humans and the world they live in remain elusive, though new scientific techniques are slowly forming a better picture from the data that has piled up. The study of prehistory is in the end still a young branch of science, one that is regularly upturned by new finds and research methods. Despite them living so long ago, the people from the Stone Age keep surprising us. Since all of humanity has its origins in the hunter-gatherer past, it is one of the most fascinating stories to tell.
Below, I will first discuss the geology of last Ice Age (approx. 2.58 million-11.000 years ago). I will also provide an overview of the jungle of terms and timelines which scholars of the related fields use by highlighting them in bold. After thus establishing the context, I will dive into the main topic: the ethics of prehistoric hunter-gatherers. As stated above, this might seem like an impossible thing to research, because it immediately springs to mind that ethical decision-making leaves little to no evidence (let alone from so long ago). And indeed, hard facts are difficult to come by, since"[t]here are no certainties" about human behaviour in prehistory, "just some possible realities, which are a matter of instinct and extrapolation rather than scientific fact." (Fagan 2013, 15) But there are quite a few hints to help us along. I have categorized these hints in three vantage points:
- Evolution: what aspects of caretaking we share with our apelike cousins and how hypersociability expanded our moral circle
- Ethnoarchaeology: using anthropological studies on recent hunter-gatherer tribes as a model to explain archaeological finds
- Looking at archaeological finds directly for evidence of ethical decision-making, e.g. medical treatments or the social position of wounded individuals.
These vantage points partially overlap and inform each other. Also, as will become clear later on, they each have their epistemological quirks and disadvantages. I will close each section by describing the exact epistemic framing of the knowledge under discussion. Together, evolution, archaeology and ethnoarchaeology provide a patchy but decent picture of what ethical life was like for of our remote ancestors.
To limit the scope of this research, I will mainly discuss one category of ethical behaviour: caretaking, and sometimes specifically medical caretaking. This gives us a neat, identifiable set of behaviours for which to find evidence. In section 2 I will briefly explain what this type of ethical behaviour entails.
1. The World of the Hunter-Gatherers
Geologists and archaeologists, like most academics, are keen to think up confusing and idiosyncratic terms. Below, I will introduce the reader to the puzzle of the last Ice Age and provide an overview of relevant terms.
Ice Ages, Stone Ages and Dealing With Time
In the last 4,5 billion years of the Earth's existence, there have been several periods known as Ice Ages, which is when "ice has accumulated at the poles and the continents have been glaciated repeatedly."(Allaby 2013)[2] Some of them have lasted tens of millions of years, with massive ice sheets (or glaciers) growing across the northern continents in some periods (called glacial periods or glacials) and retreating in others (interglacials). Interglacials usually last much shorter, i.e. several millennia, versus glacials of a hundred thousand years or more. To establish the length and severity of these periods, scientists have taken to studying geological formations left by the ice sheets, but also exotic phenomena like Arctic ice cores and the inner layers of cave stalagmites. These have built up over thousands to millions of years, leaving subtle chemical variations between layers of ice or mineral. By closely analysing these layers, researchers can infer the approximate temperature and composition of the atmosphere from a given period.
Judging from these data, the last Ice Age started around 2.58 million years ago (mya) and coincides with the geological epoch called the Pleistocene. Some researchers state that this Ice Age ended around 11.8 thousand years ago (kya), the current warmer temperatures signalling a new geological epoch (the Holocene). Others argue that we're now actually in a typical, brief interglacial, meaning that the Earth could return to a glaciated state somewhere in the next few millennia. In any case, both epochs fall within the larger geological period called the Quaternary (2.58 mya-present), and so the last Ice Age is also called the Quaternary Ice Age or the Quaternary Glaciation. Judging from the ice cores and cave stalagmites, the climate oscillated wildly before the current, continuously warm period started, especially during the millennia where Homo sapiens developed. As Brian Fagan (2013, 29) writes, "About 90 percent of the past five hundred thousand years have been colder than today, and the world's climate has been in transition from cold to warm or back again for about three quarters of that time." During glaciations, sea levels became much lower because of the accumulated ice in glaciers, sometimes 100 meters lower or more. As a result, huge swathes of land would get freed up from the sea: for example, Indonesia was mostly a continuous landmass and the British Isles were connected to the European mainland up till Denmark (an area called Doggerland). At the same time the encroaching ice sheets would make more northern areas uninhabitable, crushing the upper crust as they went and creating harsh tundra landscapes south of them. Reindeer, mammoths and woolly rhinoceros inhabited these cold ecosystems, fiercely hunted by humans. When the Earth warmed during interglacials, sometimes only lasting a millennium or less, coastal areas flooded again, and meltwater rivers swept the landscape. Hunter-gatherers almost certainly preferred living close to water (both sweet and salt) since it offers both hydration and sustenance. Considering this, it starts to dawn just how many traces of their existence must have been lost through all this geological violence.
An almost parallel timescale is used by archaeologists (who study the human past) and palaeontologists (who study all living beings of the past). Working in the context of human evolution, they use the phases of humanity's technological development to demarcate time periods. Their timeframe therefore opens with the Palaeolithic (literally Old Stone Age). This period started around 3.3 mya (so before the start of the Ice Age and Pleistocene). From this moment onwards, the first crude stone tools appear in the archaeological record (the body of physical evidence about the past). These are often no more than flint rocks smashed so as to make a sharp edge. It is commonly subdivided into the Lower, Middle and Upper Palaeolithic, with evidence of ever more sophisticated toolmaking from one subdivision to the next. The end of the Palaeolithic coincides with the end of the Pleistocene, c. 11.8 kya (and so the end of the Ice Age as well). This paper will focus mostly on the Middle and Upper Palaeolithic, which is when Homo sapiens came to be and, according to many in the field, developed distinctive signs of ethical behaviour.
In the Upper Palaeolithic there is a further subdivision based on toolmaking. Archaeologists have identified distinctive 'toolkits' in the archaeological record. The toolkits are comprised of artefacts made in a typical way, such as flint blades or spear points. They surmise that these toolkits, which often remain unchanging for thousands of years, belonged to a distinctive culture that also lasted thousands of years. The main toolkits are: the Aurignacian (43-26 kya), the Gravettian (33-21 kya), the Solutrean (22-17 kya) and the Magdalenian (17-12 kya).
After the Palaeolithic the Mesolithic (or Middle Stone Age) begins. The Mesolithic is characterized by very refined and efficient toolkits compared to what went before. It forms the prelude to the Neolithic (or New Stone Age), which is when agriculture developed and people invented polished stone axes. The Mesolithic and Neolithic ended at different times at different places, depending on when a given region underwent the transition to farming. They are followed by the Copper, Bronze and Iron Age,which also offer a patchwork of transitional cultures starting from around 7 kya. Prehistory formally ends when humans started writing down events around 5 kya. But judging purely from their tool use, some societies lived in a Stone Age until long after 5 kya, or even still do so today. And though their number is dwindling, there are a quite a few tribes remaining without using written language to any meaningful degree. Which means that for them 'prehistory' too effectively lasts until today.[3] For clarity, however, I will take prehistory to refer to the Palaeolithic, Mesolithic and Neolithic. 'A prehistoric human group' for the purposes of this paper means a group from one of these three periods. This is to distinguish such groups from ones that are described in ethnography, i.e. tribes that were described ('recorded') in the last couple of centuries.
2. Caretaking as a Category of Ethical Behaviour
Ethics is a very broad phenomenon, so to keep a coherent narrative I have to limit the scope of this paper. I chose the following type of ethical behaviour as my focus points: 'taking care of the weak' or 'caretaking' for short. For the purposes of this paper, it suffices to briefly discuss this concept, so that it can be used as a clear reference point. As will become clear later, caretaking has left a discernible mark in the archaeological record and is often observed in recorded hunter-gatherer tribes as well.
The concept of caretaking, or providing care to those who are vulnerable or in need, encompasses a wide range of behaviors and ethical considerations. In essence, caretaking involves actions aimed at supporting the well-being, growth, and survival of others, which can include both humans and other entities such as creative ideas. Mayeroff (1971) published the first philosophical work focused on the concept of care, and since then there has been a small current of academic publications devoted to the topic. However, care is a bit of a slippery concept. According to one author (Gaut: 1981: 19), it is seen as having more of "a family of meanings, broad in scope" than one clear definition. A commentator on Mayeroff (Nasr 2019) summarizes his attempt to define care as follows: "Mayeroff defines 'care' as aiding the growth and self-realization of others (including humans or creative ideas) and claims that care can bring order, meaning, and direction to human lives." Unsurprisingly, many authors link their conceptualization of care to the practice of medical caretaking, seeing nursing as its primary example. This gives a clear conceptual frame after all: Broadly associating 'care' with 'growth' as well as 'medical treatment' gives us the type of concept we are looking for. Now, what evidence do we have for caretaking in prehistory?
3. Human Evolution: Cognition, Culture and Climate Disasters
In this section, we will take a look at what evolution teaches us about human morality. The main question is: what sets us apart from other animals? And what does this teach us about caretaking in prehistory?
What apes can teach us
To put it briefly, what sets us apart is not ethical behaviour per se, but the social complexity in which it takes place, and the high degree to which we reign in our animal drifts in order to behave more ethically. Caretaking happens among non-human animals as well, but we do it for a much wider group of individuals. To understand how this phenomenon came about, I will sketch the context of what scholars have named the rise of behavioural modernity (Sterelny 2014). To be clear, this is still a hotly debated topic, and I can only present a few of the ideas here. There are still many mysteries to be solved.
It is common knowledge that there are many social animals, and that we share some characteristics with them. Chimpanzees, our closest evolutionary relatives, live in groups of several dozen individuals, with a clearly identifiable social order. The basic point of comparing humans to such great apes is that if a species related to us developed a trait earlier along the evolutionary line, our ancestors also shared that trait. For example. ever since Frans de Waal (1982) studied a colony of chimps in the Arnhem Burgers Zoo, we know that chimp behaviour in a group is very, very similar to what humans do. It turned out apes have a sense of morals.
Studying chimps, it soon becomes clear we deeply share one basic aspect of caretaking: taking care of children. Mothers and fathers feeding and protecting their offspring. It should be obvious that prehistoric people did this too. Babies are highly dependent on caretaking at birth, a rule which holds for all mammals. Perhaps some tribes were as tough on their kids as Spartans, but even they in the end took care of them. This gives us the first piece of the puzzle of prehistoric caretaking.
Morality and social networks
However, apes cannot teach us everything. The important question is: where do we differ in our ethical behaviour? What makes humans different? The answer to this question will provide the next piece to the puzzle. The search for the distinguishing trait was described as follows by Hanno Sauer (2023, 49): "Our morals are specifically human. Primates can show us (…) which skills do not explain the core of our morals. If apes possess a certain characteristic, it automatically disqualifies as an explanation (…)" He advocates the view that cooperation in groups larger than those of great apes has been our evolutionary adaptive advantage, the crucial difference between us and them. This idea is called hypersociability.
To illustrate, take the following hypothetical situation. We're all familiar with travelling on an airplane. It's not the most comfortable thing to do, but it works, generally. Several hundred strangers are peaceful and docile while flying. Now imagine that the airplane is full of chimpanzees. In no time, ears and faeces will be flying around, they're all screaming, and by the end of the flight, many of them will be dead. Individuals in a small social group would function along similar lines in both species, but working together in larger groups, groups that require extensive altruism, and especially ones that require more strict behaviour, is something only humans can do. The author Sarah Hrdy, who first came up with the above example, wrote that it's not that "humans don't display similar propensities toward jealousy, indignation, rage, xenophobia, or homicidal violence. But compared with our nearest ape relations, humans are more adept at forestalling outright mayhem." (Hrdy 2009, 3, Mothers and Others)
However, from a perspective of competitive evolution, altruism is not the obvious choice. Being selfish is nice! Altruism is in fact so difficult to come about, that our moral circle didn't expand beyond the members of our own small groups until quite recently on the evolutionary timescale. There is indeed still much disagreement on how this adaptation came about. The problem lies in the complex origins of our species. Consider the following quote from The Dawn of Everything. The authors considered that the differences between humans today are superficial compared to the context of our origin in Africa:
"Ancestral humans were not only quite different from each other; they also coexisted with smaller-brained, more ape-like species such as Homo naledi. What were these ancestral societies like? At this point, at least, we should be honest and admit that, for the most part, we don't have the slightest idea. There's only so much you can reconstruct from cranial remains and the occasional piece of knapped flint – which is basically all we have. Most of the time we don't even really know what was going on below the neck, let alone with pigmentation, diet or anything else. What we do know is that we are composite products of this original mosaic of human populations, which interacted with one another, interbred, drifted apart and came together mostly in ways we can still only guess at. It seems reasonable to assume that behaviours like mating and child-rearing practices, the presence or absence of dominance hierarchies or forms of language and proto-language must have varied at least as much as physical types, and probably far more.
Perhaps the only thing we can say with real certainty is that, in terms of ancestry, we are all Africans."
(Graeber & Wengrow, p. 81)
A common, purely homo sapiens origin can only be thought of after around 30.000 years ago, which is when the other hominin species had all but died out. However, Homo sapiens had been distinct from its cousins well before this time. The first anatomically modern skeleton, i.e. a human with a skeletal build identical to ours, is as old as 300.000 years. But at this point their tools and subsistence methods weren't yet much different from the others. From around 100.000 years ago, however, archaeologists start to find evidence of culture: artefacts such as perforated sea shells, ostrich shells and beads used for necklaces, ochre used as pigment for body decoration, clay figurines, and all sorts of new tools. These phenomena are hailed as indicators of the 'rise of behavioural modernity' (Gamble et al. 2011). For around a million years before, other hominin species made the exact same stone handaxes without any change in design, and pretty much no other tools. Now, within millennia, people started inventing all sorts of new implements and decorations.
On top of the new artefacts, we also find hints of trade networks, as some of those shells or tools start travelling hundreds of kilometres from where they originated. It is in behavioural modernity that we find that next puzzle piece to add to our picture of prehistoric caretaking. The finds I described above have led researchers to theorize that people developed the capacity for a new type of social identity (Sterelny 2014): they expanded their moral and social circle beyond the few dozen individuals with whom they woke up every day. Now, people could form bonds with people they only saw seldomly. The necklaces and body decoration might have aided this expanded network as visual markers of identity. Now, people could pool efforts and resources with many more individuals than before. As anyone who has studied history or set up a project will know, in cooperation lies the greatest power.
To be clear, I'm rushing through many theories here, and this is just one of many views on the development of humanity towards modern behaviour. Getting back to the main topic, I picked this perspective of social networks because it has an interesting aspect of caretaking.
Based on the archaeological finds of the late Palaeolithic, some scientists believe there were indeed large networks of hunter-gatherer groups. In the following quote, Graeber and Wengrow make a comparison between prehistoric finds and the networks of Native Americans and Aboriginal Australians. On both continents, many societies were organized in terms of clans with animal names, like Bear Clan or Wolf Clan, and they argue late Palaeolithic Europe must have had a similar situation.
"…we know almost nothing about the languages people were speaking in the Upper Palaeolithic, their myths, initiation rituals, or conceptions of the soul; but we do know that, from the Swiss Alps to Outer Mongolia, they were often using remarkably similar tools, playing remarkably similar musical instruments, carving similar female figurines, wearing similar ornaments and conducting similar funeral rites. What's more, there is reason to believe that at certain points in their lives, individual men and women often travelled very long distances. Surprisingly, current studies of hunter-gatherers suggest that this is almost exactly what one should expect. (…)
In [recent] centuries, forms of regional organization might extend thousands of miles. Aboriginal Australians, for instance, could travel halfway across the continent, moving among people who spoke entirely different languages, and still find camps divided into the same kinds of totemic [groups] that existed at home. What this means is that half the residents owed them hospitality (…). Similarly, a North American 500 years ago could travel from the shores of the Great Lakes to the Louisiana bayous and still find settlements – speaking languages entirely unrelated to their own – with members of their own Bear, Elk or Beaver clans who were obliged to host and feed them.4
It's difficult enough to reconstruct how these forms of long-distance organization operated just a few centuries ago, before they were destroyed by the coming of European settlers. So we can really only guess how analogous systems might have worked some 40,000 years ago. But the striking material uniformities observed by archaeologists across very long distances attest to the existence of such systems. 'Society', insofar as we can comprehend it at that time, spanned continents."
(pp. 122-3)
Note that Greaber & Wengrow combine all three viewpoints discussed in this paper: they use an anthropological comparison to explain archaeological findings, which then point to a stage of human evolution. It is a compelling explanation, but we cannot be sure it is true, simply because too much of late Palaeolithic society is lost to confirm it unambiguously. If this view is correct, however, travellers could count on caretaking if they met related tribes, something modern humans facilitated with their hypersociability. And so we would have another aspect of caretaking in prehistory.
Conclusion
Our picture of hunter-gatherer ethics now has two parts: Evolutionary studies first showed us that we have no reason to doubt that prehistoric people took care of their children. This first type of caretaking follows from a general, broad theory of behavioural biology. Then, following from another broad theory, namely studies into the development of behavioural modernity, we know that Homo sapiens probably developed large social networks, in which the members could count on caretaking when they made long distance travels.
I must add as a final note that, compelling as this last explanation is, it is not without issues. Graeber and Wengrow make what is called an anthropological comparison, but there are some caveats with using recorded tribes for such a comparison. In section 5, I will return to this topic in depth.
4. Archaeology: The Puzzling Finds of the Palaeolithic
Before looking at what recent hunter-gatherers can teach us about prehistoric people, let us first ask: is there any direct evidence in the archaeological record for caretaking?
Palaeopathology and its pitfalls
The main point I will make in this section is that it can be inferred from healed bone fractures in skeletal remains that people in the past took care of incapacitated individuals at least some of the time. There is also evidence of a successful type of head surgery named skull trepanation. This leads us to concrete examples of caretaking in prehistory, i.e. medical caretaking. The conclusion of this section is epistemically framed as follows. First, the finds of palaeopathology, though informative in their own way, cannot be generalized. Second, to effectively interpret these finds, we cannot escape ethnoarchaeological comparisons.
Palaeopathology is the study of injuries and diseases of the past. This academic field, using both archaeological, biological and cultural data, "provides primary evidence for the state of health of our ancestors." (Roberts and Manchester 2007, 1) Its practitioners have developed an impressive range of analytical methods to 'read' the skeletons that archaeologists find. I will mention three ways in which palaeopathologists identify ailments in prehistoric skeletons: 1) finding signs of wear and tear, 2) finding visual or chemical signs of disease and 3) signs of bone fracture and signs of healed bone fracture. But only one, the identification of healed fractures, in my view gives fully clear evidence of caretaking. This holds both for simple broken bones and so-called skull trepanations. Why this is so will become clear below.
First, it is important to note that like any science, palaeopathology has its limitations. We have to consider the low representativeness of palaeopathological finds. Roberts and Manchester (2007) write:
"The 'populations' being studied in palaeopathology are dead and therefore may not be representative of the living group; biological anthropologists are dealing with a sample of a sample of a sample … of the original living population (…)." (p. 12)
Basically, we are dealing with a major problem of representativeness. Consider the following. From the entire Upper Palaeolithic (c. 40-12kya), the fossils of only 804 individuals have been found in Europe (del Amo et al. 2024). Even with the lowest demographic estimate for this period, around 1500 inhabitants on the continent of Europe (Maier & Zimmerman 2017), this is an extremely small proportion of the period's living population. Many of the finds are incomplete skeletons or even just fragments, but for the sake of argument we can make a rough calculation: : if we take generations in this period to last 30 years, the remains of the 804 individuals represent 0.057% of the living population, or around 6 people per 10.000.[4] The figure is probably even lower for other continents, since the period before the Ice Age is most extensively studied in Europe.
Not only is the sample size very small, the remains we find are not an average sample either. This is so because "the differential disposal of males, females, children and people with particular diseases, and their subsequent excavation, means biases in the produced data are inevitable." (Roberts and Manchester 2007; 12) Consider as well that there are methods of disposing the deceased that leave no trace, such as total cremation; or that burials could have been destroyed by geological processes; thus, certain populations can escape scrutiny altogether. Such a very small and biased sample size means we cannot make valid generalizations about prehistoric populations based on these remains.
Still, if there is evidence of caretaking in the skeletons we have found, we could at least conclude that caretaking took place some of the time. Though we do not get generalized claims this way, we might obtain another, more piecemeal type of knowledge. To find out what this entails, we first need to dive into the evidence. So how do palaeopathologists 'read' human remains?
Palaeopathological methods of study rely primarily on the visual identification and description of abnormalities in skeletons (Roberts & Manchester 2007, 7). Studying bones is necessarily the main source of information, since ailments that affect only the soft tissues (wounds, some diseases) naturally leave no trace after the soft tissues decompose (Roberts & Manchester 2007, 12-3, 85). But some diseases do leave traces on bones (e.g. arthritis and bone cancer). Because of this, palaeopathology has been able to ascertain the presence of these diseases in prehistory. But it is questionable whether we can determine the attitude towards disease or infer any caretaking from the evidence (Dettwyler, 1991). If someone suffered from heavy arthritis, the best we can say is that maybe someone took care of them, which we could say about any deceased individual.
Only very rarely are soft tissues with traces of damage preserved, though these rare cases are certainly interesting. Perhaps the most famous example is Ötzi, the Neolithic man who was found in the Alps in 1991 (other prehistoric examples include North-Western European 'bog bodies' and some early Egyptian mummies). Ötzi was found with his skin, clothing and tools almost fully intact, along with fragments of hair. Researchers found a number of deliberately placed cuts on his limbs treated with ground charcoal. The 'tattoos' do not seem decorative. They were probably meant as a type of acupuncture, the charcoal most likely an anti-inflammatory treatment (Kean et al. 2013). Supposedly, Ötzi didn't do these all himself, meaning that we find evidence of medical caretaking on his soft tissues. However, not only is Ötzi a unique find, he is also from the Neolithic. As far as I'm able to tell, no human soft tissues have been found from before the end of the last Ice Age, only a few samples of non-human animals.[5] This would mean, in terms of soft tissues, we have no data from before the end of the Ice Age, when humans lived solely as hunter-gatherers. Therefore, from here on this paper will focus on the 'hard' evidence provided by bones, of which there are many Palaeolithic examples.
By studying signs of wear and tear on bones, palaeopathologists can deduce the former lifestyle of a person to a remarkable degree. By studying skeletons from people whose lifestyle is known, researchers know what signs to recognize in cases where the lifestyle of the person is initially not known. The adult skeleton of a hunter-gatherer will show distinctive signs of wear in his joints and bones compared to, say, a farmer or bureaucrat. One famous example of this method comes from Berger and Trinkaus (1995), who found that Neanderthal skeletons show bone fractures that are very similar to those found in modern-day rodeo performers. They deduced from this that Neanderthals must have hunted large animals by engaging with them in close combat. As we will see below, in an example from the Italian Romito finds, certain signs of wear and tear can contribute to a picture of caretaking as well.
A palaeopathologist can also look for bone fractures (sometimes called trauma) and determine what caused them, similar to how a current-day autopsy is performed. A fracture can be defined as "the result of any traumatic event that leads to a complete or partial break of a bone" (Roberts & Manchester, 2007; 89). It can be difficult to distinguish between a fracture that happened just before death or sometime after, since in both cases the body cannot enter into the healing process. But healed fractures are less problematic, since they are almost always recognizable, and so "[i]t is with the healed fracture (…) that the palaeopathologist is usually concerned (…)." Interestingly, this is so because "the majority of fractures observed from archaeological contexts are healed." (Roberts & Manchester, 2007; 89) A fracture heals according to a standard, 3-phase process with a recognizable timeline. In the cellular phase (3-9 weeks), the damaged bone tissue dies off and is replaced by a soft, new tissue. In the second phase, called the metabolic phase, the soft, immature bone is replaced by more sturdy and mature bone tissue. The final, mechanical phase "contributes over two-thirds of the total healing time and involves realignment and remodelling of bone along the lines of stress." (Roberts and Manchester, 2007; 91-2) The remodelling occurs over a span of many years, and full healing time differs per type of bone. Using radiographic images, palaeopathologists can determine where on the timeline a fracture got to be at the time of death. In this way they can, for example, tell if a person has lived many years since the fracture occurred or died shortly after.
Palaeopathology and medical caretaking
I will now discuss two types of healed fractures: those resulting simply from injury and those resulting from successful skull trepanation. Both provide evidence for medical caretaking.
Skull trepanation is a puzzling phenomenon. It is sometimes described as "the oldest surgical procedure known to humanity" (Gross, 2021). Hundreds of skulls have been found from widespread times and places with particular incisions. There is some discussion about the exact starting period of this practice, perhaps already in the Ukrainian Mesolithic (Lillie 1998) but it definitely took off in the Neolithic (i.e. still in prehistory, but during the transition to agriculture). It is unknown why exactly these holes were made, perhaps as a ritual practice or to 'let out the spirits', but a genuine medical treatment is often seen as the most likely explanation. Roberts and Manchester (2007) write:
"The operation involves, for whatever reason, incision of the scalp and the cutting through and removal of an area of the skull. The result is the exposure of the membranes (dura) covering the brain. Survival of the patient, for he or she must be regarded as a patient in this surgical operation, probably depended on the skilful avoidance of perforation of these membranes and the avoidance, either by luck or good judgment, of the major blood vessels within the skull. It should be noted that the proportion of survivors of this operation in antiquity was high (…). The evidence for survival is, of course, the healing and remodelling of bone around the operation site." (p. 124)
Some possible medical reasons for trepanation might be treatment for a previous fracture, e.g. from a blow to the head; to relieve intracranial pressure; or as a treatment for migraine. Given its prevalence, the procedure was supposedly effective. One skull from Cuzco, Peru, showed no less than seven trepanations, all healed. Whatever its motivation was, the trepanation of skulls was a practice that required dedication and skill from fellow humans. Since the skulls show signs of healing, and the practice was popular, we can assume that it contributed to the health and growth of the patients. As such, trepanned skulls provide evidence for instances of medical caretaking in prehistory. Again, we cannot be sure as to the statistical implications of these finds, but we have concrete examples nonetheless.
Now I will discuss the other main archaeological source of evidence for medical caretaking: healed fractures resulting from injury.
Broken bones would have been highly problematic injuries in the Palaeolithic. Roberts and Manchester (2007) write:
"Satisfactory healing of fractures in adults is a long process taking perhaps weeks or months, during which time total rest of the injured part is essential. This may not have been too difficult to do in the case of an upper limb fracture in the past, when the afflicted individual would have been able to perform some tasks with the opposite limb. It would have been much more difficult in the case of a person from a small community who fractured a lower limb bone. There must have been little room for 'passengers' in the work-centred groups of the past, particularly in hunter-gatherer populations. In such cases the injured, if he or she survived, would have been dependent for welfare upon kinspeople. And what reorganization of lifestyle of the group would be required if the person was a member of a nomadic group, for surely he or she would be totally incapacitated for perhaps three months with a badly fractured femur [i.e. hip]?" (p. 128; my italics)
The authors highlight here a universal ethical dilemma presented to nomadic peoples: do we take care of an injured person, leave them behind or if necessary commit euthanasia? The question, from a purely rationalist perspective, comes down to an investment of time and resources. As noted above, many skeletons are found with healed fractures, which means in these cases apparently the group members were willing to make the investment in their group members. Whenever we see such a heavy fracture healed, we know that the person had been taken care of.
Note, however, that there is still a basic analogy being made to recently observed phenomena. A bone in the ground is just that, a bone; it might show certain structures or patterns like splinters or seems, but interpreting those as a healed bone still requires a comparison to medical observations done in recent times. The interpretation of the object as a bone with a healed fracture is then not a solitary scientific claim like a chemical analysis. The interpretation is based on the proposition that there is a biological continuity between the organism that created the bone and humans living today whose biological processes have been observed. That proposition is a very reasonable one, since in all likelihood the biological processes that result in healed bone fractures today are basically the same as they were with people in the Palaeolithic. So the conclusion is valid. I merely wish to point out the fact that when interpreting these bones, we cannot escape a comparison similar to the comparison made in ethnoarchaeology. At the same time, the comparison made in palaeopathology is a lot safer, since the propositions underlying the comparison concern biological processes that are shared by all humans. A bone heals regardless of the cultural or ecological context of its owner, and so palaeopathology avoids the difficulties of continuity that plague the comparisons of ethnoarchaeology.
I will close this section with an example. A clear case of an injured individual that lived past his injury comes from Southern Italy. In the Grotta en Riparo del Romito, 9 buried individuals were found dating from 10.8-14kya, along with engraved rock art and symbolic artefacts (Museo Fiorentino di Preistoria, 2024a). One of the individuals, a male hunter named Romito 8, had suffered a serious trauma from a fall, paralysing his left arm. Yet, as the archaeologists could tell from his healed fracture, he survived for several years. Since he was, in all likelihood, no longer able to provide all of his food and materials, his group members must have taken care of him. Moreover, he was probably able to fulfil a contributing role within the community:
"This impairment, however, did not prevent him from living within the group, making himself fit for several years with only the activities permitted by his handicap. The participation of Romito 8 in the life of the community is demonstrated by the exceptional wear of the occlusal surfaces of the teeth, which were used to work leather or soft wood, perhaps to produce baskets or containers. The situation demonstrates, together with other cases from various eras, that in the Palaeolithic the care of disabled people was a frequent practice."
(Museo Fiorentino di Preistoria, 2024b)[6]
As illustrated below, the archaeologists hypothesized that Romito 8 managed to live with his paralyzed arm thanks to a splint:
In this hypothetical reconstruction, the artist shows Romito 8 processing twigs with his teeth. The twigs would have been used for basket weaving by other group members. Source: Museo Fiorentino di Preistoria, photo taken by me (March 2024).
Photo of Romito 8's jaw, showing a distinctive worn tooth at (a). Source: https://www.grottaromito.com/it/la-grotta-del-romito/le-sepolture/romito-8
A Lack of Moral Certainty: Dettwyler's Critique on Interpretations of Compassion
Though it is hard to deny that individuals like Romito 8 were taken care of in a minimal sense (since their health condition made it highly unlikely for them to obtain their own food) we should be wary to flesh out the moral context of this caretaking any further. Anthropologist Katherine Dettwyler (1991) points out that many archaeologists make invalid claims about the moral treatment and social standing of the impaired individuals whose remains they find. Her core point is that in the end we do not know how an impaired person was treated. Some scholars are quickly inclined to project sympathy upon these lives. But any of the impaired people we find could have been treated with hatred and ridicule all the same: "I suggest that this dichotomy between survival/compassion and nonsurvival/noncompassion is not at all clear-cut." (1991, 382) As a counterpoint to the previous section, it is worthwhile to discuss Dettwyler's arguments in detail.
Dettwyler's critique is specifically aimed at those archaeologists that "claim that the survival of these individuals provides evidence for compassion and moral decency among prehistoric peoples, who would have had to provide food, medical care (however rudimentary), and perhaps transportation for these "handicapped" individuals." (1991, 376) For example, she discusses claims made about Romito 2, a person with dwarfism and deformed arms found in the same cave as Romito 8. He cites Frayer et al (1987, 60-1): "…this skeleton provides evidence of tolerance and care for a severely deformed individual in the Palaeolithic (…)." And Gould (1988, 16): "If we consider care of the handicapped (particularly at some cost to caretakers) as a key attribute of humanity, then the Romito people surely practiced compassion at this level." Dettwyler's point is that though some basic caretaking might have taken place, any other moral aspect of the impaired person's life is pure speculation.
The problem stems from the fact that archaeologists only find objects, not the behaviour that created these objects. Sometimes behaviour can be inferred directly from the object. A mineral that geologically only appears in Swabia ended up in a site in Southern France; we can infer that people must have moved it. But we can make little more than such basic inferences without using speculation, which is prone to bias. Dettwyler argues that interpretations of objects that attribute compassion to prehistoric people based on objects alone indeed entail several implicit and biased assumptions. These assumptions are "about the number of nonproductive members normally present in any population, about the abilities of disabled individuals to contribute to society, about the treatment of disabled individuals by other members of the group, and about the "moral rightness" of facilitating the survival of a disabled individual under all circumstances." (1991, 375) I will discuss these points separately below. Not all of these assumptions underlie all claims archaeologists make (in fact, Dettwyler makes a strawman in assumption 2) but her points provide important footnotes to the discussion. After reading these assumptions, it becomes clear what to avoid when trying to make truthful claims about the ethical life of prehistoric humans.
Assumption 1: The vast majority of a population's members are productive and self-sufficient most of the time (1991; 379-80).When describing the survival of an impaired and nonproductive person as a sign of compassion by group members, archaeologists make it seem as if the presence of a nonproductive person is exceptional and a rare burden on the community. In fact, judging from the ethnographic record, any group of hunter-gatherers usually has quite a few such members: children, elderly, pregnant women and the sick. This is a relatively safe ethnoarchaeological comparison to make, since any society has children, older members, pregnant women and sick people. Dettwyler concludes that "all successful human groups have extensive experience in taking are of nonproductive members." (1991; 380)
Assumption 2: Individuals who do not show skeletal/fossil evidence of impairments were not disabled (1991: 380). As noted earlier, many impairments of the soft tissues do not show up in skeletons. Dettwyler mentions that e.g. "deafness, blindness (or even near-sightedness), mental retardation, and mental illness have (…) only soft-tissue manifestations (…)." Since skeletons showing impairments are rare and soft-tissue impairments do not show up, "[w]e have no basis for concluding that disabled individuals or disabling conditions were either more frequent or less frequent in the past than they are today." Though this is an interesting point, it might seem like a strawman when looking at the interpretations under discussion. Sure, finds like Romito 2 are seen as exceptional, but none of the scholars Dettwyler cites makes claims about the statistical implications of such finds (at least not in the citations). It seems that Dettwyler rather wants to make the point that the outsized attention given to bone-related impairments gives a wrong impression of the types and frequency of ailments in prehistory.
Assumption 3: A person with a physical impairment is, necessarily, nonproductive (1991: 380). Dettwyler points out that, by contrast, disabled individuals observed in the ethnographic record often try to be as productive as they can in their own way. Though not yet discovered by the time Dettwyler wrote her text, Romito 8, whom we met above, is a good example of how that could work: since he was not able to hunt anymore, he ground his teeth to provide his group with processed leather and twigs. Caretaking still takes place, but the relation between the giver and taker of the care is more balanced, since the receiver of the care builds up social credit through his mundane work. Having had a serious injury or two myself, I can relate to a feeling of restlessness resulting from lying prone; when I'm forced to lie down, the main thing on my mind is finding ways to do useful things nevertheless.
Assumption 4: "Survival" of disabled individuals is indicative of "compassion" (1991: 382) When we find evidence of an individual surviving an impairment or injury, we cannot conclude that this person was treated nicely:
"There is a wide gap between "survival" and being treated nicely. For all we know, these individuals were ridiculed, teased, taunted, beaten, treated as slaves, physically and emotionally abused, constantly reminded of their differences and shortcomings, and threatened with bodily harm or abandonment. If they were viewed as a "burden" to the group, they may have been reminded of this daily. They may have had to sit by while others debated whether or not to keep them alive. (…) As with compassion, cruelty and indifference leave few traces in the archaeological record.
I am not suggesting that we ought to take a pessimistic view of earlier human societies or that we know with any certainty that disabled individuals were treated poorly. Unfortunately, the ethnographic record, including that of our own society, supports the more dismal view of human nature more often than it supports the view of society as compassionate. The archaeological record, however, cannot tell us whether disabled individuals were treated with compassion, or with tolerance, or with cruelty." (1991: 382)
Dettwyler again makes use of ethnography to argue against the 'compassionate' interpretations of archaeologists. As discussed in the section on ethnoarchaeology below, it is a slippery approach to cite examples of recently observed behaviour to make claims about the behaviour of prehistoric people. Oftentimes, there are major disanalogies between the populations compared (i.e. prehistoric and recent) in terms of ecology, culture and level of contact with farmers. However, Dettwyler is aware of such epistemic limits. Concerning Romito 2 (the person with dwarfism), he cites Gould (1988, 18): "perhaps his social standing engendered his acceptance; but then we might also conjecture, in direct contrast, that he achieved his high status because his differences were valued or because his deeds or intelligence won respect despite his physical handicaps." Dettwyler comments: "This is certainly possible, but, again, we have no evidence to argue for or against this speculation." (1991, 381) This comment points to the line between speculation and fact, a line that can get easily blurred in archaeological discourse when assumptions and biases creep in. We can certainly speculate about prehistoric cultures based on ethnographical observations, but in the end we cannot be sure that those speculations are true because the prehistoric populations are gone (and will remain so).
Assumption 5: Providing for, caring for, and facilitating the survival of a disabled individual is always the "compassionate" thing to do. (1991, 382-3) When an archaeologist claims that keeping disabled individuals alive is a sign of compassion, kindness or moral decency, it implies that the alternative is necessarily the immoral thing to do. However, this is an ethical standpoint on which people can widely differ. Dettwyler mentions the discussion around impaired newborns in our own society: if a baby has the prospect of a terrible life, isn't it the compassionate thing to kill it or let it die? When attributing compassion to prehistoric people based on skeletal finds, archaeologists make their own ethical judgements seem like scientific reality: "It suggests a certainty about the moral issues in these situations, which as "objective" scientists, we are not justified in claiming." (1991: 383) In light of this, it seems that speculation about the moral milieu around disabled individuals in prehistory should be in the realm of artists and storytellers. I don't think there is anything wrong with wanting to tell a story set in the Palaeolithic, but one has to be aware that there are important differences between establishing a scientific fact and suggesting a life story. A scientific fact gives us knowledge. But the facts given by the science of archaeology alone are usually rather dry when compared to the richness of a human life; they cannot tell us much about it. Unsurprisingly, people are quickly inclined to project a story on these dry facts. And whereas a fictional story lacks scientific validity, it can work really well on the emotions. If the story is well-informed about the science, it can convey some knowledge too. Indeed, as Pruitt (2022) has argued, "archaeologists can better convey information about the past by using well-structured stories that align with how our brains receive and process information. A compelling story engages the aesthetic sense and aids the consumer in empathising with, comprehending and recalling subject matter."
Conclusion
Judging from the fact that some skeletons show healed bone fractures, but taking into account that these are not representative of complete past populations, we can conclude that people were taken care of after obtaining a broken bone at least some of the time. Moreover, skull trepanation gives us clear cases of prehistoric medical caretaking from the start of the Neolithic. The same holds for individuals with other impairments (e.g. dwarfism, stunted arms) that would have prevented them from gathering food for themselves. However, since behaviour does not fossilize, we cannot know for sure whether the disabled person was treated nicely.
Still, in cases of healed fractures, we can state that the group members of the injured person were willing to make the investment of time and resources to heal their group member. Even if they otherwise behaved in a nasty way to them. Considering all this, we arrive at an argument in favour of the presence of a base level type of caretaking in prehistory through induction. Just to spell it out clearly, here is the argumentative structure lined up:
- A fractured bone needs weeks if not months to heal, during which time a person is partially or fully incapacitated.
- To survive while incapacitated, the incapacitated person is highly if not fully dependent on their group members for food, medicine and protection.
- Some prehistoric skeletons show healed fractures.
C. In prehistoric times, some people with bone fractures were taken care of while their injuries healed.
The conclusion might not seem a sweeping one. Because the skeleton samples are not representative, we cannot validly make a claim such as "most people with fractured bones were taken care of" , let alone "50-80% of people with fractured bones were taken care of". But our deduction is based on contemporary 'hard' evidence, i.e. prehistoric skeletons. Moreover, for anyone interested in recreating the world of a specific tribe in a specific time (say, for a filmscript) this piecemeal evidence is highly valuable. Not a grand theory, but an in-depth look at one individual. A life story can be constructed around a certain skeleton, which is highly useful for educational storytelling (Pruitt 2022).
5. Ethnoarchaeology: Comparing With Care
The use of anthropological or ethnographical data to explain archaeological finds is an approach as old as archaeology itself (Stiles 2001, 42). In fact, almost every text I've come across during research for this paper contained at least one analogy with recorded hunter-gatherer tribes. In essence, it can be useful to make such analogies but also epistemically problematic. Ethnoarchaeology provides a rich source of hypotheses on medical caretaking in the Palaeolithic, but the claims made about prehistory this way are not easily validated. Comparisons between ancient tribes and recent ones have to be made with caution as the context of these societies can differ greatly. In the end only very specific, technical approaches to archaeology can avoid making any analogy at all.
Principles of Ethnoarchaeology
According to Stiles (1977, 88) ethnoarchaeology is defined as "the use of ethnographic methods and information to aid in the interpretation and explanation of archaeological data." Formulated more extensively, the aim is (1977, 90) "to make use of the information gathered in the historical present that has relevance in interpreting and explaining archaeologically revealed residues of prehistoric human behaviour. One of the most specific aims is improving the quality of the gathered information to make it more useful to archaeologists in formulating models and applying analogies." To this end, a scholar can undertake field research or (s)he can make use of existing publications. One of the motivations for making the ethnographic analogy that Stiles describes in a later article comes from the field of evolutionary ecology:
"Evolutionary ecology (EE) is based on the neo-Darwinian principle that the behavior of organisms within a given species is determined by natural selection acting on the species over its evolutionary history. Beginning with ethological studies of non-human animals it has extended itself through primates to humans (…). Since humans have spent the vast majority of their evolutionary history as h-gs, EE assumes that h-g behavior today can inform us about human behavioral evolution." (Stiles 2001, 43)
For the archaeologist, it is important to have the anthropological field studies specifically aimed at artefacts and how they leave traces. Many field studies do not mention the behaviours that are of interest to the archaeologist, such as object scattering or artefact production. Stiles notes (1977, 93-4):
"By observing how people manufacture, use, and discard artefacts and how they perceive the artefacts from their particular socio-cultural perspective, archaeologists are provided with a wealth of valuable information unobtainable from any other source for the interpretation of archaeological patterning. (…) The overall aim of these studies (…) is to provide the archaeologist with a number of alternative hypotheses with which to test his data. These studies have also shown the variety of ways man can adapt to his environment, and thus an archaeologist must be careful to consider as many alternative explanations as possible of his data."
It can seem that ethnoarchaeology concerns mainly tool use and manufacture, but the field can be taken more broadly. Recorded hunter-gatherer tribes can be used as models in terms of their culture as well. Around many objects, a colourful world of morals, behaviours and ideas existed, all lost. It is tempting to look at recent cultures to try and form a picture of what might have been. Looking at the anthropological record is highly seductive if we're searching specifically for examples of caretaking as well. Anthropologists visiting a living tribe observe the full range of caretaking behaviours: not just parents taking care of their children, not just people with broken bones receiving care, but also medicine men performing dozens of treatments, both medical and spiritual; food sharing; we can see attitudes towards euthanasia, pregnancy, sickness and madness. These observations allow us to start to colour in the schematics given to us by archaeology (As Diamond (2013) is very keen to do). However, the 'cultural analogy' is substantially more difficult compared to an analogy made with basic tool use. This approach will be discussed in detail below.
Though anthropological studies are the main source of information for the ethnoarchaeologist, Stiles in an earlier article (1977) mentions three other sources, each with their own shortcomings. These are
1) Early accounts of travellers that show the way of life of preindustrial societies before Western colonizers changed their patterns of behaviour (or, I should add, exterminated them). However, these accounts are unscientific and fragmentary;
2) Museum collections. These can usefully show cultural artefacts and how they are made. However, museums often do not have (or give) information on how the artefacts functioned in a socio-economic system and why they fell into disuse.
3) Experimental archaeology. In this last field of research, scientists try to make (pre)historic artefacts the way they were made at the time. For example, an experimental archaeologist might try and recreate the flint handaxes used by Homo erectus with the materials and techniques available to these hominids. This approach is informative on production methods, but "taken out of the context of the original living milieu no information can be acquired about the social and cultural aspects of the artefacts and their preserved spatial arrangement [i.e. how they would show up in the archaeological record]." (1977, 92)
As stated above, the clearest analogies can be made with tools. A good example is the spear thrower. This tool is basically a stick with a hook used to throw a spear further and harder. Effectively, it lengthens the arm of the thrower. A spear thrower was until recently the main hunting weapon used by Aboriginal Australians, who know it as a woomera. Now, many hooked sticks have been found in prehistoric Europe, some of them richly decorated with animal carvings. But for a researcher unfamiliar with spear throwers, the use of the artefact is not obvious from the design. Compare this with a jar, which is (I presume) always used to contain things. The researcher would be left with a mysterious object. But the Aboriginal's use of the woomera enables the researcher to make a model or analogy. Since the pre-Ice Age artefact is very similar in shape and size to the Australian woomera, the use of the artefact is explained as a spear-thrower.
An Aboriginal hunter throwing a spear with a woomera, south of Arnhem Land (1983). Source: yarn.com
A hooked stick carved out of deer antler, decorated with an ibex, found at the Mas d'Azil Cave. It is interpreted as being a spear thrower like the woomera.
Source: donsmaps.com
The Ethnographical Analogy and its Pitfalls
Making a comparison with anything more complex than a tool like the spear thrower brings major difficulties. The key concept for ethnographic analogies is the context of the society that is used as a model. If an archaeologist wants to make a claim about a prehistoric society by using observations made of a recent, recorded tribe, he has to take the social, historical and ecological context of both groups into account (Stiles 2001). Since these variables can differ greatly between any given tribe (let alone ones that lived tens of thousands of years apart or on different continents) any comparison can soon run into problems of validity.
Let's first return to our simple example: the decorated hooked stick from the Mas d'Asil cave. I oversimplified the comparison at first in order to bring the basic logic of ethnoarchaeology across. We can in fact doubt the comparison with the Aboriginal woomera based on shape and size alone. Couldn't the decorated stick be simply that, a decorated stick, used for unknown rituals instead of hunting? The object itself does not speak to us unambiguously, and there is no contemporary text to argue for its use. Now let's take some more context into account. The Aboriginals used the woomera as a hunting weapon, i.e. to throw spears at medium to large-sized animals. As far as we know, the people living in pre-Ice Age France also subsisted by hunting. Meaning that the decorated hooked stick pictured above was made by a society that hunted. Intact spears from the Palaeolithic are extremely rare, but judging from the signs of wear on flint points no one doubts that spears were common weapons in the era. This all makes the analogy with the Australian woomera more likely: we're looking at two societies that hunted animals with spears. The explanation of the decorated hooked stick is now more substantiated by taking context into account.
To systematize this approach, Stiles (2001) proposed a conceptual framework for judging whether a recent hunter-gatherer society is apt to be used as a valid model for prehistoric human behaviour. In short, he developed (2001, 41) "[a] method for classifying hunter-gatherer groups according to progressive stages of historical contact and interrelations with agricultural neighbours." In short, the more a hunter-gatherer (hg) tribe has been in contact with agriculturalists (ag), the less they can be used as an analogy. Their culture, social system and subsistence methods slowly change as they adapt to the growing power of the agriculturalists. Stiles' "Contextual Classification System" has 6 stages, starting with stage 0.
Stage 0: Precontact – the hgs have had no contact with ags (2001, 44-5). These societies "will obviously be most representative of prehistoric socio-economic systems, at least of modern Homo sapiens." However, by the 20th century there were probably no precontact tribes left, and they were not well described at the time. Some groups that come close to a stage-0 hg system are hinterland Australian Aborigines, some North and South American groups and the Andaman Island hgs. Though not all groups directly came into contact with industrialized nations in the 20th century, most likely all of them had at least encountered local ags. As noted above, early accounts of travellers might describe 'true' precontact groups, but these accounts are unscientific and fragmentary, so these do not provide a clear picture. If an early anthropologist did describe such a society, his account according to Stiles must have "consisted of postcontact reconstructions." Nevertheless, early accounts give us enough information about stage-0 hgs to present a problem of generalization: "It appears that there were significant variations in the degrees of egalitarian norms, sharing, territoriality, mobility, settlement patterns, group sizes, kinship systems, social organization and so on (…)."[7]
Stage 1: Contact – Hgs have their first encounters with ags, characterized as hostile or at least unfriendly and mistrustful (2001, 45). "Stage 1 h-g systems are still traditional at the time of contact, but they can change very rapidly, depending upon the intensity of interaction with the newcomers. (…) Coastal Australian Aborigines, Tasmanians, and most Native Americans were stage 1 with respect to European settlers up to the early 19th century."
Stage 2: Sporadic exchange – hgs establish friendly relations with the encroaching newcomers, though they still keep to their original culture. They do so because the ags are almost always more powerful: they have more food, better tech, and simply more people. The hunters start to trade for desired goods, such as iron tools, pottery, cloth, beads, tobacco, tea and grain. The San people from the Kalahari desert were in this stage from the 17th-19th century.
Stage 3: Accomodation. The trade links get stronger, and cultural exchange gets on the way as well. The hunter-gatherers thus get in a socio-economic situation where they are inferior to the farmers. Stiles writes of a formalization of a client-patron relationship. Intermarriage starts to happen, and hunter-gatherer groups start to settle near farmers for parts of the year. The hunters are seen as a steady source of 'wild goods' the farmers themselves cannot obtain, like certain meats or plants. Moreover, the cultural exchange leads to an opposition in identification: there is an ideological opposition, whereby the agriculturalists define the hunters as 'wild' and themselves as 'civilised'. The Inuit were in this stage up till about 1960.
Stage 4/5: Acculturation – Assimilation. The hgs are more and more acculturated and eventually assimilated into the farmer communities. Sometimes, they switch to farming themselves, sometimes they are annihilated. In some cases, they remain in the same position to farmers indefinitely, as has happened to some hunter-gatherer groups still active today, like the San in the Kalahari desert.
Given the ways in which the context of a hg group can vary it should be clear that we cannot use just any recent group as an analogy for prehistoric groups. There are differences in contact with agriculturalists, but also differences in ecological context. Consider that the San, and many other recent groups live in the periphery of the habitable earth, areas that didn't get exploited by farmers simply because they are unfarmable. Any hunter-gatherer group living before the advent of agriculture would most certainly have preferred living in the richest and most hospitable areas. What you would get in such cases are probably not the small groups of the San, but probably larger societies; Graeber and Wengrow are keen to use the societies that existed on the Pacific Coast of North America as a comparison.
Indeed, some consider the shortcomings of the anthropological comparison so big that it's reason to do away with the discipline altogether. Gosselain, in an article aptly named To hell with ethnoarchaeology! (2016), argues we should discard of the discipline altogether. I would agree that stating a certain tribe resembles an unchanged way of life from 30.000 years ago is just plain wrong. It's not just the context is different, there is no way there can be a cultural continuity for that long.
In the end, what's valuable in anthropology for reconstructing prehistory is not that we find living fossils and hold them up against the light. No, instead, recent hunter-gatherers give us hypotheses to test, they present us with ways of thinking about hunter-gatherer life that our agricultural, industrial brains could not figure out themselves (Stiles 2001, Graeber and Wengrow 2023). Anthropological observations introduced science to things like seasonal patterns of living, certain spiritual beliefs, and many types of social and political experimentation. You cannot logic such things based on pieces and bone alone.
Conclusion
The use of anthropological and ethnographic data to interpret archaeological findings, though long-standing, presents significant epistemic challenges. Ethnoarchaeology, leveraging ethnographic methods to interpret archaeological data, provides hypotheses about Paleolithic caretaking but validates these claims with difficulty due to differing contexts between ancient and modern societies. Stiles' framework enhances archaeological data interpretation by offering alternative hypotheses, yet variability among hunter-gatherer groups and historical contact with agricultural societies complicates direct comparisons. Therefore, while anthropological studies offer insights into social behaviors, including caretaking, they should be used cautiously, serving as hypothesis sources rather than definitive models of prehistoric life.
6. Conclusion: a patchwork of prehistoric ethics
In examining the ethics of prehistoric people, particularly through the lens of caretaking, it becomes clear that a multifaceted approach is necessary. The evidence, though fragmentary, offers valuable insights when pieced together. Evolutionary studies indicate that prehistoric humans likely exhibited caretaking behaviors similar to those observed in modern hunter-gatherer societies, particularly in terms of caring for children and injured individuals. Archaeological findings, such as healed fractures and evidence of medical procedures like trepanation, suggest that prehistoric communities invested time and resources in the care of their members. However, interpreting these findings requires caution, as the archaeological record is incomplete and biased towards those individuals whose remains have been preserved.
Ethnoarchaeology, while providing a rich source of hypotheses, must be approached with care due to the significant differences in context between ancient and modern societies. Analogies drawn from contemporary hunter-gatherer groups offer a window into potential prehistoric behaviors but are not definitive. These analogies should serve as starting points for hypotheses rather than conclusive evidence.
Ultimately, the study of prehistoric ethics through caretaking behaviors presents a complex but illuminating picture. By integrating data from evolutionary biology, archaeology, and ethnoarchaeology, we can form a more nuanced understanding of how prehistoric humans might have cared for one another. This approach highlights the inherent difficulties and limitations in drawing definitive conclusions but also underscores the value of interdisciplinary research in piecing together the mosaic of our distant past. Through this patchwork of evidence, we gain glimpses into the ethical lives of our ancestors.
- Allaby, M. (2013). A dictionary of geology and earth sciences (4th ed.). Oxford University Press. ISBN 9780199653065. Retrieved September 17, 2019.
- Blanckaert, C. (2022). Un autre monde ethnique: L'homme de Cro-Magnon, l'idée de progrès et les dialectiques de la modernité en préhistoire. Revue d'histoire des sciences, 75(1), 71-104.
- Dettwyler, K. A. (1991). Can Palaeopathology provide Evidence for "Compassion"? American Journal of Physical Anthropology, 84, 375-384.
- Diamond, J. (2013). The world until yesterday: What can we learn from traditional societies? Penguin.
- Fagan, B. (2011). Cro-Magnon: How the Ice Age gave birth to the first modern humans. Bloomsbury Publishing USA.
- Fry, S. T. (1990). The philosophical foundations of caring. Ethical and moral dimensions of care, 13-24.
- Gamble, C., Gowlett, J., & Dunbar, R. (2011). The Social Brain and the Shape of the Palaeolithic. Cambridge Archaeological Journal.
- Gould, S. J. (1988). Honorable men and women. Natural History, 3, 16-20.
- Graeber, D., & Wengrow, D. (2021). The dawn of everything: A new history of humanity. Penguin UK.
- Hitchcock, D. (2024). Schöningen spears and throwing sticks. donsmaps.com https://www.donsmaps.com/schoningen.html (last visited June 16, 2024).
- Hrdy, S. B. (2009). Mothers and others. Harvard University Press.
- Kean, W. F., Tocchio, S., & Kean, M. et al. (2013). The musculoskeletal abnormalities of the Similaun Iceman ("Ötzi"): clues to chronic pain and possible treatments. Inflammopharmacology, 21, 11–20. https://doi.org/10.1007/s10787-012-0153-5
- Lillie, M. (1998). Cranial surgery dates back to Mesolithic. Nature, 391, 854. https://doi.org/10.1038/36023
- Maier, A., & Zimmermann, A. (2017). Populations headed south? The Gravettian from a palaeodemographic point of view. Antiquity, 91(357), 573-588.
- Museo Fiorentino di Preistoria (a) LA GROTTA E IL RIPARO on grottaromito.com. https://www.grottaromito.com/it/la-grotta-del-romito/il-sito/la-grotta-e-il-riparo(last visited 16-06-2024).
- Museo Fiorentino di Preistoria (b) Ricostruzione ipotetica delle attività di Romito 8 (display case). Museum visited in March 2024.
- Nasr Isfahani, M. (2019). Book Criticism: On Caring by Milton Mayeroff. Pizhuhish nāmah-i intiqādī-i mutūn va barnāmah hā-yi ̒ulūm-i insāni (Critical Studies in Texts & Programs of Human Sciences), 19(1), 287-305.
- Nelson, H., & Jurmain, R. (1988). Introduction to physical anthropology. St. Paul, MN: West Publishing Company.
- Pruitt, T. (2022). Engaging the Science of Storytelling in Archaeological Communication. Archaeological Review from Cambridge / Vol. 37.2
- Roberts, C. A., & Manchester, K. (2007). The archaeology of disease. Cornell University Press.
- Robson, B., & Baek, O. K. (2009). The engines of Hippocrates: From the dawn of medicine to medical and pharmaceutical informatics. John Wiley & Sons.
- Sauer, H. (2023). Moraal. Amsterdam, De Bezige Bij.
- Souvatzi, S. G., Baysal, A., & Baysal, E. L. (Eds.). (2019). Time and history in prehistory. Routledge.
- Sterelny, K. (2014). A paleolithic reciprocation crisis: symbols, signals, and norms. Biological Theory.
- Stiles, D. (1977). Ethnoarchaeology: A discussion of methods and applications. Man, 87-103.
- Stiles, D. (2001). Hunter-gatherer studies: The importance of context. African Study Monographs. Supplementary Issue, 26, 41-65.
- Verano, J. W. (2016). Differential diagnosis: trepanation. International Journal of Paleopathology, 14, 1-9.
- Verano, J. W., & Finger, S. (2009). Ancient trepanation. In Handbook of Clinical Neurology (Vol. 95, pp. 3-14). Elsevier.
- Waal, Frans de (1982) Chimpanzee politics: Power and sex among apes (25th Anniversary ed. 2007). Baltimore, MD: JHU Press. ISBN 978-0-8018-8656-0.
- Willis, K. J., & Van Andel, T. H. (2004). Trees or no trees? The environments of central and eastern Europe during the Last Glaciation. Quaternary Science Reviews, 23(23-24), 2369-2387.
[1] Cited from Dettwyler (1991, 375)
[2] Allaby, Michael (January 2013). "ice ages" in A Dictionary of Geology and Earth Sciences (Fourth ed.). Oxford University Press. ISBN 9780199653065. Retrieved 17 Sep 2019.
[3] The fact that these ages overlap points to the problematic nature of generalizing about the deep past: because we are talking about huge swathes of time, but little evidence remains of what humans did, we risk losing sight of the diversity of human societies in the past. This point will be discussed extensively in the section on ethnoarchaeology.
[4] Assumptions are 1) generations lasting 30 years and 2) the Upper Palaeolithic lasting 28.000 years. The calculation is as follows:
28.000/30=933,33 generations
933,33*1500=1.400.00 people in the living population
The percentage of skeletons found from the living population is (804/1.400.000)*100=0.057..%
[5] Many almost fully intact mammoths have been found in the Siberian permafrost. One exceptionally well-preserved woolly mammoth was the Yuka Mammoth: discovered in 2010, it lived around 39 kya (so well before the end of the last Ice Age). Its remains included soft tissues, fur, and the brain.
[6] Translated from Italian by Google Translate.
[7] This aligns well with the view proposed by Graeber & Wengrow in The Dawn of Everything. Also interesting to mention inuit that felt their culture hadn't changed except for hunting weapons. Stiles mentions them as exception to the rule