Author : Wahid Ahmad
Millions of years ago early
human ancestors during Pliocene and early Pleistocene likely faced threats from
large predators. However, direct evidence—like tooth marks or other signs of
carnivore attacks on their bones—is uncommon. Some known examples include bite
marks on fossils of Australopithecus anamensis, Australopithecus
africanus, and Paranthropus robustus, as well as crocodile bite
marks on Homo habilis fossils.
A 1.45-million-year-old
human tibia (or shinbone) from Turkana, Kenya shows signs of being cut with
stone tools. These marks were found on a well-preserved part of the bone and
are consistent with butchering, similar to what’s seen on animal bones from the
same area.
The bone’s identity is
uncertain—it may belong to Homo erectus, Homo habilis, or another
early human species—so it's referred to simply as a hominin.
The marks suggest the
body was processed for meat, either due to starvation or as part of the diet.
Two tooth marks, likely from a large carnivore, were also found, but it's
unclear if the animal fed on the body before or after the butchering.
This is one of the
earliest possible signs of human flesh consumption, though such evidence is
rare and hard to interpret without more contexts.
Human cannibalism—the act of
eating the flesh of another human—has been practiced throughout both
prehistoric and historic periods by different human species, including Homo
sapiens. This practice is known as anthropophagy
when referring specifically to humans. Cannibalism has taken place in many
different contexts—social, political, economic, and religious—across various
cultures and regions. Archaeological, historical, and ethnographic evidence
indicates that cannibalism was a complex behavior with a variety of purposes,
including survival, ritual, and nutritional reasons.
There are three main
types of cannibalism. Exocannibalism
involves eating individuals from outside one’s group, often enemies. Endocannibalism refers to eating
members of one’s own community, often as part of funerary rituals. Survival cannibalism happens in
extreme situations like famine, where eating human flesh is necessary to stay
alive. These categories help researchers interpret archaeological findings,
although the lines between them can be blurry.
In Europe, evidence of
prehistoric cannibalism ranges from the early Pleistocene to the Iron Age.
Human bones found at archaeological sites often show signs of intense
processing, such as cut marks for defleshing, broken bones for marrow
extraction, and even tooth marks—strong indicators that the remains were
consumed. Despite this, there is debate among researchers about whether these
acts were primarily ritualistic or driven by nutritional needs.
Controversy over the
historical reality of cannibalism remains. In the 20th century, scholars like
argued that reports of cannibalism were exaggerated or fabricated, often used
by European colonizers to justify conquest and slavery. For example, Queen
Isabel of Spain once decreed that only Native Americans who were cannibals
could be enslaved. Cannibalism might have occurred during times of famine but
questions remain whether it should be called cannibalism at all.
Although today
cannibalism is considered taboo and often linked to mental illness, traces of
it remain in culture and religion. Fairy tales like Hansel and Gretel
depict it as a dark myth, and Christian rituals like the Eucharist symbolically
represent the consumption of human flesh and blood through bread and wine. This
shows that cannibalism, while largely rejected, still holds symbolic meaning in
modern society.
The oldest known case
of cannibalism comes from the TD6 level of the Gran Dolina site in Sierra de
Atapuerca, Spain. This site, dating to the end of the early Pleistocene,
contains remains of at least 11 individuals from the species Homo antecessor.
These remains include mostly children and a few young adults. The bones were
scattered throughout the cave and mixed with animal bones and stone tools.
About 45% of the human bones showed signs of being cut, broken, and bitten
indicating cannibalistic activity. These modifications suggest that the bodies
were skinned, dismembered, defleshed, and even had their skulls broken to
extract the brain and bones cracked to get marrow.
Researchers debate the
reasons behind this cannibalism. Some argue it wasn’t due to starvation, but
was instead done for food as part of regular behaviour—termed
"gastronomic" or "cultural" cannibalism. Ongoing
excavations show that such acts happened repeatedly, likely as part of a
tradition among groups living in the cave. The high number of child remains led
some scholars to compare this pattern to chimpanzee behavior, where weaker
individuals are often targeted in intergroup conflict. This suggests the
cannibalism could be related to territorial disputes or resource competition,
though it's too early to call it "warfare" in the modern sense.
The TD6.2 assemblage
has been interpreted in various ways—nutritional, cultural, or exocannibalism.
However, researchers agree it involved repeated events without symbolic burial,
likely tied to conflicts over land and resources.
Another early case of
cannibalism comes from the Arago Cave in Tautavel, France. Here, remains of at
least 30 individuals from the Middle Pleistocene have been found. These remains
showed signs of systematic bone breakage and cut marks made while butchering.
Only specific body parts—like skulls, limbs, and the pelvis—were found, while
bones from the torso, hands, and feet were mostly missing. This selective
treatment of the bodies led some researchers to suggest that the cannibalism
may have had ritualistic elements.
From about 130,000 to
40,000 years ago, there is strong evidence that Neanderthals engaged in
cannibalism at several sites across Europe. At Moula-Guercy in France,
researchers found remains of six individuals with cut marks and broken bones,
indicating that their bodies were butchered in the same way as animals. These
human bones were mixed with animal remains and tools, suggesting that
Neanderthals removed flesh and marrow for consumption. This site is considered
a clear case of cannibalism, though no specific reason—such as ritual or
survival—has been confirmed.
Human
and animal bones from Goyet Cave in Belgium offer important insights on
Neandertal behaviour. Researchers identified at least five individuals—four
adults or adolescents and one child. Long bones like tibias and femurs were the
best preserved.
Radiocarbon
dating placed these individuals between 44,000 and 45,500 years ago. Many of
the bones showed cutmarks, marrow extraction damage, and signs of being used as
tools, indicating they were processed similarly to animal carcasses. This
strongly suggests cannibalism, likely for survival or ritual purposes.
Comparable patterns were seen in horse and reindeer bones from the cave, though
Neandertal bones had more percussion marks due to their density. There were no
signs of burning, and the preservation of DNA makes extensive cooking unlikely.
This
is the first confirmed evidence of Neanderthal cannibalism in Northern Europe.
As no modern humans were present at the time, other Neandertals likely carried
out the processing. Whether the use of bones as tools had symbolic meaning
remains unclear.
Although
the remains are from the same era as certain stone tools, poor excavation
records prevent linking them to a specific culture. Other nearby Neandertal
sites show different treatment of the dead, highlighting the behavioral
diversity among late Neandertal groups—ranging from possible burials to
cannibalism—despite their genetic similarities.
In
Spain, at Cueva del Sidrón, the remains of at least 13 Neanderthals were found
with similar signs of human processing—cut marks and smashed bones. Unlike
Moula-Guercy, this site had very few animal bones, making it unusual. The
evidence suggests survival cannibalism, likely during a time of food shortage,
though detailed studies are still needed to confirm this.
Krapina
in Croatia presents a more debated case. Over 800 Neanderthal bones were found,
and while some researchers believe the bones were cleaned for burial, others
argue the cut marks and broken bones show clear evidence of cannibalism. Some
even found possible human tooth marks, adding to the idea that the bodies were
eaten.
Other
sites like Pradelles and Boquete de Zafarraya in France also show signs of
cannibalism—cut marks and fractures on bones—but provide little additional
context. At Combe-Grenal, there’s disagreement about whether the cut marks came
from funerary practices or cannibalism, though many argue that, due to
similarities with animal remains, cannibalism is more likely.
Neanderthal
cannibalism appears to have been practiced for various reasons, most often
likely for nutrition, but possibly also for cultural or ritualistic purposes.
However, without more evidence, especially symbolic artefacts or burial
structures, it is difficult to determine their exact motivations.
From
the Upper Palaeolithic to the Bronze Age, there is evidence that anatomically
modern humans in Europe practiced cannibalism, although the reasons and nature
of these acts are not always clear. Human remains from this period, especially
before the Magdalenian Era, are often very fragmented and unusual. In France,
it’s estimated that 40% of Magdalenian human remains show signs of being cut or
butchered, while only 5% were found in formal burials. These signs include
slicing and scraping marks that suggest the bodies were defleshed, possibly as
part of funeral rituals.
However,
because remains are scarce, it’s hard to say whether this involved eating the
bodies or just processing them for other reasons.
At Santa Maira in
(Spain human bite marks on ribs suggest that at least some body parts were
eaten, though it's not clear if this was done in a ritual or simply for food.
Another site, Le Placard Cave in France, has nine skulls with cut marks and
intentional breaks, suggesting they were made into skull cups. This points
toward ritual defleshing, though the full meaning is still debated, especially
since new animal bones found at the site may change previous interpretations.
Gough’s Cave in
Britain is another key example. Dated to around 14,700 years ago, it contains
human remains with clear signs of both eating and ritual treatment. Skulls were
carefully shaped into cups, and tooth marks were found on bones, showing they
were chewed. Researchers believe this is a strong case of ritual
cannibalism—meaning the bodies were both eaten and treated in a meaningful or
symbolic way.
The Mesolithic site
of Grotte Perrats in France also shows clear evidence of cannibalism. The bones
of at least eight people were found with many cut marks and broken bones,
similar to how animal bones were processed for food. Over 40% of the human
bones showed signs of cutting, breaking, and even scalping. The research suggests
this was cannibalism but does not commit to whether it was for survival,
ritual, or possibly to harm outsiders.
The Brillenhöhle
site in Germany includes human bones with a high number of cut marks,
particularly on feet bones. Although one interpretation proposed these were
from secondary burials, later studies found human bite marks and signs of
marrow extraction. These findings support the idea that the individuals were
consumed, likely through cannibalistic practices.
In the Neolithic
period, the Fontbrégoua site in France revealed remains from 13 people
processed like animals. Notably, skulls, hands, and feet were missing from some
bone piles, possibly indicating war trophies or ritual use. Similarly, at
Herxheim in Germany, the remains of over 1,000 people showed signs of cutting,
bone breaking, cooking, and even human tooth marks. Some skulls were made into
cups. While some scholars suggested complex funeral rituals, others concluded
it was exocannibalism during wartime, supported by evidence like strontium
isotopes showing distant origins of some individuals.
Additional
Neolithic examples from Spain, such as the Cueva de Malalmuerzo and Cueva de
Carigüela, show clear parallels with earlier sites. Human remains were found
mixed with animals, bearing cut marks and deliberate bone breakage, again
pointing to cannibalism. Skull cups further support this interpretation.
During the Bronze
Age, Cueva del Mirador in Spain provides evidence of what was originally
interpreted as brain extraction for food, but this may have also had a ritual
aspect. Many bones show signs of boiling, cut marks, and human bites, strongly
suggesting cannibalism occurred. Other proposed Bronze Age cases from Central
Europe lack detailed analysis, making firm conclusions difficult.
By the Iron Age,
cannibalism appears much less common. A few UK sites show human bones with cut
marks and green bone breakage, but no thorough studies have been done. In later
history, cannibalism became a social taboo, associated with barbarism. Most
modern European cases of cannibalism relate to extreme necessity or mental
illness rather than cultural practice.
Defining
cannibalism in prehistoric Europe is complex, especially when trying to
identify it in archaeological contexts or understand its causes. Some scholars
attempt to distinguish between “anthropophagy” and “cannibalism.”
Anthropophagy refers to
occasional acts of eating human flesh, possibly by individuals, while
cannibalism is seen as a cultural or social practice that may involve group
participation. However, many researchers argue that these terms are essentially
synonymous in archaeology and should be treated as such. The simplest
definition, which most agree on, is the consumption of human tissues ( like flesh, marrow, blood, etc.) by other
humans, though pinpointing the motivations behind this behavior is far more
difficult.
Historically, scholars like have
categorized cannibalism into several types. Gastronomic cannibalism, for food value, and ritual cannibalism, for spiritual
purposes, and medicinal cannibalism
(using human tissue to treat illness). Survival
cannibalism involves eating humans in extreme hunger, and aggressive cannibalism is revenge-driven
or antisocial acts. Dietary cannibalism,
which she considered the easiest to identify archaeologically, as it’s focused
purely on nutrition.
Cannibalism is not a single, uniform
practice but a complex behavior influenced by social, religious, political, and
economic factors. It can be grouped into
two main types: exceptional cannibalism, driven by immediate needs like
survival, and socially-instituted cannibalism, which is embedded in
cultural practices such as rituals, warfare, or beliefs about death.
There are many subtypes of
cannibalism, including ritual, medicinal, self-cannibalism, legal, symbolic,
and even "gourmet" cannibalism. These categories, often based on
ethnographic or historical data, reflect a wide range of motivations—from honouring
the dead to humiliating enemies. However, such classifications are difficult to
apply to prehistoric cases where motives can't be directly observed.
Ethnographic studies show that
cannibalism often relates to a group’s worldview, spirituality, and social
customs. For example, in some societies, eating human flesh may have been part
of managing life and death or expressing dominance over enemies. These acts
weren’t always considered barbaric but were integrated into social and
religious life.
In archaeology, it's challenging
to determine why cannibalism occurred because many of these complex cultural
meanings leave no physical trace. Analogies from ethnographic studies can help,
but they must be used carefully, as prehistoric societies might not have had
the same symbolic systems.
Terms like
"nutritional" or "gastronomic" cannibalism are often used
to describe cases where the primary goal seems to be food. However, these
labels can be misleading. Eating human flesh always involves some nutritional
value, but that doesn’t rule out symbolic or ritual aspects. Even so-called
nutritional cannibalism might follow social rules or customs, blurring the line
between practical and ritual behavior.
Examples from prehistoric Europe,
such as the Gran Dolina site and Herxheim, suggest cannibalism linked to
intergroup violence. While Gran Dolina might show survival-based or violent
cannibalism, Herxheim displays signs of more structured, possibly ritualized
practices that reflect cultural and symbolic meanings.
The identification of cannibalism
in prehistoric European contexts relies heavily on taphonomic analysis—the
study of processes affecting organisms after death, particularly bone
modifications. Archaeologists distinguish cannibalism from other cultural
practices (like funerary rituals or mutilation) by identifying specific
anthropogenic changes such as cut marks, bone breakage for marrow extraction,
human tooth impressions, cooking evidence, and spatial associations with animal
remains processed in similar ways. While these signs can sometimes overlap with
those resulting from ritualistic or mortuary practices, the consistency and
pattern of modifications provide strong indicators of cannibalistic behavior.
There has been considerable
debate within the academic community about interpreting such evidence. Some
argue that mortuary practices can leave similar marks to those attributed to
cannibalism, citing ethnographic parallels like defleshing rituals. Others
counter that these claims ignore critical contextual distinctions and fail to
account for parallels in the treatment of human and animal remains. When human
bones are processed identically to food animals—defleshed, broken for marrow
extraction, and even cooked—the evidence points more convincingly to
nutritional cannibalism.
Still, the interpretation remains
complex. Sites like Brillenhöhle and Fontbrégoua illustrate this tension. Some
scholars have dismissed cannibalism at Brillenhöhle based on the presence of
cut marks, while others argue that the intensity and nature of the
modifications indicate consumption. Fontbrégoua is frequently cited as a robust
case for prehistoric cannibalism due to its extensive evidence of butchering.
However, even this interpretation has been questioned, highlighting the
subjectivity and evolving nature of taphonomic interpretations.
To refine
identification, researchers have developed methodological frameworks focusing
on the type, frequency, and anatomical location of bone modifications. Studies
compare human remains to those of animals processed at the same site. Human
tooth marks, although shallow and sometimes hard to differentiate from those of
other carnivores, are considered strong evidence when found alongside cut marks
and perimortem bone fractures. Their presence in sites such as Gough’s Cave and
El Mirador strengthens arguments for cannibalism.
Ultimately, a
holistic approach is necessary—one that examines the entire archaeological
context, compares human and faunal assemblages, and carefully considers the
spatial and taphonomic evidence. While some assemblages still yield ambiguous
interpretations due to small sample sizes or poor preservation, many European
sites—spanning from the Lower Paleolithic to the Bronze Age—share enough
taphonomic characteristics to support the occurrence of cannibalism. These
include systematic butchering, evidence of marrow extraction, burning, and
sometimes human tooth impressions, all of which collectively differentiate
cannibalism from other cultural treatments of the dead.
Evidence for
prehistoric cannibalism in Europe has grown, yet it remains relatively scarce,
limiting broad generalizations. Across 18 archaeological assemblages ranging
from the early Pleistocene to the Bronze Age, signs of human cannibalism have
been documented. The increasing number of such findings suggests cannibalism was
practiced intermittently over long periods, especially intensifying after the
Upper Palaeolithic.
Most cannibalized
assemblages share distinctive taphonomic features, such as a high frequency of
anthropogenic marks—often over 20 percent—which is higher than those found in
North American contexts. Contrary to the view that such cut marks indicate
mortuary practices rather than cannibalism, these marks more accurately reflect
full butchering sequences not typically seen in funerary contexts. The
processing includes defleshing, dismemberment, evisceration, bone breakage,
burning or boiling, and, in many cases, the presence of human tooth marks.
These practices go beyond what is seen in ritual or secondary burial and point
to actual consumption.
The pattern of butchering,
breaking, and thermal processing mirrors sequences recorded in other global
contexts like the American Southwest. Common features include long bone and
skull breakage for marrow and brain extraction, disarticulated skeletons, and
the occasional anatomical association of segments like hands or feet. Human
tooth marks, bone crushing, and percussion marks are widely observed, although
these require more experimental validation. Tool use from human bones, though
rare, has been documented in specific periods such as the Magdalenian.
For the more
debated assemblages, the presence of human tooth marks and consistent
processing methods strengthens the argument for cannibalism. These assemblages
should be revisited with modern techniques, including DNA, isotopic, and
chronological analyses, to develop new interpretations and insights. A holistic
analytical framework—including demographic data, tool associations, and
stratigraphy—can help clarify whether cannibalism was a rare event or a
routine, institutionalized behavior.
Finally, the
motivation behind cannibalism remains elusive. Traditional labels like
"nutritional" or "ritual" may oversimplify a complex
behavior that could have occurred in contexts of violence, survival, or even
affection. Ethnographic parallels help interpret these findings but must be
applied with caution. Ultimately, only through integrated, multidisciplinary
approaches can we hope to understand the frequency, causes, and cultural
meanings of cannibalism in prehistoric Europe.