Gough’s Cave in Somerset was thought to have given up all its secrets when excavations ended in 1992, yet research on human bones from the site has continued in the decades since. After its discovery in the 1880s, the site was developed as a show cave and largely emptied of sediment, at times with minimal archaeological supervision. The excavations uncovered intensively-processed human bones intermingled with abundant butchered large mammal remains and a diverse range of flint, bone, antler, and ivory artefacts.
New radiocarbon techniques have revealed remains were deposited over a very short period of time, possibly during a series of seasonal occupations, about 14,700 years ago.
Dr Silvia Bello, from the Natural History Museum’s Department of Earth Sciences, lead researcher of the work said, “The human remains have been the subject of several studies. In a previous analysis, we could determine that the cranial remains had been carefully modified to make skull-cups. During this research, however, we’ve identified a far greater degree of human modification than recorded in earlier. We’ve found undoubting evidence for defleshing, disarticulation, human chewing, crushing of spongy bone, and the cracking of bones to extract marrow.”
The presence of human tooth marks on many of the bones provides incontrovertible evidence for cannibalism, the team found. In a wider context, the treatment of the human corpses and the manufacture and use of skull-cups at Gough’s Cave has parallels with other ancient sites in central and western Europe. But the new evidence from Gough’s Cave suggests that cannibalism during the ‘Magdalenian period’ was part of a customary mortuary practice that combined intensive processing and consumption of the bodies with the ritual use of skull-cups.
Simon Parfitt, of University College London, said, “A recurring theme of this period is the remarkable rarity of burials and how commonly we find human remains mixed with occupation waste at many sites. Further analysis along the lines used to study Gough’s Cave will help to establish whether the type of ritualistic cannibalism practiced there is a regional (‘Creswellian’) phenomenon, or a more widespread practice found throughout the Magdalenian world.”
Natural History Museum - Header Image : Skull Bowl – © The Trustees of the Natural History Museum, London
They found that the biochemical composition of teeth that were forming in the womb and during a child’s early years not only provided insight into the health of the baby’s mother, it even showed major differences between those infants who died and those who survived beyond early childhood.
Earlier work led by Dr Janet Montgomery and Dr Mandy Jay from Durham’s Department of Archaeology found similar results in people living in the Iron Age on the Isle of Skye and in Neolithic Shetland.
These archaeological findings – published in the American Journal of Physical Anthropology – are now being tested in baby teeth from children born recently in Bradford and Sudan. If similar patterns can be seen in current day mothers and children, the researchers hope this could lead to a simple test on baby teeth to predict potential health problems in adulthood.
Lead researcher Dr Julia Beaumont from Bradford’s School of Archaeological Sciences explains: “We know that stress and poor diet in mothers, both during pregnancy and after birth, can have an impact on a child’s development. In the past that could mean a child didn’t survive; now it’s more likely to mean a child has a greater risk of health issues in later life. While sometimes there are obvious signs of maternal stress in the baby at birth, such as a low birth weight, that isn’t always the case. So a simple test on teeth that are naturally shed by children as they grow could provide useful information about future health risks.”
Levels of carbon and nitrogen isotopes within bone and teeth, and the relationship between the two, change with different diets, so baby teeth can reveal clues about the diet of the mother during pregnancy and the diet of the child immediately after birth. The first permanent molar also forms around birth and is retained into adulthood. Each layer of the tooth relates to around four months’ growth, starting in the womb, enabling it to be linked to a specific period of a baby’s life.
These indicators have also been thought to show when a baby has been breastfed – seen as a healthy start in life. Nitrogen isotope levels are higher in people on protein rich diets and in breastfed babies, and lower for vegetarian diets.
However, in the samples taken from the famine cemetery, the results were counterintuitive. The babies who showed higher nitrogen isotope levels at birth didn’t survive into adulthood. Those who did survive had lower and more stable nitrogen isotope levels throughout early childhood.
Similar results were found amongst Victorians buried in the London cemetery who lived during a period of high rates of infant death and amongst the prehistoric people in Scotland. Dr Beaumont believes that, far from being an indicator of a good start in life, the higher nitrogen isotope levels showed that the mothers were malnourished and under stress.
“At the period we studied, it’s likely that most babies were breastfed, but only some showed the spike in nitrogen isotope levels normally associated with it,” she says. “Where pregnant and breastfeeding mothers are malnourished however, they can recycle their own tissues in order for the baby to grow and then to produce milk to feed it. We believe this produces higher nitrogen isotope levels and is what we’re seeing in the samples from the 19th-century cemeteries. Babies born to and breastfed by malnourished mothers do not receive all the nutrients they need, and this is possibly why these babies didn’t survive.”
Dr Beaumont now hopes that the insights she’s gained from the historical graves can be used to help children in the future. She is currently testing teeth from children through the Born in Bradford project, a long term study of a cohort of 13,500 children, born between 2007 and 2010, whose health is being tracked from pregnancy through childhood and into adult life. She hopes to be able to correlate nitrogen and carbon isotope levels to the medical history of the mother and the future health of the children.
“We currently cannot analyse any other tissue in the body where the stress we are under before birth and during early childhood is recorded,” says Dr Beaumont. “If we can show that baby teeth, which are lost naturally, provide markers for stress in the first months of life, we could have an important indicator of future health risks, such as diabetes and heart disease.”
University of Bradford – Header Image : Dr. Julia Beaumont, University of Bradford. – University of Bradford
PLOS ONE published the results, which knock another chip off theories that Stone Age hand axes are simple tools that don’t involve higher-order executive function of the brain.
“For the first time, we’ve showed a relationship between the degree of prefrontal brain activity, the ability to make technological judgments, and success in actually making stone tools,” says Dietrich Stout, an experimental archeologist at Emory University and the leader of the study. “The findings are relevant to ongoing debates about the origins of modern human cognition, and the role of technological and social complexity in brain evolution across species.”
The skill of making a prehistoric hand axe is “more complicated and nuanced than many people realize,” Stout says. “It’s not just a bunch of ape-men banging rocks together. We should have respect for Stone Age tool makers.”
The study’s co-authors include Bruce Bradley of the University of Exeter in England, Thierry Chaminade of Aix-Marseille University in France; and Erin Hecht and Nada Khreisheh of Emory University.
Stone tools – shaped by striking a stone “core” with a piece of bone, antler, or another stone – provide some of the most abundant evidence of human behavioral change over time. Simple Oldowan stone flakes are the earliest known tools, dating back 2.6 million years. The Late Acheulean hand axe goes back 500,000 years. While it’s relatively easy to learn to make an Oldowan flake, the Acheulean hand axe is harder to master, due to its lens-shaped core tapering down to symmetrical edges.
“We wanted to tease apart and compare what parts of the brain were most actively involved in these stone tool technologies, particularly the role of motor control versus strategic thinking,” Stout says.
The researchers recruited six subjects, all archeology students at Exeter University, to train in making stone tools, a skill known as “knapping.” The subjects’ skills were evaluated before and after they trained and practiced. For Oldowan evaluations, subjects detached five flakes from a flint core. For Acheulean evaluations, they produced a tool from a standardized porcelain core.
At the beginning, middle and end of the 18-month experiment, subjects underwent functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI) scans of their brains while they watched videos. The videos showed rotating stone cores marked with colored cues: A red dot indicated an intended point of impact, and a white area showed the flake predicted to result from the impact. The subjects were asked the following questions:
“If the core were struck in the place indicated, is what you see a correct prediction of the flake that would result?”
“Is the indicated place to hit the core a correct one given the objective of the technology?”
The subjects responded by pushing a “yes” or “no” button.
Answering the first question, how a rock will break if you hit it in a certain place, relies more on reflexive, perceptual and motor-control processes, associated with posterior portions of the brain. Stout compares it to the modern-day rote reflex of a practiced golf swing or driving a car.
The second question – is it a good idea to hit the core in a certain spot if you want to make a hand axe – involves strategic thinking, such as planning the route for a road trip. “You have to think about information that you have stored in your brain, bring it online, and then make a decision about each step of the trip,” Stout says.
This so-called executive control function of the brain, associated with activity in the prefrontal cortex, allows you to project what’s going to happen in the future and use that projection to guide your action. “It’s kind of like mental time travel, or using a computer simulation,” Stout explains. “It’s considered a high level, human cognitive capacity.”
The researchers mapped the skill level of the subjects onto the data from their brain scans and their responses to the questions.
Greater skill at making tools correlated with greater accuracy on the video quiz for predicting the correct strategy for making a hand axe, which was itself correlated with greater activity in the prefrontal cortex. “These data suggest that making an Acheulean hand axe is not simply a rote, auto pilot activity of the brain,” Stout says. “It requires you to engage in some complicated thinking.”
Most of the hand axes produced by the modern hands and minds of the study subjects would not have cut it in the Stone Age. “They weren’t up to the high standards of 500,000 years ago,” Stout says.
A previous study by the researchers showed that learning to make stone tools creates structural changes in fiber tracts of the brain connecting the parietal and frontal lobes, and that these brain changes correlated with increases in performance. “Something is happening to strengthen this connection,” Stout says. “This adds to evidence of the importance of these brain systems for stone tool making, and also shows how tool making may have shaped the brain evolutionarily.”
Stout recently launched a major, three-year archeology experiment that will build on these studies and others. Known as the Language of Technology project, the experiment involves 20 subjects who will each devote 100 hours to learning the art of making a Stone Age hand axe, and also undergo a series of MRI scans. The project aims to hone in whether the brain systems involved in putting together a sequence of words to make a meaningful sentence in spoken language overlap with systems involved in putting together a series of physical actions to reach a meaningful goal.
Welcome to In Focus. In this series we take a closer look at particular sites, finds and objects from the world of Archaeology.
Today we examine the Ancient Assyrian City of Nimrud in Northern Iraq.
The research by an international team of scientists confirmed the Earth’s first crust had formed around 4.5 billion years ago.
The team measured the amount of the rare elements hafnium and lutetium in the mineral zircon in a meteorite that originated early in the solar system.
“Meteorites that contain zircons are rare. We had been looking for an old meteorite with large zircons, about 50 microns long, that contained enough hafnium for precise analysis,” said Dr Yuri Amelin, from The Australian National University (ANU) Research School of Earth Sciences.
“By chance we found one for sale from a dealer. It was just what we wanted. We believe it originated from the asteroid Vesta, following a large impact that sent rock fragments on a course to Earth.”
The heat and pressure in the Earth’s interior mixes the chemical composition of its layers over billions of years, as denser rocks sink and less dense minerals rise towards the surface, a process known as differentiation.
Determining how and when the layers formed relies on knowing the composition of the original material that formed into the Earth, before differentiation, said Dr Amelin.
“Meteorites are remnants of the original pool of material that formed all the planets,” he said.
“But they have not had planetary-scale forces changing their composition throughout their five billion years orbiting the sun.”
The team accurately measured the ratio of the isotopes hafnium-176 and hafnium-177 in the meteorite, to give a starting point for the Earth’s composition.
The team were then able to compare the results with the oldest rocks on Earth, and found that the chemical composition had already been altered, proving that a crust had already formed on the surface of the Earth around 4.5 billion years ago.
- CBA History
- Support Us
- Group Publications