Who first buried the dead?


Paige Madison writes:

A mysterious cache of bones, recovered from a deep chamber in a South African cave, is challenging long-held beliefs about how a group of bipedal apes developed into the abstract-thinking creatures that we call “human.” The fossils were discovered in 2013 and were quickly recognized as the remains of a new species unlike anything seen before. Named Homo naledi, it has an unexpected mix of modern features and primitive ones, including a fairly small brain. Arguably the most shocking aspect of Homo naledi, though, concerned not the remains themselves but rather their resting place.

The chamber where the bones were found is far from the cave entrance, accessible only through a narrow, difficult passage that is completely shrouded in darkness. Scientists believe the chamber has long been difficult to access, requiring a journey of vertical climbing, crawling, and tight squeezing through spaces only 20 centimeters across. It would be an impossible place to live, and a highly unlikely location for many individuals to have ended up by accident. Those details pushed the research team toward a shocking hypothesis: despite its puny brain, Homo naledi purposefully interred its dead. The cave chamber was a graveyard, they concluded.

For anthropologists, mortuary rituals carry an outsize importance in tracing the emergence of human uniqueness—especially the capacity to think symbolically. Symbolic thought gives us the ability to transcend the present, remember the past, and visualize the future. It allows us to imagine, to create, and to alter our environment in ways that have significant consequences for the planet. Use of language is the quintessential embodiment of such mental abstractions, but studying its history is difficult because language doesn’t fossilize. Burials do.

Burials provide a hard, material record of a behavior that is deeply spiritual and meaningful. It allows scientists to trace the emergence of beliefs, values, and other complex ideas that appear to be uniquely human. Homo sapiens is unquestionably unlike any other species alive today. Pinpointing what separates us from the rest of nature is surprisingly difficult, however. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

There’s no scientific basis for race — it’s a made-up label

 

Elizabeth Kolbert writes:

In the first half of the 19th century, one of America’s most prominent scientists was a doctor named Samuel Morton. Morton lived in Philadelphia, and he collected skulls.

He wasn’t choosy about his suppliers. He accepted skulls scavenged from battlefields and snatched from catacombs. One of his most famous craniums belonged to an Irishman who’d been sent as a convict to Tasmania (and ultimately hanged for killing and eating other convicts). With each skull Morton performed the same procedure: He stuffed it with pepper seeds—later he switched to lead shot—which he then decanted to ascertain the volume of the braincase.

Morton believed that people could be divided into five races and that these represented separate acts of creation. The races had distinct characters, which corresponded to their place in a divinely determined hierarchy. Morton’s “craniometry” showed, he claimed, that whites, or “Caucasians,” were the most intelligent of the races. East Asians—Morton used the term “Mongolian”—though “ingenious” and “susceptible of cultivation,” were one step down. Next came Southeast Asians, followed by Native Americans. Blacks, or “Ethiopians,” were at the bottom. In the decades before the Civil War, Morton’s ideas were quickly taken up by the defenders of slavery.

“He had a lot of influence, particularly in the South,” says Paul Wolff Mitchell, an anthropologist at the University of Pennsylvania who is showing me the skull collection, now housed at the Penn Museum. We’re standing over the braincase of a particularly large-headed Dutchman who helped inflate Morton’s estimate of Caucasian capacities. When Morton died, in 1851, the Charleston Medical Journal in South Carolina praised him for “giving to the negro his true position as an inferior race.”

Today Morton is known as the father of scientific racism. So many of the horrors of the past few centuries can be traced to the idea that one race is inferior to another that a tour of his collection is a haunting experience. To an uncomfortable degree we still live with Morton’s legacy: Racial distinctions continue to shape our politics, our neighborhoods, and our sense of self.

This is the case even though what science actually has to tell us about race is just the opposite of what Morton contended. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

Rights of the dead and the living clash when scientists extract DNA from human remains

File 20180405 189824 kw01re.jpg?ixlib=rb 1.1
Who gets to decide for the dead, such as this Egyptian mummy?
AP Photo/Ric Feld

By Chip Colwell, University of Colorado Denver

The remains of a 6-inch long mummy from Chile are not those of a space alien, according to recently reported research. The tiny body with its strange features – a pointed head, elongated bones – had been the subject of fierce debate over whether a UFO might have left it behind. The scientists gained access to the body, which is now in a private collection, and their DNA testing proved the remains are those of a human fetus. The undeveloped girl suffered from a bone disease and was the child of an unknown local Atacama woman.

This study was supposed to end the mummy’s controversy. Instead, it ignited another one.

The mummified fetus from the Atacama region of Chile.
Bhattacharya S et al. 2018, CC BY

Authorities in Chile have denounced the research. They believe a looter plundered the girl from her grave and illegally took her from the country. The Chilean Society of Biological Anthropology issued a damning statement. It asked, “Could you imagine the same study carried out using the corpse of someone’s miscarried baby in Europe or America?”

As an archaeologist, I share in the excitement around how technology and techniques to study DNA are leaping ahead. As never before, the mysteries of our bodies and histories are finding exciting answers – from the revelation that humans interbred with Neanderthals, to how Britain was populated, to the enigma of a decapitated Egyptian mummy.

But, I have also closely studied the history of collecting human remains for science. I am gravely concerned that the current “bone rush” to make new genetic discoveries has set off an ethical crisis.

[Read more…]

Neanderthals cared for each other and survived into old age – new research

File 20180404 189801 zqmjij.jpg?ixlib=rb 1.1

shutterstock

By James Ohman, Liverpool John Moores University and Asier Gomez-Olivencia, University of the Basque Country

When we think of Neanderthals, we often imagine these distant ancestors of ours to be rather brutish, dying at a young age and ultimately becoming extinct. But new findings show that at least some of these ancient Neanderthals survived into old age – despite suffering from sickness or diseases.

Neanderthals were hunter-gatherers, living in harsh environments, mostly colder than today. And of course they had to face different dangers to modern humans – not only during the hunt, but also because they shared ecosystems with large carnivores such as lions, leopards and hyenas.

But despite this harsh life of the hunter gatherer, our research indicates that some Neanderthals lived to be fairly old and even had some of the signs of age related illnesses – such as degenerative lesions in the spine, consistent with osteoarthritis. Our research also found that an adult male Neanderthal survived bone fractures. And when he died, he was buried by members of his group.

[Read more…]

The unwelcome revival of ‘race science’

Gavin Evans writes:

One of the strangest ironies of our time is that a body of thoroughly debunked “science” is being revived by people who claim to be defending truth against a rising tide of ignorance. The idea that certain races are inherently more intelligent than others is being trumpeted by a small group of anthropologists, IQ researchers, psychologists and pundits who portray themselves as noble dissidents, standing up for inconvenient facts. Through a surprising mix of fringe and mainstream media sources, these ideas are reaching a new audience, which regards them as proof of the superiority of certain races.

The claim that there is a link between race and intelligence is the main tenet of what is known as “race science” or, in many cases, “scientific racism”. Race scientists claim there are evolutionary bases for disparities in social outcomes – such as life expectancy, educational attainment, wealth, and incarceration rates – between racial groups. In particular, many of them argue that black people fare worse than white people because they tend to be less naturally intelligent.

Although race science has been repeatedly debunked by scholarly research, in recent years it has made a comeback. Many of the keenest promoters of race science today are stars of the “alt-right”, who like to use pseudoscience to lend intellectual justification to ethno-nationalist politics. If you believe that poor people are poor because they are inherently less intelligent, then it is easy to leap to the conclusion that liberal remedies, such as affirmative action or foreign aid, are doomed to fail.

There are scores of recent examples of rightwingers banging the drum for race science. In July 2016, for example, Steve Bannon, who was then Breitbart boss and would go on to be Donald Trump’s chief strategist, wrote an article in which he suggested that some black people who had been shot by the police might have deserved it. “There are, after all, in this world, some people who are naturally aggressive and violent,” Bannon wrote, evoking one of scientific racism’s ugliest contentions: that black people are more genetically predisposed to violence than others. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

DNA from more than 900 ancient people trace the prehistoric migrations of our species

Carl Zimmer writes:

David Reich wore a hooded, white suit, cream-colored clogs, and a blue surgical mask. Only his eyes were visible as he inspected the bone fragments on the counter.

Dr. Reich, a geneticist at Harvard Medical School, pointed out a strawberry-sized chunk: “This is from a 4,000-year-old site in Central Asia — from Uzbekistan, I think.”

He moved down the row. “This is a 2,500-year-old sample from a site in Britain. This is Bronze Age Russian, and these are Arabian samples. These people would have never met each other in time or space.”

Dr. Reich hopes that his team of scientists and technicians can find DNA in these bones. Odds are good that they will.

In less than three years, Dr. Reich’s laboratory has published DNA from the genomes of 938 ancient humans — more than all other research teams working in this field combined. The work in his lab has reshaped our understanding of human prehistory.

“They often answer age-old questions and sometimes provide astonishing unanticipated insights,” said Svante Paabo, the director of the Max Planck Institute of Paleoanthropology in Leipzig, Germany.

Dr. Reich, Dr. Paabo and other experts in ancient DNA are putting together a new history of humanity, one that runs in parallel with the narratives gleaned from fossils and written records. In Dr. Reich’s research, he and his colleagues have shed light on the peopling of the planet and the spread of agriculture, among other momentous events.

In a book to be published next week, “Who We Are and How We Got Here,” Dr. Reich, 43, explains how advances in DNA sequencing and analysis have helped this new field take off. [Continue reading…]

Evidence of complex cognitive abilities in humans more than 300,000 years ago

Gemma Tarlach writes:

Three papers, published together in Science today, add up to a paradigm-shoving conclusion: Key aspects of what we think of as modern human behavior evolved more than 300,000 years ago, a radical revision to the evolutionary timeline.

To understand the significance of the trio of studies, let’s take a brisk walk through recent changes in our understanding of human evolution. For decades, the consensus was that Homo sapiens evolved around 200,000 years ago in Africa, with anatomically modern humans emerging 100,000 years ago and leaving their ancestral continent around 50,000 years ago.

In the past few years, however, a series of surprising fossil finds, as well as advanced genetic analysis, have smashed that old paradigm like an angry Hulk.

The new picture of human evolution just emerging is that our species is more than 300,000 years old and left Africa much earlier than we thought, beginning at least 120,000 years ago and possibly as early as 200,000 years ago. Genetic analysis suggests our species may be even older, and was already interbreeding with Neanderthals, up to 470,000 years ago.

Fossilized bones, and the ancient DNA researchers are sometimes able to extract from them, can only tell us so much, however. Often missing from the picture is how the individuals lived. While the timeline for the physical evolution of our species gets pushed further and further back, Homo sapiens are defined as much by their modern human behavior as the shape of their skull. Any early Homo sapiens running around might have looked at lot like us, the thinking went, but they didn’t act like us and weren’t capable of the same cognitive feats, such as symbolic thinking.

While there was no definitive date for the emergence of modern human behavior — after all, it was a process, not the flip of a switch — the nice, round number of 100,000 years ago has often been cited as a landmark. That’s about when some of the earliest evidence of modern human behavior, such assembling ochre-based compounds to create paint kits, appears.

Enter today’s studies, which take a multifaceted look at the Kenyan site of Olorgesailie. Taken in sum, the papers present evidence that gradual climate change drove humans to adapt to a new environment and altered food supply; the adaptations appear to include far-flung social and trade networks, advanced tool technology and manipulation of ochre, a variety of high-iron rock used throughout the human story as pigment. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

The ancient hunt in which the tracker’s skill united reason and imagination

“The San people of the Kalahari desert are the last tribe on Earth to use what some believe to be the most ancient hunting technique of all: the persistence hunt; they run down their prey,” says David Attenborough:

 

“The hunter pays tribute to his quarry’s courage and strength. With ceremonial gestures that ensure that its spirit returns to the desert sands from which it came. While it was alive, he lived and breathed with it and felt its every movement in his own body, and at the moment of its death, he shared its pain. He rubs its saliva into his own legs to relieve the agony of his own burning muscles, and he gives thanks for the life he has taken so that he may sustain the lives of his family waiting for him back in their settlement.”

Louis Liebenberg, author of The Art of Tracking: The Origin of Science, argues that the rational skills required by the ancient tracker provided the basis of scientific reasoning.

The first creative science, practiced by possibly some of the earliest members of Homo sapiens who had modern brains and intellects, may have been the tracking of game animals…

In easy tracking terrain, trackers may follow a trail simply by looking for one sign after the other, but in difficult terrain this can become so time-consuming that they may never catch up with their quarry. Instead of looking for one sign at a time, the trackers place themselves in the position of their quarry in order to anticipate the route it may have taken. They then decide in advance where they can expect to find signs, instead of wasting time looking for them. To reconstruct an animal’s activities, specific actions and movements must be seen in the context of the animal’s whole environment at specific times and places…

Since tracks may be partly obliterated or difficult to see, they may only exhibit partial evidence, so the reconstruction of these animals’ activities must be based on creative hypotheses. To interpret the footprints, trackers must use their imagination to visualize what the animal was doing to create such markings. Such a reconstruction will contain more information than is evident from the tracks, and will therefore be partly factual and partly hypothetical. As new factual information is gathered in the process of tracking, hypotheses may have to be revised or substituted by better ones. A hypothetical reconstruction of the animal’s behaviors may enable trackers to anticipate and predict the animal’s movements. These predictions provide ongoing testing of the hypotheses.

Perhaps the most significant feature of creative science is that a hypothesis may enable the scientist to predict novel facts that would not otherwise have been known.

Implicit in this interpretation of tracking there is also a view of science broader than its conventional placement within the sphere of human rationality. From this perspective, reason and imagination work hand in hand.

Thus, when the hunter hypothesizes about the movements of his quarry, he is also engaging in a wild leap of imagination: he becomes the quarry by entering its mind and seeing the world through its eyes.

From this vantage point, there is no conquest or victory in the hunt. Hunter and hunted are one, inseparable in life and death.

This way of knowing non-human life, through a creative identification in which animal “spirits” are experienced, seems to be universal among indigenous peoples, strongly suggesting it is something we have lost rather than advanced above. In a most fundamental way, it signals the degree to which collectively our observational and empathic skills have withered as we withdrew from the natural world.

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

Across human history, there’s little evidence large-scale social organization necessitates enduring inequality

David Graeber and David Wengrow write:

Stonehenge, it turns out, was only the latest in a very long sequence of ritual structures, erected in timber as well as stone, as people converged on the plain from remote corners of the British Isles, at significant times of year. Careful excavation has shown that many of these structures – now plausibly interpreted as monuments to the progenitors of powerful Neolithic dynasties – were dismantled just a few generations after their construction. Still more strikingly, this practice of erecting and dismantling grand monuments coincides with a period when the peoples of Britain, having adopted the Neolithic farming economy from continental Europe, appear to have turned their backs on at least one crucial aspect of it, abandoning cereal farming and reverting – around 3300 BC – to the collection of hazelnuts as a staple food source. Keeping their herds of cattle, on which they feasted seasonally at nearby Durrington Walls, the builders of Stonehenge seem likely to have been neither foragers nor farmers, but something in between. And if anything like a royal court did hold sway in the festive season, when they gathered in great numbers, then it could only have dissolved away for most of the year, when the same people scattered back out across the island.

Why are these seasonal variations important? Because they reveal that from the very beginning, human beings were self-consciously experimenting with different social possibilities. Anthropologists describe societies of this sort as possessing a ‘double morphology’. Marcel Mauss, writing in the early twentieth century, observed that the circumpolar Inuit, ‘and likewise many other societies . . . have two social structures, one in summer and one in winter, and that in parallel they have two systems of law and religion’. In the summer months, Inuit dispersed into small patriarchal bands in pursuit of freshwater fish, caribou, and reindeer, each under the authority of a single male elder. Property was possessively marked and patriarchs exercised coercive, sometimes even tyrannical power over their kin. But in the long winter months, when seals and walrus flocked to the Arctic shore, another social structure entirely took over as Inuit gathered together to build great meeting houses of wood, whale-rib, and stone. Within them, the virtues of equality, altruism, and collective life prevailed; wealth was shared; husbands and wives exchanged partners under the aegis of Sedna, the Goddess of the Seals.

Another example were the indigenous hunter-gatherers of Canada’s Northwest Coast, for whom winter – not summer – was the time when society crystallised into its most unequal form, and spectacularly so. Plank-built palaces sprang to life along the coastlines of British Columbia, with hereditary nobles holding court over commoners and slaves, and hosting the great banquets known as potlatch. Yet these aristocratic courts broke apart for the summer work of the fishing season, reverting to smaller clan formations, still ranked, but with an entirely different and less formal structure. In this case, people actually adopted different names in summer and winter, literally becoming someone else, depending on the time of year.

Perhaps most striking, in terms of political reversals, were the seasonal practices of 19th-century tribal confederacies on the American Great Plains – sometime, or one-time farmers who had adopted a nomadic hunting life. In the late summer, small and highly mobile bands of Cheyenne and Lakota would congregate in large settlements to make logistical preparations for the buffalo hunt. At this most sensitive time of year they appointed a police force that exercised full coercive powers, including the right to imprison, whip, or fine any offender who endangered the proceedings. Yet as the anthropologist Robert Lowie observed, this ‘unequivocal authoritarianism’ operated on a strictly seasonal and temporary basis, giving way to more ‘anarchic’ forms of organisation once the hunting season – and the collective rituals that followed – were complete.

Scholarship does not always advance. Sometimes it slides backwards. A hundred years ago, most anthropologists understood that those who live mainly from wild resources were not, normally, restricted to tiny ‘bands.’ That idea is really a product of the 1960s, when Kalahari Bushmen and Mbuti Pygmies became the preferred image of primordial humanity for TV audiences and researchers alike. As a result we’ve seen a return of evolutionary stages, really not all that different from the tradition of the Scottish Enlightenment: this is what Fukuyama, for instance, is drawing on, when he writes of society evolving steadily from ‘bands’ to ‘tribes’ to ‘chiefdoms,’ then finally, the kind of complex and stratified ‘states’ we live in today – usually defined by their monopoly of ‘the legitimate use of coercive force.’ By this logic, however, the Cheyenne or Lakota would have had to be ‘evolving’ from bands directly to states roughly every November, and then ‘devolving’ back again come spring. Most anthropologists now recognise that these categories are hopelessly inadequate, yet nobody has proposed an alternative way of thinking about world history in the broadest terms.

Quite independently, archaeological evidence suggests that in the highly seasonal environments of the last Ice Age, our remote ancestors were behaving in broadly similar ways: shifting back and forth between alternative social arrangements, permitting the rise of authoritarian structures during certain times of year, on the proviso that they could not last; on the understanding that no particular social order was ever fixed or immutable. Within the same population, one could live sometimes in what looks, from a distance, like a band, sometimes a tribe, and sometimes a society with many of the features we now identify with states. With such institutional flexibility comes the capacity to step outside the boundaries of any given social structure and reflect; to both make and unmake the political worlds we live in. If nothing else, this explains the ‘princes’ and ‘princesses’ of the last Ice Age, who appear to show up, in such magnificent isolation, like characters in some kind of fairy-tale or costume drama. Maybe they were almost literally so. If they reigned at all, then perhaps it was, like the kings and queens of Stonehenge, just for a season. [Continue reading…]

Are we smart enough to know how smart animals are?


Frans de Waal asks: are we smart enough to know how smart animals are?

Just as attitudes of superiority within segments of human culture are often expressions of ignorance, humans collectively — especially when subject to the dislocating effects of technological dependence — tend to underestimate the levels of awareness and cognitive skills of creatures who live mostly outside our sight. This tendency translates into presuppositions that need to be challenged by what de Waal calls his “cognitive ripple rule”:

Every cognitive capacity that we discover is going to be older and more widespread than initially thought.

In a review of de Waal’s book, Are We Smart Enough to Know How Smart Animals Are?, Ludwig Huber notes that there are a multitude of illustrations of the fact that brain size does not correlate with cognitive capacities.

Whereas we once thought of humans as having unique capabilities in learning and the use of tools, we now know these attributes place us in a set of species that also includes bees. Our prior assumptions about seemingly robotic behavior in such creatures turns out to have been an expression of our own anthropocentric prejudices.

Huber writes:

Various doctrines of human cognitive superiority are made plausible by a comparison of human beings and the chimpanzees. For questions of evolutionary cognition, this focus is one-sided. Consider the evolution of cooperation in social insects, such as the Matabele ant (Megaponera analis). After a termite attack, these ants provide medical services. Having called for help by means of a chemical signal, injured ants are brought back to the nest. Their increased chance of recovery benefits the entire colony. Red forest ants (Myrmica rubra) have the ability to perform simple arithmetic operations and to convey the results to other ants.

When it comes to adaptations in animals that require sophisticated neural control, evolution offers other spectacular examples. The banded archerfish (Toxotes jaculatrix) is able to spit a stream of water at its prey, compensating for refraction at the boundary between air and water. It can also track the distance of its prey, so that the jet develops its greatest force just before impact. Laboratory experiments show that the banded archerfish spits on target even when the trajectory of its prey varies. Spit hunting is a technique that requires the same timing used in throwing, an activity otherwise regarded as unique in the animal kingdom. In human beings, the development of throwing has led to an enormous further development of the brain. And the archerfish? The calculations required for its extraordinary hunting technique are based on the interplay of about six neurons. Neural mini-networks could therefore be much more widespread in the animal kingdom than previously thought.

Research on honeybees (Apis mellifera) has brought to light the cognitive capabilities of minibrains. Honeybees have no brains in the real sense. Their neuronal density, however, is among the highest in insects, with roughly 960 thousand neurons—far fewer than any vertebrate. Even if the brain size of honeybees is normalized to their body size, their relative brain size is lower than most vertebrates. Insect behavior should be less complex, less flexible, and less modifiable than vertebrate behavior. But honeybees learn quickly how to extract pollen and nectar from a large number of different flowers. They care for their young, organize the distribution of tasks, and, with the help of the waggle dance, they inform each other about the location and quality of distant food and water.

Early research by Karl von Frisch suggested that such abilities cannot be the result of inflexible information processing and rigid behavioral programs. Honeybees learn and they remember. The most recent experimental research has, in confirming this conclusion, created an astonishing picture of the honeybee’s cognitive competence. Their representation of the world does not consist entirely of associative chains. It is far more complex, flexible, and integrative. Honeybees show configural conditioning, biconditional discrimination, context-dependent learning and remembering, and even some forms of concept formation. Bees are able to classify images based on such abstract features as bilateral symmetry and radial symmetry; they can comprehend landscapes in a general way, and spontaneously come to classify new images. They have recently been promoted to the set of species capable of social learning and tool use.

In any case, the much smaller brain of the bee does not appear to be a fundamental limitation for comparable cognitive processes, or at least their performance. Jumping spiders and cephalopods are similarly instructive. The similarities between mammals and bees are astonishing, but they cannot be traced to homologous neurological developments. As long as the animal’s neural architecture remains unknown, we cannot determine the cause of their similarity. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.