Rights of the dead and the living clash when scientists extract DNA from human remains

File 20180405 189824 kw01re.jpg?ixlib=rb 1.1
Who gets to decide for the dead, such as this Egyptian mummy?
AP Photo/Ric Feld

By Chip Colwell, University of Colorado Denver

The remains of a 6-inch long mummy from Chile are not those of a space alien, according to recently reported research. The tiny body with its strange features – a pointed head, elongated bones – had been the subject of fierce debate over whether a UFO might have left it behind. The scientists gained access to the body, which is now in a private collection, and their DNA testing proved the remains are those of a human fetus. The undeveloped girl suffered from a bone disease and was the child of an unknown local Atacama woman.

This study was supposed to end the mummy’s controversy. Instead, it ignited another one.

The mummified fetus from the Atacama region of Chile.
Bhattacharya S et al. 2018, CC BY

Authorities in Chile have denounced the research. They believe a looter plundered the girl from her grave and illegally took her from the country. The Chilean Society of Biological Anthropology issued a damning statement. It asked, “Could you imagine the same study carried out using the corpse of someone’s miscarried baby in Europe or America?”

As an archaeologist, I share in the excitement around how technology and techniques to study DNA are leaping ahead. As never before, the mysteries of our bodies and histories are finding exciting answers – from the revelation that humans interbred with Neanderthals, to how Britain was populated, to the enigma of a decapitated Egyptian mummy.

But, I have also closely studied the history of collecting human remains for science. I am gravely concerned that the current “bone rush” to make new genetic discoveries has set off an ethical crisis.

[Read more…]

Neanderthals cared for each other and survived into old age – new research

File 20180404 189801 zqmjij.jpg?ixlib=rb 1.1

shutterstock

By James Ohman, Liverpool John Moores University and Asier Gomez-Olivencia, University of the Basque Country

When we think of Neanderthals, we often imagine these distant ancestors of ours to be rather brutish, dying at a young age and ultimately becoming extinct. But new findings show that at least some of these ancient Neanderthals survived into old age – despite suffering from sickness or diseases.

Neanderthals were hunter-gatherers, living in harsh environments, mostly colder than today. And of course they had to face different dangers to modern humans – not only during the hunt, but also because they shared ecosystems with large carnivores such as lions, leopards and hyenas.

But despite this harsh life of the hunter gatherer, our research indicates that some Neanderthals lived to be fairly old and even had some of the signs of age related illnesses – such as degenerative lesions in the spine, consistent with osteoarthritis. Our research also found that an adult male Neanderthal survived bone fractures. And when he died, he was buried by members of his group.

[Read more…]

The unwelcome revival of ‘race science’

Gavin Evans writes:

One of the strangest ironies of our time is that a body of thoroughly debunked “science” is being revived by people who claim to be defending truth against a rising tide of ignorance. The idea that certain races are inherently more intelligent than others is being trumpeted by a small group of anthropologists, IQ researchers, psychologists and pundits who portray themselves as noble dissidents, standing up for inconvenient facts. Through a surprising mix of fringe and mainstream media sources, these ideas are reaching a new audience, which regards them as proof of the superiority of certain races.

The claim that there is a link between race and intelligence is the main tenet of what is known as “race science” or, in many cases, “scientific racism”. Race scientists claim there are evolutionary bases for disparities in social outcomes – such as life expectancy, educational attainment, wealth, and incarceration rates – between racial groups. In particular, many of them argue that black people fare worse than white people because they tend to be less naturally intelligent.

Although race science has been repeatedly debunked by scholarly research, in recent years it has made a comeback. Many of the keenest promoters of race science today are stars of the “alt-right”, who like to use pseudoscience to lend intellectual justification to ethno-nationalist politics. If you believe that poor people are poor because they are inherently less intelligent, then it is easy to leap to the conclusion that liberal remedies, such as affirmative action or foreign aid, are doomed to fail.

There are scores of recent examples of rightwingers banging the drum for race science. In July 2016, for example, Steve Bannon, who was then Breitbart boss and would go on to be Donald Trump’s chief strategist, wrote an article in which he suggested that some black people who had been shot by the police might have deserved it. “There are, after all, in this world, some people who are naturally aggressive and violent,” Bannon wrote, evoking one of scientific racism’s ugliest contentions: that black people are more genetically predisposed to violence than others. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

DNA from more than 900 ancient people trace the prehistoric migrations of our species

Carl Zimmer writes:

David Reich wore a hooded, white suit, cream-colored clogs, and a blue surgical mask. Only his eyes were visible as he inspected the bone fragments on the counter.

Dr. Reich, a geneticist at Harvard Medical School, pointed out a strawberry-sized chunk: “This is from a 4,000-year-old site in Central Asia — from Uzbekistan, I think.”

He moved down the row. “This is a 2,500-year-old sample from a site in Britain. This is Bronze Age Russian, and these are Arabian samples. These people would have never met each other in time or space.”

Dr. Reich hopes that his team of scientists and technicians can find DNA in these bones. Odds are good that they will.

In less than three years, Dr. Reich’s laboratory has published DNA from the genomes of 938 ancient humans — more than all other research teams working in this field combined. The work in his lab has reshaped our understanding of human prehistory.

“They often answer age-old questions and sometimes provide astonishing unanticipated insights,” said Svante Paabo, the director of the Max Planck Institute of Paleoanthropology in Leipzig, Germany.

Dr. Reich, Dr. Paabo and other experts in ancient DNA are putting together a new history of humanity, one that runs in parallel with the narratives gleaned from fossils and written records. In Dr. Reich’s research, he and his colleagues have shed light on the peopling of the planet and the spread of agriculture, among other momentous events.

In a book to be published next week, “Who We Are and How We Got Here,” Dr. Reich, 43, explains how advances in DNA sequencing and analysis have helped this new field take off. [Continue reading…]

Evidence of complex cognitive abilities in humans more than 300,000 years ago

Gemma Tarlach writes:

Three papers, published together in Science today, add up to a paradigm-shoving conclusion: Key aspects of what we think of as modern human behavior evolved more than 300,000 years ago, a radical revision to the evolutionary timeline.

To understand the significance of the trio of studies, let’s take a brisk walk through recent changes in our understanding of human evolution. For decades, the consensus was that Homo sapiens evolved around 200,000 years ago in Africa, with anatomically modern humans emerging 100,000 years ago and leaving their ancestral continent around 50,000 years ago.

In the past few years, however, a series of surprising fossil finds, as well as advanced genetic analysis, have smashed that old paradigm like an angry Hulk.

The new picture of human evolution just emerging is that our species is more than 300,000 years old and left Africa much earlier than we thought, beginning at least 120,000 years ago and possibly as early as 200,000 years ago. Genetic analysis suggests our species may be even older, and was already interbreeding with Neanderthals, up to 470,000 years ago.

Fossilized bones, and the ancient DNA researchers are sometimes able to extract from them, can only tell us so much, however. Often missing from the picture is how the individuals lived. While the timeline for the physical evolution of our species gets pushed further and further back, Homo sapiens are defined as much by their modern human behavior as the shape of their skull. Any early Homo sapiens running around might have looked at lot like us, the thinking went, but they didn’t act like us and weren’t capable of the same cognitive feats, such as symbolic thinking.

While there was no definitive date for the emergence of modern human behavior — after all, it was a process, not the flip of a switch — the nice, round number of 100,000 years ago has often been cited as a landmark. That’s about when some of the earliest evidence of modern human behavior, such assembling ochre-based compounds to create paint kits, appears.

Enter today’s studies, which take a multifaceted look at the Kenyan site of Olorgesailie. Taken in sum, the papers present evidence that gradual climate change drove humans to adapt to a new environment and altered food supply; the adaptations appear to include far-flung social and trade networks, advanced tool technology and manipulation of ochre, a variety of high-iron rock used throughout the human story as pigment. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

The ancient hunt in which the tracker’s skill united reason and imagination

“The San people of the Kalahari desert are the last tribe on Earth to use what some believe to be the most ancient hunting technique of all: the persistence hunt; they run down their prey,” says David Attenborough:

 

“The hunter pays tribute to his quarry’s courage and strength. With ceremonial gestures that ensure that its spirit returns to the desert sands from which it came. While it was alive, he lived and breathed with it and felt its every movement in his own body, and at the moment of its death, he shared its pain. He rubs its saliva into his own legs to relieve the agony of his own burning muscles, and he gives thanks for the life he has taken so that he may sustain the lives of his family waiting for him back in their settlement.”

Louis Liebenberg, author of The Art of Tracking: The Origin of Science, argues that the rational skills required by the ancient tracker provided the basis of scientific reasoning.

The first creative science, practiced by possibly some of the earliest members of Homo sapiens who had modern brains and intellects, may have been the tracking of game animals…

In easy tracking terrain, trackers may follow a trail simply by looking for one sign after the other, but in difficult terrain this can become so time-consuming that they may never catch up with their quarry. Instead of looking for one sign at a time, the trackers place themselves in the position of their quarry in order to anticipate the route it may have taken. They then decide in advance where they can expect to find signs, instead of wasting time looking for them. To reconstruct an animal’s activities, specific actions and movements must be seen in the context of the animal’s whole environment at specific times and places…

Since tracks may be partly obliterated or difficult to see, they may only exhibit partial evidence, so the reconstruction of these animals’ activities must be based on creative hypotheses. To interpret the footprints, trackers must use their imagination to visualize what the animal was doing to create such markings. Such a reconstruction will contain more information than is evident from the tracks, and will therefore be partly factual and partly hypothetical. As new factual information is gathered in the process of tracking, hypotheses may have to be revised or substituted by better ones. A hypothetical reconstruction of the animal’s behaviors may enable trackers to anticipate and predict the animal’s movements. These predictions provide ongoing testing of the hypotheses.

Perhaps the most significant feature of creative science is that a hypothesis may enable the scientist to predict novel facts that would not otherwise have been known.

Implicit in this interpretation of tracking there is also a view of science broader than its conventional placement within the sphere of human rationality. From this perspective, reason and imagination work hand in hand.

Thus, when the hunter hypothesizes about the movements of his quarry, he is also engaging in a wild leap of imagination: he becomes the quarry by entering its mind and seeing the world through its eyes.

From this vantage point, there is no conquest or victory in the hunt. Hunter and hunted are one, inseparable in life and death.

This way of knowing non-human life, through a creative identification in which animal “spirits” are experienced, seems to be universal among indigenous peoples, strongly suggesting it is something we have lost rather than advanced above. In a most fundamental way, it signals the degree to which collectively our observational and empathic skills have withered as we withdrew from the natural world.

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

Across human history, there’s little evidence large-scale social organization necessitates enduring inequality

David Graeber and David Wengrow write:

Stonehenge, it turns out, was only the latest in a very long sequence of ritual structures, erected in timber as well as stone, as people converged on the plain from remote corners of the British Isles, at significant times of year. Careful excavation has shown that many of these structures – now plausibly interpreted as monuments to the progenitors of powerful Neolithic dynasties – were dismantled just a few generations after their construction. Still more strikingly, this practice of erecting and dismantling grand monuments coincides with a period when the peoples of Britain, having adopted the Neolithic farming economy from continental Europe, appear to have turned their backs on at least one crucial aspect of it, abandoning cereal farming and reverting – around 3300 BC – to the collection of hazelnuts as a staple food source. Keeping their herds of cattle, on which they feasted seasonally at nearby Durrington Walls, the builders of Stonehenge seem likely to have been neither foragers nor farmers, but something in between. And if anything like a royal court did hold sway in the festive season, when they gathered in great numbers, then it could only have dissolved away for most of the year, when the same people scattered back out across the island.

Why are these seasonal variations important? Because they reveal that from the very beginning, human beings were self-consciously experimenting with different social possibilities. Anthropologists describe societies of this sort as possessing a ‘double morphology’. Marcel Mauss, writing in the early twentieth century, observed that the circumpolar Inuit, ‘and likewise many other societies . . . have two social structures, one in summer and one in winter, and that in parallel they have two systems of law and religion’. In the summer months, Inuit dispersed into small patriarchal bands in pursuit of freshwater fish, caribou, and reindeer, each under the authority of a single male elder. Property was possessively marked and patriarchs exercised coercive, sometimes even tyrannical power over their kin. But in the long winter months, when seals and walrus flocked to the Arctic shore, another social structure entirely took over as Inuit gathered together to build great meeting houses of wood, whale-rib, and stone. Within them, the virtues of equality, altruism, and collective life prevailed; wealth was shared; husbands and wives exchanged partners under the aegis of Sedna, the Goddess of the Seals.

Another example were the indigenous hunter-gatherers of Canada’s Northwest Coast, for whom winter – not summer – was the time when society crystallised into its most unequal form, and spectacularly so. Plank-built palaces sprang to life along the coastlines of British Columbia, with hereditary nobles holding court over commoners and slaves, and hosting the great banquets known as potlatch. Yet these aristocratic courts broke apart for the summer work of the fishing season, reverting to smaller clan formations, still ranked, but with an entirely different and less formal structure. In this case, people actually adopted different names in summer and winter, literally becoming someone else, depending on the time of year.

Perhaps most striking, in terms of political reversals, were the seasonal practices of 19th-century tribal confederacies on the American Great Plains – sometime, or one-time farmers who had adopted a nomadic hunting life. In the late summer, small and highly mobile bands of Cheyenne and Lakota would congregate in large settlements to make logistical preparations for the buffalo hunt. At this most sensitive time of year they appointed a police force that exercised full coercive powers, including the right to imprison, whip, or fine any offender who endangered the proceedings. Yet as the anthropologist Robert Lowie observed, this ‘unequivocal authoritarianism’ operated on a strictly seasonal and temporary basis, giving way to more ‘anarchic’ forms of organisation once the hunting season – and the collective rituals that followed – were complete.

Scholarship does not always advance. Sometimes it slides backwards. A hundred years ago, most anthropologists understood that those who live mainly from wild resources were not, normally, restricted to tiny ‘bands.’ That idea is really a product of the 1960s, when Kalahari Bushmen and Mbuti Pygmies became the preferred image of primordial humanity for TV audiences and researchers alike. As a result we’ve seen a return of evolutionary stages, really not all that different from the tradition of the Scottish Enlightenment: this is what Fukuyama, for instance, is drawing on, when he writes of society evolving steadily from ‘bands’ to ‘tribes’ to ‘chiefdoms,’ then finally, the kind of complex and stratified ‘states’ we live in today – usually defined by their monopoly of ‘the legitimate use of coercive force.’ By this logic, however, the Cheyenne or Lakota would have had to be ‘evolving’ from bands directly to states roughly every November, and then ‘devolving’ back again come spring. Most anthropologists now recognise that these categories are hopelessly inadequate, yet nobody has proposed an alternative way of thinking about world history in the broadest terms.

Quite independently, archaeological evidence suggests that in the highly seasonal environments of the last Ice Age, our remote ancestors were behaving in broadly similar ways: shifting back and forth between alternative social arrangements, permitting the rise of authoritarian structures during certain times of year, on the proviso that they could not last; on the understanding that no particular social order was ever fixed or immutable. Within the same population, one could live sometimes in what looks, from a distance, like a band, sometimes a tribe, and sometimes a society with many of the features we now identify with states. With such institutional flexibility comes the capacity to step outside the boundaries of any given social structure and reflect; to both make and unmake the political worlds we live in. If nothing else, this explains the ‘princes’ and ‘princesses’ of the last Ice Age, who appear to show up, in such magnificent isolation, like characters in some kind of fairy-tale or costume drama. Maybe they were almost literally so. If they reigned at all, then perhaps it was, like the kings and queens of Stonehenge, just for a season. [Continue reading…]

Are we smart enough to know how smart animals are?


Frans de Waal asks: are we smart enough to know how smart animals are?

Just as attitudes of superiority within segments of human culture are often expressions of ignorance, humans collectively — especially when subject to the dislocating effects of technological dependence — tend to underestimate the levels of awareness and cognitive skills of creatures who live mostly outside our sight. This tendency translates into presuppositions that need to be challenged by what de Waal calls his “cognitive ripple rule”:

Every cognitive capacity that we discover is going to be older and more widespread than initially thought.

In a review of de Waal’s book, Are We Smart Enough to Know How Smart Animals Are?, Ludwig Huber notes that there are a multitude of illustrations of the fact that brain size does not correlate with cognitive capacities.

Whereas we once thought of humans as having unique capabilities in learning and the use of tools, we now know these attributes place us in a set of species that also includes bees. Our prior assumptions about seemingly robotic behavior in such creatures turns out to have been an expression of our own anthropocentric prejudices.

Huber writes:

Various doctrines of human cognitive superiority are made plausible by a comparison of human beings and the chimpanzees. For questions of evolutionary cognition, this focus is one-sided. Consider the evolution of cooperation in social insects, such as the Matabele ant (Megaponera analis). After a termite attack, these ants provide medical services. Having called for help by means of a chemical signal, injured ants are brought back to the nest. Their increased chance of recovery benefits the entire colony. Red forest ants (Myrmica rubra) have the ability to perform simple arithmetic operations and to convey the results to other ants.

When it comes to adaptations in animals that require sophisticated neural control, evolution offers other spectacular examples. The banded archerfish (Toxotes jaculatrix) is able to spit a stream of water at its prey, compensating for refraction at the boundary between air and water. It can also track the distance of its prey, so that the jet develops its greatest force just before impact. Laboratory experiments show that the banded archerfish spits on target even when the trajectory of its prey varies. Spit hunting is a technique that requires the same timing used in throwing, an activity otherwise regarded as unique in the animal kingdom. In human beings, the development of throwing has led to an enormous further development of the brain. And the archerfish? The calculations required for its extraordinary hunting technique are based on the interplay of about six neurons. Neural mini-networks could therefore be much more widespread in the animal kingdom than previously thought.

Research on honeybees (Apis mellifera) has brought to light the cognitive capabilities of minibrains. Honeybees have no brains in the real sense. Their neuronal density, however, is among the highest in insects, with roughly 960 thousand neurons—far fewer than any vertebrate. Even if the brain size of honeybees is normalized to their body size, their relative brain size is lower than most vertebrates. Insect behavior should be less complex, less flexible, and less modifiable than vertebrate behavior. But honeybees learn quickly how to extract pollen and nectar from a large number of different flowers. They care for their young, organize the distribution of tasks, and, with the help of the waggle dance, they inform each other about the location and quality of distant food and water.

Early research by Karl von Frisch suggested that such abilities cannot be the result of inflexible information processing and rigid behavioral programs. Honeybees learn and they remember. The most recent experimental research has, in confirming this conclusion, created an astonishing picture of the honeybee’s cognitive competence. Their representation of the world does not consist entirely of associative chains. It is far more complex, flexible, and integrative. Honeybees show configural conditioning, biconditional discrimination, context-dependent learning and remembering, and even some forms of concept formation. Bees are able to classify images based on such abstract features as bilateral symmetry and radial symmetry; they can comprehend landscapes in a general way, and spontaneously come to classify new images. They have recently been promoted to the set of species capable of social learning and tool use.

In any case, the much smaller brain of the bee does not appear to be a fundamental limitation for comparable cognitive processes, or at least their performance. Jumping spiders and cephalopods are similarly instructive. The similarities between mammals and bees are astonishing, but they cannot be traced to homologous neurological developments. As long as the animal’s neural architecture remains unknown, we cannot determine the cause of their similarity. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

The white men seemed like lightning from heaven but their shit smelled just like ours

 

Sean Flynn writes:

Long after missionaries and Europeans settled on the coast of New Guinea in the 19th century, the mountainous interior remained unexplored. As recently as the 1920s, outsiders believed the mountains, which run the length of the island from east to west, were too steep and rugged for anyone to live there. But when gold was discovered 40 miles inland, prospectors went north across the Coral Sea to seek their fortunes. Among them were three brothers from Queensland, Australia: Michael, James and Daniel Leahy, the children of Irish immigrants, who in the early 1930s hiked to the top of the ridges with a group of native porters and gun bois (or armed guards) from the coast.

In the highlands the Leahys found wide, fertile valleys, groomed with garden plots that were later estimated to feed a million inhabitants sorted into hundreds of tribes and clans. The highlanders lived in huts of timber and kunai grass, used stone tools and fought with wooden spears and arrows. Just as white settlers had been unaware of their existence, the highlanders had no idea that anyone lived beyond the mountains.

At first, they suspected the white men were spirits, or maybe lightning come to earth. More curious than afraid, they traded with the white men, sweet potatoes and pigs and women in exchange for steel axes and shells (plentiful on the coast, but rare and highly prized in the highlands). When the expedition encountered new tribes, Michael “Mick” Leahy, the oldest brother and acknowledged leader, would shoot a pig to demonstrate his superior firepower. If a tribal “big man” tried to rally his warriors into a raiding party, Mick and his gun bois would shoot a few of them, too.

The Leahys traipsed through the highlands until, in 1933, they struck a claim near what is now Mount Hagen. There they built an airstrip, with friendly locals stamping the dirt flat in endless sing-sings, and settled in to make a modest fortune dredging shiny rocks from the streams. In time, the Leahys became famous for “opening” the interior to the outside world, and Mount Hagen grew into one of the country’s largest cities.

Nearly 50 years later, Bob Connolly, then a young journalist in Sydney, met Robin Anderson at the Australian Broadcasting Company, where they both worked making documentaries for television. The two fell in love and began looking for independent projects. One evening, at dinner with a friend who happened to be working on an oral history of the colonization of New Guinea, they learned that Mick Leahy, besides being a prospector and explorer, was also an amateur photographer—and not only had he brought a still camera and a movie camera on his expeditions, but his films and photographs were rumored to have survived.

Robin tracked down one of Mick Leahy’s sons in the coastal Papua New Guinea town of Lae, who went into his attic and retrieved 11 canisters of film. What was more, Robin heard stories that there were people in the highlands who still remembered when the white men first came. After flying back to Sydney, she soon returned—this time with Bob, along with camera equipment, and the two spent months retracing the Leahys’ original route.

The resulting film, First Contact, was released in 1983. It was a remarkable achievement, combining Mick’s jerky black-and-white footage and photographs of his encounters with the highlanders with Bob and Robin’s interviews with native men and women who had been there. Intercut with those were lengthy sit-downs with the two surviving Leahy brothers, both by then very old men and long settled in the highlands. (Mick had died in 1978.) Bob and Robin produced the film in the format to which they were accustomed: 54 minutes, television length, shot by a hired crew, narrated by a professional voice actor. A reconstruction of an archetypal story like this one—among the last encounters between two different cultures who had no knowledge of the other—must usually be dredged from diaries and ships’ logs and other centuries-old accounts and recreated from fragments, from dust. In this case, though, there was no need to guess at how the highlanders and the white men saw each other, to puzzle it out from stray clues: they all looked into a camera and spoke for themselves. “We not only have Cortez on the Aztecs,” Bob told friends. “We have the Aztecs on Cortez!” [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

The first peoples in the Americas were not from Europe

Jennifer Raff writes:

Last month’s release of The Ice Bridge, an episode in the Canadian Broadcasting Corporation series The Nature of Things has once again revived public discussion of a controversial idea about how the Americas were peopled known as the “Solutrean hypothesis”. This idea suggests a European origin for the peoples who made the Clovis tools, the first recognized stone tool tradition in the Americas. As I was one of the experts appearing on the documentary, I want to share my thoughts about it and why I see the ideas portrayed within as unsettling, unwise, and scientifically implausible.

First, in addition to the scientific problems with the Solutrean hypothesis which I’ll discuss shortly, it’s important to note that it has overt political and cultural implications in denying that Native Americans are the only indigenous peoples of the continents. The notion that the ancestors of Native Americans were not the first or only people on the continent has great popularity among white nationalists, who see it as a means of denying Native Americans an ancestral claim on their land. Indeed, although this particular iteration is new, the idea behind the Solutrean hypothesis is part of a long tradition of Europeans trying to insert themselves into American prehistory; justifying colonialism by claiming that Native Americans were not capable of creating the diverse and sophisticated material culture of the Americas. Unfortunately, the producers of the documentary deliberately chose not to address this issue head-on, nor did they include any critical perspectives from indigenous peoples. While supporting the agenda of white nationalists was not the intent of the producers or of the scientists involved, it would have been appropriate for the documentary to take a stand against it, and I and many archaeologists are disappointed that they did not. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.