Have we forgotten how to die?

In a review of seven books on death and dying, Julie-Marie Strange writes:

James Turner was twenty-five when his four-year-old daughter Annice died from a lung condition. She died at home with her parents and grandmother; her sleeping siblings were told of her death the next morning. James did everything to soothe Annice’s last days but, never having encountered death before, he didn’t immediately recognize it. He didn’t know what to do or expect and found it hard to discuss things with his wife Martha. The family received many condolences but kept the funeral private. Losing a child, often described as the hardest bereavement to bear, changed James Turner forever.

Death in the twenty-first century is typified by the paradox contained in this story. Although we greedily consume death at a distance through fiction, drama and the media, we are hamstrung by it up close and personal. In 1955 the commentator Geoffrey Gorer declared that death had become more pornographic than sex. It was, he said, the new taboo and mourning had become “indecent”. Since then, matters have arguably got worse. The decline in institutional Christianity left a spiritual and existential vacuum, while the rise in individual materialism has fragmented family networks and communities. Shared rites of passage that publicly validated grief have receded, and the space of death has moved increasingly from the home to the hospital.

Focusing on the US and, to a lesser extent, Northern Europe, Haider Warraich’s Modern Death: How medicine changed the end of life identifies how far-reaching these changes are. A physician and clinical researcher, Warraich is well placed to observe the dubious implications of an expanded medicalization of death. Most people want to die at home, but the majority continue to die in hospital, surrounded by medical equipment. In general, life expectancy in the past century has increased, but so has the use of medicine to prolong it artificially. Definitions of death have grown more complicated – does it lie in brain function or in the heart and lungs? – and are openly contested. And despite what Warraich calls medicine’s “obsession” with preventing or delaying death, there is no clear provision for bereaved families. That task waits to be taken up. Kathryn Mannix agrees in With the End in Mind: Dying, death and wisdom in an age of denial, suggesting that it “has become taboo to mention dying”. Through a “gradual tran­sition”, Mannix says, we have lost the vocab­ulary for talking about death and depend instead on euphemism, lies and ambiguity; she wants us to “reclaim” a language of death.

This is a recurring theme among these seven books. For some, our inability to talk straight about death and dying is partly about the mystery of the end. Andrew Stark, in The Consolations of Mortality: Making sense of death, identifies the decline in religion in the West and the idea of the afterlife as pivotal to our lack of confidence in confronting death. Robert McCrum, in Every Third Thought: On life, death and the endgame, speculates that ageing and death present a particular conundrum to self-assured baby boomers, who try to give death the slip (“let’s talk about it another time . . .”). In From Here to Eternity: Travelling the world to find the good death, Caitlin Doughty expands the problem into a generic Western culture of death “avoidance” – we duck awkward conversations with the dying, hand our corpses to corporate professionals and, worst of all, treat grief with embarrassment and shame. Kevin Toolis, in My Father’s Wake: How the Irish teach us to live, love and die, describes a veritable “Western Death Machine”, in which public services, health professionals, the media and corporate bodies all conspire towards the removal of death and dying from the purview of ordinary people. A former war correspondent, Toolis has seen more than his fair share of death and is here to shake us out of our complacency. [Continue reading…]

How Lebanon transformed Anthony Bourdain

Kim Ghattas writes:

Growing up in Beirut during Lebanon’s 15–year civil war, I wished for someone like Anthony Bourdain to tell the story of my country: a place ripped apart by violence, yes, but also a country where people still drove through militia checkpoints just to gather for big Sunday family lunches, or dodged sniper fire to get to their favorite butcher across town to sample some fresh, raw liver for breakfast. Bourdain, the legendary roving chef and master storyteller who committed suicide on Friday in France at the age of 61, would have approved of such excursions in search of the perfect morsel—he probably would have come along.

Coming of age during conflict made me want to become a journalist. I hoped to tell the story of my country and the Middle East—a place rife with conflicts, sure, but also layered with complexities, a place of diverse peoples full of humanity. In the summer of 2006, I was the BBC’s Beirut correspondent when war erupted between Israel and Hezbollah, the pro-Iran Shia militant group. Hezbollah had kidnapped three Israeli soldiers, triggering the month-long conflict. Within a day, the Israelis had bombed Beirut’s airport out of action. I worked 34 days in a row, 20 hours a day, reporting live on television and radio, alongside dozens of colleagues who’d flown in to help cover the conflict.

I didn’t know it then, but Bourdain was there too, filming an episode of his show No Reservations. And perhaps he didn’t know it then, but Lebanon would change him forever. In the episode, he talked about how he had come to Beirut to make a happy show about food and culture in a city that was regaining its reputation as the party capital of the Middle East. Instead, he found himself filming a country that had tipped into war overnight. Filming on the day the violence broke out, he managed to capture that split second where people’s faces fell as they realized their lives had been upended. [Continue reading…]

How music can fight prejudice

Tom Jacobs writes:

The outpouring of hostility toward immigrants and refugees has reminded us that ethnocentrism remains a fact of life in both Europe and the United States. Combating it will require teaching a new generation to view members of different cultures as potential friends rather than threatening outsiders. But what mode of communication has the power to stimulate such a shift?

New research from Portugal suggests the answer may be music.

It reports schoolchildren around age 11 who learned about the music and culture of a faraway land expressed warmer feelings toward immigrants from that country than those who did not. What’s more, those positive emotions were still evident three months after this exposure to the foreign culture.

“Music can inspire people to travel to other emotional worlds,” writes a research team led by psychologist Felix Neto of the University of Porto. Their work suggests songs can serve as an emotional bridge between cultures, revealing feelings that are common to both. [Continue reading…]

Human society is unprepared for the rise of artificial intelligence

Henry Kissinger writes:

The internet age in which we already live prefigures some of the questions and issues that AI will only make more acute. The Enlightenment sought to submit traditional verities to a liberated, analytic human reason. The internet’s purpose is to ratify knowledge through the accumulation and manipulation of ever expanding data. Human cognition loses its personal character. Individuals turn into data, and data become regnant.

Users of the internet emphasize retrieving and manipulating information over contextualizing or conceptualizing its meaning. They rarely interrogate history or philosophy; as a rule, they demand information relevant to their immediate practical needs. In the process, search-engine algorithms acquire the capacity to predict the preferences of individual clients, enabling the algorithms to personalize results and make them available to other parties for political or commercial purposes. Truth becomes relative. Information threatens to overwhelm wisdom.

Inundated via social media with the opinions of multitudes, users are diverted from introspection; in truth many technophiles use the internet to avoid the solitude they dread. All of these pressures weaken the fortitude required to develop and sustain convictions that can be implemented only by traveling a lonely road, which is the essence of creativity.

The impact of internet technology on politics is particularly pronounced. The ability to target micro-groups has broken up the previous consensus on priorities by permitting a focus on specialized purposes or grievances. Political leaders, overwhelmed by niche pressures, are deprived of time to think or reflect on context, contracting the space available for them to develop vision.

The digital world’s emphasis on speed inhibits reflection; its incentive empowers the radical over the thoughtful; its values are shaped by subgroup consensus, not by introspection. For all its achievements, it runs the risk of turning on itself as its impositions overwhelm its conveniences. [Continue reading…]

Islam in Eastern Europe

Jacob Mikanowski writes:

There has never been an Eastern Europe without Islam. Eastern Europe owes its existence to the intermingling of languages, of cultures, and, perhaps above all, of faiths. It is the meeting place of the Catholic West and the Orthodox East, of Ashkenazi and Sephardic Jewry, of militant Islam and crusading Christianity, of Byzantine mystics and Sufi saints.

Once, this plurality would have been obvious. A visitor to Vilnius in the 17th century would have heard six languages spoken in the streets; they could have heard prayers conducted in at least five more. The city had churches belonging to five denominations, as well as a synagogue and a mosque. Some examples of “Lithuanian” mosques still exist in Poland and Belarus. Wooden and square, they look just like parish churches, with the minor exception of the ornament at the top: a slim silver crescent instead of a cross.

If anything marks Eastern Europe as a place of its own, and not someone else’s periphery, it is this function as gateway and bridge between and among different traditions. And yet, again and again, the role of Islam in the making of this tapestry has been forgotten or disavowed. That is a grave mistake. Islam is the silver thread holding the whole together. Thirty years ago, the historian Larry Wolff argued that Eastern Europe was a product of the Enlightenment. When Western (principally French) intellectuals began to fashion their countries as realms of progress and rationality, they created the “East” as a flattering foil for their ambitions, filled as it was (in their eyes at least) with backwardness and superstition.

It seems to me that Wolff is only partially right. I think a notion of a separate Eastern Europe predates the Enlightenment by a few hundred years. I think, moreover, that its genesis is intimately tied to the introduction of Islam to the Balkans and southern steppes and, with it, the creation of a shatter-zone between empires stretching from the Adriatic to the Black Sea. This shatter-zone consisted of a sharp border and a soft frontier. Armies and lone warriors fought along the border. People, stories, and miracles crossed the frontiers. So many of the legends that came to define the nations of the region stem from this space of contact. And everywhere you look, relationships that appear at first to be based on enmity turn out instead to be characterized by mutual influence, mimicry, friendship, and even love. [Continue reading…]

How a Eurasian steppe empire coped with decades of drought

By Diana Crow

The bitterly cold, dry air of the Central Asian steppe is a boon to researchers who study the region. The frigid climate “freeze-dries” everything, including centuries-old trees that once grew on lava flows in Mongolia’s Orkhon Valley. A recent study of the tree-ring record, published in March, from some of these archaic logs reveals a drought that lasted nearly seven decades—one of the longest in a 1,700-year span of steppe history—from A.D. 783–850.

Decades of prolonged drought would have killed much of the grass that the Orkhon Valley’s domesticated horses relied upon. Yet the dominant steppe civilization of the era, an empire of Turkic horse nomads called the Uyghurs, somehow survived nearly 60 years of the drought, a period about seven times longer than the Dust Bowl that devastated the central U.S. in the 1930s.

Based on surviving Chinese and Uyghur documents from the drought years, the study’s authors concluded that the Uyghurs survived by diversifying their economy and using international diplomacy to boost trade.

Rather than driving the Uyghurs to plunder neighboring territories—as other steppe empires tended to do—the drought led them to take advantage of their location on the Silk Road and reinvent their economy. The Uyghurs’ relatively peaceful strategies seem to have staved off total collapse for a surprisingly long time. “They were champs,” says physical geographer and study co-author Amy Hessl of West Virginia University.

Prior to this paper, no one knew that the Uyghurs faced an “epic drought,” Hessl says. The recognition that they did may change the way historians interpret the social, political, and economic strategies of the Uyghurs.

[Read more…]

The Dreamtime, science and narratives of Indigenous Australia

File 20180501 135803 tkypa4.jpg?ixlib=rb 1.1
Lake Mungo and the surrounding Willandra Lakes of NSW were established around 150,000 years ago.
from www.shutterstock.com

David Lambert, Griffith University

This article is an extract from an essay Owning the science: the power of partnerships in First Things First, the 60th edition of Griffith Review.

We’re publishing it as part of our occasional series Zoom Out, where authors explore key ideas in science and technology in the broader context of society and humanity.


Scientific and Indigenous knowledge systems have often been in conflict. In my view, too much is made of these conflicts; they have a lot in common.

For example, Indigenous knowledge typically takes the form of a narrative, usually a spoken story about how the world came to be. In a similar way, evolutionary theories, which aim to explain why particular characters are adapted to certain functions, also take the form of narratives. Both narratives are mostly focused on “origins”.




Read more:
Friday essay: when did Australia’s human history begin?


From a strictly genetic perspective, progress on origins research in Australia has been particularly slow. Early ancient DNA studies were focused on remains from permafrost conditions in Antarctica and cool temperate environments such as northern Europe, including Greenland.

But Australia is very different. Here, human remains are very old, and many are recovered from very hot environments.

While ancient DNA studies have played an important role in informing understanding of the evolution of our species worldwide, little is known about the levels of ancient genomic variation in Australia’s First Peoples – although some progress has been made in recent years. This includes the landmark recovery of genomic sequences from both contemporary and ancient Aboriginal Australian remains.

[Read more…]

The crisis in modern masculinity

Pankaj Mishra writes:

Many straight white men feel besieged by “uppity” Chinese and Indian people, by Muslims and feminists, not to mention gay bodybuilders, butch women and trans people. Not surprisingly they are susceptible to [Jordan] Peterson’s notion that the ostensible destruction of “the traditional household division of labour” has led to “chaos”. This fear and insecurity of a male minority has spiralled into a politics of hysteria in the two dominant imperial powers of the modern era. In Britain, the aloof and stiff upper-lipped English gentleman, that epitome of controlled imperial power, has given way to such verbally incontinent Brexiters as Boris Johnson. The rightwing journalist Douglas Murray, among many elegists of English manhood, deplores “emasculated Italians, Europeans and westerners in general” and esteems Trump for “reminding the west of what is great about ourselves”. And, indeed, whether threatening North Korea with nuclear incineration, belittling people with disabilities or groping women, the American president confirms that some winners of modern history will do anything to shore up their sense of entitlement.

But gaudy displays of brute manliness in the west, and frenzied loathing of what the alt-rightists call “cucks” and “cultural Marxists”, are not merely a reaction to insolent former weaklings. Such manic assertions of hyper-masculinity have recurred in modern history. They have also profoundly shaped politics and culture in Asia, Africa and Latin America. Osama bin Laden believed that Muslims “have been deprived of their manhood” and could recover it by obliterating the phallic symbols of American power. Beheading and raping innocent captives in the name of the caliphate, the black-hooded young volunteers of Islamic State were as obviously a case of psychotic masculinity as the Norwegian mass-murderer Anders Behring Breivik, who claimed Viking warriors as his ancestors. Last month, the Philippines president Rodrigo Duterte told female rebels in his country that “We will not kill you. We will just shoot you in the vagina.” Tormenting hapless minorities, India’s Hindu supremacist chieftains seem obsessed with proving, as one asserted after India’s nuclear tests in 1998, “we are not eunuchs any more”.

Morbid visions of castration and emasculation, civilisational decline and decay, connect Godse and Schlesinger to Bin Laden and Trump, and many other exponents of a rear-guard machismo today. They are susceptible to cliched metaphors of “soft” and “passive” femininity, “hard” and “active” masculinity; they are nostalgic for a time when men did not have to think twice about being men. And whether Hindu chauvinist, radical Islamist or white nationalist, their self-image depends on despising and excluding women. It is as though the fantasy of male strength measures itself most gratifyingly against the fantasy of female weakness. Equating women with impotence and seized by panic about becoming cucks, these rancorously angry men are symptoms of an endemic and seemingly unresolvable crisis of masculinity.

When did this crisis begin? And why does it seem so inescapably global? Writing Age of Anger: A History of the Present, I began to think that a perpetual crisis stalks the modern world. It began in the 19th century, with the most radical shift in human history: the replacement of agrarian and rural societies by a volatile socio-economic order, which, defined by industrial capitalism, came to be rigidly organised through new sexual and racial divisions of labour. And the crisis seems universal today because a web of restrictive gender norms, spun in modernising western Europe and America, has come to cover the remotest corners of the earth as they undergo their own socio-economic revolutions. [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

Neanderthals developed art earlier than modern humans

Carl Zimmer writes:

The two new studies don’t just indicate that Neanderthals could make cave art and jewelry. They also establish that Neanderthals were making these things long before modern humans — a blow to the idea that they simply copied their cousins.

The earliest known cave paintings made by modern humans are only about 40,000 years old, while Neanderthal cave art is at least 24,000 years older. The oldest known shell jewelry made by modern humans is about 70,000 years old, but Neanderthals were making it 45,000 years before then.

“These results imply that Neanderthals were not apart from these developments,” said Dr. Zilhão. “For all practical purposes, they were modern humans, too.”

The new studies raise another intriguing possibility, said Clive Finlayson, director of the Gibraltar Museum: that the capacity for symbolic thought was already present 600,000 years ago in the ancestors of both Neanderthals and modern humans. [Continue reading…]

New paper links ancient drawings and the origins of language

Peter Dizikes, Massachusetts Institute of Technology:

When and where did humans develop language? To find out, look deep inside caves, suggests an MIT professor.

More precisely, some specific features of cave art may provide clues about how our symbolic, multifaceted language capabilities evolved, according to a new paper co-authored by MIT linguist Shigeru Miyagawa.

A key to this idea is that cave art is often located in acoustic “hot spots,” where sound echoes strongly, as some scholars have observed. Those drawings are located in deeper, harder-to-access parts of caves, indicating that acoustics was a principal reason for the placement of drawings within caves. The drawings, in turn, may represent the sounds that early humans generated in those spots.

In the new paper, this convergence of sound and drawing is what the authors call a “cross-modality information transfer,” a convergence of auditory information and visual art that, the authors write, “allowed early humans to enhance their ability to convey symbolic thinking.” The combination of sounds and images is one of the things that characterizes human language today, along with its symbolic aspect and its ability to generate infinite new sentences.

“Cave art was part of the package deal in terms of how homo sapiens came to have this very high-level cognitive processing,” says Miyagawa, a professor of linguistics and the Kochi-Manjiro Professor of Japanese Language and Culture at MIT. “You have this very concrete cognitive process that converts an acoustic signal into some mental representation and externalizes it as a visual.”

Cave artists were thus not just early-day Monets, drawing impressions of the outdoors at their leisure. Rather, they may have been engaged in a process of communication.

“I think it’s very clear that these artists were talking to one another,” Miyagawa says. “It’s a communal effort.”

The paper, “Cross-modality information transfer: A hypothesis about the relationship among prehistoric cave paintings, symbolic thinking, and the emergence of language,” is being published in the journal Frontiers in Psychology. The authors are Miyagawa; Cora Lesure, a Ph.D. student in MIT’s Department of Linguistics; and Vitor A. Nobrega, a Ph.D. student in linguistics at the University of Sao Paulo, in Brazil. [Continue reading…]