Do you see what I see?

Nicola Jones writes:

In a Candoshi village in the heart of Peru, anthropologist Alexandre Surrallés puts a small colored chip on a table and asks, “Ini tamaara?” (“How is it?” or “What is it like?”). What Surrallés would like to ask is, “What color is this?” But the Candoshi, a tribe of some 3,000 people living on the upper banks of the Amazon River, don’t have a word for the concept of color. Nor are their answers to the question he does ask familiar to most Westerners. In this instance, a lively discussion erupts between two Candoshi about whether the chip, which Surrallés would call amber or yellow-orange, looks more like ginger or fish spawn.

This moment in July 2014 was just one among many similar experiences Surrallés had during a total of three years living among the Candoshi since 1991. His fieldwork led Surrallés to the startling conclusion that these people simply don’t have color words: reliable descriptors for the basic colors in the world around them. Candoshi children don’t learn the colors of the rainbow because their community doesn’t have words for them.

Though his finding might sound remarkable, Surrallés, who is with the National Center for Scientific Research in Paris, isn’t the first to propose that this cultural phenomenon exists. Anthropologists in various corners of the world have reported on other small tribes who also don’t seem to have a staple vocabulary for color. Yet these conclusions fly in the face of those found in the most influential book on the topic: The World Color Survey, published in 2009, which has at its very heart the hypothesis that every culture has basic color words for at least part of the rainbow.

The debate sits at the center of an ongoing war in the world of color research. On the one side stand “universalists,” including the authors of The World Color Survey and their colleagues, who believe in a conformity of human perceptual experience: that all people see and name colors in a somewhat consistent way. On the other side are “relativists,” who believe in a spectrum of experience and who are often offended by the very notion that a Westerner’s sense of color might be imposed on the interpretation of other cultures and languages. Many researchers, like Surrallés, say they stand in the middle: While there are some universals in human perception, Surrallés argues, color terms don’t seem to be among them. [Continue reading…]

How culture works with evolution to produce human cognition

Cecilia Heyes writes:

The conventional view, inside and outside academia, is that children are ‘wired’ to imitate. We are ‘Homo imitans’, animals born with a burning desire to copy the actions of others. Imitation is ‘in our genes’. Birds build nests, cats miaow, pigs are greedy, while humans possess an instinct to imitate.

The idea that humans have cognitive instincts is a cornerstone of evolutionary psychology, pioneered by Leda Cosmides, John Tooby and Steven Pinker in the 1990s. ‘[O]ur modern skulls house a Stone Age mind,’ wrote Cosmides and Tooby in 1997. On this view, the cognitive processes or ‘organs of thought’ with which we tackle contemporary life have been shaped by genetic evolution to meet the needs of small, nomadic bands of people – people who devoted most of their energy to digging up plants and hunting animals. It’s unsurprising, then, that today our Stone Age instincts often deliver clumsy or distasteful solutions, but there’s not a whole lot we can do about it. We’re simply in thrall to our thinking genes.

This all seems plausible and intuitive, doesn’t it? The trouble is, the evidence behind it is dubious. In fact, if we look closely, it’s apparent that evolutionary psychology is due for an overhaul. Rather than hard-wired cognitive instincts, our heads are much more likely to be populated by cognitive gadgets, tinkered and toyed with over successive generations. Culture is responsible not just for the grist of the mind – what we do and make – but for fabricating its mills, the very way the mind works. [Continue reading…]

Global WEIRDing is a trend we can’t ignore

By Kensy Cooperrider

For centuries, Inuit hunters navigated the Arctic by consulting wind, snow and sky. Now they use GPS. Speakers of the aboriginal language Gurindji, in northern Australia, used to command 28 variants of each cardinal direction. Children there now use the four basic terms, and they don’t use them very well. In the arid heights of the Andes, the Aymara developed an unusual way of understanding time, imagining the past as in front of them, and the future at their backs. But for the youngest generation of Aymara speakers – increasingly influenced by Spanish – the future lies ahead

These are not just isolated changes. On all continents, even in the world’s remotest regions, indigenous people are swapping their distinctive ways of parsing the world for Western, globalised ones. As a result, human cognitive diversity is dwindling – and, sadly, those of us who study the mind had only just begun to appreciate it. 

In 2010, a paper titled ‘The Weirdest People in the World?’ gave the field of cognitive science a seismic shock. Its authors, led by the psychologist Joe Henrich at the University of British Columbia, made two fundamental points. The first was that researchers in the behavioural sciences had almost exclusively focused on a small sliver of humanity: people from Western, educated, industrialised, rich, democratic societies. The second was that this sliver is not representative of the larger whole, but that people in London, Buenos Aires and Seattle were, in an acronym, WEIRD.

But there is a third fundamental point, and it was the psychologist Paul Rozin at the University of Pennsylvania who made it. In his commentary on the 2010 article, Rozin noted that this same WEIRD slice of humanity was ‘a harbinger of the future of the world’. He had seen this trend in his own research. Where he found cross-cultural differences, they were more pronounced in older generations. The world’s young people, in other words, are converging. The signs are unmistakable: the age of global WEIRDing is upon us.

[Read more…]

The empathetic humanities have much to teach our adversarial culture

By Alexander Bevilacqua, Aeon, January 15, 2019

As anyone on Twitter knows, public culture can be quick to attack, castigate and condemn. In search of the moral high ground, we rarely grant each other the benefit of the doubt. In her Class Day remarks at Harvard’s 2018 graduation, the Nigerian novelist Chimamanda Ngozi Adichie addressed the problem of this rush to judgment. In the face of what she called ‘a culture of “calling out”, a culture of outrage’, she asked students to ‘always remember context, and never disregard intent’. She could have been speaking as a historian.

History, as a discipline, turns away from two of the main ways of reading that have dominated the humanities for the past half-century. These methods have been productive, but perhaps they also bear some responsibility for today’s corrosive lack of generosity. The two approaches have different genealogies, but share a significant feature: at heart, they are adversarial.

One mode of reading, first described in 1965 by the French philosopher Paul Ricœur and known as ‘the hermeneutics of suspicion’, aims to uncover the hidden meaning or agenda of a text. Whether inspired by Karl Marx, Friedrich Nietzsche or Sigmund Freud, the reader interprets what happens on the surface as a symptom of something deeper and more dubious, from economic inequality to sexual anxiety. The reader’s task is to reject the face value of a work, and to plumb for a submerged truth.

A second form of interpretation, known as ‘deconstruction’, was developed in 1967 by the French philosopher Jacques Derrida. It aims to identify and reveal a text’s hidden contradictions – ambiguities and even aporias (unthinkable contradictions) that eluded the author. For example, Derrida detected a bias that favoured speech over writing in many influential philosophical texts of the Western tradition, from Plato to Jean-Jacques Rousseau. The fact that written texts could privilege the immediacy and truth of speech was a paradox that revealed unarticulated metaphysical commitments at the heart of Western philosophy.

Both of these ways of reading pit reader against text. The reader’s goal becomes to uncover meanings or problems that the work does not explicitly express. In both cases, intelligence and moral probity are displayed at the expense of what’s been written. In the 20th century, these approaches empowered critics to detect and denounce the workings of power in all kinds of materials – not just the dreams that Freud interpreted, or the essays by Plato and Rousseau with which Derrida was most closely concerned.

They do, however, foster a prosecutorial attitude among academics and public intellectuals. As a colleague once told me: ‘I am always looking for the Freudian slip.’ He scours the writings of his peers to spot when they trip up and betray their problematic intellectual commitments. One poorly chosen phrase can sully an entire work.

[Read more…]

The hidden resilience of ‘food desert’ neighborhoods

Barry Yeoman writes:

Even before Ashanté Reese and I reach the front gate, retired schoolteacher Alice Chandler is standing in the doorway of her brick home in Washington, D.C. She welcomes Reese, an anthropologist whom she has known for six years, with a hug and apologizes for having nothing to feed us during this spontaneous visit.

Chandler, 69 years old, is a rara avis among Americans: an adult who has lived nearly her entire life in the same house. This fact makes her stories particularly valuable to Reese, who has been studying the changing food landscape in Deanwood, a historically black neighborhood across the Anacostia River from most of the city.

When Chandler was growing up, horse-drawn wagons delivered meat, fish, and vegetables to her doorstep. The neighborhood had a milkman, as did many U.S. communities in the mid-20th century. Her mother grew vegetables in a backyard garden and made wine from the fruit of their peach tree.

Food was shared across fence lines. “Your neighbor may have tomatoes and squash in their garden,” Chandler says. “And you may have cucumbers in yours. Depending on how bountiful each one was, they would trade off.” Likewise, when people went fishing, “they would bring back enough for friends in the neighborhood. That often meant a Saturday evening fish fry at home.”

Around the corner was the Spic N Span Market, a grocery with penny candy, display cases of fresh chicken and pork chops, and an old dog who slept in the back. The owner, whom Chandler knew as “Mr. Eddie,” was a Jewish man who hired African-American cashiers and extended credit to customers short on cash. Next door was a small farm whose owner used to give fresh eggs to Chandler’s mother.

Chandler was born into this architecturally eclectic neighborhood. On the basis of oral histories found in archives, Reese mapped 11 different groceries that were open in Deanwood during its peak years, the 1930s and ’40s. African-Americans owned five. Jews, excluded by restrictive covenants from living in some other D.C. neighborhoods, owned six. For much of the mid-20th century, there was also a Safeway store.

Today there are exactly zero grocery stores. The only places for Deanwood’s 5,000 residents to buy food in their neighborhood are corner stores, abundantly stocked with beer and Beefaroni but nearly devoid of fruit, vegetables, and meat. At one of those stores, which I visited, a “Healthy Corners” sign promised fresh produce. Instead, I found two nearly empty wooden shelves sporting a few sad-looking onions, bananas, apples, and potatoes. The nearest supermarket, a Safeway, is a hilly 30-minute walk away. A city council member who visited last year found long lines, moldy strawberries, and meat that appeared to have spoiled.

The common name for neighborhoods like these is “food deserts,” which the U.S. Department of Agriculture defines as areas “where people have limited access to a variety of healthy and affordable food.” According to the USDA, food deserts tend to offer sugary, fatty foods; the department also says that poor access to fruits, vegetables, and lean meats could lead to obesity and diabetes. A map produced by the nonpartisan D.C. Policy Center puts about half of Deanwood into a desert.

But Reese, an assistant professor of anthropology at Spelman College in Atlanta, Georgia, has joined a number of scholars who are pushing back against the food desert model. She calls it a “lazy” shorthand to describe both a series of corporate decisions and a complex human ecosystem. [Continue reading…]

The steward of Middle-earth

Hannah Long writes:

Around the time Christopher [Tolkien] was commissioned an officer in the RAF in 1945, [J.R.R.] Tolkien was calling his son “my chief critic and collaborator.” Christopher would return from flying missions to pore over another chapter of his father’s work. He also joined the informal literary club known as the Inklings. At 21, he was the youngest—and is now the last surviving—member. The band of friends—J.R.R. Tolkien, C.S. Lewis, Owen Barfield, Hugo Dyson, and Charles Williams, among others—would meet at Oxford’s Eagle and Child pub or Lewis’s rooms in Magdalen College to chat about literature and philosophy and to read aloud portions of works in progress.

Christopher was recruited to narrate his father’s stories. The group considered his clear, rich voice a marked improvement over his father’s dithering, mumbling delivery. Lewis had recognized the brilliance of J.R.R. Tolkien’s work from the first moment he encountered it, and for years remained Tolkien’s only audience. Dyson, not so appreciative, exclaimed during one reading, “Oh, not another f—ing elf!”

Poet and scholar Malcolm Guite argues that the Inklings, despite their profound differences (Tolkien was an English Roman Catholic, Lewis an Ulster Protestant, Williams a hermetic mystic) refined and supported each other in their common literary mission.

“They’re not often noticed by literary historians because . . . in terms of English literature, the self-defining mainstream of 20th-century literature supposedly was high modernism, shaped by Joyce and Eliot,” Guite said in a 2011 lecture. But “there was actually . . . something quite radical going on in that group. Together, they were able to form a profoundly alternative and countercultural vision.” Guite emphasizes, in particular, the Inklings’ shared desire to respond to the materialist, largely atheistic cohort whose voices dominated the world of letters.

Although the Inklings are often accused of escapism, nearly all culture was engaged in a sort of dissociation because of the carnage and devastation of the First World War. Tolkien scholar Verlyn Flieger writes that Tolkien was “a traveler between worlds,” from his Edwardian youth to his postbellum disillusionment. It was this “oscillation that, paradoxically, makes him a modern writer, for . . . the temporal dislocation of his ‘escape’ mirrored the psychological disjunction and displacement of his century.”

High modernism found that escape in science, creating a stark divide between the material and the spiritual. This technical, technological, atomizing approach turns up in The Lord of the Rings with the villainous wizard Saruman, whose materialist philosophy dismisses the transcendent. Early in the book, Saruman changes his robe from white to multicolored. He explains, “White cloth may be dyed. The white page can be overwritten; and the white light can be broken.”

“In which case it is no longer white,” Gandalf replies. “And he that breaks a thing to find out what it is has left the path of wisdom.”

Saruman ignores that his dissection of color has eliminated something greater than the sum of its parts; he has lost view of the transcendent white light. For the Inklings, the medium of fantasy restored—or rather revealed—the enchantment of a disenchanted world. It reinstated an understanding of the transcendent that had been lost in postwar alienation. [Continue reading…]

China has placed hundreds of thousands of Muslims in cultural extermination camps

The New York Times reports:

On the edge of a desert in far western China, an imposing building sits behind a fence topped with barbed wire. Large red characters on the facade urge people to learn Chinese, study law and acquire job skills. Guards make clear that visitors are not welcome.

Inside, hundreds of ethnic Uighur Muslims spend their days in a high-pressure indoctrination program, where they are forced to listen to lectures, sing hymns praising the Chinese Communist Party and write “self-criticism” essays, according to detainees who have been released.

The goal is to remove any devotion to Islam.

Abdusalam Muhemet, 41, said the police detained him for reciting a verse of the Quran at a funeral. After two months in a nearby camp, he and more than 30 others were ordered to renounce their past lives. Mr. Muhemet said he went along but quietly seethed.

“That was not a place for getting rid of extremism,” he recalled. “That was a place that will breed vengeful feelings and erase Uighur identity.”

This camp outside Hotan, an ancient oasis town in the Taklamakan Desert, is one of hundreds that China has built in the past few years. It is part of a campaign of breathtaking scale and ferocity that has swept up hundreds of thousands of Chinese Muslims for weeks or months of what critics describe as brainwashing, usually without criminal charges.

Though limited to China’s western region of Xinjiang, it is the country’s most sweeping internment program since the Mao era — and the focus of a growing chorus of international criticism. [Continue reading…]

Remembering 1968

Jackson Lears writes:

The religious dimension of American radicalism was what separated it from the student uprisings in Paris and other European cities during the spring of 1968. American radicals lacked the anticlerical animus of Europeans; priests, rabbis, and ministers enlisted in the front ranks of the civil rights and antiwar movements. King’s decision to bear witness against the war was central to legitimating resistance to it, while provoking government counterattacks as well as denunciations from both liberals and conservatives.

“Religion” may be too solemn a word for many 1960s radicals, but it helps to capture the depth of their motives: above all their longing for a more direct, authentic experience of the world than the one on offer in midcentury American society. What made radicals mad, what drove their deepest animus against the war, was their sense that it was a product of the same corporate technostructure—as John Kenneth Galbraith called it in The New Industrial State (1967)—that reduced everyday life to a hamster cage of earning and spending. The tribunes of the technostructure were men like Robert McNamara, who shuttled from the Ford Motor Company to the Defense Department to the World Bank, and who seemed to know everything about managerial techniques but nothing about their ultimate purpose, if indeed there was one. Elite managers were the high priests of an orthodoxy with a blankness, a vacancy, at its center.

The fundamental expression of this vacuity was the war machine that multiplied corpses in Vietnam and nuclear weapons throughout the world. King acknowledged the connection between managerialism and militarism at Arlington Cemetery in February 1968, when he said, “Somewhere along the way we have allowed the means by which we live to outdistance the ends for which we live.” A society of means without ends was a society without a soul.

Antiwar radicals, recoiling from soullessness, challenged the church of technocratic rationality. Taking this challenge seriously, recovering the mood of an extended moment, requires beginning earlier and ending later than 1968. [Continue reading…]

Have we forgotten how to die?

In a review of seven books on death and dying, Julie-Marie Strange writes:

James Turner was twenty-five when his four-year-old daughter Annice died from a lung condition. She died at home with her parents and grandmother; her sleeping siblings were told of her death the next morning. James did everything to soothe Annice’s last days but, never having encountered death before, he didn’t immediately recognize it. He didn’t know what to do or expect and found it hard to discuss things with his wife Martha. The family received many condolences but kept the funeral private. Losing a child, often described as the hardest bereavement to bear, changed James Turner forever.

Death in the twenty-first century is typified by the paradox contained in this story. Although we greedily consume death at a distance through fiction, drama and the media, we are hamstrung by it up close and personal. In 1955 the commentator Geoffrey Gorer declared that death had become more pornographic than sex. It was, he said, the new taboo and mourning had become “indecent”. Since then, matters have arguably got worse. The decline in institutional Christianity left a spiritual and existential vacuum, while the rise in individual materialism has fragmented family networks and communities. Shared rites of passage that publicly validated grief have receded, and the space of death has moved increasingly from the home to the hospital.

Focusing on the US and, to a lesser extent, Northern Europe, Haider Warraich’s Modern Death: How medicine changed the end of life identifies how far-reaching these changes are. A physician and clinical researcher, Warraich is well placed to observe the dubious implications of an expanded medicalization of death. Most people want to die at home, but the majority continue to die in hospital, surrounded by medical equipment. In general, life expectancy in the past century has increased, but so has the use of medicine to prolong it artificially. Definitions of death have grown more complicated – does it lie in brain function or in the heart and lungs? – and are openly contested. And despite what Warraich calls medicine’s “obsession” with preventing or delaying death, there is no clear provision for bereaved families. That task waits to be taken up. Kathryn Mannix agrees in With the End in Mind: Dying, death and wisdom in an age of denial, suggesting that it “has become taboo to mention dying”. Through a “gradual tran­sition”, Mannix says, we have lost the vocab­ulary for talking about death and depend instead on euphemism, lies and ambiguity; she wants us to “reclaim” a language of death.

This is a recurring theme among these seven books. For some, our inability to talk straight about death and dying is partly about the mystery of the end. Andrew Stark, in The Consolations of Mortality: Making sense of death, identifies the decline in religion in the West and the idea of the afterlife as pivotal to our lack of confidence in confronting death. Robert McCrum, in Every Third Thought: On life, death and the endgame, speculates that ageing and death present a particular conundrum to self-assured baby boomers, who try to give death the slip (“let’s talk about it another time . . .”). In From Here to Eternity: Travelling the world to find the good death, Caitlin Doughty expands the problem into a generic Western culture of death “avoidance” – we duck awkward conversations with the dying, hand our corpses to corporate professionals and, worst of all, treat grief with embarrassment and shame. Kevin Toolis, in My Father’s Wake: How the Irish teach us to live, love and die, describes a veritable “Western Death Machine”, in which public services, health professionals, the media and corporate bodies all conspire towards the removal of death and dying from the purview of ordinary people. A former war correspondent, Toolis has seen more than his fair share of death and is here to shake us out of our complacency. [Continue reading…]

How Lebanon transformed Anthony Bourdain

Kim Ghattas writes:

Growing up in Beirut during Lebanon’s 15–year civil war, I wished for someone like Anthony Bourdain to tell the story of my country: a place ripped apart by violence, yes, but also a country where people still drove through militia checkpoints just to gather for big Sunday family lunches, or dodged sniper fire to get to their favorite butcher across town to sample some fresh, raw liver for breakfast. Bourdain, the legendary roving chef and master storyteller who committed suicide on Friday in France at the age of 61, would have approved of such excursions in search of the perfect morsel—he probably would have come along.

Coming of age during conflict made me want to become a journalist. I hoped to tell the story of my country and the Middle East—a place rife with conflicts, sure, but also layered with complexities, a place of diverse peoples full of humanity. In the summer of 2006, I was the BBC’s Beirut correspondent when war erupted between Israel and Hezbollah, the pro-Iran Shia militant group. Hezbollah had kidnapped three Israeli soldiers, triggering the month-long conflict. Within a day, the Israelis had bombed Beirut’s airport out of action. I worked 34 days in a row, 20 hours a day, reporting live on television and radio, alongside dozens of colleagues who’d flown in to help cover the conflict.

I didn’t know it then, but Bourdain was there too, filming an episode of his show No Reservations. And perhaps he didn’t know it then, but Lebanon would change him forever. In the episode, he talked about how he had come to Beirut to make a happy show about food and culture in a city that was regaining its reputation as the party capital of the Middle East. Instead, he found himself filming a country that had tipped into war overnight. Filming on the day the violence broke out, he managed to capture that split second where people’s faces fell as they realized their lives had been upended. [Continue reading…]