The myth of the eight-hour sleep

Stephanie Hegarty writes:

We often worry about lying awake in the middle of the night – but it could be good for you. A growing body of evidence from both science and history suggests that the eight-hour sleep may be unnatural.

In the early 1990s, psychiatrist Thomas Wehr conducted an experiment in which a group of people were plunged into darkness for 14 hours every day for a month.

It took some time for their sleep to regulate but by the fourth week the subjects had settled into a very distinct sleeping pattern. They slept first for four hours, then woke for one or two hours before falling into a second four-hour sleep.

Though sleep scientists were impressed by the study, among the general public the idea that we must sleep for eight consecutive hours persists.

In 2001, historian Roger Ekirch of Virginia Tech published a seminal paper, drawn from 16 years of research, revealing a wealth of historical evidence that humans used to sleep in two distinct chunks. [Continue reading…]

Could a lack of humility be at the root of what ails America?

File 20190523 187176 aze4jc.jpg?ixlib=rb 1.1
What happens when everyone thinks they’re smarter than everyone else?
Ljupco Smokovski/Shutterstock.com

By Frank T. McAndrew, Knox College

There are a lot of reasons behind the political polarization of the country and the deterioration of civic discourse.

I wonder if a lack of humility is one of them.

In his recent book, “The Death of Expertise,” national security expert Tom Nichols described a type of person each of us probably knows:

“They are young and old, rich and poor, some with education, others armed only with a laptop or a library card. But they all have one thing in common: They are ordinary people who believe they are actually troves of knowledge. [They are] convinced they are more informed than the experts, more broadly knowledgeable than the professors, and more insightful than the gullible masses…”

Interestingly, intellectual humility has become a hot topic in the field of personality psychology. In recent years, a spate of studies have emerged that highlight the important role it plays in our knowledge, relationships and worldview.

So what happens when everyone thinks they’re smarter than everyone else?

[Read more…]

Muslims lived in America before Protestantism even existed

Sam Haselby writes:

Muslims came to America more than a century before the Puritans founded the Massachusetts Bay Colony in 1630. Muslims were living in America not only before Protestants, but before Protestantism existed. After Catholicism, Islam was the second monotheistic religion in the Americas.

The popular misunderstanding, even among educated people, that Islam and Muslims are recent additions to America tells us important things about how American history has been written. In particular, it reveals how historians have justified and celebrated the emergence of the modern nation-state. One way to valorise the United States of America has been to minimise the heterogeneity and scale – the cosmopolitanism, diversity and mutual co-existence of peoples – in America during the first 300 years of European presence.

The writing of American history has also been dominated by Puritan institutions. It might no longer be quite true, as the historian (and Southerner) U B Phillips complained more than 100 years ago, that Boston had written the history of the US, and largely written it wrong. But when it comes to the history of religion in America, the consequences of the domination of the leading Puritan institutions in Boston (Harvard University) and New Haven (Yale University) remain formidable. This ‘Puritan effect’ on seeing and understanding religion in early America (and the origins of the US) brings real distortion: as though we were to turn the political history of 20th-century Europe over to the Trotskyites. [Continue reading…]

Do you see what I see?

Nicola Jones writes:

In a Candoshi village in the heart of Peru, anthropologist Alexandre Surrallés puts a small colored chip on a table and asks, “Ini tamaara?” (“How is it?” or “What is it like?”). What Surrallés would like to ask is, “What color is this?” But the Candoshi, a tribe of some 3,000 people living on the upper banks of the Amazon River, don’t have a word for the concept of color. Nor are their answers to the question he does ask familiar to most Westerners. In this instance, a lively discussion erupts between two Candoshi about whether the chip, which Surrallés would call amber or yellow-orange, looks more like ginger or fish spawn.

This moment in July 2014 was just one among many similar experiences Surrallés had during a total of three years living among the Candoshi since 1991. His fieldwork led Surrallés to the startling conclusion that these people simply don’t have color words: reliable descriptors for the basic colors in the world around them. Candoshi children don’t learn the colors of the rainbow because their community doesn’t have words for them.

Though his finding might sound remarkable, Surrallés, who is with the National Center for Scientific Research in Paris, isn’t the first to propose that this cultural phenomenon exists. Anthropologists in various corners of the world have reported on other small tribes who also don’t seem to have a staple vocabulary for color. Yet these conclusions fly in the face of those found in the most influential book on the topic: The World Color Survey, published in 2009, which has at its very heart the hypothesis that every culture has basic color words for at least part of the rainbow.

The debate sits at the center of an ongoing war in the world of color research. On the one side stand “universalists,” including the authors of The World Color Survey and their colleagues, who believe in a conformity of human perceptual experience: that all people see and name colors in a somewhat consistent way. On the other side are “relativists,” who believe in a spectrum of experience and who are often offended by the very notion that a Westerner’s sense of color might be imposed on the interpretation of other cultures and languages. Many researchers, like Surrallés, say they stand in the middle: While there are some universals in human perception, Surrallés argues, color terms don’t seem to be among them. [Continue reading…]

How culture works with evolution to produce human cognition

Cecilia Heyes writes:

The conventional view, inside and outside academia, is that children are ‘wired’ to imitate. We are ‘Homo imitans’, animals born with a burning desire to copy the actions of others. Imitation is ‘in our genes’. Birds build nests, cats miaow, pigs are greedy, while humans possess an instinct to imitate.

The idea that humans have cognitive instincts is a cornerstone of evolutionary psychology, pioneered by Leda Cosmides, John Tooby and Steven Pinker in the 1990s. ‘[O]ur modern skulls house a Stone Age mind,’ wrote Cosmides and Tooby in 1997. On this view, the cognitive processes or ‘organs of thought’ with which we tackle contemporary life have been shaped by genetic evolution to meet the needs of small, nomadic bands of people – people who devoted most of their energy to digging up plants and hunting animals. It’s unsurprising, then, that today our Stone Age instincts often deliver clumsy or distasteful solutions, but there’s not a whole lot we can do about it. We’re simply in thrall to our thinking genes.

This all seems plausible and intuitive, doesn’t it? The trouble is, the evidence behind it is dubious. In fact, if we look closely, it’s apparent that evolutionary psychology is due for an overhaul. Rather than hard-wired cognitive instincts, our heads are much more likely to be populated by cognitive gadgets, tinkered and toyed with over successive generations. Culture is responsible not just for the grist of the mind – what we do and make – but for fabricating its mills, the very way the mind works. [Continue reading…]

Global WEIRDing is a trend we can’t ignore

By Kensy Cooperrider

For centuries, Inuit hunters navigated the Arctic by consulting wind, snow and sky. Now they use GPS. Speakers of the aboriginal language Gurindji, in northern Australia, used to command 28 variants of each cardinal direction. Children there now use the four basic terms, and they don’t use them very well. In the arid heights of the Andes, the Aymara developed an unusual way of understanding time, imagining the past as in front of them, and the future at their backs. But for the youngest generation of Aymara speakers – increasingly influenced by Spanish – the future lies ahead

These are not just isolated changes. On all continents, even in the world’s remotest regions, indigenous people are swapping their distinctive ways of parsing the world for Western, globalised ones. As a result, human cognitive diversity is dwindling – and, sadly, those of us who study the mind had only just begun to appreciate it. 

In 2010, a paper titled ‘The Weirdest People in the World?’ gave the field of cognitive science a seismic shock. Its authors, led by the psychologist Joe Henrich at the University of British Columbia, made two fundamental points. The first was that researchers in the behavioural sciences had almost exclusively focused on a small sliver of humanity: people from Western, educated, industrialised, rich, democratic societies. The second was that this sliver is not representative of the larger whole, but that people in London, Buenos Aires and Seattle were, in an acronym, WEIRD.

But there is a third fundamental point, and it was the psychologist Paul Rozin at the University of Pennsylvania who made it. In his commentary on the 2010 article, Rozin noted that this same WEIRD slice of humanity was ‘a harbinger of the future of the world’. He had seen this trend in his own research. Where he found cross-cultural differences, they were more pronounced in older generations. The world’s young people, in other words, are converging. The signs are unmistakable: the age of global WEIRDing is upon us.

[Read more…]

The empathetic humanities have much to teach our adversarial culture

By Alexander Bevilacqua, Aeon, January 15, 2019

As anyone on Twitter knows, public culture can be quick to attack, castigate and condemn. In search of the moral high ground, we rarely grant each other the benefit of the doubt. In her Class Day remarks at Harvard’s 2018 graduation, the Nigerian novelist Chimamanda Ngozi Adichie addressed the problem of this rush to judgment. In the face of what she called ‘a culture of “calling out”, a culture of outrage’, she asked students to ‘always remember context, and never disregard intent’. She could have been speaking as a historian.

History, as a discipline, turns away from two of the main ways of reading that have dominated the humanities for the past half-century. These methods have been productive, but perhaps they also bear some responsibility for today’s corrosive lack of generosity. The two approaches have different genealogies, but share a significant feature: at heart, they are adversarial.

One mode of reading, first described in 1965 by the French philosopher Paul Ricœur and known as ‘the hermeneutics of suspicion’, aims to uncover the hidden meaning or agenda of a text. Whether inspired by Karl Marx, Friedrich Nietzsche or Sigmund Freud, the reader interprets what happens on the surface as a symptom of something deeper and more dubious, from economic inequality to sexual anxiety. The reader’s task is to reject the face value of a work, and to plumb for a submerged truth.

A second form of interpretation, known as ‘deconstruction’, was developed in 1967 by the French philosopher Jacques Derrida. It aims to identify and reveal a text’s hidden contradictions – ambiguities and even aporias (unthinkable contradictions) that eluded the author. For example, Derrida detected a bias that favoured speech over writing in many influential philosophical texts of the Western tradition, from Plato to Jean-Jacques Rousseau. The fact that written texts could privilege the immediacy and truth of speech was a paradox that revealed unarticulated metaphysical commitments at the heart of Western philosophy.

Both of these ways of reading pit reader against text. The reader’s goal becomes to uncover meanings or problems that the work does not explicitly express. In both cases, intelligence and moral probity are displayed at the expense of what’s been written. In the 20th century, these approaches empowered critics to detect and denounce the workings of power in all kinds of materials – not just the dreams that Freud interpreted, or the essays by Plato and Rousseau with which Derrida was most closely concerned.

They do, however, foster a prosecutorial attitude among academics and public intellectuals. As a colleague once told me: ‘I am always looking for the Freudian slip.’ He scours the writings of his peers to spot when they trip up and betray their problematic intellectual commitments. One poorly chosen phrase can sully an entire work.

[Read more…]

The hidden resilience of ‘food desert’ neighborhoods

Barry Yeoman writes:

Even before Ashanté Reese and I reach the front gate, retired schoolteacher Alice Chandler is standing in the doorway of her brick home in Washington, D.C. She welcomes Reese, an anthropologist whom she has known for six years, with a hug and apologizes for having nothing to feed us during this spontaneous visit.

Chandler, 69 years old, is a rara avis among Americans: an adult who has lived nearly her entire life in the same house. This fact makes her stories particularly valuable to Reese, who has been studying the changing food landscape in Deanwood, a historically black neighborhood across the Anacostia River from most of the city.

When Chandler was growing up, horse-drawn wagons delivered meat, fish, and vegetables to her doorstep. The neighborhood had a milkman, as did many U.S. communities in the mid-20th century. Her mother grew vegetables in a backyard garden and made wine from the fruit of their peach tree.

Food was shared across fence lines. “Your neighbor may have tomatoes and squash in their garden,” Chandler says. “And you may have cucumbers in yours. Depending on how bountiful each one was, they would trade off.” Likewise, when people went fishing, “they would bring back enough for friends in the neighborhood. That often meant a Saturday evening fish fry at home.”

Around the corner was the Spic N Span Market, a grocery with penny candy, display cases of fresh chicken and pork chops, and an old dog who slept in the back. The owner, whom Chandler knew as “Mr. Eddie,” was a Jewish man who hired African-American cashiers and extended credit to customers short on cash. Next door was a small farm whose owner used to give fresh eggs to Chandler’s mother.

Chandler was born into this architecturally eclectic neighborhood. On the basis of oral histories found in archives, Reese mapped 11 different groceries that were open in Deanwood during its peak years, the 1930s and ’40s. African-Americans owned five. Jews, excluded by restrictive covenants from living in some other D.C. neighborhoods, owned six. For much of the mid-20th century, there was also a Safeway store.

Today there are exactly zero grocery stores. The only places for Deanwood’s 5,000 residents to buy food in their neighborhood are corner stores, abundantly stocked with beer and Beefaroni but nearly devoid of fruit, vegetables, and meat. At one of those stores, which I visited, a “Healthy Corners” sign promised fresh produce. Instead, I found two nearly empty wooden shelves sporting a few sad-looking onions, bananas, apples, and potatoes. The nearest supermarket, a Safeway, is a hilly 30-minute walk away. A city council member who visited last year found long lines, moldy strawberries, and meat that appeared to have spoiled.

The common name for neighborhoods like these is “food deserts,” which the U.S. Department of Agriculture defines as areas “where people have limited access to a variety of healthy and affordable food.” According to the USDA, food deserts tend to offer sugary, fatty foods; the department also says that poor access to fruits, vegetables, and lean meats could lead to obesity and diabetes. A map produced by the nonpartisan D.C. Policy Center puts about half of Deanwood into a desert.

But Reese, an assistant professor of anthropology at Spelman College in Atlanta, Georgia, has joined a number of scholars who are pushing back against the food desert model. She calls it a “lazy” shorthand to describe both a series of corporate decisions and a complex human ecosystem. [Continue reading…]

The steward of Middle-earth

Hannah Long writes:

Around the time Christopher [Tolkien] was commissioned an officer in the RAF in 1945, [J.R.R.] Tolkien was calling his son “my chief critic and collaborator.” Christopher would return from flying missions to pore over another chapter of his father’s work. He also joined the informal literary club known as the Inklings. At 21, he was the youngest—and is now the last surviving—member. The band of friends—J.R.R. Tolkien, C.S. Lewis, Owen Barfield, Hugo Dyson, and Charles Williams, among others—would meet at Oxford’s Eagle and Child pub or Lewis’s rooms in Magdalen College to chat about literature and philosophy and to read aloud portions of works in progress.

Christopher was recruited to narrate his father’s stories. The group considered his clear, rich voice a marked improvement over his father’s dithering, mumbling delivery. Lewis had recognized the brilliance of J.R.R. Tolkien’s work from the first moment he encountered it, and for years remained Tolkien’s only audience. Dyson, not so appreciative, exclaimed during one reading, “Oh, not another f—ing elf!”

Poet and scholar Malcolm Guite argues that the Inklings, despite their profound differences (Tolkien was an English Roman Catholic, Lewis an Ulster Protestant, Williams a hermetic mystic) refined and supported each other in their common literary mission.

“They’re not often noticed by literary historians because . . . in terms of English literature, the self-defining mainstream of 20th-century literature supposedly was high modernism, shaped by Joyce and Eliot,” Guite said in a 2011 lecture. But “there was actually . . . something quite radical going on in that group. Together, they were able to form a profoundly alternative and countercultural vision.” Guite emphasizes, in particular, the Inklings’ shared desire to respond to the materialist, largely atheistic cohort whose voices dominated the world of letters.

Although the Inklings are often accused of escapism, nearly all culture was engaged in a sort of dissociation because of the carnage and devastation of the First World War. Tolkien scholar Verlyn Flieger writes that Tolkien was “a traveler between worlds,” from his Edwardian youth to his postbellum disillusionment. It was this “oscillation that, paradoxically, makes him a modern writer, for . . . the temporal dislocation of his ‘escape’ mirrored the psychological disjunction and displacement of his century.”

High modernism found that escape in science, creating a stark divide between the material and the spiritual. This technical, technological, atomizing approach turns up in The Lord of the Rings with the villainous wizard Saruman, whose materialist philosophy dismisses the transcendent. Early in the book, Saruman changes his robe from white to multicolored. He explains, “White cloth may be dyed. The white page can be overwritten; and the white light can be broken.”

“In which case it is no longer white,” Gandalf replies. “And he that breaks a thing to find out what it is has left the path of wisdom.”

Saruman ignores that his dissection of color has eliminated something greater than the sum of its parts; he has lost view of the transcendent white light. For the Inklings, the medium of fantasy restored—or rather revealed—the enchantment of a disenchanted world. It reinstated an understanding of the transcendent that had been lost in postwar alienation. [Continue reading…]

China has placed hundreds of thousands of Muslims in cultural extermination camps

The New York Times reports:

On the edge of a desert in far western China, an imposing building sits behind a fence topped with barbed wire. Large red characters on the facade urge people to learn Chinese, study law and acquire job skills. Guards make clear that visitors are not welcome.

Inside, hundreds of ethnic Uighur Muslims spend their days in a high-pressure indoctrination program, where they are forced to listen to lectures, sing hymns praising the Chinese Communist Party and write “self-criticism” essays, according to detainees who have been released.

The goal is to remove any devotion to Islam.

Abdusalam Muhemet, 41, said the police detained him for reciting a verse of the Quran at a funeral. After two months in a nearby camp, he and more than 30 others were ordered to renounce their past lives. Mr. Muhemet said he went along but quietly seethed.

“That was not a place for getting rid of extremism,” he recalled. “That was a place that will breed vengeful feelings and erase Uighur identity.”

This camp outside Hotan, an ancient oasis town in the Taklamakan Desert, is one of hundreds that China has built in the past few years. It is part of a campaign of breathtaking scale and ferocity that has swept up hundreds of thousands of Chinese Muslims for weeks or months of what critics describe as brainwashing, usually without criminal charges.

Though limited to China’s western region of Xinjiang, it is the country’s most sweeping internment program since the Mao era — and the focus of a growing chorus of international criticism. [Continue reading…]