We are more rational than we are told

Steven Poole writes:

Humanity’s achievements and its self-perception are today at curious odds. We can put autonomous robots on Mars and genetically engineer malarial mosquitoes to be sterile, yet the news from popular psychology, neuroscience, economics and other fields is that we are not as rational as we like to assume. We are prey to a dismaying variety of hard-wired errors. We prefer winning to being right. At best, so the story goes, our faculty of reason is at constant war with an irrational darkness within. At worst, we should abandon the attempt to be rational altogether.

The present climate of distrust in our reasoning capacity draws much of its impetus from the field of behavioural economics, and particularly from work by Daniel Kahneman and Amos Tversky in the 1980s, summarised in Kahneman’s bestselling Thinking, Fast and Slow (2011). There, Kahneman divides the mind into two allegorical systems, the intuitive ‘System 1’, which often gives wrong answers, and the reflective reasoning of ‘System 2’. ‘The attentive System 2 is who we think we are,’ he writes; but it is the intuitive, biased, ‘irrational’ System 1 that is in charge most of the time.

Other versions of the message are expressed in more strongly negative terms. You Are Not So Smart (2011) is a bestselling book by David McRaney on cognitive bias. According to the study ‘Why Do Humans Reason?’ (2011) by the cognitive scientists Hugo Mercier and Dan Sperber, our supposedly rational faculties evolved not to find ‘truth’ but merely to win arguments. And in The Righteous Mind (2012), the psychologist Jonathan Haidt calls the idea that reason is ‘our most noble attribute’ a mere ‘delusion’. The worship of reason, he adds, ‘is an example of faith in something that does not exist’. Your brain, runs the now-prevailing wisdom, is mainly a tangled, damp and contingently cobbled-together knot of cognitive biases and fear.

This is a scientised version of original sin. And its eager adoption by today’s governments threatens social consequences that many might find troubling. A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept. [Continue reading…]

Yes, determinists, there is such a thing as free will

In an interview with Nautilus, Christian List says:

I think the mistake in the standard arguments against free will lies in a failure to distinguish between different levels of description. If we are searching for free will at the fundamental physical level, we are simply searching in the wrong place.

Let’s go through these arguments one by one. What do you say to those who consider the idea that humans are beings with goals and intentions, and that we act on them, a prescientific holdover?

If you try to make sense of human behavior, not just in ordinary life but also in the sciences, then the ascription of intentionality is indispensable. It’s infeasible and not illuminating to explain human behavior at the level of astronomically complex neural firing patterns that take place in the brain.

Suppose I ask a taxi driver to take me to Paddington Station. Twenty-five minutes later, I’m there. The next day, I tell the driver to take me to St. Pancras Station. Now the driver takes me to St. Pancras. If I look at the underlying microphysical activity, it would be very difficult to pinpoint what those two events have in common. If we switch to the intentional mode of explanation, we can very easily explain why the taxi driver takes me to Paddington on the first day, and what the difference is on the second day that leads the driver to take me to St. Pancras Station. The taxi driver understands our communication, forms the intention to take me to a particular station, and is clearly incentivized to do so because this is the way for the driver to earn a living.

The neuroscientific skeptic is absolutely right that, at the fundamental physical level, there is no such thing as intentional goal-directed agency. The mistake is to claim that there is no such thing at all. Intentional agency is an emergent higher-level property, but it is no less real for that. [Continue reading…]

Social rejection doesn’t only hurt — it kills

Elitsa Dermendzhiyska writes:

The psychologist Naomi Eisenberger describes herself as a mutt of a scientist. Never quite fitting the mould of the fields she studied – psychobiology, health psychology, neuroscience – she took an unusual early interest in what you might call the emotional life of the brain. As a doctoral student at the University of California, Los Angeles (UCLA), Eisenberger found it curious that we often describe being rejected in terms of physical pain: ‘My heart was broken’, ‘I felt crushed’, ‘He hurt my feelings’, ‘It was like a slap in the face’. More than metaphors, these expressions seem to capture something essential about how we feel in a way that we can’t convey directly. And you’ll find similar ones not just in English but in languages all over the world. Eisenberger wondered why. Could there be a deeper connection between physical and emotional pain?

In a landmark experiment in 2003, Eisenberger and her colleagues had test subjects strapped with virtual-reality headsets. Peering through goggles, the participants could see their own hand and a ball, plus two cartoon characters – the avatars of fellow participants in another room. With the press of a button, each player could toss the ball to another player while the researchers measured their brain activity through fMRI scans. In the first round of CyberBall – as the game became known – the ball flew back and forth just as you’d expect, but pretty soon the players in the second room started making passes only to each other, completely ignoring the player in the first room. In reality, there were no other players: just a computer programmed to ‘reject’ each participant so that the scientists could see how exclusion – what they called ‘social pain’ – affects the brain.

Physical pain involves several brain regions, some of which detect its location, while others, such as the anterior insula (AI) and the dorsal anterior cingulate cortex (dACC), process the subjective experience, the unpleasantness, of pain. In fMRI scans of people playing CyberBall, Eisenberger’s team saw both the AI and the dACC light up in participants excluded from the game. Moreover, those who felt the most emotional distress also showed the most pain-related brain activity. In other words, being socially rejected triggered the same neural circuits that process physical injury, and translate it into the experience we call pain. [Continue reading…]

Rich guys are most likely to have no idea what they’re talking about, study suggests

Christopher Ingraham writes:

Researchers embarked on a novel study intent on measuring what a Princeton philosophy professor contends is one of the most salient features of our culture — the ability to play the expert without being one.

Or, as the social scientists put it, to BS.

Research by John Jerram and Nikki Shure of the University College of London, and Phil Parker of Australian Catholic University attempted to measure the pervasiveness of this trait in society and identify its most ardent practitioners.

Study participants were asked to assess their knowledge of 16 math topics on a five-point scale ranging from “never heard of it” to “know it well, understand the concept.” Crucially, three of those topics were complete fabrications: “proper numbers,” “subjunctive scaling” and “declarative fractions.” Those who said they were knowledgeable about the fictitious topics were categorized as BSers.

Using a data set spanning nine predominantly English-speaking countries, researchers delineated a number of key findings. First, men are much more likely than women to master the art of hyperbole, as are the wealthy relative to the poor or middle class. North Americans, meanwhile, tend to slip into this behavior more readily than English speakers in other parts of the globe. And if there were a world championship, as a true devotee might appreciate, the title would go to Canada, data show. [Continue reading…]

Experiments that make quantum mechanics directly visible to the human eye

Rebecca Holmes writes:

I spent a lot of time in the dark in graduate school. Not just because I was learning the field of quantum optics – where we usually deal with one particle of light or photon at a time – but because my research used my own eyes as a measurement tool. I was studying how humans perceive the smallest amounts of light, and I was the first test subject every time.

I conducted these experiments in a closet-sized room on the eighth floor of the psychology department at the University of Illinois, working alongside my graduate advisor, Paul Kwiat, and psychologist Ranxiao Frances Wang. The space was equipped with special blackout curtains and a sealed door to achieve total darkness. For six years, I spent countless hours in that room, sitting in an uncomfortable chair with my head supported in a chin rest, focusing on dim, red crosshairs, and waiting for tiny flashes delivered by the most precise light source ever built for human vision research. My goal was to quantify how I (and other volunteer observers) perceived flashes of light from a few hundred photons down to just one photon.

As individual particles of light, photons belong to the world of quantum mechanics – a place that can seem totally unlike the Universe we know. Physics professors tell students with a straight face that an electron can be in two places at once (quantum superposition), or that a measurement on one photon can instantly affect another, far-away photon with no physical connection (quantum entanglement). Maybe we accept these incredible ideas so casually because we usually don’t have to integrate them into our daily existence. An electron can be in two places at once; a soccer ball cannot.

But photons are quantum particles that human beings can, in fact, directly perceive. Experiments with single photons could force the quantum world to become visible, and we don’t have to wait around – several tests are possible with today’s technology. The eye is a unique biological measurement device, and deploying it opens up exciting areas of research where we truly don’t know what we might find. Studying what we see when photons are in a superposition state could contribute to our understanding of the boundary between the quantum and classical worlds, while a human observer might even participate in a test of the strangest consequences of quantum entanglement. [Continue reading…]

How culture works with evolution to produce human cognition

Cecilia Heyes writes:

The conventional view, inside and outside academia, is that children are ‘wired’ to imitate. We are ‘Homo imitans’, animals born with a burning desire to copy the actions of others. Imitation is ‘in our genes’. Birds build nests, cats miaow, pigs are greedy, while humans possess an instinct to imitate.

The idea that humans have cognitive instincts is a cornerstone of evolutionary psychology, pioneered by Leda Cosmides, John Tooby and Steven Pinker in the 1990s. ‘[O]ur modern skulls house a Stone Age mind,’ wrote Cosmides and Tooby in 1997. On this view, the cognitive processes or ‘organs of thought’ with which we tackle contemporary life have been shaped by genetic evolution to meet the needs of small, nomadic bands of people – people who devoted most of their energy to digging up plants and hunting animals. It’s unsurprising, then, that today our Stone Age instincts often deliver clumsy or distasteful solutions, but there’s not a whole lot we can do about it. We’re simply in thrall to our thinking genes.

This all seems plausible and intuitive, doesn’t it? The trouble is, the evidence behind it is dubious. In fact, if we look closely, it’s apparent that evolutionary psychology is due for an overhaul. Rather than hard-wired cognitive instincts, our heads are much more likely to be populated by cognitive gadgets, tinkered and toyed with over successive generations. Culture is responsible not just for the grist of the mind – what we do and make – but for fabricating its mills, the very way the mind works. [Continue reading…]

The interplay that brings together order and disorder

Alan Lightman writes:

Planets, stars, life, even the direction of time all depend on disorder. And we human beings as well. Especially if, along with disorder, we group together such concepts as randomness, novelty, spontaneity, free will and unpredictability. We might put all of these ideas in the same psychic basket. Within the oppositional category of order, we can gather together notions such as systems, law, reason, rationality, pattern, predictability. While the different clusters of concepts are not mirror images of one another, like twilight and dawn, they have much in common.

Our primeval attraction to both order and disorder shows up in modern aesthetics. We like symmetry and pattern, but we also relish a bit of asymmetry. The British art historian Ernst Gombrich believed that, although human beings have a deep psychological attraction to order, perfect order in art is uninteresting. ‘However we analyse the difference between the regular and the irregular,’ he wrote in The Sense of Order (1979), ‘we must ultimately be able to account for the most basic fact of aesthetic experience, the fact that delight lies somewhere between boredom and confusion.’ Too much order, we lose interest. Too much disorder, and there’s nothing to be interested in. My wife, a painter, always puts a splash of colour in the corner of her canvas, off balance, to make the painting more appealing. Evidently, our visual sweet-spot lies somewhere between boredom and confusion, predictability and newness.

Human beings have a conflicted relationship to this order-disorder nexus. We are alternately attracted from one to the other. We admire principles and laws and order. We embrace reasons and causes. We seek predictability. Some of the time. On other occasions, we value spontaneity, unpredictability, novelty, unconstrained personal freedom. We love the structure of Western classical music, as well as the free-wheeling runs or improvised rhythms of jazz. We are drawn to the symmetry of a snowflake, but we also revel in the amorphous shape of a high-riding cloud. We appreciate the regular features of pure-bred animals, while we’re also fascinated by hybrids and mongrels. We might respect those who manage to live sensibly and lead upright lives. But we also esteem the mavericks who break the mould, and we celebrate the wild, the unbridled and the unpredictable in ourselves. We are a strange and contradictory animal, we human beings. And we inhabit a cosmos equally strange. [Continue reading…]

Can we get better at forgetting?

Benedict Carey writes:

Whatever its other properties, memory is a reliable troublemaker, especially when navigating its stockpile of embarrassments and moral stumbles. Ten minutes into an important job interview and here come screenshots from a past disaster: the spilled latte, the painful attempt at humor. Two dates into a warming relationship and up come flashbacks of an earlier, abusive partner.

The bad timing is one thing. But why can’t those events be somehow submerged amid the brain’s many other dimming bad memories?

Emotions play a role. Scenes, sounds and sensations leave a deeper neural trace if they stir a strong emotional response; this helps you avoid those same experiences in the future. Memory is protective, holding on to red flags so they can be waved at you later, to guide your future behavior.

But forgetting is protective too. Most people find a way to bury, or at least reshape, the vast majority of their worst moments. Could that process be harnessed or somehow optimized?

Perhaps. In the past decade or so, brain scientists have begun to piece together how memory degrades and forgetting happens. A new study, published this month in the Journal of Neuroscience, suggests that some things can be intentionally relegated to oblivion, although the method for doing so is slightly counterintuitive.

For the longest time, forgetting was seen as a passive process of decay and the enemy of learning. But as it turns out, forgetting is a dynamic ability, crucial to memory retrieval, mental stability and maintaining one’s sense of identity.

That’s because remembering is a dynamic process. At a biochemical level, memories are not pulled from the shelf like stored videos but pieced together — reconstructed — by the brain. [Continue reading…]

Good Samaritans aren’t the exception

Melanie McGrath writes:

A few years ago, I was assaulted on a busy street in London by a man who came up behind me. Some details of the assault are hazy, others pin-sharp. I recall exactly what my attacker did, and that the assault was witnessed by rush-hour drivers sitting at a red light. If there were pedestrians nearby, I do not remember them, though the situation suggests that there were people at hand. I do remember that no one came to my aid.

On the face of it, this looks like a textbook case of bystander apathy – the failure of onlookers to intervene in troubling, violent or even murderous events when others are present. The effect was first described in 1968 by the social psychologists Bibb Latané at Columbia University in New York and John Darley at New York University. Their research was prompted by the murder of Kitty Genovese outside her home in Queens in 1964. In The New York Times’s report of the killing, which was rehashed by news media across the world, only one of 38 witnesses was said to have done anything to intervene.

Latané and Darley’s research suggested that the greater the number of onlookers the less likely anyone was to step in, especially if others around them appeared calm or unconcerned. Whereas lone bystanders stepped forward to help a victim 85 per cent of the time, only 31 per cent of witnesses intervened when they were part of a group of five. Latané and Darley labelled this phenomenon ‘diffusion of responsibility’, which along with ‘evaluation apprehension’ (concern about how any intervention might be interpreted) and ‘pluralistic ignorance’ (if everyone else seems calm, there’s nothing to worry about) make up what has become known as the bystander effect or bystander apathy.

In the half-century since it was first described, the bystander effect has been widely studied and elaborated upon, but never fundamentally challenged. [Continue reading…]

Teens have less face time with their friends – and are lonelier than ever

File 20190319 60956 6picsy.jpg?ixlib=rb 1.1
Teens aren’t necessarily less social, but the contours of their social lives have changed.
pxhere

By Jean Twenge, San Diego State University

Ask a teen today how she communicates with her friends, and she’ll probably hold up her smartphone. Not that she actually calls her friends; it’s more likely that she texts them or messages them on social media.

Today’s teens – the generation I call “iGen” that’s also called Gen Z – are constantly connected with their friends via digital media, spending as much as nine hours a day on average with screens.

How might this influence the time they spend with their friends in person?

Some studies have found that people who spend more time on social media actually have more face time with friends.

But studies like this are only looking at people already operating in a world suffused with smartphones. They can’t tell us how teens spent their time before and after digital media use surged.

What if we zoomed out and compared how often previous generations of teens spent time with their friends to how often today’s teens are doing so? And what if we also saw how feelings of loneliness differed across the generations?

To do this, my co-authors and I examined trends in how 8.2 million U.S. teens spent time with their friends since the 1970s. It turns out that today’s teens are socializing with friends in fundamentally different ways – and also happen to be the loneliest generation on record.

[Read more…]