Russians interacted with at least 14 Trump associates during the campaign and transition

The Washington Post reports:

The Russian ambassador. A deputy prime minister. A pop star, a weightlifter, a lawyer, a Soviet army veteran with alleged intelligence ties.

Again and again and again, over the course of Donald Trump’s 18-month campaign for the presidency, Russian citizens made contact with his closest family and friends, as well as figures on the periphery of his orbit.

Some offered to help his campaign and his real estate business. Some offered dirt on his Democratic opponent. Repeatedly, Russian nationals suggested Trump should hold a peacemaking sit-down with Vladi­mir Putin — and offered to broker such a summit.

In all, Russians interacted with at least 14 Trump associates during the campaign and presidential transition, public records and interviews show. [Continue reading…]

If dying could be sweet

When former presidents or other famous people die, the news of such events is always dominated by recollections of their lives. Generally we learn only the most abbreviated details of the circumstances in which life came to an end.

The final days of George H W Bush’s life were unusual in that they were shared with his lifelong friend James Baker and other friends and family members who then graciously provided the New York Times with an account that conveys tenderness and humanity. It resonates on a level that makes irrelevant the political views any of us might have about the Bush dynasty.

When Mr. Baker came to the house early on Friday morning, Mr. Bush seemed to rally a bit, and it appeared that he would defy death one more time. He began to eat again. He had three five-minute soft-boiled eggs, a favorite, as well as a bowl of yogurt and two fruit drinks. “Everybody thought this is going to be a great day and he’s back and he’s bounced back again,” Mr. Baker said.

Mr. Baker left around 9:15 a.m. but decided to return in the evening when he and Mrs. Baker were on the way to dinner with some friends. “He was sitting up in bed and was able to converse with people,” Mr. Baker said.

But in the car on the way home from dinner, the Bakers received a phone call urging them to come back to Mr. Bush’s house. They arrived about 8:15 p.m. “He had slipped considerably,” Mr. Baker said.

Ronan Tynan, the Irish tenor, had called earlier in the day to ask if he could drop by, and when he showed up, Ms. Becker [the former president’s longtime chief of staff] asked him to sing to the president. Mr. Tynan sang two songs, the first “Silent Night” and the second a Gaelic song.

As he sang “Silent Night,” Mr. Baker said, “Believe it or not, the president was mouthing the words.”

Mr. Baker held Mr. Bush’s hand and rubbed his feet for nearly a half-hour. The other children, who live around the country, were called so they could tell their father goodbye.

Dr. Levenson [rector of St. Martin’s Episcopal Church in Houston], who arrived at 9:15 p.m., led those in the room in prayer. “We all knelt around him and placed our hands on him and prayed for him and it was a very graceful, gentle death,” he said. “It was very evident that that man was so deeply loved.”

There was no struggle, no prolonged period of labored breathing. At 10:10 p.m., the former president slipped away.

“If those things could be sweet,” Mr. Baker said, “it was sweet.”

The insect apocalypse is here. What does it mean for the rest of life on Earth?

Brooke Jarvis reports:

Sune Boye Riis was on a bike ride with his youngest son, enjoying the sun slanting over the fields and woodlands near their home north of Copenhagen, when it suddenly occurred to him that something about the experience was amiss. Specifically, something was missing.

It was summer. He was out in the country, moving fast. But strangely, he wasn’t eating any bugs.

For a moment, Riis was transported to his childhood on the Danish island of Lolland, in the Baltic Sea. Back then, summer bike rides meant closing his mouth to cruise through thick clouds of insects, but inevitably he swallowed some anyway. When his parents took him driving, he remembered, the car’s windshield was frequently so smeared with insect carcasses that you almost couldn’t see through it. But all that seemed distant now. He couldn’t recall the last time he needed to wash bugs from his windshield; he even wondered, vaguely, whether car manufacturers had invented some fancy new coating to keep off insects. But this absence, he now realized with some alarm, seemed to be all around him. Where had all those insects gone? And when? And why hadn’t he noticed?

Riis watched his son, flying through the beautiful day, not eating bugs, and was struck by the melancholy thought that his son’s childhood would lack this particular bug-eating experience of his own. It was, he granted, an odd thing to feel nostalgic about. But he couldn’t shake a feeling of loss. “I guess it’s pretty human to think that everything was better when you were a kid,” he said. “Maybe I didn’t like it when I was on my bike and I ate all the bugs, but looking back on it, I think it’s something everybody should experience.”

I met Riis, a lanky high school science and math teacher, on a hot day in June. He was anxious about not having yet written his address for the school’s graduation ceremony that evening, but first, he had a job to do. From his garage, he retrieved a large insect net, drove to a nearby intersection and stopped to strap the net to the car’s roof. Made of white mesh, the net ran the length of his car and was held up by a tent pole at the front, tapering to a small, removable bag in back. Drivers whizzing past twisted their heads to stare. Riis eyed his parking spot nervously as he adjusted the straps of the contraption. “This is not 100 percent legal,” he said, “but I guess, for the sake of science.”

Riis had not been able to stop thinking about the missing bugs. The more he learned, the more his nostalgia gave way to worry. Insects are the vital pollinators and recyclers of ecosystems and the base of food webs everywhere. Riis was not alone in noticing their decline. In the United States, scientists recently found the population of monarch butterflies fell by 90 percent in the last 20 years, a loss of 900 million individuals; the rusty-patched bumblebee, which once lived in 28 states, dropped by 87 percent over the same period. With other, less-studied insect species, one butterfly researcher told me, “all we can do is wave our arms and say, ‘It’s not here anymore!’ ” Still, the most disquieting thing wasn’t the disappearance of certain species of insects; it was the deeper worry, shared by Riis and many others, that a whole insect world might be quietly going missing, a loss of abundance that could alter the planet in unknowable ways. “We notice the losses,” says David Wagner, an entomologist at the University of Connecticut. “It’s the diminishment that we don’t see.”

Because insects are legion, inconspicuous and hard to meaningfully track, the fear that there might be far fewer than before was more felt than documented. People noticed it by canals or in backyards or under streetlights at night — familiar places that had become unfamiliarly empty. The feeling was so common that entomologists developed a shorthand for it, named for the way many people first began to notice that they weren’t seeing as many bugs. They called it the windshield phenomenon. [Continue reading…]

Humanity is destroying life on Earth

 

The Guardian reports:

Humanity has wiped out 60% of mammals, birds, fish and reptiles since 1970, leading the world’s foremost experts to warn that the annihilation of wildlife is now an emergency that threatens civilisation.

The new estimate of the massacre of wildlife is made in a major report produced by WWF and involving 59 scientists from across the globe. It finds that the vast and growing consumption of food and resources by the global population is destroying the web of life, billions of years in the making, upon which human society ultimately depends for clean air, water and everything else.

“We are sleepwalking towards the edge of a cliff” said Mike Barrett, executive director of science and conservation at WWF. “If there was a 60% decline in the human population, that would be equivalent to emptying North America, South America, Africa, Europe, China and Oceania. That is the scale of what we have done.”

“This is far more than just being about losing the wonders of nature, desperately sad though that is,” he said. “This is actually now jeopardising the future of people. Nature is not a ‘nice to have’ – it is our life-support system.” [Continue reading…]

The New York Times reports:

The Chinese government, reversing a 25-year ban, announced on Monday that it would allow the use of rhinoceros horns and tiger bones in medicine, a move that environmentalists described as a significant setback for efforts to protect the animals from extinction.

The State Council, China’s cabinet, said in a policy directive that it would legalize the use of rhino horns and tiger bones for “medical research or in healing,” but only by certified hospitals and doctors, and only from rhinos and tigers raised in captivity, excluding zoo animals.

Still, environmentalists said the decision would likely help fuel a black market for wild rhino and tiger parts, which are revered in traditional Chinese medicine for supposed healing powers, and could lead to increased poaching of the fewer than 30,000 rhinos and 3,900 tigers still in the wild. [Continue reading…]

The rhythm of life

 

Living in ignorance about our ignorance

Kaidi Wu and David Dunning write:

In 1806, entrepreneur Frederic Tudor sailed to the island of Martinique with a precious cargo. He had harvested ice from frozen Massachusetts rivers and expected to make a tidy profit selling it to tropical customers. There was only one problem: the islanders had never seen ice. They had never experienced a cold drink, never tasted a pint of ice cream. Refrigeration was not a celebrated innovation, but an unknown concept. In their eyes, there was no value in Tudor’s cargo. His sizable investment melted away unappreciated and unsold in the Caribbean heat.

Tudor’s ice tale contains an important point about human affairs. Often, human fate rests not on what people know but what they fail to know. Often, life’s outcomes are determined by hypocognition.

What is hypocognition? If you don’t know, you’ve just experienced it.

Hypocognition, a term introduced to modern behavioral science by anthropologist Robert Levy, means the lack of a linguistic or cognitive representation for an object, category, or idea. The Martinique islanders were hypocognitive because they lacked a cognitive representation of refrigeration. But so are we hypocognitive of the numerous concepts that elude our awareness. We wander about the unknown terrains of life as novices more often than experts, complacent of what we know and oblivious to what we miss.

In financial dealings, almost two thirds of Americans are hypocognitive of compound interest, unaware of how much saving money can benefit them and how quickly debt can crush them. In health, a full third of people suffering from Type II diabetes remain hypocognitive of the illness. They fail to seek needed treatment—despite recognizing blurry vision, dry mouth, frequent urination—because they lack the underlying concept that would unify the disparate warning signals into a single alarm.

Hypocognition is about the absence of things. It is hard to recognize precisely because it is invisible. To recognize hypocognition requires a departure from the reassuring familiarity of our own culture to gain a grasp of the unknown and the missing. After all, it is difficult to see the culture we inhabit from only within. [Continue reading…]

We are more than our brains

Alan Jasanoff writes:

Brains are undoubtedly somewhat computer-like – computers, after all, were invented to perform brain-like functions – but brains are also much more than bundles of wiry neurons and the electrical impulses they are famous for propagating. The function of each neuroelectrical signal is to release a little flood of chemicals that helps to stimulate or suppress brain cells, in much the way that chemicals activate or suppress functions such as glucose production by liver cells or immune responses by white blood cells. Even the brain’s electrical signals themselves are the products of chemicals called ions that move in and out of cells, causing tiny ripples that can spread independently of neurons.

Also distinct from neurons are the relatively passive brain cells called glia (Greek for glue) that are roughly equal in number to the neurons but do not conduct electrical signals in the same way. Recent experiments in mice have shown that manipulating these uncharismatic cells can produce dramatic effects on behaviour. In one experiment, a research group in Japan showed that direct stimulation of glia in a brain region called the cerebellum could cause a behavioural response analogous to changes more commonly evoked by stimulation of neurons. Another remarkable study showed that transplantation of human glial cells into mouse brains boosted the animals’ performance in learning tests, again demonstrating the importance of glia in shaping brain function. Chemicals and glue are as integral to brain function as wiring and electricity. With these moist elements factored in, the brain seems much more like an organic part of the body than the idealised prosthetic many people imagine.

Stereotypes about brain complexity also contribute to the mystique of the brain and its distinction from the body. It has become a cliché to refer to the brain as ‘the most complex thing in the known Universe’. This saying is inspired by the finding that human brains contain something on the order of 100,000,000,000 neurons, each of which makes about 10,000 connections (synapses) to other neurons. The daunting nature of such numbers provides cover for people who argue that neuroscience will never decipher consciousness, or that free will lurks somehow among the billions and billions.

But the sheer number of cells in the human brain is unlikely to explain its extraordinary capabilities. Human livers have roughly the same number of cells as brains, but certainly don’t generate the same results. Brains themselves vary in size over a considerable range – by around 50 per cent in mass and likely number of brain cells. Radical removal of half of the brain is sometimes performed as a treatment for epilepsy in children. Commenting on a cohort of more than 50 patients who underwent this procedure, a team at Johns Hopkins in Baltimore wrote that they were ‘awed by the apparent retention of memory after removal of half of the brain, either half, and by the retention of the child’s personality and sense of humour’. Clearly not every brain cell is sacred.

If one looks out into the animal kingdom, vast ranges in brain size fail to correlate with apparent cognitive power at all. Some of the most perspicacious animals are the corvids – crows, ravens, and rooks – which have brains less than 1 per cent the size of a human brain, but still perform feats of cognition comparable to chimpanzees and gorillas. Behavioural studies have shown that these birds can make and use tools, and recognise people on the street, feats that even many primates are not known to achieve. Within individual orders, animals with similar characteristics also display huge differences in brain size. Among rodents, for instance, we can find the 80-gram capybara brain with 1.6 billion neurons and the 0.3-gram pygmy mouse brain with probably fewer than 60 million neurons. Despite a greater than 100-fold difference in brain size, these species live in similar habitats, display similarly social lifestyles, and do not display obvious differences in intelligence. Although neuroscience is only beginning to parse brain function even in small animals, such reference points show that it is mistaken to mystify the brain because of its sheer number of components.

Playing up the machine-like qualities of the brain or its unbelievable complexity distances it from the rest of the biological world in terms of its composition. But a related form of brain-body distinction exaggerates how the brain stands apart in terms of its autonomy from body and environment. This flavour of dualism contributes to the cerebral mystique by enhancing the brain’s reputation as a control centre, receptive to bodily and environmental input but still in charge.

Contrary to this idea, our brains themselves are perpetually influenced by torrents of sensory input. The environment shoots many megabytes of sensory data into the brain every second, enough information to disable many computers. The brain has no firewall against this onslaught. Brain-imaging studies show that even subtle sensory stimuli influence regions of the brain, ranging from low-level sensory regions where input enters the brain to parts of the frontal lobe, the high-level brain area that is expanded in humans compared with many other primates.

Many of these stimuli seem to take direct control of us. For instance, when we view illustrations, visual features often seem to grab our eyes and steer our gaze around in spatial patterns that are largely reproducible from person to person. If we see a face, our focus darts reflexively among eyes, nose and mouth, subconsciously taking in key features. When we walk down the street, our minds are similarly manipulated by stimuli in the surroundings – the honk of a car’s horn, the flashing of a neon light, the smell of pizza – each of which guides our thoughts and actions even if we don’t realise that anything has happened.

Even further below our radar are environmental features that act on a slower timescale to influence our mood and emotions. Seasonal low light levels are famous for their correlation with depression, a phenomenon first described by the South African physician Norman Rosenthal soon after he moved from sunny Johannesburg to the grey northeastern United States in the 1970s. Colours in our surroundings also affect us. Although the idea that colours have psychic power evokes New Age mysticism, careful experiments have repeatedly linked cold colours such as blue and green to positive emotional responses, and hot red hues to negative responses. In one example, researchers showed that participants performed worse on IQ tests labelled with red marks than on tests labelled with green or grey; another study found that subjects performed better on computerised creativity tests delivered on a blue background than on a red background.

Signals from within the body influence behaviour just as powerfully as influences from the environment, again usurping the brain’s command and challenging idealised conceptions of its supremacy. [Continue reading…]

What matters

Owen Flanagan writes:

In “The Strange Order of Things” Antonio Damasio promises to explore “one interest and one idea … why and how we emote, feel, use feelings to construct our selves; how feelings assist or undermine our best intentions; why and how our brains interact with the body to support such functions.”

Damasio thinks that the cognitive revolution of the last 40 years, which has yielded cognitive science, cognitive neuroscience and artificial intelligence, has been, in fact, too cognitive, too rationalist, and not concerned enough with the role that affect plays in the natural history of mind and culture. Standard stories of the evolution of human culture are framed in terms of rational problem solving, creative intelligence, invention, foresight and linguistically mediated planning — the inventions of fire, shelters from the storms, agriculture, the domestication of animals, transportation systems, systems of political organization, weapons, books, libraries, medicine and computers.

Damasio rightly insists that a system with reason, intelligence and language does nothing unless it cares about something, unless things matter to it or, in the case of the emerging world of A.I., things matter to its makers. Feelings motivate reason and intelligence, then “stay on to check the results, and help negotiate the necessary adjustments.”

In an earlier book, “Looking for Spinoza,” Damasio developed the concept of conatus — drive, will, motive, urge — as the taken-for-granted force or catalyst that puts reason, creative intelligence and language to work. If there were no feelings, he adds now, there would be no art, no music, no philosophy, no science, no friendship, no love, no culture and complex life would not aim to sustain itself. “The complete absence of feeling would spell a suspension of being.” [Continue reading…]

Have we forgotten how to die?

In a review of seven books on death and dying, Julie-Marie Strange writes:

James Turner was twenty-five when his four-year-old daughter Annice died from a lung condition. She died at home with her parents and grandmother; her sleeping siblings were told of her death the next morning. James did everything to soothe Annice’s last days but, never having encountered death before, he didn’t immediately recognize it. He didn’t know what to do or expect and found it hard to discuss things with his wife Martha. The family received many condolences but kept the funeral private. Losing a child, often described as the hardest bereavement to bear, changed James Turner forever.

Death in the twenty-first century is typified by the paradox contained in this story. Although we greedily consume death at a distance through fiction, drama and the media, we are hamstrung by it up close and personal. In 1955 the commentator Geoffrey Gorer declared that death had become more pornographic than sex. It was, he said, the new taboo and mourning had become “indecent”. Since then, matters have arguably got worse. The decline in institutional Christianity left a spiritual and existential vacuum, while the rise in individual materialism has fragmented family networks and communities. Shared rites of passage that publicly validated grief have receded, and the space of death has moved increasingly from the home to the hospital.

Focusing on the US and, to a lesser extent, Northern Europe, Haider Warraich’s Modern Death: How medicine changed the end of life identifies how far-reaching these changes are. A physician and clinical researcher, Warraich is well placed to observe the dubious implications of an expanded medicalization of death. Most people want to die at home, but the majority continue to die in hospital, surrounded by medical equipment. In general, life expectancy in the past century has increased, but so has the use of medicine to prolong it artificially. Definitions of death have grown more complicated – does it lie in brain function or in the heart and lungs? – and are openly contested. And despite what Warraich calls medicine’s “obsession” with preventing or delaying death, there is no clear provision for bereaved families. That task waits to be taken up. Kathryn Mannix agrees in With the End in Mind: Dying, death and wisdom in an age of denial, suggesting that it “has become taboo to mention dying”. Through a “gradual tran­sition”, Mannix says, we have lost the vocab­ulary for talking about death and depend instead on euphemism, lies and ambiguity; she wants us to “reclaim” a language of death.

This is a recurring theme among these seven books. For some, our inability to talk straight about death and dying is partly about the mystery of the end. Andrew Stark, in The Consolations of Mortality: Making sense of death, identifies the decline in religion in the West and the idea of the afterlife as pivotal to our lack of confidence in confronting death. Robert McCrum, in Every Third Thought: On life, death and the endgame, speculates that ageing and death present a particular conundrum to self-assured baby boomers, who try to give death the slip (“let’s talk about it another time . . .”). In From Here to Eternity: Travelling the world to find the good death, Caitlin Doughty expands the problem into a generic Western culture of death “avoidance” – we duck awkward conversations with the dying, hand our corpses to corporate professionals and, worst of all, treat grief with embarrassment and shame. Kevin Toolis, in My Father’s Wake: How the Irish teach us to live, love and die, describes a veritable “Western Death Machine”, in which public services, health professionals, the media and corporate bodies all conspire towards the removal of death and dying from the purview of ordinary people. A former war correspondent, Toolis has seen more than his fair share of death and is here to shake us out of our complacency. [Continue reading…]

Aristotle’s lessons on happiness

Edith Hall writes:

In the Western world, only since the mid-18th century has it been possible to discuss ethical questions publicly without referring to Christianity. Modern thinking about morality, which assumes that gods do not exist, or at least do not intervene, is in its infancy. But the ancient Greeks and Romans elaborated robust philosophical schools of ethical thought for more than a millennium, from the first professed agnostics such as Protagoras (fifth century BCE) to the last pagan thinkers. The Platonists’ Academy at Athens was not finally closed down until 529 CE, by the Emperor Justinian.

That longstanding tradition of moral philosophy is an invaluable legacy of ancient Mediterranean civilisation. It has prompted several contemporary secular thinkers, faced with the moral vacuum left by the decline of Christianity since the late 1960s, to revive ancient schools of thought. Stoicism, founded in Athens by the Cypriot Zeno in about 300 BCE, has advocates. Self-styled Stoic organisations on both sides of the Atlantic offer courses, publish books and blogposts, and even run an annual Stoic Week. Some Stoic principles underlay Dale Carnegie’s self-help classic How to Stop Worrying and Start Living (1948). He recommended Marcus Aurelius’ Meditations to its readers. But authentic ancient Stoicism was pessimistic and grim. It denounced pleasure. It required the suppression of emotions and physical appetites. It recommended the resigned acceptance of misfortune, rather than active engagement with the fine-grained business of everyday problem-solving. It left little room for hope, human agency or constructive repudiation of suffering.

Less familiar is the recipe for happiness (eudaimonia) advocated by Aristotle, yet it has much to be said for it. Outside of philosophy departments, where neo-Aristotelian thinkers such as Philippa Foot and Rosalind Hursthouse have championed his virtue ethics as an alternative to utilitarianism and Kantian approaches, it is not as well known as it should be. At his Lyceum in Athens, Aristotle developed a model for the maximisation of happiness that could be implemented by individuals and whole societies, and is still relevant today. It became known as ‘peripatetic philosophy’ because Aristotle conducted philosophical debates while strolling in company with his interlocutors.

The fundamental tenet of peripatetic philosophy is this: the goal of life is to maximise happiness by living virtuously, fulfilling your own potential as a human, and engaging with others – family, friends and fellow citizens – in mutually beneficial activities. Humans are animals, and therefore pleasure in responsible fulfilment of physical needs (eating, sex) is a guide to living well. But since humans are advanced animals, naturally inclining to live together in settled communities (poleis), we are ‘political animals’ (zoa politika). Humans must take responsibility for their own happiness since ‘god’ is a remote entity, the ‘unmoved mover’ who might maintain the universe’s motion but has neither any interest in human welfare, nor any providential function in rewarding virtue or punishing immorality. Yet purposively imagining a better, happier life is feasible since humans have inborn abilities that allow them to promote individual and collective flourishing. These include the inclinations to ask questions about the world, to deliberate about action, and to activate conscious recollection. [Continue reading…]