Inside the struggle to define life

Ian Sample writes:

All the brain cells of life on Earth still cannot explain life on Earth. Its most intelligent species has uncovered the building blocks of matter, read countless genomes and watched spacetime quiver as black holes collide. It understands much of how living creatures work, but not how they came to be. There is no agreement, even, on what life is.

The conundrum of life is so fundamental that to solve it would rank among the most important achievements of the human mind. But for all scientists’ efforts – and there have been plenty – the big questions remain. If biology is defined as the study of life, on this it has failed to deliver.

But enlightenment may come from another direction. Rather than biology, some scientists are now looking to physics for answers, in particular the physics of information. Buried in the rules that shape information lie the secrets of life and perhaps even the reason for our existence.

That, at least, is the bold proposal from Paul Davies, a prominent physicist who explores the idea in his forthcoming book, The Demon in the Machine. Published next week, it continues a theme of thinking that landed Davies the $1m Templeton prize for contributions to religious thought and inquiry.

As director of the Beyond Center for Fundamental Concepts in Science at Arizona State University, Davies is well placed to spot the next wave that will crash over science. What he sees on the horizon is a revolution that brings physics and biology together through the common science of information.

“The basic hypothesis is this,” Davies says. “We have fundamental laws of information that bring life into being from an incoherent mish-mash of chemicals. The remarkable properties we associate with life are not going to come about by accident.”

The proposal takes some unpacking. Davies believes that the laws of nature as we know them today are insufficient to explain what life is and how it came about. We need to find new laws, he says, or at least new principles, which describe how information courses around living creatures. Those rules may not only nail down what life is, but actively favour its emergence. [Continue reading…]

Emergence: How complex wholes arise from simple parts

John Rennie writes:

You could spend a lifetime studying an individual water molecule and never deduce the precise hardness or slipperiness of ice. Watch a lone ant under a microscope for as long as you like, and you still couldn’t predict that thousands of them might collaboratively build bridges with their bodies to span gaps. Scrutinize the birds in a flock or the fish in a school and you wouldn’t find one that’s orchestrating the movements of all the others.

Nature is filled with such examples of complex behaviors that arise spontaneously from relatively simple elements. Researchers have even coined the term “emergence” to describe these puzzling manifestations of self-organization, which can seem, at first blush, inexplicable. Where does the extra injection of complex order suddenly come from?

Answers are starting to come into view. One is that these emergent phenomena can be understood only as collective behaviors — there is no way to make sense of them without looking at dozens, hundreds, thousands or more of the contributing elements en masse. These wholes are indeed greater than the sums of their parts.

Another is that even when the elements continue to follow the same rules of individual behavior, external considerations can change the collective outcome of their actions. For instance, ice doesn’t form at zero degrees Celsius because the water molecules suddenly become stickier to one another. Rather, the average kinetic energy of the molecules drops low enough for the repulsive and attractive forces among them to fall into a new, more springy balance. That liquid-to-solid transition is such a useful comparison for scientists studying emergence that they often characterize emergent phenomena as phase changes.

Our latest In Theory video on emergence explains more about how throngs of simple parts can self-organize into a more extraordinary whole:

 

The crisis inside the physics of time

Marcia Bartusiak writes:

Poets often think of time as a river, a free-flowing stream that carries us from the radiant morning of birth to the golden twilight of old age. It is the span that separates the delicate bud of spring from the lush flower of summer.

Physicists think of time in somewhat more practical terms. For them, time is a means of measuring change—an endless series of instants that, strung together like beads, turn an uncertain future into the present and the present into a definite past. The very concept of time allows researchers to calculate when a comet will round the sun or how a signal traverses a silicon chip. Each step in time provides a peek at the evolution of nature’s myriad phenomena.

In other words, time is a tool. In fact, it was the first scientific tool. Time can now be sliced into slivers as thin as one ten-trillionth of a second. But what is being sliced? Unlike mass and distance, time cannot be perceived by our physical senses. We don’t see, hear, smell, touch, or taste time. And yet we somehow measure it. As a cadre of theorists attempt to extend and refine the general theory of relativity, Einstein’s momentous law of gravitation, they have a problem with time. A big problem.

“It’s a crisis,” says mathematician John Baez, of the University of California at Riverside, “and the solution may take physics in a new direction.” Not the physics of our everyday world. Stopwatches, pendulums, and hydrogen maser clocks will continue to keep track of nature quite nicely here in our low-energy earthly environs. The crisis arises when physicists attempt to merge the macrocosm—the universe on its grandest scale—with the microcosm of subatomic particles. [Continue reading…]

Studying time is like holding a snowflake

Brian Gallagher writes:

In April, in the famous Faraday Theatre at the Royal Institution in London, Carlo Rovelli gave an hour-long lecture on the nature of time. A red thread spanned the stage, a metaphor for the Italian theoretical physicist’s subject. “Time is a long line,” he said. To the left lies the past—the dinosaurs, the big bang—and to the right, the future—the unknown. “We’re sort of here,” he said, hanging a carabiner on it, as a marker for the present.

Then he flipped the script. “I’m going to tell you that time is not like that,” he explained.

Rovelli went on to challenge our common-sense notion of time, starting with the idea that it ticks everywhere at a uniform rate. In fact, clocks tick slower when they are in a stronger gravitational field. When you move nearby clocks showing the same time into different fields—one in space, the other on Earth, say—and then bring them back together again, they will show different times. “It’s a fact,” Rovelli said, and it means “your head is older than your feet.” Also a non-starter is any shared sense of “now.” We don’t really share the present moment with anyone. “If I look at you, I see you now—well, but not really, because light takes time to come from you to me,” he said. “So I see you sort of a little bit in the past.” As a result, “now” means nothing beyond the temporal bubble “in which we can disregard the time it takes light to go back and forth.”

 

Michio Kaku: Why there are higher dimensions

 

The peculiar numbers that could underlie the laws of nature

 

Natalie Wolchover writes:

In 2014, a graduate student at the University of Waterloo, Canada, named Cohl Furey rented a car and drove six hours south to Pennsylvania State University, eager to talk to a physics professor there named Murat Günaydin. Furey had figured out how to build on a finding of Günaydin’s from 40 years earlier — a largely forgotten result that supported a powerful suspicion about fundamental physics and its relationship to pure math.

The suspicion, harbored by many physicists and mathematicians over the decades but rarely actively pursued, is that the peculiar panoply of forces and particles that comprise reality spring logically from the properties of eight-dimensional numbers called “octonions.”

As numbers go, the familiar real numbers — those found on the number line, like 1, π and -83.777 — just get things started. Real numbers can be paired up in a particular way to form “complex numbers,” first studied in 16th-century Italy, that behave like coordinates on a 2-D plane. Adding, subtracting, multiplying and dividing is like translating and rotating positions around the plane. Complex numbers, suitably paired, form 4-D “quaternions,” discovered in 1843 by the Irish mathematician William Rowan Hamilton, who on the spot ecstatically chiseled the formula into Dublin’s Broome Bridge. John Graves, a lawyer friend of Hamilton’s, subsequently showed that pairs of quaternions make octonions: numbers that define coordinates in an abstract 8-D space.

There the game stops. Proof surfaced in 1898 that the reals, complex numbers, quaternions and octonions are the only kinds of numbers that can be added, subtracted, multiplied and divided. The first three of these “division algebras” would soon lay the mathematical foundation for 20th-century physics, with real numbers appearing ubiquitously, complex numbers providing the math of quantum mechanics, and quaternions underlying Albert Einstein’s special theory of relativity. This has led many researchers to wonder about the last and least-understood division algebra. Might the octonions hold secrets of the universe? [Continue reading…]

The Standard Model of particle physics: The absolutely amazing theory of almost everything

File 20180521 14978 36nv6i.jpg?ixlib=rb 1.1
How does our world work on a subatomic level?
Varsha Y S, CC BY-SA

By Glenn Starkman, Case Western Reserve University

The Standard Model. What dull name for the most accurate scientific theory known to human beings.

More than a quarter of the Nobel Prizes in physics of the last century are direct inputs to or direct results of the Standard Model. Yet its name suggests that if you can afford a few extra dollars a month you should buy the upgrade. As a theoretical physicist, I’d prefer The Absolutely Amazing Theory of Almost Everything. That’s what the Standard Model really is.

Many recall the excitement among scientists and media over the 2012 discovery of the Higgs boson. But that much-ballyhooed event didn’t come out of the blue – it capped a five-decade undefeated streak for the Standard Model. Every fundamental force but gravity is included in it. Every attempt to overturn it to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

In short, the Standard Model answers this question: What is everything made of, and how does it hold together?

[Read more…]

What are the limits of manipulating nature?

In Scientific American, Neil Savage writes:

Matt Trusheim flips a switch in the darkened laboratory, and an intense green laser illuminates a tiny diamond locked in place beneath a microscope objective. On a computer screen an image appears, a fuzzy green cloud studded with brighter green dots. The glowing dots are color centers in the diamond—tiny defects where two carbon atoms have been replaced by a single atom of tin, shifting the light passing through from one shade of green to another.

Later, that diamond will be chilled to the temperature of liquid helium. By controlling the crystal structure of the diamond on an atom-by-atom level, bringing it to within a few degrees of absolute zero and applying a magnetic field, researchers at the Quantum Photonics Laboratory run by physicist Dirk Englund at the Massachusetts Institute of Technology think they can select the quantum-mechanical properties of photons and electrons with such precision that they can transmit unbreakable secret codes.

Trusheim, a postdoctoral researcher in the lab, is one of many scientists trying to figure out just which atoms embedded in which crystals under what conditions will give them that kind of control. Indeed, scientists around the world are tackling the hard problem of controlling nature at the level of atoms and below, down to electrons or even fractions of electrons. Their aim is to find the knobs that control the fundamental properties of matter and energy and turn those knobs to customize matter and energy, creating ultrapowerful quantum computers or superconductors that work at room temperature.

These scientists face two main challenges. On a technical level, the work is extremely difficult. Some crystals, for instance, must be made to 99.99999999 percent purity in vacuum chambers emptier than space. The more fundamental challenge is that the quantum effects these researchers want to harness—for example, the ability of a particle to be in two states at once, à la Schrödinger’s cat—happen at the level of individual electrons. Up here in the macro world, the magic goes away. Researchers manipulating matter at the smallest scales, therefore, are trying to coax nature into behaving in ways that strain at the limits imposed by fundamental physics. The degree to which they succeed will help determine our scientific understanding and technological capacity in the decades to come. [Continue reading…]

Free will, video games, and the most profound quantum mystery

David Kaiser writes:

The word “predictable” first entered the English language two centuries ago. Its début came in neither a farmer’s almanac nor a cardsharp’s manual but in The Monthly Repository of Theology and General Literature, a Unitarian periodical. In 1820, one Stephen Freeman wrote a dense treatise in which he criticized the notion that human behavior—seemingly manifest “amidst the conflicting, boisterous, unreasonable wills of men, all acting, as they feel they do, their various parts with complete freedom of choice”—somehow existed outside the domain of cause and effect. Freeman (“free man,” no less!) argued, instead, that human consciousness and our perception of free will must be subject to chains of causation. “What but this certainty, this necessity, can render any event, even such as depends on the free-will of intelligent agents, predictable?” he asked.

This week, in the journal Nature, a collaboration of more than a hundred quantum physicists, distributed across twelve laboratories in eleven countries on five continents, turned Freeman’s formulation on its head. With the help of high-powered lasers, superconducting magnets, and state-of-the-art machine-learning algorithms, they concluded that “if human will is free, there are physical events . . . that are intrinsically random, that is, impossible to predict.” The group dubbed their experiment the Big Bell Test, after the renowned twentieth-century physicist John S. Bell.

The question at the center of Bell’s work is whether objects in the real world, including elementary particles, have definite properties of their own, independent of whether anyone happens to measure them. Quantum theory holds that they do not—that the act of performing a measurement doesn’t so much reveal a preëxisting value as summon it forth. (It as though you had no definite weight until you stepped on your bathroom scale.) The Danish physicist Niels Bohr, writing in the nineteen-thirties, argued that the outcomes of quantum measurements were thus truly, inherently random. [Continue reading…]

Taming the multiverse: Stephen Hawking’s final theory about the big bang

University of Cambridge, April 27, 2018

Professor Stephen Hawking’s final theory on the origin of the universe, which he worked on in collaboration with Professor Thomas Hertog from KU Leuven, has been published today in the Journal of High Energy Physics.

The theory, which was submitted for publication before Hawking’s death earlier this year, is based on string theory and predicts the universe is finite and far simpler than many current theories about the big bang say.

Professor Hertog, whose work has been supported by the European Research Council, first announced the new theory at a conference at the University of Cambridge in July of last year, organised on the occasion of Professor Hawking’s 75th birthday.

Modern theories of the big bang predict that our local universe came into existence with a brief burst of inflation – in other words, a tiny fraction of a second after the big bang itself, the universe expanded at an exponential rate. It is widely believed, however, that once inflation starts, there are regions where it never stops. It is thought that quantum effects can keep inflation going forever in some regions of the universe so that globally, inflation is eternal. The observable part of our universe would then be just a hospitable pocket universe, a region in which inflation has ended and stars and galaxies formed.

“The usual theory of eternal inflation predicts that globally our universe is like an infinite fractal, with a mosaic of different pocket universes, separated by an inflating ocean,” said Hawking in an interview last autumn. “The local laws of physics and chemistry can differ from one pocket universe to another, which together would form a multiverse. But I have never been a fan of the multiverse. If the scale of different universes in the multiverse is large or infinite the theory can’t be tested. ”

In their new paper, Hawking and Hertog say this account of eternal inflation as a theory of the big bang is wrong. “The problem with the usual account of eternal inflation is that it assumes an existing background universe that evolves according to Einstein’s theory of general relativity and treats the quantum effects as small fluctuations around this,” said Hertog. “However, the dynamics of eternal inflation wipes out the separation between classical and quantum physics. As a consequence, Einstein’s theory breaks down in eternal inflation.” [Read more…]