Earth’s mammals have shrunk dramatically, and meat-eating hominids are to blame

The Washington Post reports:

Life on Earth used to look a lot more impressive. Just a little more than 100,000 years ago, there were sloths as long as a giraffe is tall, monstrous bears whose shoulders were six feet off the ground, and Bunyanesque beavers that weighed as much as an NFL linebacker. But over time, all of these creatures disappeared in a manner so rapid and so mysterious that scientists still can’t fully explain what went down.

Did an asteroid discharge the mega-beasts, similar to the one thought to have snuffed out the dinosaurs? Or was it widespread climatic change or a plague of new diseases? Did our penchant for hunting play a role?

It’s likely that a combination of factors led to a planet-wide demise in sizable mammals as the Ice Age came to a close. But a study published Thursday in the journal Science provides evidence that the major drivers were humans and other hominids.

“We looked at the entire fossil record for 65 million years, in million-year increments, and we asked the question, ‘Is it ever bad to be big?’ ” said lead author Felisa Smith, a paleoecologist at the University of New Mexico. For most of evolutionary history, the answer was no — larger body mass did not make an animal more likely to go extinct, she said. “For 65 million years, it didn’t matter what size you were.”

That is, until a new kind of predator arrived on the scene: Homo erectus. Around 1.8 million years ago, hominids that had long been dependent on plants became hominids that were “heavily and increasingly dependent on meat as a food source,” Smith said.

As these tool-wielding team hunters spread out from Africa, large-mammal extinctions followed. If you’re going to spend time and energy on a hunt, these early humans and their ancestors probably believed, it’s go big or go home.

“You hunt a rabbit, you have food for a small family for a day,” Smith said. “You hunt a mammoth, you feed the village.” [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

After a volcano’s ancient supereruption, humanity may have thrived

Shannon Hall writes:

The Toba supereruption [about 74,000 years ago] expelled roughly 10,000 times more rock and ash than the 1980 Mount St. Helens eruption. So much ejecta would have darkened skies worldwide, causing scientists to speculate that it might have plunged the Earth into a volcanic winter whose chill could be felt far from Indonesia. Climate models suggest that temperatures may have plummeted by as much as 30 degrees Fahrenheit. And in such a cold world, plants may have ceased growing, glaciers may have advanced, sea-levels may have dropped and rainfall may have slowed.

Then in 1998, Stanley Ambrose, an anthropologist, linked the proposed disaster to genetic evidence that suggested a population bottleneck had occurred around the same time. He was certain that the Toba supereruption had caused the human population to decline to some 10,000 people — a close call for our ancestors.

“These were dramatic theories,” said Michael Petraglia, an archaeologist at the Max Planck Institute for the Science of Human History who was not involved in the study. “They were very popular — both in the scientific world, but also in the public imagination.”

The latest study, however, suggests that those theories are incorrect, Dr. Petraglia said. “We’re not seeing all the drama.” [Continue reading…]

Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.

Across human history, there’s little evidence large-scale social organization necessitates enduring inequality

David Graeber and David Wengrow write:

Stonehenge, it turns out, was only the latest in a very long sequence of ritual structures, erected in timber as well as stone, as people converged on the plain from remote corners of the British Isles, at significant times of year. Careful excavation has shown that many of these structures – now plausibly interpreted as monuments to the progenitors of powerful Neolithic dynasties – were dismantled just a few generations after their construction. Still more strikingly, this practice of erecting and dismantling grand monuments coincides with a period when the peoples of Britain, having adopted the Neolithic farming economy from continental Europe, appear to have turned their backs on at least one crucial aspect of it, abandoning cereal farming and reverting – around 3300 BC – to the collection of hazelnuts as a staple food source. Keeping their herds of cattle, on which they feasted seasonally at nearby Durrington Walls, the builders of Stonehenge seem likely to have been neither foragers nor farmers, but something in between. And if anything like a royal court did hold sway in the festive season, when they gathered in great numbers, then it could only have dissolved away for most of the year, when the same people scattered back out across the island.

Why are these seasonal variations important? Because they reveal that from the very beginning, human beings were self-consciously experimenting with different social possibilities. Anthropologists describe societies of this sort as possessing a ‘double morphology’. Marcel Mauss, writing in the early twentieth century, observed that the circumpolar Inuit, ‘and likewise many other societies . . . have two social structures, one in summer and one in winter, and that in parallel they have two systems of law and religion’. In the summer months, Inuit dispersed into small patriarchal bands in pursuit of freshwater fish, caribou, and reindeer, each under the authority of a single male elder. Property was possessively marked and patriarchs exercised coercive, sometimes even tyrannical power over their kin. But in the long winter months, when seals and walrus flocked to the Arctic shore, another social structure entirely took over as Inuit gathered together to build great meeting houses of wood, whale-rib, and stone. Within them, the virtues of equality, altruism, and collective life prevailed; wealth was shared; husbands and wives exchanged partners under the aegis of Sedna, the Goddess of the Seals.

Another example were the indigenous hunter-gatherers of Canada’s Northwest Coast, for whom winter – not summer – was the time when society crystallised into its most unequal form, and spectacularly so. Plank-built palaces sprang to life along the coastlines of British Columbia, with hereditary nobles holding court over commoners and slaves, and hosting the great banquets known as potlatch. Yet these aristocratic courts broke apart for the summer work of the fishing season, reverting to smaller clan formations, still ranked, but with an entirely different and less formal structure. In this case, people actually adopted different names in summer and winter, literally becoming someone else, depending on the time of year.

Perhaps most striking, in terms of political reversals, were the seasonal practices of 19th-century tribal confederacies on the American Great Plains – sometime, or one-time farmers who had adopted a nomadic hunting life. In the late summer, small and highly mobile bands of Cheyenne and Lakota would congregate in large settlements to make logistical preparations for the buffalo hunt. At this most sensitive time of year they appointed a police force that exercised full coercive powers, including the right to imprison, whip, or fine any offender who endangered the proceedings. Yet as the anthropologist Robert Lowie observed, this ‘unequivocal authoritarianism’ operated on a strictly seasonal and temporary basis, giving way to more ‘anarchic’ forms of organisation once the hunting season – and the collective rituals that followed – were complete.

Scholarship does not always advance. Sometimes it slides backwards. A hundred years ago, most anthropologists understood that those who live mainly from wild resources were not, normally, restricted to tiny ‘bands.’ That idea is really a product of the 1960s, when Kalahari Bushmen and Mbuti Pygmies became the preferred image of primordial humanity for TV audiences and researchers alike. As a result we’ve seen a return of evolutionary stages, really not all that different from the tradition of the Scottish Enlightenment: this is what Fukuyama, for instance, is drawing on, when he writes of society evolving steadily from ‘bands’ to ‘tribes’ to ‘chiefdoms,’ then finally, the kind of complex and stratified ‘states’ we live in today – usually defined by their monopoly of ‘the legitimate use of coercive force.’ By this logic, however, the Cheyenne or Lakota would have had to be ‘evolving’ from bands directly to states roughly every November, and then ‘devolving’ back again come spring. Most anthropologists now recognise that these categories are hopelessly inadequate, yet nobody has proposed an alternative way of thinking about world history in the broadest terms.

Quite independently, archaeological evidence suggests that in the highly seasonal environments of the last Ice Age, our remote ancestors were behaving in broadly similar ways: shifting back and forth between alternative social arrangements, permitting the rise of authoritarian structures during certain times of year, on the proviso that they could not last; on the understanding that no particular social order was ever fixed or immutable. Within the same population, one could live sometimes in what looks, from a distance, like a band, sometimes a tribe, and sometimes a society with many of the features we now identify with states. With such institutional flexibility comes the capacity to step outside the boundaries of any given social structure and reflect; to both make and unmake the political worlds we live in. If nothing else, this explains the ‘princes’ and ‘princesses’ of the last Ice Age, who appear to show up, in such magnificent isolation, like characters in some kind of fairy-tale or costume drama. Maybe they were almost literally so. If they reigned at all, then perhaps it was, like the kings and queens of Stonehenge, just for a season. [Continue reading…]

Britain left Stone Age 4,500 years ago as early Britons were replaced by metalworking migrants


BBC News reports:

The ancient population of Britain was almost completely replaced by newcomers about 4,500 years ago, a study shows.

The findings mean modern Britons trace just a small fraction of their ancestry to the people who built Stonehenge.

The astonishing result comes from analysis of DNA extracted from 400 ancient remains across Europe.

The mammoth study, published in Nature, suggests the newcomers, known as Beaker people, replaced 90% of the British gene pool in a few hundred years.

Lead author Prof David Reich, from Harvard Medical School in Cambridge, US, said: “The magnitude and suddenness of the population replacement is highly unexpected.”

The reasons remain unclear, but climate change, disease and ecological disaster could all have played a role.

People in Britain lived by hunting and gathering until agriculture was introduced from continental Europe about 6,000 years ago. These Neolithic farmers, who traced their origins to Anatolia (modern Turkey) built giant stone (or “megalithic”) structures such as Stonehenge in Wiltshire, huge Earth mounds and sophisticated settlements such as Skara Brae in the Orkneys.

But towards the end of the Neolithic, about 4,450 years ago, a new way of life spread to Britain from Europe. People began burying their dead with stylised bell-shaped pots, copper daggers, arrowheads, stone wrist guards and distinctive perforated buttons. [Continue reading…]

Paleolithic parenting and animated GIFs

The creation of the moving image represents a technical advance in the arts comparable with the invention of the steam engine during the industrial revolution.

The transition from static to moving imagery was a watershed event in human history, through which people discovered a new way of capturing the visible world — or so it seemed.

It turns out, however, that long before the advent of civilization, our Paleolithic forebears figured out that movement seen in living creatures around them could, by cunning means, be captured in crafted illusions of movement.


Let’s run with the hypothesis that this 14,000 year old artifact is indeed a toy. What does this tell us about its creator and the children for whom it was made?

Before people started congregating in large settlements and forming highly stratified societies, they weren’t enmeshed in a struggle for survival where daily life was all about finding the next meal without becoming one. On the contrary, the conditions of life were congenial to the pursuit of crafts that helped cultivate the imagination and promote delight of the young ones.

While in our world, animation may most often be used for surrogate parenting — a way of giving kids a blind watcher that frees up parental hands for more pressing matters (like sending and receiving text messages) — I doubt that this was how our ancestors used toys.

For one thing, the contemporary challenges of time management are stunningly contemporary. We have accomplished an extraordinary feat: figured out how to live longer than ever while also having a sense that we have less time than ever.

Did the Paleolithic dad say: Watch the running deer while I skin this rabbit and mom grinds those acorns?

I don’t think so. Much more likely was the age-old bonding experience of shared delight as a child’s face lights up and finds pleasure in near-endless repetition. Paleolithic parents had plenty of time to play with their children.

These weren’t over-worked parents looking for ways to occupy neglected children. They were parents whose own lives were inseparable from those of their offspring. This was an epoch in which life was not partitioned into the discrete segments that define our own.

These were toys that came straight from the hands of the toy maker. They didn’t have to stand up to comparison with newer, better, more expensive toys; nor were they at risk of getting lost in mountains of discarded toys.

Again, a lesson in values: that those who have less, generally have a capacity to appreciate more.

Do I belong to what Melvin Konner called:

… a long line of credulous people … who seem to believe that we have left something behind that is better in every way than what we have now and that the most apt way to solve our problems is to go backward as quickly as possible[?]

I don’t think so.

It’s easy for Konner and others to dismiss this kind of interest in human origins as being driven by a naive conception of an idyllic natural state, but who if anyone is actually proposing the impossible: a return to a mythical Eden?

The issue here isn’t whether we might by some means recreate or return to our Paleolithic past, but rather, how an understanding of that past might better inform the way we perceive the present.

We live under the spell of many beguiling technological false promises, none more pernicious than the notion that doing things faster, frees time. The promise of the future is always that it’s going to be better.

When it turns out that human beings cracked the code of animation 14 to 20,000 years ago, this should give us pause to consider not merely the significance of this event as a technological breakthrough. We can also reflect on the differences between then and now in terms of how this facility in representation gets utilized in human culture.

Thanks to the invention of and portability of the animated GIF, it’s now possible for humans en masse to catch a glimpse of a prehistoric precursor of the very same technology: still images conjured to create the illusion of movement.

What is not the same is the way human minds are typically engaging with the technology.

I would argue that the Paleolithic human mind, operating in its relatively uncluttered world, would, with delight and with relatively undistracted attention appreciate the full effect. The moving deer would not only be captivating but perhaps also magical.

While the motion might rely on a way of tricking the eye, the toy might thereby be infused with the spirit of the deer. While the child was entertained he was perhaps also receiving an early initiation in the art of hunting.

For the audience of the animated GIF, however, the most common effect is at most to prompt a momentary muscle-flex — a retweet — perhaps accompanied with an audible reaction — “cool” — as within a second or two attention turns elsewhere.

Never has humanity been so well-fed while also experiencing so much growing hunger. Our restless attention forever longs for more when the present never seems to provide enough.

As we go forward, we also go backward, and not in a good way.

Britain First and the first Britons

 

The white supremacists who chant “blood and soil” (borrowing this phrase from the Nazis’ Blut und Boden) think white-skinned people have a special claim to the lands of Europe and North America.

This is an arrogant and ignorant belief to hold on this side of the Atlantic where every white person has immigrant ancestry originating from Europe, but European whiteness in terms of origin (not superiority) is a less controversial notion. That is to say, even among those of us who support the development and protection of inclusive, racially diverse societies, it’s generally believed that prior the modern era of mass migration, European societies were overwhelmingly white because, to put it crudely, Europe is where white people come from.

It turns out that European whiteness has surprisingly shallow roots, as new research findings based on a DNA analysis of “Cheddar Man” indicate. (Readers who might only be familiar with Cheddar as the name of a cheese should note that the cheese is named after the place.)

The Guardian reports:

The first modern Britons, who lived about 10,000 years ago, had “dark to black” skin, a groundbreaking DNA analysis of Britain’s oldest complete skeleton has revealed.

The fossil, known as Cheddar Man, was unearthed more than a century ago in Gough’s Cave in Somerset. Intense speculation has built up around Cheddar Man’s origins and appearance because he lived shortly after the first settlers crossed from continental Europe to Britain at the end of the last ice age. People of white British ancestry alive today are descendants of this population.

It was initially assumed that Cheddar Man had pale skin and fair hair, but his DNA paints a different picture, strongly suggesting he had blue eyes, a very dark brown to black complexion and dark curly hair.

The discovery shows that the genes for lighter skin became widespread in European populations far later than originally thought – and that skin colour was not always a proxy for geographic origin in the way it is often seen to be today.

Tom Booth, an archaeologist at the Natural History Museum who worked on the project, said: “It really shows up that these imaginary racial categories that we have are really very modern constructions, or very recent constructions, that really are not applicable to the past at all.”

Yoan Diekmann, a computational biologist at University College London and another member of the project’s team, agreed, saying the connection often drawn between Britishness and whiteness was “not an immutable truth. It has always changed and will change”.

Britain First, as a neo-fascist organization that claims to represent “indigenous British people,” should take note.

Whiteness construed as a marker of geographic origins more strongly identifies the diversity of those origins than ties them to any particular place.

Or to put it another way: those of us with fair skins should probably view ourselves as mongrels of the human race.