The people who invaded the Capitol have spent years showing us who they are online
Maybe you saw this coming nearly a decade ago, when #YourSlipIsShowing laid bare how racist Twitter users were impersonating Black women on the internet. Maybe, for you, it was during Gamergate, the online abuse campaign targeting women in the industry. Or maybe it was the mass shooting in Christchurch, New Zealand, when a gunman steeped in the culture of 8chan livestreamed himself murdering dozens of people.
Maybe it was when you, or your friend, or your community, became the target of an extremist online mob, and you saw online anger become real-world danger and harm.
Or maybe what happened on Wednesday, when a rabble of internet-fueled Trump supporters invaded the Capitol, came as a surprise.
For weeks they had been planning their action in plain sight on the internet—but they have been showing you who they are for years. The level of shock you feel right now about the power and danger of online extremism depends on whether you were paying attention.
The mob who tried to block Congress from confirming Joe Biden’s presidential victory showed how the stupidity and danger of the far-right internet could come into the real world again, but this time it struck at the center of the US government. Neo-nazi streamers weren’t just inside the Capitol; they were putting on a show for audiences of tens of thousands of people who egged them on in the chats. The mob was having fun doing memes in the halls of American democracy as a woman—a Trump supporter whose social media history shows her devotion to QAnon—was killed trying to break into congressional offices.
The past year, especially since the pandemic, has been one giant demonstration of the consequences of inaction: the consequences of ignoring the many, many people who have been begging social media companies to take the meme-making extremists and conspiracy theorists that have thrived on their platforms seriously.
Facebook and Twitter acted to slow the rise of QAnon over the summer, but only after the pro-Trump conspiracy theory was able to grow relatively unrestricted there for three years. Account bans and algorithm tweaks have long been too little, too late to deal with racists, extremists, and conspiracy theorists, and they have rarely addressed the fact that these powerful systems were working exactly as intended.
I spoke with a small handful of the people who could have told you this was coming for a story in October. Researchers, technologists, and activists told me that major social media companies have, for the entire lifetime of their history, chosen to do nothing, or to act only after their platforms cause abuse and harm.
Ariel Waldman tried to get Twitter to meaningfully address abuse there in 2008. Researchers like Shafiqah Hudson, I’Nasah Crockett, and Shireen Mitchell have tracked exactly how harassment works and finds an audience on these platforms for years. Whitney Phillips talked about how she’s haunted by laughter—not just from other people, but also her own—back in the earliest days of her research into online culture and trolling, when overwhelmingly white researchers and personalities treated the extremists among them as edgy curiosities.
Ellen Pao, who briefly served as CEO of Reddit in 2014 and stepped down after introducing the platform’s first anti-harassment policy, was astonished that Reddit had banned r/The_Donald only in June 2020, after evidence had built for years to show that the popular pro-Trump message board served as an organizing space for extremists and a channel for mob abuse. Of course, by the time it was banned, many of its users had already migrated away from Reddit to TheDonald.win, an independent forum created by the same people who ran the previous version. Its pages were filled with dozens of calls for violence ahead of Wednesday’s rally turned attempted coup.
Facebook, Twitter, and YouTube didn’t create conspiracy thinking or extremist ideologies, of course. Nor did they invent the idea of dangerous personality cults. But these platforms have—by design—handed those groups the mechanisms to reach much larger audiences much faster, and to recruit and radicalize new converts, even at the expense of the people and communities those ideologies target for abuse. And crucially, even when it was clear what was happening, they chose the minimal amount of change—or decided not to intervene at all. [Continue reading…]