Facebook and the normalization of deviance

Facebook and the normalization of deviance

Sue Halpern writes:

When the sociologist Diane Vaughan came up with the term “the normalization of deviance,” she was referring to nasa administrators’ disregard of the flaw that caused the Challenger space shuttle to explode, in 1986. The idea was that people in an organization can become so accepting of a problem that they no longer consider it to be problematic. (In the case of the Challenger, nasa had been warned that the shuttle’s O-rings were likely to fail in cold temperatures.) Consider Facebook: for years, its leadership has known that the social network has abetted political polarization, social unrest, and even ethnic cleansing. More recently, it has been aware that its algorithms have promoted misinformation and disinformation campaigns about covid-19 and vaccines. Over the past year, the company made piecemeal attempts to remove false information about the pandemic, issuing its most comprehensive ban in February. An analysis last month by the nonprofit group First Draft, however, found that at least thirty-two hundred posts making unfounded claims about covid-19 vaccines had been posted after the February ban. Two weeks ago, the top post on Facebook about the vaccines was of Tucker Carlson, on Fox News, “explaining” that they don’t work.

Over the years, Mark Zuckerberg, Facebook’s C.E.O., has issued a cascade of apologies for the company’s privacy breaches, algorithmic biases, and promotion of hate speech, among other issues. Too often, the company seems to change course only after such issues become public; in many cases, it had been made aware of those failures long before, by Facebook employees, injured parties, or objective evidence. It took months for the firm to acknowledge that political ads on its platform were being used to manipulate voters, and to then create a way for users to find out who was paying for them. Last December, the company finally reconfigured its hate-speech algorithm, after years of criticism from Black groups that the algorithm disproportionately removed posts by Black users discussing racial discrimination. “I think it’s more useful to make things happen and then, like, apologize later,” Zuckerberg said early in his career. We’ve witnessed the consequences ever since. [Continue reading…]

Comments are closed.