Facebook saw evidence it promotes divisiveness and extremism but Zuckerberg wasn’t interested
Facebook had evidence suggesting that the company’s algorithms encourage polarization and “exploit the human brain’s attraction to divisiveness,” but top executives including CEO Mark Zuckerberg killed or weakened proposed solutions, The Wall Street Journal reported Tuesday.
The effort to better understand Facebook’s impact on user behavior started in response to the Cambridge Analytica scandal, and its internal researchers determined that, contrary to the company’s mission of connecting the world, its products were having the opposite effect, according to the paper.
One 2016 report found “64% of all extremist group joins are due to our recommendation tools,” with most people joining at the suggestion of Facebook’s “Groups You Should Join” and “Discover” algorithms. Researchers noted that “our recommendation systems grow the problem,” according to the paper.
The Wall Street Journal reported that Facebook teams pitched multiple fixes, including: limiting the spread of information from groups’ most hyperactive and hyperpartisan users; suggesting a wider variety of groups than users might normally encounter; and creating subgroups for heated debates to prevent them from derailing entire groups.
However, these proposals were often nixed entirely or significantly diluted by Zuckerberg and policy chief Joel Kaplan, according to the paper, which reported that Zuckerberg eventually lost interest in trying to address the polarization problem and was concerned about the potential for solutions to limit user growth.
In response to the pitch about limiting the virality of hyperpartisan users’ posts, Zuckerberg reportedly asked the team to not bring something like that to him again. [Continue reading…]