Has the tide turned against TikTok, Telegram and X?
Three major events have shaken up the social-media world in the past two weeks. First, French authorities detained Pavel Durov, the iconoclastic billionaire behind the online platform Telegram. Then, a judge suspended the microblogging service X in Brazil. Soon after, a federal appeals court in Pennsylvania ruled that the mother of a 10-year-old child who died copying a TikTok self-asphyxiation video can sue the service, circumventing a blanket legal immunity the company has long claimed.
While each of these events took place in a different country with its own laws, together they demonstrate a sudden shift in the balance of power between governments and technology companies. We are nearer to the end of impunity for tech titans who have evaded accountability for the offline harms and societal disruptions wrought by the platforms that built their fabulous wealth.
In France, Mr. Durov was charged with complicity in enabling fraud and the distribution of drugs and child sexual abuse material on Telegram, as well as with failure to cooperate with law enforcement. As research from Stanford showed last year, Telegram users are able to share and sell vile content, helped by a content moderation operation that is, to put it charitably, sparse.
While Mr. Durov’s public statements venerate freedom of expression, framing himself, as other tech billionaires do, as a tribune of technology for the people, he built a business in part by providing a haven for criminals who profit from child abuse. And his company, compared with other tech giants, does relatively little to try to stop it. Telegram has almost completely refused to assist governments in their efforts to root it out, which is partly what led to Mr. Durov’s detention. (Mr. Durov is now pledging to “significantly improve” the platform, though he also recently defended Telegram’s current moderation approach.)
Elon Musk, who criticized Mr. Durov’s arrest, also described Brazil’s suspension of his own platform, X, in apocalyptic terms. But as in the Telegram case, the details here matter. A powerful judge has been ordering tech companies to remove content in an effort to clamp down on lies and misinformation online. When Mr. Musk would not comply and closed X’s offices in Brazil, the judge suspended the service.
The suspension is due in large part to Mr. Musk’s refusal to follow Brazilian law requiring foreign companies (including Google and Meta) to have a legal representative in the country. And while Mr. Musk is fond of portraying himself as a stalwart defender of transparency and “free speech,” his record shows the opposite to be the case, from censoring tweets critical of Turkey’s authoritarian president, Recep Tayyip Erdogan, to agreeing to censorship in India.
In the case of TikTok, a federal court in Pennsylvania ruled the company could potentially be held liable for harm because its algorithmically generated “For You Page” showed children “blackout challenge” videos depicting people choking themselves until they passed out. Several children died attempting the challenge. The ruling, critically, found that Section 230, a 1990s-era law that has been heavily relied on for years by the social-media titans to defend themselves from lawsuits, doesn’t apply in this case because TikTok’s own algorithm proactively promoted the damaging content to its users.
If the ruling is ultimately upheld, it will shift how our society looks at the responsibilities and obligations of tech companies, parallel to the expectations we have of food and drug corporations. Such a shift might hinder innovation. But it also might bolster public health and democratic integrity. [Continue reading…]