What’s (Really) at Stake with the Arrest of Pavel Durov
It is not common for the CEO of a large technology company to be arrested. And that is why there is so much talk about Pavel Durov, the head of Telegram, arrested in France on August 24. In the first hours, many hypotheses were made about the reasons for the arrest. On X, the hashtag #FreePavel spread, in support of what many people now consider a martyr of freedom of expression. Following the partial clarification of the French authorities, we can try to bring some order to a case that is more about what we want – as individuals and as a society – from web platforms than about freedom of speech.
Why Pavel Durov was arrested
Paris prosecutor Laure Beccuau said in a statement that the arrest was part of an investigation opened on July 8 “against persons unknown” for a range of possible charges, including complicity in the distribution of child pornography, drug dealing, money laundering and refusal to cooperate with law enforcement.
The investigation is being conducted by cybercrime and fraud specialists, and Durov has reportedly been detained for questioning by investigators. In a statement released hours after his arrest, Telegram said it “complies with EU laws, including the Digital Services Act.” And that “content moderation on the platform meets industry standards.”
Well, it seems that the European Union has little to do with the arrest of Pavel Durov. Which seems to have been carried out starting from French laws. The issue, in short, is a bit more complex and does not really concern freedom of speech.
Telegram is different from other platforms
Let’s start from the beginning. Telegram is basically an instant messaging app, like Whatsapp. However, it has some peculiarities. Its popularity (it has about 900 million users worldwide) is partly due to the decision to allow the creation of large chat groups, up to 200,000 people, at a time when other platforms, like WhatsApp, were reducing the size of groups to combat disinformation. Other features, such as sharing large files, the absence of limits on sharing links and the use of bots that can interact with users within channels, have helped make it a powerful tool for communicating with more or less large groups of people. On different topics, from the war in Ukraine or Palestine, to reporting discounts on Amazon.
These capabilities, combined with the app’s minimal moderation, have made it a fairly safe haven for the non-consensual dissemination of intimate material, child pornography, and drug dealing. All of this in an environment that is only apparently safe: despite being considered a private platform, end-to-end encryption (i.e. the technology that makes chats visible only to the sender and receiver of the message) is not set by default, unlike what happens on Meta platforms. You have to activate it by selecting the secret chats option.
The real problem of Telegram
The point, however, is moderation. Telegram, contrary to what it claims, does not even come close to the standards of content control required of large technology companies. On Platformer, the journalist Casey Newton underlines how, despite the ban on publishing illegal content or promoting violence, the platform, in its FAQ, highlights the unwillingness to delete content – even after reports from users – illegal from groups, chats and private channels.
And here is the perception of security: it is not so much the encryption, it is more the fact that under no circumstances are the contents shared with law enforcement. Again in the FAQ, it is written: “To protect data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is distributed across multiple data centers around the globe, controlled by different legal entities that are in turn distributed under different jurisdictions. The related decryption keys are split into parts and are never kept together with the data they protect. As a result, it takes multiple court orders from different jurisdictions to force us to hand over any data”.
In short, no moderation and unwillingness to collaborate with law enforcement. Regardless of Durov’s arrest (on which some aspects still need to be clarified), this is the crux of this story. In an article published in 2022, the American newspaper The Verge very effectively addressed the issue of content moderation on social networks, starting with the takeover of Twitter (now X) by another champion of freedom of opinion like Elon Musk.
“The essential truth of any social network,” the piece reads, “is that the real ‘product’ is content moderation, and everyone hates whoever decides how that moderation works. Content moderation is what Twitter produces: it’s what defines the user experience. It’s the same for YouTube, Instagram, and TikTok. They all try to incentivize good content, discourage bad content, and remove the truly unacceptable content.”
Well, not all of them, apparently. Not Telegram, at least. Can we still accept this?