Pavel Durov, the CEO and founder of messaging app Telegram, was charged in France on Wednesday with a number of crimes, including complicity in drug trafficking and facilitating the spread of child sexual abuse material on the platform he created.
Durov was previously arrested in Paris on Saturday, and details about that arrest had been limited until Wednesday. Now, however, it is clear that the charges against Durov are part of a larger French investigation. The Washington Post has reported that French police have suggested that “child sex crimes” are an area of particular focus for officials.
Following Wednesday’s indictment, Durov was required to post 5 million euros (about $5.5 million) for bail. As a condition of his release, he is forbidden from leaving France and must report to a French police station twice a week.
This isn’t the first time Telegram has been linked to illegal activity. It is a globally popular platform that offers both broadcast channels (in which users can send text and media to large groups of people) and user-to-user chats. It also offers what it calls “secret chat” conversations that are end-to-end encrypted — meaning that the messages sent are only decipherable to the conversation participants and that no one else, not even Telegram, can see the content.
That feature, as well as other privacy features like self-deleting messages, make the app extremely useful for political dissidents and journalists trying to work under repressive regimes or protect sources. But the app has also, over the years, become a space where extremists can radicalize users and organize terror attacks.
That has led to some pressure on the part of governments for Telegram to be more collaborative in the data it shares with authorities. Despite this, however, Telegram has largely been able to avoid dramatic legal encounters — until now. Paris prosecutors noted on Wednesday that Telegram had refused to assist them in the case.
Durov’s arrest is renewing scrutiny on the app and reigniting the hotly debated issues of free speech and the challenges of content moderation on social media.
Durov and his brother Nikolai founded Telegram to offer an app that centered user privacy following Russia’s “Snow Revolution” in 2011 and 2012, when blatant election fraud ignited months of protests, culminating in a harsh and ever-evolving government crackdown. Previously, Durov quarreled with Russian authorities who wanted to suppress speech on the Facebook-like service he founded called VKontakte.
In the years since its founding, Telegram has allegedly enabled some truly shocking crimes. Perhaps most infamously, it was used to coordinate ISIS attacks in Paris and Berlin. It cracked down on ISIS-based activity on the app after those attacks, but its content moderation policies have faced a lot of scrutiny.
As Vox has noted, those policies are laxer than those of other social media groups, and outlets such as the Washington Post have reported that Telegram has played host to a variety of criminal content, including child pornography. Keeping that sort of material off of a platform is an arduous — but not impossible — task, Alessandro Accorsi, a researcher at the International Crisis Group, told Vox.
“The effectiveness of content moderation is largely dependent on the platform and the resources it allocates to safety,” Accorsi said. “Social media companies are generally reactive. They want to limit the financial resources dedicated to moderation, as well as possible legal, political, and ethical headaches. So what usually happens is that they will focus their efforts on a few groups or issues for which inaction on their part carries legal or reputational costs.”
For example, when ISIS uses a service for terror attacks, that service focuses on stopping ISIS from using its products.
In communications that aren’t end-to-end encrypted, tech companies use a combination of human investigators as well as algorithm-powered programs to sort through content. The sort of end-to-end encryption used in Telegram’s “secret chats,” however, makes that type of moderation all but impossible.
Also complicating matters is the varied nature of internet law across the globe. In the US, publishers are generally legally shielded from liability over what users post. But that’s not universally the case; many countries have much stricter legal frameworks around intermediary liability. France’s SREN Act is extremely stringent and can levy fines against publishers for content violations.
“It’s a really hard thing to do, especially in comparative context, because what’s hateful or extreme or radical speech in some place like the US is going to be different from Myanmar or Bangladesh or other countries,” David Muchlinski, professor of international affairs at Georgia Tech, told Vox. That makes content moderation “a clumsy tool at best.”
Telegram has, in response to recent outside pressure, employed some content moderation, Accorsi told Vox. “It has banned channels associated with a handful of organizations (most recently Hamas and far-right groups in the UK), but thousands of problematic groups are still present.”
But the investigation and charges against Durov suggest Telegram may not be doing enough to keep bad actors from using the platform to commit crimes.
Update, August 28, 5:45 pm ET: This post was originally published on August 26 and has been updated to include the charges against Durov.