As most people know the French arrested Durov the co-founder/CEO of Telegram when he jetted into France last weekend
This arrest has caused considerable comment, a fair amount of it not based on the stated reasons for the arrest. Specifically many people are cheering/worrying about the exposure of Russian military usage of Telegram and about the potential for censorship on such platforms.
What’s the beef?
According to the French authorities they arrested him for a number of reasons
This judicial investigation was opened against person unnamed, on charges of:
- Complicity – web-mastering an online platform in order to enable an illegal transaction in organized group,
- Refusal to communicate, at the request of competent authorities, information or documents necessary for carrying out and operating interceptions allowed by law, - Complicity
– possessing pornographic images of minors,
- Complicity - distributing, offering or making available pornographic images of minors, in organized group,
- Complicity - acquiring, transporting, possessing, offering or selling narcotic substances,
- Complicity - offering, selling or making available, without legitimate reason, equipment, tools, programs or data designed for or adapted to get access to and to damage the operation of an automated data processing system,
- Complicity – organized fraud,
- Criminal association with a view to committing a crime or an offense punishable by 5 or more years of imprisonment
- Laundering of the proceeds derived from organized group’s offences and crimes,
-Providing cryptology services aiming to ensure confidentiality without certified declaration,
- Providing a cryptology tool not solely ensuring authentication or integrity monitoring without prior declaration,
- Importing a cryptology tool ensuring authentication or integrity monitoring without prior declaration.
Some of these are mostly legitimate, some are not. None, you will note, mention either Freedom of Speech (except indirectly) or Russia.
The mostly legitimate ones are about the fact that Telegram is used by criminals to distribute child pr0n, hacking tools, set up drug deals etc. There is, IMHO, no doubt that Telegram is used for all these things. But it is not alone. Instagram, Xitter, Whatsapp, Signal, Mastodon and every other online platform is used for the same things.
Instagram even grooms teens and pre-teens thanks to its algorithms. So if this is the reason for the arrest I recommend that Mark Zuckerberg not visit France in the near future
The gripe here seems to be that Telegram was not willing to work with the authorities by giving them a back door or by moderating content posted to remove illegal stuff. This probably also explains the cryptology ones. I am not a lawyer but Preston Byrne, who is, thinks that a lot of this is the French authorities trying their own law on a target that they don’t like but which could absolutely apply to most other messaging/social media companies
We’ll need wait for the evidence to come out before reaching any firm conclusions on this point. If I had to guess, in a world where every platform hosts unlawful activity to some extent, this looks like selective enforcement. I would also guess that Durov was not “aiding and abetting” as the U.S. would understand it and that this French enforcement action is an overbroad application of French law to punish a perceived political enemy, with the French security state trying to use local doctrines in a novel way to try to police a foreign company with moderation policies it (and likely each of its security cooperation partners in the EU and across the channel in the UK) regards as too lax.
In the absence of a lot of evidence showing that Durov and Telegram specifically intended to commit these crimes or bring them about, there is no reason why similar charges could not be laid against any other provider of social media services in France whose moderation practices are anything less than perfect, in particular social media services which provide end-to-end encryption.
Summing up: for the time being, if you run a social media company, or if you provide encrypted messaging services, which are accessible in France, and you’re based in the United States, get out of Europe.
And stay out.
It may be worth wondering why the French dislike Pavel Durov and want to try applying some questionably broad complicity laws to him and not others. So who is he?
Who is Pavel Durov?
Durov is a Russian who first made his money developing the Russian equivalent of Facebook - vkontakte. At some point he refused to comply with certain demands from the Putinists about handing over user data and quit Russia. Subsequently he founded Telegram which is an extremely leanly run organization (I believe it has something like 100 total employees of which 15 are developers) to provide a messaging service that could work anywhere and which could include encrypted messages.
Sometime later Telegram was banned in Russia and then a bit later on that ban was rescinded and Durov visited Moscow, though he retains residency and citizenship in UAE and (apparently) also French citizenship. There’s a lot of (deliberate) murkiness here regarding the relationship between Durov and the Russian authorities but it seems that they have been a lot closer than one might have expected a decade or so ago.
What is Telegram?
Telegram is a messaging app with a unique group mesaging option that is widely used by all sorts of people to communicate. That communication may or may not be encrypted and it almost completely unmoderated. As 404Media note, it is an absolute sewer:
We at 404 Media have seen and reported on much of the illegal activity on Telegram with our own eyes. Telegram is widely and blatantly used in the open by drug dealers who advertise their products on Facebook and Instagram, hackers who sell credit cards in public groups, hacking crews that have begun to commit physical violence against each other, widespread fraud rings, and people who make and sell nonconsensual, AI-generated sexual content of celebrities, ordinary people, and minors.
Crucially, much of this content is not encrypted, because group chats on Telegram are not encrypted and because encryption is not enabled by default. It would be more accurate to call Telegram a messaging app on which a version of encryption can be enabled for certain chats if you want. It is not really an “encrypted messaging app.” Many of these devices and groups are advertised in the open, and many of these groups have thousands of users. In our experience, Telegram does very little to remove this sort of activity, and in many years of reporting on them, we can think of only one instance in which Telegram actually banned a group we sent to them.
Matthew Green, who is a well-known cryptologist, has a lot of questions about just how much encryption there is in normal Telegram usage (and that link notes some other issues with Telegram including the fact that getting access to the Telegram servers would provide a lot of fascinating metadata even if the underlying data is encrypted or not stored).
Other examples of just how much of a sewer Telegram is are all over Xitter. See this Threadreader thread as an example. A lot of people are upset that while Telegram is generally unmoderated, it does seem to take down stuff that offends dictatorial regimes -see these two Xeets:
Telegram had moderation, which is selective, typically benefiting authoritarian states.
1. Shutdown of an Iranian opposition channel in 2017 after a government complaint.
2. The Russian government lifting its ban on Telegram in 2020 after it agreed to assist with extremism investigations.
3. Disappearance of Telegram channels covering protests in Bashkortostan in January 2024.
4. Telegram's involvement in Russian surveillance in occupied Ukrainian regions.
The platform's collection of metadata (phone numbers, contacts, IP addresses) can be used to map user relationships. Telegram is not a private, secure, or decentralized platform. Its encryption is server-side, allowing Telegram access to users' messages.
Durov visited Russia 50 times between 2015 and 2021. After a failed cryptocurrency project in the USA, Durov returned to Russia, and the Kremlin lifted its ban on Telegram shortly thereafter.
Misinformation Labeling: The "FAKE" label is present on certain Telegram channels (e.g., the channel of the wives of mobilized soldiers) but absent on others (e.g., a fake copy of the Free Russia Legion channel), indicating possible selective moderation.
It should be noted that there are persistent rumors that, despite Durov’s previous public statements, he remains close to the Russian leadership and that the Russian intelligence services have significant hidden access to Telegram. Whether or not this is true, various Russian government officials made a variety of posts and statements that indicated they were very nervous of other countries getting access to Telegram’s servers/data. Russian military bloggers are equally nervous:
It is also undeniably true that a lot of Russian semi-official business uses Telegram including coordination between different parts of the Russian military in their war with Ukraine. If Telegram were to shut it probably would hurt Russia a lot as all sorts of things that you would not expect to be dependent on the platform would turn out to be so.
Why Does The “Right Wing” Support Telegram
With all this it might seem odd that the “Right Wing” seems so support Telegram and its founder. The answer mostly comes down to mistrust of the Signal messenger (which is in fact end-to-end encrypted) combined with a general belief in free speech. The signal thing (well covered in the 404 Media artile linked above) is silly IMHO because signal is in fact auditable, making it hard for anyone to put in a backdoor, no matter for what reason. Signal is way more secure than Telegram and the French (and others) absolutely hate it because it’s impossible to see what is being discussed there and the protocol doesn’t need to use Signal’s own servers.
Some of this seems to be related to the bizarre factional alignment seen in the Russian invasion of Ukraine. For various reasons the nationalist right is more supportive of Russia while the internationalist left is more suportive of Ukraine. This is, IMHO, remarkably stupid on behalf of the right because they really ought to be supporting the national ambitions of Ukraine to detemine its own future in the same way that they support it for Israel (where the left supports the antisemitic genocidalist Palestinians). Since Telegram is favored by Russia (and Durov is Russian) it seems that is another reason why people on the right are support Telegram in addition to their general desire to support free speech and platforms that allow it. It is also note-worthy that the Ukrainian-supporting internationalist left is also the pro-censorship side that gets all worked up about dis/misinformation and the like. The fact that these people like censorship and hate Telegram is another reason why free speechers tend to support it.
The opponents of censorship have a point, one made more strongly in recent days by Zuckerberg’s mea culpa about censorship in 2020. However as I noted above, Telegram is not a good poster child for free speech. I get that free speech means speech I disagree with, but the mass criminality on Telegram makes it an easy target. In fact based on what we learned from reports of the actual charges (archive), it seems that the French authorities are mostly upset that Telegram has failed let them wiretap suspects and failed to remove all kinds of criminal activity:
Durov is being interrogated as part of a case initiated by a cybercrime unit of the Paris prosecutor’s office. Investigative judges handling the case are looking into a wide range of allegations, which include refusing to help authorities run legal wiretaps on suspects, enabling the sale of child sexual abuse material and aiding and abetting drug trafficking.
Aside: I find it amusing that the people who are upset that Telegram allowed protestors (rioters) in the UK to coordinate seem to be highly correlated with the people who cheered on the color revolutions over a decade ago that were coordinated on twitter. Weird how that happens
What is interesting about telegram is that, unlike Xitter or the Bork of Feces, it does not use any algorithm to promote or demote particular posts in groups. Instead posts are simply shown in chronological order. That and the lack of almost any moderation means it is indeed a free speech paradise, but as I’ve said above, the downside seems to be that a significant fraction of the speech on it is criminal.
Free Speech Limits
The Telegram situation does raise interesting moral questions about the limits to free speech. Should criminal speech be free? How about speech that incites criminals? Should a platform be responsible for the posts of its users?
This is not exactly helped by different countries having different rules about what is criminal speech. In some countries blasphemy is a crime. Pornography laws are different too. Drug laws, including alcohol and tobacco, as well as prescription medicines, vary widely too. And that ignores all the laws about things like fraud, copyright and so on.
Going down the slippery slope then there’s glorification of violence, incitement, hate speech, cyber-bullying and the like. Some (most?) of this is clearly speech that should be free, but if a social media pile on causes the victim to commit suicide or be SWATted, should that speech be free? How about speech that causes a young person to radicalize and then commit a terrorist act?
The free speech absolutist view - that a platform should allow anything up to and including say the sexual abuse, torture and murder of young children - has the benefit of being simple. It is however, in my opinion, dangerously amoral if not immoral. It seems to me that a platform that can detect and block stuff like that has a responsibility to do so and to cooperate with the relevant authorities to bring the perpetrators to justice. Evil is a real thing and the social media platforms in general are absolutely complicit in spreading it. None of them do a good job trying to stamp it out, indeed many seem to have recommendation algorithms that end up promoting it.
All the platforms, to one degree or another, try to evade responsibility for what is published on their platforms and for people being negatively impacted by it. Recently tiktok was found to be liable for a child’s death by a US appeals court because of the algorithmic recommendations it made. I have to say that, on the whole, I agree with this approach. If a platform is going to promote some things and demote others then it should take responsibility for harm caused by what it promotes. If this results in all the platforms being sued out of business in the USA that will be fine with me because it seems to me that the platforms are deliberately and knowingly additicting their users in order to get ad revenue. I consider this business model to be pretty much evil no matter whether the content it pushes is unicorns and rainbows or child pr0n and murder.
Ironically Telegram may have the best defense against this sort of thing since, as far as I know, it doesn’t do any manipulation and just sends posts in chronological order.
Back to Durov
However the French case against Durov is not related to the promotion or not of posts, it is more basic. The French claim appears to be that Durov is complicit in all the criminal activity that results from users posting on Telegram because it does not cooperate with law enforcement to take down content, block/ban users or provide metadata regarding users that law enforcement need to arrest and prosecute criminal activity.
As I wrote above, it is trivial to find criminal activity on Telegram so the question is to what extent is Telegram complicit in those crimes. Moreover as Durov is probably the majority shareholder in Telegram and certainly the CEO he absolutely should take responsibility for what he has created. Currently he, and all other social media organizations, have been dodging responsibility by talking about free speech and being just a public space. And despite all that, all of them have in fact banned stuff at government request.
The big big question is whether responsibility means he is legally liable for what people do on his platform and whether he should be required to police his platform to remove content that offends governments and law enforcement people in different countries. Despite what I say about such platforms distributing evil I’m not sure that they should necessarily be legally liable if they do not promote it.
I think that the better approach would be for the French government to simply ban Telegram in France1 if it considers Telegram to be so bad and the same for Xitter, the Bork of Feces and any other platform. And if other countries agree they should too. The problem here of course is that this makes clear that the French government are censorious bansturbators just like the West Taiwanese or the Saudis which is not the kind of company the French government would like to be in. But if enough countries agree with France then the social media platform will go bust and that sort of thing is going to concentrate the minds of the competion. Or, altenatively, if not enough countries agree then it just makes clear that France is a bansturbatory irrelevance. Something tells me that France is worried that the latter is more likely
Yes I know banning something won’t stop people accessing it. It would likely make it hard enough to access that most French citizens wouldn’t and it would make an easy additional charge to tack onto the prosecutions of criminals who still use telegram. It is however dead easy to ban Telegram by simply requiring ISPs (in France) to null route traffic to Telegram’s ASes/IP subnets
I lean toward absolute free speech mostly, since the slope the other way is extremely slippery. I think one has the right to shout "FIRE!" in a theater just as surely someone else has the right to shout, even louder "THAT'S BULLSHIT!".
Having said that, actions, including speech, have consequences. One has the right to say anything but can an should be held libel or lauded for any results of said speech. If a fool shouts fire and because of that folks are injured or harmed the fool should be reprimanded by the law and society.
Telegram evil? I think not, not any more than a monkey wrench or a .44 Mag. pistol is, although such can be utilized for good, evil or neutral purposes.
I grew up in an America where most folks truly felt it was far better for ten criminals to go free than one innocent to be falsely incarcerated. I admit such colors my feelings and beliefs today.
I also grew up in an America where most put a fair amount of trust in our government. Loss of that naivety of course also colors my feelings and beliefs .
Every country has some variation of the '230' rule, the idea that social-media are merely neutral channels, like telephone wires, and therefore have no responsibility for what is transmitted over them. In the U.S., this rule quickly became obsolete, as the Federal Government enforced censorship of anything the regime of the day doesn't like. As Facebook, Twitter, and Google tried to comply with Government takedown orders, they discovered their highly touted algorithms couldn't do the job, and they were forced to manually ban people like Alex Berenson for no other reason than that the Federal Government ordered it. (Berenson went against their 'vaxing is safe' narrative.) Now the censorship regime in the United States is an 'all-of-Government' colossus, together with the fact-check industry. At the same time, the absolutist free-speech model has also proven seriously flawed, as the criminal activity on Telegram demonstrates. Some sort of content moderation is clearly essential, and it's clearly better for the platform itself to take this responsibility seriously than to do nothing about it. As you suggest, that is what French officials were trying to tell Durov, and he wasn't listening or cooperating. So they arrested him to get his attention. That's the most benign interpretation of their action. The clever French may also have sought to diminish Russian trust in Durov, as suggested by one of the writers quoted, since no one knows what Durov may have been forced to disclose in the first three days of his captivity. What we can learn from all this is that neither the absolutist free-speech position, nor the total Government-control position, is viable. Content moderation will have to wise-up considerably, lest it go down the fact-check rat-hole. Algorithms, AI, etc. won't cut it. There is simply no substitute for human judgment.