InvestigationsFakespertsSubscribe to our Sunday Digest
SOCIETY

Durov’s anatomical theater: How the Telegram trial is reshaping relations between governments and tech giants

After the arrest of Pavel Durov, Telegram’s administrators began providing French authorities with data that helped identify suspects in cases involving crimes against children. Previously, the platform had largely ignored requests from French law enforcement. Telegram now faces additional criticism: that a lack of transparency in its moderation policies enables drug trafficking, fraud, and the spread of pornography. Information freedom experts see the case against Telegram’s founder as part of a larger problem — the lack of an established framework for cooperation between tech corporations and governments. One possible solution is the introduction of a supranational system to regulate internet services. 

RU

Intermediary liability

“There is currently no unified international agreement on the liability of companies that act as intermediaries in information dissemination and content moderation,” says Sarkis Darbinyan, co-founder of the Roskomsvoboda project. “However, there are positions outlined by the UN and the Council of Europe.” According to Darbinyan, for the 46 member states of the Council of Europe, a landmark case in the European Court of Human Rights (ECHR) — Delfi v Estonia — set an important legal precedent. In this case, journalistic outlet Delfi published an article, and users posted threats in the comments against one of the subjects of the piece. The person’s lawyers demanded that the platform remove the comments and compensate their target for moral damages. Delfi removed the comments but refused to pay compensation. The court ruled that Delfi was liable for the actions of its users, stating that the platform could have foreseen the negative comments, and yet it failed to take sufficient measures to prevent them from appearing.

This precedent has shaped the interpretation of European laws regulating content moderation, says Darbinyan, particularly the E-Commerce Directive and the Digital Services Act. According to these regulations, platforms are not held responsible for user-generated content unless they are aware of its illegality. However, they are obligated to remove it once notified.

Would Durov have benefited from relocating Telegram’s headquarters from Dubai to France? No, because the company’s place of registration, as well as the citizenship of its directors and shareholders, is irrelevant. “Content regulation rules for platforms now generally have a cross-border nature,” Darbinyan explains. “All platforms must comply with both national laws and international standards. The only factors that matter are: whether the platform targets its services to citizens of these countries, whether it offers services in their languages, and whether the company generates revenue from users who are citizens of these countries.”

Mikhail Klimarev, director of the Internet Protection Society, believes that the complaints against Telegram are part of a larger issue.

“The question is broader: ‘how do we regulate large IT corporations?’ There are many countries, each with its own specifics, and no one knows how to bring them all to a common standard. The laws that have to be followed are sometimes detached from reality — like in Russia or Myanmar, for example.”

Cooperation with authorities

After reports surfaced about Durov’s collaboration with French law enforcement, Russian users began recalling instances in which cases were initiated against individuals based on their messages and subscriptions in the messenger. However, user privacy and content moderation are different matters.

“As far as I recall, Telegram didn’t leak private messages. Mostly, devices were confiscated and read,” Klimarev notes. “Facebook and Instagram were blocked and declared extremist in Russia, even though they did nothing to ‘deserve’ that or defend freedom of speech. But WhatsApp [which belongs to the same company, Meta, as Facebook and Instagram] still operates. Does that mean it's cooperating with the authorities?”

It’s also unclear how Durov was able to visit Russia so often after leaving the country due to government pressure, and why he ventured back in the first place. “He wasn’t under investigation or on a wanted list,” Klimarev points out. “But it’s an interesting question. First, it turns out he lied. Second, he was in Russia from June to October 2021, right when Telegram was blocking [Alexei Navalny’s] 'Smart Voting' bot. Maybe they told him, ‘Either the bot goes, or we arrest you.’ And after that, he never returned.”

Durov purportedly left Russia for good in 2014 after being forced out of his role as CEO of the social network VK. It was only after the tech entrepreneur’s arrest in Paris last month that news of his ongoing visits to Russia became known.

“Since 2014, Pavel Durov visited Russia to see his mother and brother. Plus, not all the developers left,” says Nikolai Kononov, journalist and author of The Durov Code. “What’s strange is that Durov was freely visiting in 2017-2018, even as the crackdown on Telegram had already begun. I explain this by the fact that Alisher Usmanov and Roman Abramovich, with whom Durov has good relations, may have guaranteed his safety in some way. Durov’s last visit to Russia was in the fall of 2021. This either means he has an extraordinary sense of foresight, or someone tipped him off about the impending full-scale invasion, and he realized it was time to distance himself from the toxic asset that Russia had become. Another possibility is that media restrictions were tightening ahead of the invasion, and Telegram might have faced increased pressure.”

Darbinyan notes how little is known about the relationship between Durov and the Russian authorities: “We are not aware of any personal cooperation between Durov and Telegram and the Russian FSB, although it is reasonable to assume that some metadata and personal data may be shared. Following the removal of the ‘Smart Voting’ bot, many became convinced that there is a communication channel for special cases. Furthermore, Russian security forces have access to specialized software that enables them to collect substantial metadata from the platform and de-anonymize account owners.”

Moderation challenges

Tech sector analysts agree that Telegram has long ignored the issue of content moderation on the platform. They highlight several reasons for this stance, including both objective factors, such as the complexity and high cost of the procedure, and subjective ones, like the founder’s personal views. Kononov calls Durov a “cyber-libertarian”:

“He has always responded slowly and indifferently to requests from various prosecutors. Moreover, when moderation did occur, it was often unclear and poorly justified.”

Content requirements for online services are not always clear and vary greatly from country to country. “Telegram's Terms of Service state that the platform cannot be used to promote violence, sell drugs, or distribute pornography. This is officially outlined,” says Klimarev. “Yes, moderation is poor. First, the volume of content is enormous, making it difficult to filter and search for prohibited material. Second, moderation is expensive.”

According to Internet Protection Society director Klimarev, Telegram is unable to make a significant impact on the situation.

“Telegram is no longer just a messaging app; it has become a social network. It may not be as large as Facebook or X (formerly Twitter), but it still is [a social network]. Moderators can't control how people plan terrorist acts in private chats. Meta, the tech giant [and owner of Facebook and WhatsApp], has struggled to create an effective moderation system. Their own standards are stricter than the laws in the countries where they operate, making communication nearly impossible. In contrast, X has almost no moderation at all, resulting in a very toxic environment.”

Journalist and author Kononov speaks to the difficulty of finding the right balance between freedom and moderation online. “What effectiveness means is unclear,” he says. “If the goal is to reduce crime, one could simply shut down social networks. Alternatively, we could revert to the earliest version of Facebook, when only Ivy League students were allowed in, and even they weren’t without issues.”

Roskomsvoboda co-founder Darbinyan disagrees with his colleagues. He believes effective content filtering is possible, saying that even on large platforms, it can be achieved through AI and dedicated moderation teams.

“Most content is flagged as illegal by neural networks as early as the upload stage — this includes child abuse material, nudity, and much more,” explains Darbinyan. “Moreover, companies like Google and Facebook have moderation teams around the world who are fluent in various languages and understand local contexts. They make decisions on content that has already reached the platform and received complaints.”

Extreme regulation or Anarchy?

“Someone’s rights will always be violated to some extent,” says Klimarev. “But what rights are we talking about? Let’s say we’re not calling for anyone to be killed, but can we advocate for jailing Putin's soldiers? Or insult French police?”

Darbinyan believes that content moderation without infringing on user rights is possible and necessary, especially for an open, centralized Web 2.0 service like Telegram:

“There was another path for Durov if he didn’t want to engage in moderation: he could have opened the source code and handed control of the platform over to the community. For instance, Evgeny Rochko did this with Mastodon — free software for deploying decentralized social networks. There are no questions about him because he doesn’t control anything. But Durov chose not to take the route he did, and as a result, he now faces the problems created by the French gendarmerie and prosecutors.”

Klimarev notes some of the more egregious examples of Telegram being used to disseminate harmful content. “There are Z-channels that post pictures of severed heads on spikes, which clearly need to be removed, yet no one is addressing this. In Myanmar, for example, a channel with a million followers published a photo of a person, and later a reply to that photo showed the beaten corpse of that person. Such posts continued to appear for several years, and the channel was only removed when the issue gained public attention,” he explains.

Darbinyan attributes these developments to the human factor. “The policies of centralized services like Telegram often align with the views of their founders. Durov has always recognized only one type of prohibited content that needs to be addressed: terrorism. Managing what was from the outset a ‘pirate ship’ on the internet, Durov began to venture into the coastal waters and ports of states, all while ignoring the ‘Maritime Code.’ Human rights advocates raised alarms about security issues in Telegram two years ago, citing the lack of a clear and transparent service policy and functional support channels. They attempted to warn Durov and even offered their advice on what to do. There was no response, nor any changes in the company’s policy,” Darbinyan says.

Access to chats and encryption keys

As Klimarev explains, for Telegram users’ messages, “there are essentially no encryption keys. The keys are access points to the servers where messages are stored, and only the platform itself can read them. They are kept on cloud servers. Otherwise, chats wouldn’t synchronize, and users would only be able to send messages from one device.”

Darbinyan notes that “nobody can say for certain how everything works, because they don’t disclose the source code written by [Pavel’s brother} Nikolai Durov. We can only trust [Pavel] Durov, who claims that messages are fragmented and stored on different servers in different countries, which supposedly ensures security.”

However, according to the co-founder of Roskomsvoboda, malicious actors or security forces have several potential ways to access other people’s chats on Telegram. These include:

  • Direct access to a device through physical coercion or due to inadequate security measures for device access (biometrics, simple passwords);
  • Hijacking the initial Telegram session via device access, allowing them to gain control over the entire user account, along with the ability to reset other sessions;
  • Intercepting SMS messages with the login confirmation code through a vulnerability in the SS7 protocol or by obtaining a duplicate SIM card from the carrier.

Moderation, transparency, and the Council of Wise Men

Addressing Telegram’s moderation issues and its interactions with state authorities will require improving the transparency of the platform.

“The main task is to make moderation rules more transparent so that each clause can be referenced,” asserts Klimarev. “When something happens behind closed doors, it lacks credibility. As a result, some may exploit this opacity for their own purposes.”

Darbinyan, for his part, stresses the importance of opening up. “In my view, the issue is not just about removing or blocking content, but also about the complete lack of transparency in Telegram’s policies and transparency reports, which other major platforms publish annually. From those reports, one can at least learn how companies handle requests from government bodies. There’s an initiative called the Digital Rights Rating, which exists in the US, Kazakhstan, Russia, and other countries.”

One way to ensure transparency is by involving third parties in decision-making on the platform. “There are already examples of creating supranational and transcorporate institutions for specific cases. For instance, the international Oversight Board was established a few years ago,” Darbinyan explains. “If content creators have exhausted their options for support on Facebook, Instagram, or Threads, they can appeal the company’s decision by reaching out to this organization, which comprises notable lawyers, university professors, journalists, and politicians. Meta can also refer cases to this board for resolution. Recently, for example, the Board ruled in favor of Meta, agreeing that three different posts containing the phrase ‘from the river to the sea should not be removed from the platform.”

Kononov, too, sees the need for a wider dialogue. “There’s the concept of ownership control, where Durov owns and manages Telegram as its CEO,” he says. “Then there’s public control, viewing Telegram as a public asset that impacts the lives, content, and personal data of countless people. Neither charismatic individuals nor the state and its police should have access to our information or influence this digital landscape. Recently, Grigory Yudin, Artemy Magun, and other philosophers presented for discussion a draft constitution for Russia of the future, and the oversight of the large platforms can be discussed in a similar manner.”