Telegram CEO’s Arrest: Wrongful or Responsible?
What the ordeal can tell us about tech and free speech.
French authorities arrested Telegram founder and CEO Pavel Durov in an airport outside Paris on August 25. The entrepreneur was suspected of complicity in illegal activities on Telegram, the cloud-based messaging platform, including the distribution of child sex abuse images and drugs.
Eight days after his indictment, Durov made a statement on his personal Telegram channel criticizing the French government’s “misguided approach” to prosecuting the criminal activity on his app, sparking debate about the international regulation of internet platforms.
“If a country is unhappy with an internet service, the established practice is to start a legal action against the service itself. Using laws from the pre-smartphone era to charge a CEO with crimes committed by third parties on the platform he manages is a misguided approach,” Durov stated.
Several tech giants, including Elon Musk and David Packham, have come to the Telegram CEO’s defense online, labeling the arrest as a threat to free speech.
Telegram is a social networking app created in 2013 by brothers Pavel and Nikolai Durov. The platform offers broadcasting, messaging, and calling services through worldwide Telegram servers. Telegram’s distinguishing characteristic, however, is its strict privacy regulations, which are foundational to the company’s mission.
On its website, Telegram boasts, “We can ensure that no single government or block [sic] of like-minded countries can intrude on people's privacy and freedom of expression.”
As of 2024, Telegram has garnered the trust of around 900 million global users and ranks as the 8th most popular social media platform worldwide. With such a heavy presence, the company now faces questions about whether the platform has taken free speech too far.
Telegram limits the number of users in a public group to 200,000. For reference, on Facebook, a group with more than 5,000 members has certain restrictions for mass communication put in place. Thus, critics point to Telegram as a potential entryway for conspiracies, misinformation and radicalization.
They highlight the early August attacks on immigration centers in the U.K., which were plotted by far-right wing groups on Telegram channels as well as the lack of Telegram moderators in comparison to other similar media companies like Meta. As a result, Telegram now faces serious criticism about its relationship to organized violence.
Earlier this month, the New York Times ended its 4-month investigation of Telegram, concluding that the platform can be labeled as a “playground for criminals, extremists and terrorists.” The investigation discovered 1,500 channels owned by white supremacists, 24 channels selling weapons and 22 channels with more than 70,000 followers advertising illegal substances such as cocaine and heroin. The journal also noted how terrorist groups such as ISIS have used Telegram to broadcast their agendas.
As the current French investigation of Telegram progresses, CEO Pavel Durov faces more than a scathing rebuke of his company. Due to the French Interior Ministry’s new cybercrime law titled “Orientation and Programming Law” (LOPMI), Durov faces a possible 10-year sentence and €500,000 fine for his alleged crimes.
This French code, which was enacted in January of 2023, uniquely includes the offense of “complicity in the administration of an online platform to allow an illicit transaction in an organized gang.” The LOPMI law has no legal equivalent in the Western world according to the former U.S. Deputy Assistant Attorney General, Adam Hickey, who established the DOJ’s national security cyber program.
Although the French government leads the movement for corporate responsibility in the digital space, other states have begun to take similar steps. In 2023, Changpeng Zhao pleaded guilty to failing to maintain an anti-money-laundering program on his platform Binance after he willingly allowed “money to flow to terrorists, cybercriminals, and child abusers through its platform.”
Additionally, in August 2024, Elon Musk was forced to shut down X’s operations in Brazil after he resisted an order to have a legal representative manage content moderation requests from the court. The Brazilian government was concerned about the misinformation spreading about former President Jair Bolsonaro on the social media app.
Thus, the central question remains: Do tech companies bear legal responsibility for third-party crimes committed on their technology platform? This debate surrounding content moderation and technology is gaining momentum and has made rounds in academic and business circles.
According to Reuters, Former Attorney General Adam Hickey referenced the Silk Road website as an example of a case where hosting online illegal transactions can justifiably result in formal prosecution.
The platform was an online black market that operated from 2011 to 2013. However, the applicability to Durov’s case is dubious because of the CEO’s “mainly law-abiding user base,” according to the former U.S. federal attorney Timothy Howard, who prosecuted the owner of Silk Road. In addition, Michel Séjean, a cyber law professor in France, stated that the law does not appear to be a “nuclear weapon,” as it is a tool to address a lack of cooperation from a technology company.
Elon Musk also quickly came to Durov’s defense, writing “#FreePavel” on X following the Telegram founder’s arrest. Furthermore, various executives in the cryptocurrency space have criticized the lack of stringency in prosecution for banks compared to Telegram. Additionally, Leonid Volkov, a top adviser for assassinated Russian opposition leader Aleksei A. Navalny, stated, “Durov is not an ‘accomplice’ to the crimes committed by Telegram users.”
At Georgetown University, Ross D. Cooper, a professor of business law at the McDonough School of Business, added:
“Because the French law focuses on the specific complicit conduct of corporate officials, it seems to be entirely consistent with US business law principles that hold corporate officials liable for their own illegal activity when fulfilling their corporate responsibilities. However, Telegram’s case highlights the distinction between merely hosting a forum where illegal activity may take place with inciting, promoting, or facilitating crimes, giving an emphasis on the meaning of the word “complicit” in the statute.”
He further adds that this focus on defining complicity “provides an interesting precedent for future cases balancing the fine line between free speech and illegal activity on a social media platform.”
In the upcoming spring semester, Professor Cooper plans to host several panels touching on legal content regulation through the McDonough Business Law Society.
Former Chief Privacy Officer at Verizon, Karen Zacharia, is just one of the speakers expected to discuss this subject, and students in Professor Cooper’s current business law class will begin discussing the interaction between technology and business law in the insider trading and torts section of his MSB course.
After numerous business and academic scholars have voiced their opinions regarding the new French law’s application to the CEO of Telegram, the discussion has shed light on basic holes in the current regime of internet governance, providing possible implications for the challenges future technology companies may face with content moderation. On Georgetown’s campus, Professor Cooper’s class and society illustrate a growing interest in the MSB in this dialogue.
As technology continues to evolve, it is in our best interest as digital citizens to engage in discussions about the regulation and protection of free speech on platforms. The Telegram case provides a compelling starting point.
How much legal responsibility should tech companies hold for crimes committed by third parties on their platforms? How can the international community bolster the legal and organizational norms around this evolving issue?
These questions are crucial to the future of interactions between technology and business law, warranting greater discussion about digital governance, platform censorship and corporate responsibility.