Kate Ruane, director of the Center for Democracy and Technologyâs free expression project, says most major technology platforms now have policies prohibiting nonconsensual distribution of intimate images, with many of the biggest agreeing to principles to tackle deepfakes. âI would say that itâs actually not clear whether nonconsensual intimate image creation or distribution is prohibited on the platform,â Ruane says of Telegramâs terms of service, which are less detailed than other major tech platforms.
Telegramâs approach to removing harmful content has long been criticized by civil society groups, with the platform historically hosting scammers, extreme right-wing groups, and terrorism-related content. Since Telegram CEO and founder Pavel Durov was arrested and charged in France in August relating to a range of potential offenses, Telegram has started to make some changes to its terms of service and provide data to law enforcement agencies. The company did not respond to WIREDâs questions about whether it specifically prohibits explicit deepfakes.
Execute the Harm
Ajder, the researcher who discovered deepfake Telegram bots four years ago, says the app is almost uniquely positioned for deepfake abuse. âTelegram provides you with the search functionality, so it allows you to identify communities, chats, and bots,â Ajder says. âIt provides the bot-hosting functionality, so it’s somewhere that provides the tooling in effect. Then itâs also the place where you can share it and actually execute the harm in terms of the end result.â
In late September, several deepfake channels started posting that Telegram had removed their bots. It is unclear what prompted the removals. On September 30, a channel with 295,000 subscribers posted that Telegram had âbannedâ its bots, but it posted a new bot link for users to use. (The channel was removed after WIRED sent questions to Telegram.)
âOne of the things thatâs really concerning about apps like Telegram is that it is so difficult to track and monitor, particularly from the perspective of survivors,â says Elena Michael, the cofounder and director of #NotYourPorn, a campaign group working to protect people from image-based sexual abuse.
Michael says Telegram has been ânotoriously difficultâ to discuss safety issues with, but notes there has been some progress from the company in recent years. However, she says the company should be more proactive in moderating and filtering out content itself.
âImagine if you were a survivor whoâs having to do that themselves, surely the burden shouldn’t be on an individual,â Michael says. âSurely the burden should be on the company to put something in place that’s proactive rather than reactive.â