Telegram has quietly updated its coverage to enable users to report personal chats to its moderators in response to recent “crimes perpetrated by third parties” on the platform.
The messaging platform, boasting approximately one billion monthly active users, has traditionally fostered an environment of relatively low oversight in regards to user communications.
On Thursday evening, Telegram began rolling out updates to its moderation policy. “All Telegram apps feature ‘Report’ buttons, enabling users to swiftly report suspected unlawful content to our moderators with just a few taps,” the company notes on its updated FAQ page.
To streamline moderation, the platform now provides a dedicated email address for automated takedown requests, guiding users to include links to content in need of moderator attention.
Will this transformation have a discernible impact on Telegram’s ability to respond to inquiries from regulatory bodies? The corporation had previously complied with court-ordered directives to share.
TechCrunch has contacted Telegram seeking comment.
The amendments to the coverage follow Durov’s arrest by French authorities as part of an inquiry into alleged crimes linked to child pornography images, drug trafficking, and fraudulent dealings.
Following his arrest, Pavel Durov issued a statement on his Telegram channel, lamenting the move: “Employing outdated legal frameworks to hold accountable a CEO for misdeeds committed by external parties on the platform he oversees is a misguided approach.”
He posits that a common playbook for nations unhappy with a web service involves taking official action directly against the service, rather than targeting its administrators.
Durov warned that imposing liability on entrepreneurs for potential misuse of their products would stifle innovation, making it unlikely for any inventor to create a new instrument.