Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by Fnord666 on Wednesday February 07 2018, @07:22AM   Printer-friendly
from the removing-the-messenger dept.

Telegram iOS app removed from App Store last week due to child pornography

The encrypted messaging app Telegram was mysteriously removed from Apple's App Store last week for a number of hours. At the time, little was known about the reason why, except that it had to do with "inappropriate content." According to a 9to5Mac report, Apple removed Telegram after the app was found serving up child pornography to users.

A verified email from Phil Schiller details that Apple was alerted to child pornography in the Telegram app, immediately verified the existence of the content, and removed the app from its online stores. Apple then notified Telegram and the authorities, including the National Center for Missing and Exploited Children. Telegram apps were only allowed to be restored to the App Store after Telegram removed the inappropriate content and reportedly banned the users who posted it.

[...] Since Telegram is a messaging app with end-to-end encryption, it's unlikely that the content in question originated from direct messages between users. It's possible that the child pornography came from a Telegram plugin, but neither Apple nor Telegram has revealed the source of the inappropriate content.

Telegram is an instant messaging service with at least 100 million monthly active users.

Also at The Verge and Apple Insider.

Related: Former Whatsapp Users Bring Telegram to its Knees
Hackers Compromised Telegram Accounts, Identified 15 Million Users' Phone Numbers
Open Source Remote Access Trojan Targets Telegram Users
Russia Targets Telegram App After St Petersburg Bombing


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Wednesday February 07 2018, @08:32AM

    by Anonymous Coward on Wednesday February 07 2018, @08:32AM (#634336)

    Apple's efforts here to prevent this content from being scene really helps catch the people abusing the children?

    Preventing the ability of people to profit from such imagery is helpful, but that's not what was done here (I believe). Preventing people ability to share/pirate it probably actually helps the producers/sellers of the content both sales wise and with staying hidden from the public and law enforcement.

    I don't fault apple for this, they are just following the law and existing social norms. I simply don't think the way we treat child pornography does much to curb child abuse. Such content should be denied copyright protection and be illegal to profit from and when asked, someone in possession should be required to disclose the source under penalty of perjury. Making possession, viewing and distribution illegal just makes it much more difficult to help out law enforcement, conduct investigations, try and figure out where the abuse is happening etc. Using the law to force people to hide the problem and discourage people with evidence to providing it seems like a bad idea.