Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday February 07 2018, @07:22AM   Printer-friendly
from the removing-the-messenger dept.

Telegram iOS app removed from App Store last week due to child pornography

The encrypted messaging app Telegram was mysteriously removed from Apple's App Store last week for a number of hours. At the time, little was known about the reason why, except that it had to do with "inappropriate content." According to a 9to5Mac report, Apple removed Telegram after the app was found serving up child pornography to users.

A verified email from Phil Schiller details that Apple was alerted to child pornography in the Telegram app, immediately verified the existence of the content, and removed the app from its online stores. Apple then notified Telegram and the authorities, including the National Center for Missing and Exploited Children. Telegram apps were only allowed to be restored to the App Store after Telegram removed the inappropriate content and reportedly banned the users who posted it.

[...] Since Telegram is a messaging app with end-to-end encryption, it's unlikely that the content in question originated from direct messages between users. It's possible that the child pornography came from a Telegram plugin, but neither Apple nor Telegram has revealed the source of the inappropriate content.

Telegram is an instant messaging service with at least 100 million monthly active users.

Also at The Verge and Apple Insider.

Related: Former Whatsapp Users Bring Telegram to its Knees
Hackers Compromised Telegram Accounts, Identified 15 Million Users' Phone Numbers
Open Source Remote Access Trojan Targets Telegram Users
Russia Targets Telegram App After St Petersburg Bombing


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Interesting) by Immerman on Wednesday February 07 2018, @08:08AM (9 children)

    by Immerman (3985) on Wednesday February 07 2018, @08:08AM (#634333)

    I sure hope Apple doesn't discover that email, texting, and video chat are all also viable channels to share inappropriate content...

    Obviously any secure communication channel is going to be more appealing to those sharing criminal content - but practically every encryption specialist on the planet has agreed that that is the price to pay for secure communication - the same tools that protect the good guys, also protect the bad guys.

    • (Score: 4, Insightful) by unauthorized on Wednesday February 07 2018, @11:52AM (7 children)

      by unauthorized (3776) on Wednesday February 07 2018, @11:52AM (#634375)

      This will be controversial, but I don't think there should be such a thing as "criminal content". Oh certainly you can depict illegal activities in a video, but possessing a video depicting illegal activities cannot legitimately be considered a crime because there is no victim. If you watch a video of ISIS beheadings nobody is hurt by your actions, it's not as if they are going to behead the person again. I don't see the argument as to why child pornography should be treated differently, the only motivation seems to be "we hate pedos and want them to be miserable".

      To address the most common rebuttals, outlawing a certain type of media content isn't going to stop distribution (just ask MAFIAA), nor is it going to encourage further production because (a) the kind of person who would fuck a kid for money would either have fucked them anyway or done something equally terrible such as stealing their organs and (b) torrents will always offer superior product at a better price than any would-be producer.

      We all know the drill, "protect the children" is the mantra used to justify the the methods, and "shut down the undesirables" is the intended outcome, with all the legitimacy of your support. I bet a good chunk of people here will be a-okay with shutting down the alt-right for their "hateful messages". Any tools of oppression you legitimize today will inevitably be used against you tomorrow.

      • (Score: 4, Insightful) by TheRaven on Wednesday February 07 2018, @12:28PM (4 children)

        by TheRaven (270) on Wednesday February 07 2018, @12:28PM (#634379) Journal

        If you watch a video of ISIS beheadings nobody is hurt by your actions, it's not as if they are going to behead the person again.

        If no one watches ISIS beheading videos, then they are not a useful propaganda tool and so there's little incentive to make them. You seem to be under the impression that demand for a product has no impact on the size of the supply.

        --
        sudo mod me up
        • (Score: 3, Interesting) by unauthorized on Wednesday February 07 2018, @02:22PM (3 children)

          by unauthorized (3776) on Wednesday February 07 2018, @02:22PM (#634397)

          If no one watches ISIS beheading videos, then they are not a useful propaganda tool and so there's little incentive to make them.

          Clearly you don't understand ISIS [mirror.co.uk] (primary source [clarionproject.org]). Even if they couldn't make videos, they would behead people anyway as more than 15 centuries of Islamist barbarism has demonstrated beyond any doubt.

          You seem to be under the impression that demand for a product has no impact on the size of the supply.

          And you seem to have the wrongful impression that outlawing a product somehow reduces supply and demand. Those who fail to learn from the mistakes of the past [wikipedia.org] are doomed to repeat them.

          • (Score: 3, Interesting) by Immerman on Wednesday February 07 2018, @04:16PM (2 children)

            by Immerman (3985) on Wednesday February 07 2018, @04:16PM (#634421)

            There is a difference between reducing demand and eliminating it. Unless demand is completely inelastic, which is basically never the case, raising the price reduces the demand. And making people risk prison terms to consume content raises the price.

            Of course, it also drives the remaining demand to the black market, which typically increases the profit margins considerably, and may actually significantly increase the total profits available, and thus the incentive to produce. Especially with a digital product that has near-zero incremental costs per customer. And of course without any legal oversight there's likely to be a lot more egregious abuses and crimes surrounding the industry.

            • (Score: 5, Insightful) by unauthorized on Wednesday February 07 2018, @06:18PM (1 child)

              by unauthorized (3776) on Wednesday February 07 2018, @06:18PM (#634476)

              I can understand the case that maybe outlawing child pornography will have non-zero impact on how many children are abused in it's production, I'm not denying this is the case. However, I do believe the impact is severely overstated.

              Firstly, most of the deterrent effect can be accomplished by outlawing production sale and purchase only, where as outlawing possession is primarily a deterrent for consumption and does not directly increase the instances of abuse. Of course it would lead to rise in demand, but it would also significantly increase the demand for unauthorized sharing, which would act as a natural negative feedback effect. If people could download any Hollywood movie from The Pirate Bay legally but would be arrested for buying it, it's no brainer that next to nobody will be buying the movies, you've essentially created a legal incentive to sabotage any attempts to profit from producing movies by legally condoning piracy.

              Secondly, and this is the real big deal, availability of pornography has very strong and well-established link with reducing the occurrence of rape. When pedos don't have outlets for their urges, this leads to increasing sexual frustration to the point where their desire to fulfill their psychological need starts outweighing their fear of the law, which leads to child abuse. This is the Trolley dilemma in a nutshell - if we legalize the the distribution of child pornography then this will likely lead to multiple instances where children who wouldn't have been abused were abused as a consequence of something we choose to do. However if we don't, we know with absolute certainty that the total instances of child molestation will be higher.

              • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @01:38AM

                by Anonymous Coward on Thursday February 08 2018, @01:38AM (#634607)

                It may no longer be true, if it ever was, but some years back a claim circulated that the total number of child porn videos and images out there was so few that cops and border guards were equipped with lists of checksums.

                Run a checker on every media file in a computer or phone. And if anything came up matching, bring the owner in for "interrogation"...

      • (Score: 2) by pipedwho on Wednesday February 07 2018, @09:14PM (1 child)

        by pipedwho (2032) on Wednesday February 07 2018, @09:14PM (#634572)

        I agree with this. Content itself is not the problem, it's the potentially harmful acts that went into creating it, or possibly the initial act where something was exposed. Trying to affect 'market demand' by making the thing itself illegal has spectacularly failed in the past.

        Like bribery tries to punish 'all players', it is better to make the legality asymmetric. For bribery, it would be more effective to allow receiving bribes to be completely legal, but making it illegal to make the bribe. By giving one party legal protection of the act, it opens up the possibly that someone will report it. It also makes it uncertain to the person making the bribe that the recipient will take the money and then report the act. Even more likely if you let the recipient keep the money, as they're less like to carefully try to obfuscate and hide the income.

        The other way round is more appropriate for 'illegal content'. ie. receiving some sort of payment for distributing the content. As it stands with current laws, it's way too legally ambiguous when someone ends up 'possessing' the content, no matter how they got it (especially if it was accidental or unintentional). The act of creating it is already illegal due to the acts themselves. The act of possession makes no sense as it reduces the odds that someone is likely to report the abuse (especially if they paid for content).

        I think the legal reasoning may come down to likening the act of paying for something as equivalent to the act of 'commissioning' said content to be created. I'd hope this ambiguity has already been handled by current sales doctrine where buying something that has already been produced doesn't mean someone has initiated production of said item. Directly commissioning creation is a separate problem, as is selling said 'known to have been illegally commissioned' actions.

        Not sure how best to deal with these things, but it's clear that the current system isn't ideal and can probably been shown to be sub-par with too many 'unintended consequences'. Just like the Drug War, this is typical knee jerk legislation.
         

        • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @01:44AM

          by Anonymous Coward on Thursday February 08 2018, @01:44AM (#634613)

          Thing is that what constitutes a "child" varies massively.

          USA have some of the highest age of consent ages out there (if varies by state, and Hawaii apparently was as low as 14 well into the 80s), and thus anyone depicted below that is considered "child porn".

          But then there are other places that have a much lower age, and bring any video you shot of your little get together back home and it's hello vice.

          Damn it, you can mail order life sized dolls from asia these days. Complete with functioning genitalia no less.

    • (Score: 2) by MichaelDavidCrawford on Thursday February 08 2018, @01:51AM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Thursday February 08 2018, @01:51AM (#634620) Homepage Journal

      Most of the file sharing services have lots of cleartext kiddy porn.

      Often the file names aren't even obfuscated.

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 0) by Anonymous Coward on Wednesday February 07 2018, @08:32AM

    by Anonymous Coward on Wednesday February 07 2018, @08:32AM (#634336)

    Apple's efforts here to prevent this content from being scene really helps catch the people abusing the children?

    Preventing the ability of people to profit from such imagery is helpful, but that's not what was done here (I believe). Preventing people ability to share/pirate it probably actually helps the producers/sellers of the content both sales wise and with staying hidden from the public and law enforcement.

    I don't fault apple for this, they are just following the law and existing social norms. I simply don't think the way we treat child pornography does much to curb child abuse. Such content should be denied copyright protection and be illegal to profit from and when asked, someone in possession should be required to disclose the source under penalty of perjury. Making possession, viewing and distribution illegal just makes it much more difficult to help out law enforcement, conduct investigations, try and figure out where the abuse is happening etc. Using the law to force people to hide the problem and discourage people with evidence to providing it seems like a bad idea.

  • (Score: 2) by physicsmajor on Wednesday February 07 2018, @09:57AM (5 children)

    by physicsmajor (1471) on Wednesday February 07 2018, @09:57AM (#634349)

    So Telegram is supposed to be end to end encrypted. If it really is, there should be no way for Apple to verify if any content is present.

    So either that's a lie, perhaps to cover a mistake at Apple, or Telegram isn't truly end to end encrypted and Apple can inspect content as the middleman.

    Neither is acceptable.

    • (Score: 3, Insightful) by Runaway1956 on Wednesday February 07 2018, @10:11AM

      by Runaway1956 (2926) Subscriber Badge on Wednesday February 07 2018, @10:11AM (#634353) Journal

      Someone shared a public key, that enabled decryption? Or, someone accidentally published a private key? It isn't terribly unusual for someone to misconfigure a client. I'm not defending either Telegram or apple here, I'm just offering alternative scenarios in which the public might be decrypting an encrypted file.

      They also blame a plugin that can be installed in Telegram, without naming that plugin. It seems reasonable that the plugin was implemented in some stupid manner that defeated encryption. Or, that the content was intentionally served to all users who had installed the plugin.

      Without more details, we can only speculate about the security of Telegram, and it's plugins.

    • (Score: 3, Informative) by Anonymous Coward on Wednesday February 07 2018, @11:30AM

      by Anonymous Coward on Wednesday February 07 2018, @11:30AM (#634372)

      Telegram is not end-to-end encrypted by default. You have to start a "secret chat" to use end-to-end.

    • (Score: 3, Informative) by romlok on Wednesday February 07 2018, @02:06PM

      by romlok (1241) on Wednesday February 07 2018, @02:06PM (#634394)

      I believe Telegram also has public groups which anyone can join and chat. My guess is that one of these groups was being used to share child porn, and that group was being listed or suggested to new users of the app.

    • (Score: 0) by Anonymous Coward on Wednesday February 07 2018, @03:48PM

      by Anonymous Coward on Wednesday February 07 2018, @03:48PM (#634415)

      [...] Since Telegram is a messaging app with end-to-end encryption, it's unlikely that the content in question originated from direct messages between users. It's possible that the child pornography came from a Telegram plugin, but neither Apple nor Telegram has revealed the source of the inappropriate content.

      Good job reading the summary, Sparky.

    • (Score: 2) by legont on Thursday February 08 2018, @03:48AM

      by legont (4179) on Thursday February 08 2018, @03:48AM (#634661)

      In a nutshell, besides regular chats, Telegram has groups and channels. Groups are private and its protection is up to the members. Telegram does not and if security is set can not censor them.

      Channels on the other hand can be read by anybody and telegram can and will remove content if it is objectionable. They do it using volunteer labor.

      Details can be found here https://telegram.org/faq#groups-supergroups-and-channels [telegram.org]

      People use channels for all kinds of commercial activities. Specialized bots are very popular.

      --
      "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
  • (Score: 2) by drussell on Wednesday February 07 2018, @10:00AM

    by drussell (2678) on Wednesday February 07 2018, @10:00AM (#634351) Journal

    So.... This (potentially goin' rogue) app was pulled from the app store for a few hours.

    What about all the people who already have said app? Will it be causing mayhem?!

    So, if this had not been a circumstance where the problem was "corrected quickly" and the app truly went rogue, how do they alert their users? Do they just delete it from your device "for your protection?" That seems impolite, to say the least... Nice can of worms there, Apple... Welcome to the doldrums! :)

    Yeah, that really inspires my confidence in your Apple brand walled garden...

    :facepalm:

  • (Score: 2) by MichaelDavidCrawford on Thursday February 08 2018, @01:48AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Thursday February 08 2018, @01:48AM (#634619) Homepage Journal

    "We have our own orphanage!" - Russian Child Pornography Website.

    Microsoft claims that they remove child pornography from Bing's index when they are notified of its URL.

    I don't believe them.

    There are vast quantities of child pornography in Bing. It already supports image similarity search. It already knows all the keywords with which kiddie port can be found, because several other CP keywords are suggested whenever a CP search is performed.

    I notified the FBI of this three years ago. Nothing has been done.

    --
    Yes I Have No Bananas. [gofundme.com]
(1)