Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday February 07 2018, @07:22AM   Printer-friendly
from the removing-the-messenger dept.

Telegram iOS app removed from App Store last week due to child pornography

The encrypted messaging app Telegram was mysteriously removed from Apple's App Store last week for a number of hours. At the time, little was known about the reason why, except that it had to do with "inappropriate content." According to a 9to5Mac report, Apple removed Telegram after the app was found serving up child pornography to users.

A verified email from Phil Schiller details that Apple was alerted to child pornography in the Telegram app, immediately verified the existence of the content, and removed the app from its online stores. Apple then notified Telegram and the authorities, including the National Center for Missing and Exploited Children. Telegram apps were only allowed to be restored to the App Store after Telegram removed the inappropriate content and reportedly banned the users who posted it.

[...] Since Telegram is a messaging app with end-to-end encryption, it's unlikely that the content in question originated from direct messages between users. It's possible that the child pornography came from a Telegram plugin, but neither Apple nor Telegram has revealed the source of the inappropriate content.

Telegram is an instant messaging service with at least 100 million monthly active users.

Also at The Verge and Apple Insider.

Related: Former Whatsapp Users Bring Telegram to its Knees
Hackers Compromised Telegram Accounts, Identified 15 Million Users' Phone Numbers
Open Source Remote Access Trojan Targets Telegram Users
Russia Targets Telegram App After St Petersburg Bombing


Original Submission

Related Stories

Former Whatsapp Users Bring Telegram to its Knees 39 comments
girlwhowaspluggedout writes:

"A mere three days after Mark Zuckerberg announced Facebook's acquisition of Whatsapp, the popular smartphone messaging app suffered a major service outage that lasted three and a half hours. Left to their own devices, Whatsapp users worldwide went rushing to its rival apps, including secure chat provider Telegram. The surge in new users quickly turned into a tidal wave that brought Telegram's service to its knees:

The SMS gateways we use to send registration codes are overloaded and slow 100 SMS per second is too much. Trying to find a solution.

In its official twitter, Telegram announced that more than 1.8 million new users had joined on Saturday, Feb 22. Four hours later, it reported an additional 800 thousand.

Telegram's messaging service, which uses 256-bit symmetric AES encryption, RSA 2048 encryption and Diffie-Hellman secure key exchange, began enjoying a spike in popularity after Whatsapp's acquisition. Although it has released the source code for its java libraries and all its official clients, its server software is still closed source."

Hackers Compromised Telegram Accounts, Identified 15 Million Users' Phone Numbers 8 comments

Submitted via IRC for TheMightyBuzzard

The accounts with Telegram, a secure messaging service based in Germany, were compromised by exploiting the fact that Telegram sends would-be users an SMS with authorization codes so that they can activate their devices.

The researchers believe the attackers have intercepted these text messages, and this allowed them to add new devices to the targets' account, and access everything in it.

This SMS interception has been performed either by compromising Iranian phone companies, or by colluding with them. The researchers believe that the latter theory is not far-fetched, as Rocket Kitten – the hacker group that they believe performed the attacks – is believed to be composed of Iranian hackers, possibly tied to the Iranian Revolutionary Guard Corps...

Rocket Kitten is known for targeting individuals, businesses and government organizations across the the Middle East, but also researchers (Iranian and European), Iranian citizens/activists, and Islamic and anti-Islamic preachers and groups, political parties and government officials.

The same group apparently also managed to misuse Telegram's API to identify 15 million Iranian phone numbers and user IDs tied with Telegram accounts earlier this year. This information can come in handy for orchestrating future attacks and help with investigations.

Source: https://www.helpnetsecurity.com/2016/08/03/compromised-telegram-accounts/


Original Submission

Open Source Remote Access Trojan Targets Telegram Users 14 comments

Submitted via IRC for TheMightyBuzzard

Remote access Trojans are mainly used to steal consumer data, either for consumers themselves or the conglomerate keeping this information safe from prying eyes. However, it appears criminals are looking at a different approach for these tools right now. A new open source remote access Trojan can now be used to extract data from the Telegram communication platform.

It is never a good sign when end-to-end encrypted communication tools are vulnerable to remote access Trojans. Unfortunately for all Telegram users, they have now become an official target for cybercriminals who make use of the RATAttack toolkit. This new open-source hacking tool has been unveiled by security researchers late last night, as it could have major consequences for all Telegram users.

Source: https://themerkle.com/open-source-remote-access-trojan-targets-telegram-users/


Original Submission

Russia Targets Telegram App After St Petersburg Bombing 16 comments

Submitted via IRC for FatPhil

Russia's FSB1 security agency has said the Telegram mobile messaging app was used by a suicide bomber who killed 15 people in St Petersburg in April.

Authorities have already threatened to block the app, founded by Russian businessman Pavel Durov, for refusing to sign up to new data laws.

Mr Durov has refused to let regulators access encrypted messages on the app.

Telegram has some 100 million users and has been used by so-called Islamic State (IS) and its supporters.

IS used the app to declare its involvement in the jihadist attack on and around London Bridge in the UK last month.

Telegram has been used by jihadists in France and the Middle East too, although the app company has highlighted its efforts to close down pro-IS channels. Telegram allows groups of up to 5,000 people to send messages, documents, videos and pictures without charge and with complete encryption.

Now the FSB has said that as part of its investigation into the St Petersburg attack it "received reliable information about the use of Telegram by the suicide bomber, his accomplices and their mastermind abroad to conceal their criminal plots at all the stages of preparation for the terrorist attack".

A Russian identified as Akbarzhon Jalilov blew himself up between two underground stations on 3 April. The security agency said that Telegram was the messenger of choice for "international terrorist organisations in Russia" because they could chat secretly with high levels of encryption.

1 According to Wikipedia, FSB:

The Federal Security Service of the Russian Federation (FSB; Russian: Федеральная служба безопасности Российской Федерации (ФСБ), tr. Federal'naya sluzhba bezopasnosti Rossiyskoy Federatsii; IPA: [fʲɪdʲɪˈralʲnəjə ˈsluʐbə bʲɪzɐˈpasnəstʲɪ rɐˈsʲijskəj fʲɪdʲɪˈratsɨjɪ]) is the principal security agency of Russia and the main successor agency to the USSR's Committee of State Security (KGB). Its main responsibilities are within the country and include counter-intelligence, internal and border security, counter-terrorism, and surveillance as well as investigating some other types of grave crimes and federal law violations.

Source: http://www.bbc.com/news/world-europe-40404842


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Interesting) by Immerman on Wednesday February 07 2018, @08:08AM (9 children)

    by Immerman (3985) on Wednesday February 07 2018, @08:08AM (#634333)

    I sure hope Apple doesn't discover that email, texting, and video chat are all also viable channels to share inappropriate content...

    Obviously any secure communication channel is going to be more appealing to those sharing criminal content - but practically every encryption specialist on the planet has agreed that that is the price to pay for secure communication - the same tools that protect the good guys, also protect the bad guys.

    • (Score: 4, Insightful) by unauthorized on Wednesday February 07 2018, @11:52AM (7 children)

      by unauthorized (3776) on Wednesday February 07 2018, @11:52AM (#634375)

      This will be controversial, but I don't think there should be such a thing as "criminal content". Oh certainly you can depict illegal activities in a video, but possessing a video depicting illegal activities cannot legitimately be considered a crime because there is no victim. If you watch a video of ISIS beheadings nobody is hurt by your actions, it's not as if they are going to behead the person again. I don't see the argument as to why child pornography should be treated differently, the only motivation seems to be "we hate pedos and want them to be miserable".

      To address the most common rebuttals, outlawing a certain type of media content isn't going to stop distribution (just ask MAFIAA), nor is it going to encourage further production because (a) the kind of person who would fuck a kid for money would either have fucked them anyway or done something equally terrible such as stealing their organs and (b) torrents will always offer superior product at a better price than any would-be producer.

      We all know the drill, "protect the children" is the mantra used to justify the the methods, and "shut down the undesirables" is the intended outcome, with all the legitimacy of your support. I bet a good chunk of people here will be a-okay with shutting down the alt-right for their "hateful messages". Any tools of oppression you legitimize today will inevitably be used against you tomorrow.

      • (Score: 4, Insightful) by TheRaven on Wednesday February 07 2018, @12:28PM (4 children)

        by TheRaven (270) on Wednesday February 07 2018, @12:28PM (#634379) Journal

        If you watch a video of ISIS beheadings nobody is hurt by your actions, it's not as if they are going to behead the person again.

        If no one watches ISIS beheading videos, then they are not a useful propaganda tool and so there's little incentive to make them. You seem to be under the impression that demand for a product has no impact on the size of the supply.

        --
        sudo mod me up
        • (Score: 3, Interesting) by unauthorized on Wednesday February 07 2018, @02:22PM (3 children)

          by unauthorized (3776) on Wednesday February 07 2018, @02:22PM (#634397)

          If no one watches ISIS beheading videos, then they are not a useful propaganda tool and so there's little incentive to make them.

          Clearly you don't understand ISIS [mirror.co.uk] (primary source [clarionproject.org]). Even if they couldn't make videos, they would behead people anyway as more than 15 centuries of Islamist barbarism has demonstrated beyond any doubt.

          You seem to be under the impression that demand for a product has no impact on the size of the supply.

          And you seem to have the wrongful impression that outlawing a product somehow reduces supply and demand. Those who fail to learn from the mistakes of the past [wikipedia.org] are doomed to repeat them.

          • (Score: 3, Interesting) by Immerman on Wednesday February 07 2018, @04:16PM (2 children)

            by Immerman (3985) on Wednesday February 07 2018, @04:16PM (#634421)

            There is a difference between reducing demand and eliminating it. Unless demand is completely inelastic, which is basically never the case, raising the price reduces the demand. And making people risk prison terms to consume content raises the price.

            Of course, it also drives the remaining demand to the black market, which typically increases the profit margins considerably, and may actually significantly increase the total profits available, and thus the incentive to produce. Especially with a digital product that has near-zero incremental costs per customer. And of course without any legal oversight there's likely to be a lot more egregious abuses and crimes surrounding the industry.

            • (Score: 5, Insightful) by unauthorized on Wednesday February 07 2018, @06:18PM (1 child)

              by unauthorized (3776) on Wednesday February 07 2018, @06:18PM (#634476)

              I can understand the case that maybe outlawing child pornography will have non-zero impact on how many children are abused in it's production, I'm not denying this is the case. However, I do believe the impact is severely overstated.

              Firstly, most of the deterrent effect can be accomplished by outlawing production sale and purchase only, where as outlawing possession is primarily a deterrent for consumption and does not directly increase the instances of abuse. Of course it would lead to rise in demand, but it would also significantly increase the demand for unauthorized sharing, which would act as a natural negative feedback effect. If people could download any Hollywood movie from The Pirate Bay legally but would be arrested for buying it, it's no brainer that next to nobody will be buying the movies, you've essentially created a legal incentive to sabotage any attempts to profit from producing movies by legally condoning piracy.

              Secondly, and this is the real big deal, availability of pornography has very strong and well-established link with reducing the occurrence of rape. When pedos don't have outlets for their urges, this leads to increasing sexual frustration to the point where their desire to fulfill their psychological need starts outweighing their fear of the law, which leads to child abuse. This is the Trolley dilemma in a nutshell - if we legalize the the distribution of child pornography then this will likely lead to multiple instances where children who wouldn't have been abused were abused as a consequence of something we choose to do. However if we don't, we know with absolute certainty that the total instances of child molestation will be higher.

              • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @01:38AM

                by Anonymous Coward on Thursday February 08 2018, @01:38AM (#634607)

                It may no longer be true, if it ever was, but some years back a claim circulated that the total number of child porn videos and images out there was so few that cops and border guards were equipped with lists of checksums.

                Run a checker on every media file in a computer or phone. And if anything came up matching, bring the owner in for "interrogation"...

      • (Score: 2) by pipedwho on Wednesday February 07 2018, @09:14PM (1 child)

        by pipedwho (2032) on Wednesday February 07 2018, @09:14PM (#634572)

        I agree with this. Content itself is not the problem, it's the potentially harmful acts that went into creating it, or possibly the initial act where something was exposed. Trying to affect 'market demand' by making the thing itself illegal has spectacularly failed in the past.

        Like bribery tries to punish 'all players', it is better to make the legality asymmetric. For bribery, it would be more effective to allow receiving bribes to be completely legal, but making it illegal to make the bribe. By giving one party legal protection of the act, it opens up the possibly that someone will report it. It also makes it uncertain to the person making the bribe that the recipient will take the money and then report the act. Even more likely if you let the recipient keep the money, as they're less like to carefully try to obfuscate and hide the income.

        The other way round is more appropriate for 'illegal content'. ie. receiving some sort of payment for distributing the content. As it stands with current laws, it's way too legally ambiguous when someone ends up 'possessing' the content, no matter how they got it (especially if it was accidental or unintentional). The act of creating it is already illegal due to the acts themselves. The act of possession makes no sense as it reduces the odds that someone is likely to report the abuse (especially if they paid for content).

        I think the legal reasoning may come down to likening the act of paying for something as equivalent to the act of 'commissioning' said content to be created. I'd hope this ambiguity has already been handled by current sales doctrine where buying something that has already been produced doesn't mean someone has initiated production of said item. Directly commissioning creation is a separate problem, as is selling said 'known to have been illegally commissioned' actions.

        Not sure how best to deal with these things, but it's clear that the current system isn't ideal and can probably been shown to be sub-par with too many 'unintended consequences'. Just like the Drug War, this is typical knee jerk legislation.
         

        • (Score: 0) by Anonymous Coward on Thursday February 08 2018, @01:44AM

          by Anonymous Coward on Thursday February 08 2018, @01:44AM (#634613)

          Thing is that what constitutes a "child" varies massively.

          USA have some of the highest age of consent ages out there (if varies by state, and Hawaii apparently was as low as 14 well into the 80s), and thus anyone depicted below that is considered "child porn".

          But then there are other places that have a much lower age, and bring any video you shot of your little get together back home and it's hello vice.

          Damn it, you can mail order life sized dolls from asia these days. Complete with functioning genitalia no less.

    • (Score: 2) by MichaelDavidCrawford on Thursday February 08 2018, @01:51AM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Thursday February 08 2018, @01:51AM (#634620) Homepage Journal

      Most of the file sharing services have lots of cleartext kiddy porn.

      Often the file names aren't even obfuscated.

      --
      Yes I Have No Bananas. [gofundme.com]
  • (Score: 0) by Anonymous Coward on Wednesday February 07 2018, @08:32AM

    by Anonymous Coward on Wednesday February 07 2018, @08:32AM (#634336)

    Apple's efforts here to prevent this content from being scene really helps catch the people abusing the children?

    Preventing the ability of people to profit from such imagery is helpful, but that's not what was done here (I believe). Preventing people ability to share/pirate it probably actually helps the producers/sellers of the content both sales wise and with staying hidden from the public and law enforcement.

    I don't fault apple for this, they are just following the law and existing social norms. I simply don't think the way we treat child pornography does much to curb child abuse. Such content should be denied copyright protection and be illegal to profit from and when asked, someone in possession should be required to disclose the source under penalty of perjury. Making possession, viewing and distribution illegal just makes it much more difficult to help out law enforcement, conduct investigations, try and figure out where the abuse is happening etc. Using the law to force people to hide the problem and discourage people with evidence to providing it seems like a bad idea.

  • (Score: 2) by physicsmajor on Wednesday February 07 2018, @09:57AM (5 children)

    by physicsmajor (1471) on Wednesday February 07 2018, @09:57AM (#634349)

    So Telegram is supposed to be end to end encrypted. If it really is, there should be no way for Apple to verify if any content is present.

    So either that's a lie, perhaps to cover a mistake at Apple, or Telegram isn't truly end to end encrypted and Apple can inspect content as the middleman.

    Neither is acceptable.

    • (Score: 3, Insightful) by Runaway1956 on Wednesday February 07 2018, @10:11AM

      by Runaway1956 (2926) Subscriber Badge on Wednesday February 07 2018, @10:11AM (#634353) Journal

      Someone shared a public key, that enabled decryption? Or, someone accidentally published a private key? It isn't terribly unusual for someone to misconfigure a client. I'm not defending either Telegram or apple here, I'm just offering alternative scenarios in which the public might be decrypting an encrypted file.

      They also blame a plugin that can be installed in Telegram, without naming that plugin. It seems reasonable that the plugin was implemented in some stupid manner that defeated encryption. Or, that the content was intentionally served to all users who had installed the plugin.

      Without more details, we can only speculate about the security of Telegram, and it's plugins.

    • (Score: 3, Informative) by Anonymous Coward on Wednesday February 07 2018, @11:30AM

      by Anonymous Coward on Wednesday February 07 2018, @11:30AM (#634372)

      Telegram is not end-to-end encrypted by default. You have to start a "secret chat" to use end-to-end.

    • (Score: 3, Informative) by romlok on Wednesday February 07 2018, @02:06PM

      by romlok (1241) on Wednesday February 07 2018, @02:06PM (#634394)

      I believe Telegram also has public groups which anyone can join and chat. My guess is that one of these groups was being used to share child porn, and that group was being listed or suggested to new users of the app.

    • (Score: 0) by Anonymous Coward on Wednesday February 07 2018, @03:48PM

      by Anonymous Coward on Wednesday February 07 2018, @03:48PM (#634415)

      [...] Since Telegram is a messaging app with end-to-end encryption, it's unlikely that the content in question originated from direct messages between users. It's possible that the child pornography came from a Telegram plugin, but neither Apple nor Telegram has revealed the source of the inappropriate content.

      Good job reading the summary, Sparky.

    • (Score: 2) by legont on Thursday February 08 2018, @03:48AM

      by legont (4179) on Thursday February 08 2018, @03:48AM (#634661)

      In a nutshell, besides regular chats, Telegram has groups and channels. Groups are private and its protection is up to the members. Telegram does not and if security is set can not censor them.

      Channels on the other hand can be read by anybody and telegram can and will remove content if it is objectionable. They do it using volunteer labor.

      Details can be found here https://telegram.org/faq#groups-supergroups-and-channels [telegram.org]

      People use channels for all kinds of commercial activities. Specialized bots are very popular.

      --
      "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
  • (Score: 2) by drussell on Wednesday February 07 2018, @10:00AM

    by drussell (2678) on Wednesday February 07 2018, @10:00AM (#634351) Journal

    So.... This (potentially goin' rogue) app was pulled from the app store for a few hours.

    What about all the people who already have said app? Will it be causing mayhem?!

    So, if this had not been a circumstance where the problem was "corrected quickly" and the app truly went rogue, how do they alert their users? Do they just delete it from your device "for your protection?" That seems impolite, to say the least... Nice can of worms there, Apple... Welcome to the doldrums! :)

    Yeah, that really inspires my confidence in your Apple brand walled garden...

    :facepalm:

  • (Score: 2) by MichaelDavidCrawford on Thursday February 08 2018, @01:48AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Thursday February 08 2018, @01:48AM (#634619) Homepage Journal

    "We have our own orphanage!" - Russian Child Pornography Website.

    Microsoft claims that they remove child pornography from Bing's index when they are notified of its URL.

    I don't believe them.

    There are vast quantities of child pornography in Bing. It already supports image similarity search. It already knows all the keywords with which kiddie port can be found, because several other CP keywords are suggested whenever a CP search is performed.

    I notified the FBI of this three years ago. Nothing has been done.

    --
    Yes I Have No Bananas. [gofundme.com]
(1)