Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday September 09 2014, @01:13PM   Printer-friendly
from the I'm-surprised-it's-only-968-million dept.

CNET reports that University of New Haven researchers found some of the most popular Android apps transmitting and storing unencrypted images, chats, screenshots and even passwords.

From images and videos to passwords and mapping data, apps like Instagram, OKCupid, TextPlus, and GroupMe are sending large amounts of user data unencrypted over the web. The researchers estimate the number of affected users at 968 million.

The university's Cyber Forensics Research and Education Group is documenting their findings in a series of videos on their YouTube channel

Related Stories

Grindr Exposes Location Data and More to Third-Party Apps 17 comments

A third-party app can use Grindr to expose your exact location

Back in March, a report revealed that Grindr suffered from flaws that could expose its users' personal information. The company issued a statement in response that said its location tracking feature is more akin to a square on an atlas and can't pinpoint users' exact location. According to a new investigation by Queer Europe, though, Grindr can still expose people's personal data through a third-party app called "Fuckr," which was released in 2015 and can locate up to 600 Grindr users within minutes. And by "locate," we mean it can tell where users are with an accuracy of 6 to 16 feet -- accurate enough to tell which establishment, house or even room they're in.

The free third party app is built on top of Grindr's private API, giving it access to the gay dating app's database. It uses a technique called "trilateration" to find users, allowing anyone with access to it a way to follow people around as they go about their day. All someone has to do to find users nearby is to use Fuckr's filters, which can narrow people based on their ethnicity, relationships and other data. Yes, because the app can tap into Grindr's database, it can reveal not only users' locations, but also their photo, body type, ethnicity, HIV status, last HIV test and even their sexual position preference.

Previously: Researchers Find Data Leaks in Instagram, Grindr, Oovoo, And More
Grindr Shared Users' HIV Status With Third Parties

Related: Gay dating app Grindr plans to go public after Chinese parent gives go-ahead
How Grindr Is Reinventing Itself as More Than Just a Dating App
Misguided Appeal in Grindr Case Is Latest Threat to Online Free Speech


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by VLM on Tuesday September 09 2014, @01:56PM

    by VLM (445) Subscriber Badge on Tuesday September 09 2014, @01:56PM (#91213)

    Much like we have silly "ipv6 days" we should have "block port 80 days".

    One huge problem is you draw a venn diagram of who is capable of owning sniffing and protocol analysis hardware to watch my traffic. Then you draw another circle of corrupt corporations and big brother .gov and .mil organizations who simply tell the far endpoint to turn everything over and they don't need the data stream because they've got access to the entire dumped database of the service and freely share it with everyone else. And the overlap is what, 90% at best case? Maybe 99% very realistically? Perhaps higher than 99.999% is a bit paranoid? The percentage is so high that it doesn't really matter, does it?

    Or rephrased, the only people with access to "images and videos" are members of the 99% general public. Won't matter if its encrypted or sent in the clear, they're not getting access to that data. And the 1% people and orgs all have access to it being the endpoints or friends of the endpoints or endpoints that no longer have any civil rights as part of the "save our freedumbs" laws. So they're always going to have free open total access no matter if its encrypted or sent in the clear. So whats the point of bothering?

    Or rephrased again, if you live in a culture that no longer has rule of law and rights, and your group doesn't have any power or isn't the biggest power anyway, then you're kind of an idiot to think putting something in an envelope makes it secret from people who we all know sneakily open and read all your mail. Its as quaint and antiquated as dueling pistols or something like that.

    • (Score: 3, Informative) by VLM on Tuesday September 09 2014, @01:57PM

      by VLM (445) Subscriber Badge on Tuesday September 09 2014, @01:57PM (#91214)

      Oh let me click the edit button.

      "Or rephrased, the only people without access to "images and videos" are members of the 99% general public"

  • (Score: 3, Insightful) by MrGuy on Tuesday September 09 2014, @02:30PM

    by MrGuy (1007) on Tuesday September 09 2014, @02:30PM (#91233)

    So, the actual article is more than a bit light on specifics, but appears at least a little alarmist.

    There's a huge difference between an app taking and storing screenshots a user didn't take and doing something mysterious with them (which only one app on the list appears to be doing - TextPlus) or storing unencrypted, weakly protected user content on their servers non-obviously (Tango and MessageMe), vs. just sending a photo I'm posting on a social media site via http instead of https (which MAY be all that Instagram is doing - I'm not watching a bunch of videos to figure out where this MIGHT be discussed) or simply storing app content like chat logs on the device unencrypted when they could be encrypted (numerous).

    Don't get me wrong - bad security is never a good thing, and being weak in one way is indicative you might be weak in others. But as I read it, there's a huge range of possible issues here from "not optimal" to "OMG really?"

    And it feels a bit like TFA (and FTG) have latched on to a lot of "big names (even though they have small issues)" so they could combine them with "big issues (even if they're from small names)" to create a headline that's technically true but as alarmist as possible. Which is always something that makes me doubt someone's authority as a "neutral uninterested third party who's just looking out for your interests."

    • (Score: 3, Insightful) by zafiro17 on Tuesday September 09 2014, @04:49PM

      by zafiro17 (234) on Tuesday September 09 2014, @04:49PM (#91306) Homepage

      "Maddeningly vague, somewhat alarmist": Isn't that the state of all tech journalism these days? That's how you get the clicks? By making people feel anxious about something without going into details (which you don't have) or providing a solution (which you don't have)?

      This just in: something you use regularly, under some use cases, could actually kill you! But more research is required and in fact your case may vary. All rights reserved, offer not valid in some states or where not permitted by law. But clicky on our adverts, puh-lease?!

      --
      Dad always thought laughter was the best medicine, which I guess is why several of us died of tuberculosis - Jack Handey
      • (Score: 3, Funny) by MrGuy on Tuesday September 09 2014, @05:25PM

        by MrGuy (1007) on Tuesday September 09 2014, @05:25PM (#91327)

        I stopped falling for clickbait using this one weird trick!

  • (Score: 0) by Anonymous Coward on Wednesday September 10 2014, @05:29AM

    by Anonymous Coward on Wednesday September 10 2014, @05:29AM (#91582)

    Because this is true of almost every android app I've ever analyzed... Maybe I should be some sort of expert releasing important statements?