Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday February 25, @02:26AM   Printer-friendly

Billions of files were left exposed:

Not every AI tool you stumble across in your phone's app marketplace is the same. In fact, many of them may be more of a privacy gamble than you would have previously thought.

A plethora of unlicensed or unsecured AI apps on the Google Play store for Android, including those marketed for identity verification and editing, have exposed billions of records and personal data, cybersecurity experts have confirmed.

A recent investigation by Cybernews found that one Android-available app in particular, "Video AI Art Generator & Maker," has leaked 1.5 million user images, over 385,000 videos, and millions of user AI-generated media files. The security flaw was spotted by researchers, who discovered a misconfiguration in a Google Cloud Storage bucket that left personal files vulnerable to outsiders. In total, the publication reported, over 12 terabytes of users' media files were accessible via the exposed bucket. The app had 500,000 downloads at the time.

Another app, called IDMerit, exposed know-your-customer data and personally identifiable information from users across 25 countries, predominantly in the U.S.

Information included full names and addresses, birthdates, IDs, and contact information constituting a full terabyte of data. Both of the apps' developers resolved the vulnerabilities after researchers notified them.

Still, cybersecurity experts warn that lax security trends among these types of AI apps pose a widespread risk to users. Many AI apps, which often store user-uploaded files alongside AI-generated content, also use a highly criticized practice known as "hardcoding secrets," embedding sensitive information such as API keys, passwords, or encryption keys directly into the app's source code. Cybernews found that 72 percent of the hundreds of Google Play apps researchers analyzed had similar security vulnerabilities.


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Interesting) by Anonymous Coward on Wednesday February 25, @02:38AM (3 children)

    by Anonymous Coward on Wednesday February 25, @02:38AM (#1434843)

    > In total, the publication reported, over 12 terabytes of users' media files were accessible via the exposed bucket.

    That's like $2000 per month, just in storage cost.

    How do these apps make that kind of money?? Surely it's not from in-app advertising. Even sales of user data seems doubtful. Large companies shelling out for 50 failure apps just to make it big once - maybe, but it seems like they'd not make something with seemingly so little profit potential.

    How much do apps like this make?! How do they make it??

    > The app had 500,000 downloads at the time.

    Does that bring in $2000/mo in advertising revenue??

    • (Score: 4, Informative) by Bentonite on Wednesday February 25, @11:04AM

      by Bentonite (56146) on Wednesday February 25, @11:04AM (#1434876)

      How do these apps make that kind of money??

      As per the article notes;

      including those marketed for identity verification and editing

      selling collected identity documents to criminals would net a fortune.

      Does that bring in $2000/mo in advertising revenue??

      If on average there is $0.05/sucker of advertising revenue (very unlikely, but possible if 1/100 are ultra-suckers and click on an ad and buy something and the agreed amount for a successful sale is $5), then revenue would be ~$25000 a year, which would cover storage.

      But I figure realistically the advertising revenue would be $0.001/sucker, while sales of sensitive information at $0.05-$0.10/user would easily net ~$25,000-$50,000.

    • (Score: 3, Interesting) by VLM on Wednesday February 25, @03:06PM (1 child)

      by VLM (445) on Wednesday February 25, @03:06PM (#1434898)

      https://cloud.google.com/storage/pricing [google.com]

      Its more like 2 cents / month / GB so 12000 GB would be more like $240 per month. I think you missed a decimal point.

      Much like going to a hotel, there are extra fees stacked on top of extra fees.

      "reading" is free (like intra cloud if your database accesses it inside the datacenter) but network fee per GB out of the cloud into north america is 8 cents/GB in extreme bulk, so downloading the entire cache one time would cost the company about a thousand bucks in network fees, for a single dude/gang/intelligence service doing a bulk download. Realize they put the data online for app users to ... use the app, so this can add up quickly. Even if the average user only downloads a file about once per month thats more like $1250 total, $250 for the long term storage and $1000 of network costs.

      As for the specific example this class of apps doesn't make money off ad revenue. Its more like 1990s long distance resellers where the advertisement is the scam (Free for the first 30 seconds!) but then they screw the absolute F out of any fish who bites the hook. I don't know this specific app but the way it works is resell Sora, let the user get one video per month "free", if they want more they can sign up for a subscription or tokens that cost, oh, say 25x what Sora charges if you deal directly with them. So even if 23 out of 25 users never sign up, the 2 that wildly massively overpay for their resold service make up for it. Its the kind of business where people try to do billing chargebacks a lot. There is a class of people who almost like moth to flame have a near sexual desire to always have a middleman in the middle, and those people are cash cows for scammy companies like this. The meta-observation is the only people making "real" money off AI are currently quite literally scammers, and soon the connotation for AI will be everything AI is a billing scam, because currently thats the only way to make money with AI LOL. As a specific example I found an app on Google right now that front-ends and resells Sora's $0.25/minute service at a monthly equivalent subscription rate of $1.49 if you use all your monthly tokens (much higher if you only use half, for example). While its a hot fad, people, especially gullible people, will almost demand to be parted with their money. You could generate an AI cat video using Sora for $2.50 or pay the scammer middleman $14.90 and some folks don't care they just want an AI cat video.

      • (Score: 3, Interesting) by VLM on Wednesday February 25, @03:14PM

        by VLM (445) on Wednesday February 25, @03:14PM (#1434900)

        soon the connotation for AI will be everything AI is a billing scam

        As a meta observation about AI inroads into the economy, its telling that the pro-AI people claim it'll take everything over, but, right now the only people making money are scammy middlemen wildly overcharging a small fraction of gullible people. Everyone else is losing money or at best breaking even.

        If it (AI) were a healthy growing gamechanger part of the economy, people other than scammers would be making money off it. However...

  • (Score: 5, Informative) by jb on Wednesday February 25, @07:20AM (5 children)

    by jb (338) on Wednesday February 25, @07:20AM (#1434860)

    ...then all your data are already compromised anyway.

    Does it really make any difference whether users' data have been compromised by one malicious actor (OS vendor) or by two (OS + app vendors)?

    • (Score: 3, Interesting) by Anonymous Coward on Wednesday February 25, @10:58AM (3 children)

      by Anonymous Coward on Wednesday February 25, @10:58AM (#1434875)

      I would not sleep in peace if I had a Google account.

      It would gnaw at me like knowing my front door lock doesn't work.

      With all those terms and conditions, they can do anything to me and there isn't a damm thing I can do about it once.

      Like virginity, once given, you ain't gettin' it back. Ever.

      • (Score: 3, Touché) by SomeGuy on Wednesday February 25, @12:27PM (1 child)

        by SomeGuy (5632) on Wednesday February 25, @12:27PM (#1434878)

        Just need to point out this is also true of a Microsoft Account.

        You have uploaded all of your sensitive files (unencrypted) to OneDrive, right? According to the advertising it is safe and secure. :P.

        • (Score: 1, Interesting) by Anonymous Coward on Wednesday February 25, @09:17PM

          by Anonymous Coward on Wednesday February 25, @09:17PM (#1434954)

          I don't have a Microsoft or Apple account either. For the same reason.

          if I have to accept all those terms, conditions, and hold harmless clauses, I am in no mood at all to surrender financial credentials.

          The quickest way to lock me into a hard set "NO!!!" is to present me with a big block of terms and conditions. One clause I especially look for is claiming a right to to change things after acceptance. That is the brightest red flag to irreversibly destroy any trust, even thogh Sales Professionals often use this clause.

          https://www.breitbart.com/entertainment/2024/08/17/widower-slams-disneys-outrageous-attempt-to-toss-lawsuit-over-wifes-death-because-they-signed-up-for-disney/ [breitbart.com]

      • (Score: 4, Funny) by hendrikboom on Wednesday February 25, @02:43PM

        by hendrikboom (1125) on Wednesday February 25, @02:43PM (#1434895) Homepage Journal

        I have heard of an obscure sect in western Canada (it may have had four or five adherents) which claims that its priest(s) have the power to restore virginity.

    • (Score: 3, Touché) by VLM on Wednesday February 25, @03:32PM

      by VLM (445) on Wednesday February 25, @03:32PM (#1434906)

      Another app, called IDMerit, exposed know-your-customer data and personally identifiable information from users across 25 countries, predominantly in the U.S.

      All your data is compromised anyway.

      Those are insurance companies insuring public data.

      I don't think privacy advocates or the general public understand how this stuff works. Its more like real estate, which people also don't understand.

      Real estate records are public. You can look at who owns my house and when I paid taxes, etc. Its a nice website actually pretty easy to use. Likewise, all this "private" data is not private, not to corporations. Its private only in the sense your neighbor probably doesn't have free access to it (but would if they paid)

      When I was a kid, my mom did part time piecework legal stuff for a title insurance company, good work if you can get it as an educated mom with kids in school, so she only got involved in weird legal title searches while I was in math class or whatever during the winter. They (the company plus my mom) take the public data for, say, home ownership records and tax payment records etc, then stamp, in the old days, literal photocopies of the public record with "we have a bond worth $1M that this public info is correct and accurate as best we can tell at this time" so the bank approves your new mortgage or whatever, and if there's a mixup (like stolen land, illegal sale, boundary dispute (in my state at that time), etc) then the title insurance company pays out. Which it approximately never does, but its a fuzzy warm blanket for the bank to approve loans... They charge a modest, and I think reasonable, fee to provide a disinterested 3rd party opinion about who owns a plot of land and put their money where their mouth is, if they are later proven wrong. You can hire an attorney off the street to research this for you, but it'll cost a metric shitload and my mom was very specialized at legal nitpicking.

      Thats all these "ID verification" type places do. All that stuff is public to all companies and all governments and all individuals willing to pay for a report. They just certify it looks believable and for a modest fee they'll insure its accurate.

      So it doesn't "really matter" if the already public data that everyone has access to is made public. Its already public to anyone that matters and its only secret to small time individual consumers who by and large DGAF anyway.

(1)