Slash Boxes

SoylentNews is people

posted by hubie on Thursday February 29, @10:17AM   Printer-friendly
from the Where-have-you-been-recently? dept.

Sensitive location data could be sold off to the highest bidder:

In 2021, a company specializing in collecting and selling location data called Near bragged that it was "The World's Largest Dataset of People's Behavior in the Real-World," with data representing "1.6B people across 44 countries." Last year the company went public with a valuation of $1 billion (via a SPAC). Seven months later it filed for bankruptcy and has agreed to sell the company.

But for the "1.6B people" that Near said its data represents, the important question is: What happens to Near's mountain of location data? Any company could gain access to it through purchasing the company's assets.

The prospect of this data, including Near's collection of location data from sensitive locations such as abortion clinics, being sold off in bankruptcy has raised alarms in Congress. Last week, Sen. Ron Wyden (D-Ore.) wrote the Federal Trade Commission (FTC) urging the agency to "protect consumers and investors from the outrageous conduct" of Near, citing his office's investigation into the India-based company.

Wyden's letter also urged the FTC "to intervene in Near's bankruptcy proceedings to ensure that all location and device data held by Near about Americans is promptly destroyed and is not sold off, including to another data broker." The FTC took such an action in 2010 to block the use of 11 years worth of subscriber personal data during the bankruptcy proceedings of the XY Magazine, which was oriented to young gay men. The agency requested that the data be destroyed to prevent its misuse.

Wyden's investigation was spurred by a May 2023 Wall Street Journal report that Near had licensed location data to the anti-abortion group Veritas Society so it could target ads to visitors of Planned Parenthood clinics and attempt to dissuade women from seeking abortions. Wyden's investigation revealed that the group's geofencing campaign focused on 600 Planned Parenthood clinics in 48 states. The Journal also revealed that Near had been selling its location data to the Department of Defense and intelligence agencies.

[...] This week, a new bankruptcy court filing showed that Wyden's requests were granted. The order placed restrictions on the use, sale, licensing, or transfer of location data collected from sensitive locations in the US and requires any company that purchases the data to establish a "sensitive location data program" with detailed policies for such data and ensure ongoing monitoring and compliance, including the creation of a list of sensitive locations such as reproductive health care facilities, doctor's offices, houses of worship, mental health care providers, corrections facilities and shelters among others. The order demands that unless consumers have explicitly provided consent, the company must cease any collection, use, or transfer of location data.

[...] The bankruptcy order also provided a rare glimpse into how data brokers license data to one another. Near's list of contracts included agreements with several location brokers, ad platforms, universities, retailers, and city governments.

It is not clear from the filing if the agreements covered Near data being licensed, Near licensing the data from the companies, or both.

Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Opportunist on Thursday February 29, @10:58AM (5 children)

    by Opportunist (5545) on Thursday February 29, @10:58AM (#1346778)

    Because even if a company promises "your data is sacred and we won't sell it", nobody says that whatever herd of locusts feasts on the cadaver after the company goes belly-up will honor it.

    • (Score: 5, Interesting) by Thexalon on Thursday February 29, @12:13PM (4 children)

      by Thexalon (636) on Thursday February 29, @12:13PM (#1346783)

      1. Promises that aren't in legal contracts aren't worth a dime, under pretty much any circumstances. At best, you'd have a false advertising claim.
      2. Promises of privacy lose out to subpoenas and other government mechanisms to retrieve that data.
      3. For the "no sale of your data" promise to be enforced, you have to know that your data was sold. That's extremely hard to discover, since there are so many entities selling your data that proving who it was that sold your data to somebody who got their hands on it is difficult at best.

      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 3, Insightful) by Opportunist on Thursday February 29, @03:30PM (2 children)

        by Opportunist (5545) on Thursday February 29, @03:30PM (#1346810)

        1. Even if that promise is in a legal contract, it's between me and someone who doesn't exist anymore.
        2. You should only provide your personal data to entities outside a jurisdiction that matters to you anyway. Preferably a government so hostile to yours that it will never cooperate with it.
        3. See 2.

        • (Score: 2) by DeathMonkey on Thursday February 29, @07:24PM (1 child)

          by DeathMonkey (1380) on Thursday February 29, @07:24PM (#1346834) Journal

          And fourthly, that promise usually looks like this:

          your data is sacred and we won't sell it*

          *unless we change our minds

          • (Score: 1) by anubi on Thursday February 29, @09:57PM

            by anubi (2828) on Thursday February 29, @09:57PM (#1346855) Journal

            * only as permitted by law.

            ( And they already have loopholes lobbied into place - anyone can become a " business partner " with a shake of a hand. )

            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 3, Insightful) by ikanreed on Thursday February 29, @04:01PM

        by ikanreed (3164) Subscriber Badge on Thursday February 29, @04:01PM (#1346812) Journal

        Legal contracts are frequently dissolved in bankruptcy anyways.

  • (Score: 5, Interesting) by JoeMerchant on Thursday February 29, @12:43PM (2 children)

    by JoeMerchant (3937) on Thursday February 29, @12:43PM (#1346786)

    If privacy were taken seriously, protected information would only be stored in encrypted form.

    The keys necessary to access the protected information would be controlled by contractual obligations, enforced by independent escrow agents.

    When the contract is expired, the keys are erased and forgotten, and the protected information is lost forever.

    We have had the technology to do these things, practically, since 1980. What we lack is the seriousness and will to do so.

    🌻🌻 []
    • (Score: 3, Interesting) by Freeman on Thursday February 29, @02:37PM (1 child)

      by Freeman (732) on Thursday February 29, @02:37PM (#1346804) Journal

      I suggest that 1980 wouldn't have been a good time to do that. Probably not even in the 90s, but certainly by the 2000s and most especially by now. It should be technologically feasible. Politically feasible? The two sides can't even come to an agreement that gets themselves paid every so often. Thus, the warnings about government shutdown, etc. Good luck on getting something logical and useful pushed through Congress. Let alone that gives the populous even more protections than they already should have.

      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
      • (Score: 5, Insightful) by JoeMerchant on Thursday February 29, @03:24PM

        by JoeMerchant (3937) on Thursday February 29, @03:24PM (#1346809)

        If we had implemented secure processes in 1980, the "compute power" issues would only have "highly secured" most of those secrets until maybe 2000. By 1990, practical upgrades in key lengths probably could have "highly secured" secrets until ~2030, and by 2000 the conventional compute power should have been possible to practically secure secrets for a hundred years or more - with exception for quantum attacks... The computational complexity side will likely be a moving target for decades to come, but when you are talking about securing hundreds of millions of secrets and a "crack" of one of them requires millions of dollars and years of effort... that's getting into "secure enough" territory.

        Politically, 1980 was already too late: The 1976 Arms Export Control Act (AECA) made it illegal to distribute munitions in other countries without a license, including cryptography. There was strong political pressure to not encrypt communications and information storage which shaped development for decades.

        Then there was, and still is, the "Wild West" digital business development landscape which placed near zero value on security and privacy of user data, negative when you factor in the political pressures (remember when pkZip was hobbled by export controls?) Juxtapose this vs the reality that secure program/systems development is significantly more difficult than open development, and we get to where we are today.

        If we were serious about privacy, the 1996-2003 rollout of HIPAA was when it should have really kicked in, but instead HIPAA was distorted into a legal club to randomly beat people with - with almost no forward progress on serious protection of personal information.

        Ten years ago, we developed a "next generation" product which handles a tiny amount of PHI (Protected Health Information) - our solution for protection of that information is that we don't store it. When the device is switched off (which it is at the end of every day), all stored PHI is destroyed. Now, we are activating network connectivity on the product, and starting to take security more seriously due to the addition of that network attack surface. I'm going to estimate that it would have slowed initial release of the product by 6 months (5 years becomes 5.5) if we had "baked in" security in the first place. Now, band-aiding security on later, there's a solid year of development involved in getting that rolled out. However, if we had tried to do this security push 5 years ago, we would have been "rolling our own" secure solutions which, inevitably, would then be different from other secure processes developed around the company in the past 5 years (and, I would attribute local development and control of those secure processes to about 3 months of the 6 months acceleration, but then we would be on the hook to maintain them going forward...)

        In my estimation, without the political back-pressure, we as a society could (and possibly should) have started this push for security 40 years earlier, but for individual developers within the society, within companies within the society, it would have been pointless to get out in front and implement a bunch of secure solutions when nobody else cared, or even actively resisted them. Secure system development inherently adds difficulty and cost, but not crazy amounts, and the benefits to systems operating with global connectivity are virtually existential. We're starting to discover that network accessible PLCs secured by Homer Simpson's password "d0nut", if they are secured at all, should not be actively controlling systems that provide critical infrastructure like electricity, clean water, etc. to large populations. It was always obvious, but it's cheaper to ignore the problem until the problems start to manifest.

        20 years ago (post 911) there was a thought problem in the Houston area: "How much damage could two guys in a pickup truck with a bed full of hand grenades do?" Typical answers were on the order of Billions of dollars of both physical damage and temporary shutdown of profitable processes, before the likely police response could stop them - hundreds of Billions if they could manage to put all the grenades on optimal damage targets. Variations of the thought problem include: do they have a grenade launcher, or are they just throwing them by hand? Even just throwing by hand the low end estimates are still in the billions.

        🌻🌻 []
  • (Score: 5, Insightful) by stratified cake on Thursday February 29, @01:54PM (1 child)

    by stratified cake (35052) on Thursday February 29, @01:54PM (#1346794)

    As far as companies and governments are concerned, it's our data the same way that milk belongs to the cow.

    • (Score: 5, Insightful) by Ox0000 on Thursday February 29, @02:34PM

      by Ox0000 (5111) on Thursday February 29, @02:34PM (#1346803)

      Spot on.
      Whenever I talk about these things in a setting where digital pillaging comes up or even remotely threatens to do so, I try to use words such as Data Subject, Data Owner, and Data Custodian.
      Especially when using that last one, I try to hammer on the fact that he who holds the data, is merely a custodian, and not the owner. The Data Subject is typically the Data Owner (in all but a handful of exceptions) and is the ultimate owner of the data. And that as a Data Custodian, one has the duty to guard that data as a good custodian. I tend to also explain that being a custodian opens you up to liabilities, that it thus behooves you to limit what you collect in order to limit your liability, and that you must put strong protections in place to limit the probability of exposure and the blast-radius of the eventual exposure.

      Unfortunately, this reasoning tends to fall on deaf ears with the MBA crowd.
      Ain't nothing an MBA cannot fuck up...

  • (Score: 3, Informative) by Ox0000 on Thursday February 29, @02:29PM

    by Ox0000 (5111) on Thursday February 29, @02:29PM (#1346801)

    Your data, sensitive or not, is considered an asset. It's not so much that it _can_ or _could_ be sold, it's that it _will_ be liquidated to whoever gives them enough money for it. Your data is treated with as much respect as the dirty mugs that they find in the office kitchen and will be sold just like it when whoever holds it goes bankrupt.
    There's only one difference: while they can only sell those mugs once, your data can be sold multiple times to multiple entities at the same time, and so it will.

  • (Score: 4, Interesting) by deganee on Thursday February 29, @06:59PM (2 children)

    by deganee (3187) Subscriber Badge on Thursday February 29, @06:59PM (#1346832)

    FYI one of the big data sales that will be coming up is of 23andMe.

    ( I don't actually know anything this just an assumption.)

    23andMe's stock price is down to $0.60, after listing at $10.00 during their IPO in 2021.
    80% of their users opted to allow their DNA to be used for research.

    To tell people about their family tree, 23andMe only needed to sequence a tiny % of their DNA. But for everyone who opted into research, the company has their original saliva sample in a freezer (biobank), and the consent document gives them the right to sequence their entire genome when it becomes economically feasible.

    "If you have elected to have your saliva sample stored by 23andMe, we may also use the results of further analysis of your sample in 23andMe Research. For example, we may conduct whole genome sequencing, which allows researchers to study genetic information more thoroughly. "

    "Unless you withdraw your biobanking consent or close your account, your permission to keep and analyze your Samples at our biobank does not expire, and 23andMe may continue to store and analyze your Samples. "

    So when 23andMe goes out of business because their stock is declining to zero, someone will buy the right to sequence millions of people's entire DNA and basically do whatever they want with it. Their right to do this **does not expire** so they can selling your DNA now or in 100 years. The cost of DNA sequencing is rapidly dropping, so eventually it will make sense to do this.

    • (Score: 3, Interesting) by Ox0000 on Thursday February 29, @09:21PM (1 child)

      by Ox0000 (5111) on Thursday February 29, @09:21PM (#1346852)

      And not only that, this includes everyone related to those who fell for 23andMe's proposition ("we get your DNA and all deducible information from it, you get a letter containing some vagaries about the geographic spread of pieces of DNA that you share with the other schmucks that fell for our proposition") even though those related individuals may not/never have given consent.

      Remember that by definition you share (a bunch of your) DNA with your relatives, that's what being related means. When you use something like 23andMe, you are giving away something that isn't yours, nor is it yours to give. To somewhat bastardize the words of Patek Philippe: "You never actually own your DNA. You merely look after it for the next generation."

      • (Score: 3, Interesting) by JoeMerchant on Thursday February 29, @09:44PM

        by JoeMerchant (3937) on Thursday February 29, @09:44PM (#1346854)

        There are many ways to "give away" your DNA besides signing it over to a company. Committing a felony and forcing law enforcement into an investigation by not confessing is one that comes to mind immediately.

        Sooner or later standard collection of DNA of most people will become standard practice. The flimsy "privacy protections" that come along with the collection should be amusing, in a tragicomedic sense.

        If you think your civil liberties are better protected today, try making a credible terrorist threat from your cellphone while driving past traffic cameras and see what happens.

        🌻🌻 []
  • (Score: 2) by bzipitidoo on Friday March 01, @12:28AM

    by bzipitidoo (4388) on Friday March 01, @12:28AM (#1346880) Journal

    I think we can no longer rely on privacy to protect us from abuse by those who have all this info on us. The example of far right political groups using this info to bully women and abortion providers is particularly outrageous. These days, doxxing is extremely easy to do. How does one get "undoxxed"? Move away, that's about all I can think of, but that's an awful lot of trouble and expense. We need other measures.

    Top of my list is tolerance. After that, fight fire with fire. Some far right nuts want to harass people? People can harass them right back, harder. No bullying KKK scumbag should have a moment of peace when out in public.

    Harder to handle is more subtle discrimination. I can imagine health insurers fishing for excuses to raise rates. It'd be "pre-existing conditions" on oxycontin!