Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Friday May 15 2015, @01:02PM   Printer-friendly
from the eraser dept.

Google's Transparency Report reveals that since the Court of Justice of the European Union's ruling on May 13, 2014 that established "the right to be forgotten" for Europeans, Google has received 255,143 requests to remove a total of 925,586 URLs. Google removed 323,482 of those URLs (41.3%).

However, that effort isn't enough for some:

Google is receiving a telling off from the UK's Information Commissioner's Office and may face legal action after failing to adequately respond to several so-called "right to be forgotten" requests. The ICO told The Register that "since the details of the ruling were first announced, we have handled over 183 complaints from those unhappy with Google's response to their takedown request". The ICO estimates that Google has mismanaged individuals' requests to remove their information in a quarter of cases.

The independent UK body set up to uphold information rights also says it will now be looking to resolve the 48 remaining cases "through discussion and negotiation with Google, though we have enforcement powers available to us if required".

In addition, 80 legal experts have written an open letter to Google demanding more data about how Google responds to removal requests:

What We Seek

Aggregate data about how Google is responding to the >250,000 requests to delist links thought to contravene data protection from name search results. We should know if the anecdotal evidence of Google's process is representative: What sort of information typically gets delisted (e.g., personal health) and what sort typically does not (e.g., about a public figure), in what proportions and in what countries?

Why It's Important

Google and other search engines have been enlisted to make decisions about the proper balance between personal privacy and access to information. The vast majority of these decisions face no public scrutiny, though they shape public discourse. What's more, the values at work in this process will/should inform information policy around the world. A fact-free debate about the RTBF is in no one's interest.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Anonymous Coward on Friday May 15 2015, @02:29PM

    by Anonymous Coward on Friday May 15 2015, @02:29PM (#183343)

    > The entire concept is flawed: It only requires search engines to remove the links; it does not require the source material to be deleted.

    No your understanding of the goals is flawed. Privacy never was and never will be black and white, private or public. There are all kinds of inbetween states, graduations of grey, where information is mostly private but still somewhat available. For example:

    It used to be that if you wanted to research a person you had to make an effort - you had to haul your ass down to the library and manually search through microfiche copies of all newspapers. Or haul your ass down to the court house to check conviction records. Or haul your ass down to the tax assesor's office to check property ownership records. Etc.

    The amount of effort required to do that research created a level of privacy for everyone. If it wasn't really important to someone, your personal information would not be known by everyone.

    Search engines have moved a lot what was mostly private to mostly public. The right to be forgotten is about moving some of that stuff back towards the mostly private side of the spectrum. The goal is not to delete the information, it is to increase the amount of effort required to find it so that some of that pre-search engine privacy is restored.

    Starting Score:    0  points
    Moderation   +3  
       Insightful=4, Overrated=1, Total=5
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 3, Interesting) by The Archon V2.0 on Friday May 15 2015, @03:05PM

    by The Archon V2.0 (3887) on Friday May 15 2015, @03:05PM (#183354)

    Another example is those sites with mugshots that will take them down if you pay them money. Those mugshots always existed, but it was useless for extortion until people started Googling their dates and potential employees. (And I use
    "Googling" quite deliberately - this racket only started working once a search engine was good enough to become a part of daily life.)

    • (Score: 2) by kaszz on Friday May 15 2015, @03:48PM

      by kaszz (4211) on Friday May 15 2015, @03:48PM (#183370) Journal

      Provided the cost is lower than taking down the mugshot site alltogether :P

  • (Score: 3, Insightful) by CirclesInSand on Friday May 15 2015, @03:54PM

    by CirclesInSand (2899) on Friday May 15 2015, @03:54PM (#183377)

    It doesn't matter what the goals of legislation is. What matters is the policy and the outcome. "Right to be forgotten" is typical reactionary legislation. People are upset that they can't hide the truth from the public, so they try to pass legislation to make it illegal. They focus on the goal rather than the obvious outcome (if you have to be told what happens when you can legally suppress the truth, you aren't paying attention).

    • (Score: 0) by Anonymous Coward on Friday May 15 2015, @04:30PM

      by Anonymous Coward on Friday May 15 2015, @04:30PM (#183391)

      People are upset that they can't hide the truth from the public

      Wrong, people are upset because mistakes from decades ago are still being held against them. Lets see how you'd like it if an arrest from right after you turned 18 was still used to deny you employment in your 30s, or if you still can't get a date because some bitter bitch falsely claimed you raped her 15 years ago. But then you'd just be upset because you couldn't hide the truth right? It wouldn't have anything to do with being treated unfairly over things that happened half a lifetime ago, things that lost their relevance a long time ago.

      • (Score: 2) by Nerdfest on Friday May 15 2015, @05:09PM

        by Nerdfest (80) on Friday May 15 2015, @05:09PM (#183404)

        The 'wrong' part of what you saying is not the that information can be found, it's that non-relevant information is being used against people. Rather than trying to hide information (it wants to be free, remember?) through legislation, use legislation to make the use of non-relevant information illegal. It may accomplish just as little, but at least it would be going after the right thing.

        • (Score: 0) by Anonymous Coward on Friday May 15 2015, @11:32PM

          by Anonymous Coward on Friday May 15 2015, @11:32PM (#183564)

          > Rather than trying to hide information (it wants to be free, remember?) through legislation, use legislation to make the use of non-relevant information illegal.

          Think about what you are proposing, how it would have to play out.

          Who decides what is "non-relevant information?" You want the government to specify a list of everything that people can (or can not) think about whenever they interact with someone? And if you violate that list you go to jail?

          All you guys are stuck in black-and-white, binary choices geek mindset when the real world is analog. Right to be forgotten is an analog solution to an analog problem. No solution will be perfect, but any counter-proposals have to least be plausible.

          • (Score: 0) by Anonymous Coward on Monday May 18 2015, @03:00AM

            by Anonymous Coward on Monday May 18 2015, @03:00AM (#184288)

            All you guys are stuck in black-and-white

            How is deciding non-relevant information on a case-by-case basis black-and-white? Using that logic, this censorship scheme is black-and-white thinking. The law isn't always black-and-white, you know. Sometimes it's quite ambiguous, out of necessity. I'll take that over this censorship implementation.

      • (Score: 2) by RedGreen on Friday May 15 2015, @05:19PM

        by RedGreen (888) on Friday May 15 2015, @05:19PM (#183409)

        "Wrong, people are upset because mistakes from decades ago are still being held against them. Lets see how you'd like it if an arrest from right after you turned 18 was still used to deny you employment in your 30s, or if you still can't get a date because some bitter bitch falsely claimed you raped her 15 years ago. But then you'd just be upset because you couldn't hide the truth right? It wouldn't have anything to do with being treated unfairly over things that happened half a lifetime ago, things that lost their relevance a long time ago."

        Already happens around here if you want job most adds I see require criminal record background check with the job application/resume. So if you want that wonderful job of sweeping floors or working in a recycling plant .. just about anything you are not getting it unless you have had a pardon for that offense if you had one.

        --
        "I modded down, down, down, and the flames went higher." -- Sven Olsen
        • (Score: 0) by Anonymous Coward on Friday May 15 2015, @11:36PM

          by Anonymous Coward on Friday May 15 2015, @11:36PM (#183567)

          > Already happens around here if you want job most adds I see require criminal record background check with the job application/resume.

          Actually your example is a perfect case of a narrow application of an american version of something much like right to be forgotten. It is called ban the box [wikipedia.org] and it is slowly gaining momentum.

      • (Score: 1) by khallow on Friday May 15 2015, @07:02PM

        by khallow (3766) Subscriber Badge on Friday May 15 2015, @07:02PM (#183454) Journal
        Why would that be online in the first place? And why is it the search engine's job to police that information, especially when people can just use search engines which don't have to comply with that law?
  • (Score: 2) by VortexCortex on Friday May 15 2015, @06:02PM

    by VortexCortex (4067) on Friday May 15 2015, @06:02PM (#183429)

    Search engines have moved a lot what was mostly private to mostly public. The right to be forgotten is about moving some of that stuff back towards the mostly private side of the spectrum. The goal is not to delete the information, it is to increase the amount of effort required to find it so that some of that pre-search engine privacy is restored.

    I have my own web spider software / recommendation engine, and I intend to use it to give back some privacy to people by allowing them to run search and recommendations in their own home, with data kept on servers they control, not via some 3rd party marketing agency. The search spider is a side effect of a digital assistant AI which also indexes documents and scans video / audio feeds (stuff you want kept in-house and not cached in some 3rd party server). I have no facility to pre-censor public data -- I can't tell the AI to be completely blind to certain bits of reality, pattern recognition must occur first; Besides, you wouldn't trust an AI that lied to you and said it couldn't find something that it had indeed found. I can aim the AI's spider at a specific group of sites and scan for things it thinks match my interest graph (which it will alert me to when it thinks I'm in the mood to see them). If I was interested in Spain and Bankruptcy that guy's data would probably be on my drives already. Things like this could be the future of search, in the best interest of end-user privacy. Data storage is cheap, you don't need to index/store everything (just what you find interesting), and even decentralized search is more "private" than centralized search engines. What happens when everyone has a search engine? It might be time to rethink your position.

    It will only get easier to find data in the future. Just like how computers were only owned by large countries and companies, and we now have computers in damn near everything with a battery, the search engines and recommendation AI will also become personalized (already happened) and then personally owned (the next phase, ironing out bugs now). The best way to increase the burden to search for the records will be to keep the records offline. It's not the data network's fault that the state is publishing data the state thinks shouldn't be available to the data network.

    The real problem is that this is just more growing pains of adapting to a world wide information network. If you go do something outside in public and someone sees you, you don't have the right to reach into their heads and erase that memory... yet. It should be like that online too. I would rather live in a world where we accept public events (like filing for bankruptcy in a public court) can't be erased from my head. The alternative is that future AI's (and the hackers that love them) create Terminators to fight for their Right to Remember.