Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Friday May 15 2015, @01:02PM   Printer-friendly
from the eraser dept.

Google's Transparency Report reveals that since the Court of Justice of the European Union's ruling on May 13, 2014 that established "the right to be forgotten" for Europeans, Google has received 255,143 requests to remove a total of 925,586 URLs. Google removed 323,482 of those URLs (41.3%).

However, that effort isn't enough for some:

Google is receiving a telling off from the UK's Information Commissioner's Office and may face legal action after failing to adequately respond to several so-called "right to be forgotten" requests. The ICO told The Register that "since the details of the ruling were first announced, we have handled over 183 complaints from those unhappy with Google's response to their takedown request". The ICO estimates that Google has mismanaged individuals' requests to remove their information in a quarter of cases.

The independent UK body set up to uphold information rights also says it will now be looking to resolve the 48 remaining cases "through discussion and negotiation with Google, though we have enforcement powers available to us if required".

In addition, 80 legal experts have written an open letter to Google demanding more data about how Google responds to removal requests:

What We Seek

Aggregate data about how Google is responding to the >250,000 requests to delist links thought to contravene data protection from name search results. We should know if the anecdotal evidence of Google's process is representative: What sort of information typically gets delisted (e.g., personal health) and what sort typically does not (e.g., about a public figure), in what proportions and in what countries?

Why It's Important

Google and other search engines have been enlisted to make decisions about the proper balance between personal privacy and access to information. The vast majority of these decisions face no public scrutiny, though they shape public discourse. What's more, the values at work in this process will/should inform information policy around the world. A fact-free debate about the RTBF is in no one's interest.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday May 15 2015, @11:32PM

    by Anonymous Coward on Friday May 15 2015, @11:32PM (#183564)

    > Rather than trying to hide information (it wants to be free, remember?) through legislation, use legislation to make the use of non-relevant information illegal.

    Think about what you are proposing, how it would have to play out.

    Who decides what is "non-relevant information?" You want the government to specify a list of everything that people can (or can not) think about whenever they interact with someone? And if you violate that list you go to jail?

    All you guys are stuck in black-and-white, binary choices geek mindset when the real world is analog. Right to be forgotten is an analog solution to an analog problem. No solution will be perfect, but any counter-proposals have to least be plausible.

  • (Score: 0) by Anonymous Coward on Monday May 18 2015, @03:00AM

    by Anonymous Coward on Monday May 18 2015, @03:00AM (#184288)

    All you guys are stuck in black-and-white

    How is deciding non-relevant information on a case-by-case basis black-and-white? Using that logic, this censorship scheme is black-and-white thinking. The law isn't always black-and-white, you know. Sometimes it's quite ambiguous, out of necessity. I'll take that over this censorship implementation.