Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by janrinok on Thursday January 16 2020, @07:07PM   Printer-friendly
from the still-want-your-data dept.

What can we rid the world of, thinks Google... Poverty? Disease? Inequality? Yeah, but first: Third-party cookies – and classic user-agent strings:

On Tuesday, Google published an update on its Privacy Sandbox proposal, a plan thoroughly panned last summer as a desperate attempt to redefine privacy in a way that's compatible with the ad slinger's business.

In a blog post, Justin Schuh, director of Chrome engineering, asked the web community for help to increase the privacy of web browsing, something browser makers like Apple and Mozilla have already been doing on their own.

"After initial dialogue with the web community, we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete," wrote Schuh.

"Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome."

That's a significant shift for a company that relies heavily on cookie data for its ad business. Google Display Network uses third-party cookies to serve behavior-based ads. And Google partners, like publishers that use Google Ad Manager to sell ads, will also be affected.

Over the past few years, as Apple, Brave, and Mozilla have taken steps to block third-party cookies by default and legislators have passed privacy legislation. Meanwhile, ad tech companies have tried to preserve their ability to track people online. Google has resisted third-party cookie blocking and last year began working on a way to preserve its data gathering while also accommodating certain privacy concerns.

Schuh said Google aims to drop third-party cookie support within two years, but added that Google "[needs] the ecosystem to engage on [its] proposals," a plea that makes it sound like the company's initial salvo of would-be web tech specs has been largely ignored.

In a phone interview with The Register, Electronic Frontier Foundation staff technologist Bennett Cyphers said there doesn't appear to have been much community interest in Google's proposals. "When they announced Privacy Sandbox last fall, they threw a bunch of code on GitHub. Those repos don't show much sign of engagement."

Cyphers said he couldn't speak to discussions at the W3C, but said people haven't shown much interest in Google's specs.

Lee Tien, senior staff attorney at the EFF, said in an email that Google is influential with standards bodies like the W3C but that doesn't mean the company will get what it wants by throwing its weight around.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by bradley13 on Thursday January 16 2020, @09:12PM (3 children)

    by bradley13 (3053) on Thursday January 16 2020, @09:12PM (#944236) Homepage Journal

    At the risk of asking a stupid question: why do we need a user-agent string at all? Browsers should implement web standards. Websites should not need to program for specific browsers. The user-agent string enables this nuttiness. And if your website is pushing the envelope so hard that it is running into browser bugs, then you need to stop pushing.

    It is possible to spoof the user-agent string. Mine is currently sent to be empty, and the web works just fine. I have a lot more problem blocking scripts - it's amazing how many sites have no fallback. User agent? Not important for most sites. I'm sure they can still fingerprint me, but there's no reason to make it easy.

    --
    Everyone is somebody else's weirdo.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 5, Interesting) by Anonymous Coward on Thursday January 16 2020, @09:41PM

    by Anonymous Coward on Thursday January 16 2020, @09:41PM (#944249)

    We don't. The standard does not mandate that. I hacked my browser to return (null) when queried through all mechanisms, and mostly nothing happens. Sometimes I get "Your browser is no longer supported" dialog boxes. Sometimes I get stack traces from a null pointer exception when the losers tried to process the user agent. Sometimes I get "Fuck off you filthy hacker" pages (especially via Cloudflare).

  • (Score: 3, Informative) by fido_dogstoyevsky on Thursday January 16 2020, @10:24PM

    by fido_dogstoyevsky (131) <{axehandle} {at} {gmail.com}> on Thursday January 16 2020, @10:24PM (#944275)

    ...why do we need a user-agent string at all?...

    To poison the collected dataset.

    --
    It's NOT a conspiracy... it's a plot.
  • (Score: 1, Interesting) by Anonymous Coward on Thursday January 16 2020, @11:50PM

    by Anonymous Coward on Thursday January 16 2020, @11:50PM (#944308)

    The original HTTP 1.0 specification gives the User-Agent header as one of the optional headers. The reasons given for it, "This is for statistical purposes, the tracing of protocol violations, and automated recognition of user agents for the sake of tailoring responses to avoid particular user agent limitations." Other than the tracking allowed by it, the browser war made the header useless, as every browser is trying to pretend that it is a different browser for older servers that don't exist anymore.