Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.

Submission Preview

Link to Story

(resub) Google lets you Opt out of AI training

Accepted submission by looorg at 2023-10-02 22:55:37 from the no.free.lunch dept.
News

This is a resub of a previous story that got accidentally piped to /dev/null
------------------------------------------------------------------------------------------

https://blog.google/technology/ai/an-update-on-web-publisher-controls/ [blog.google] [blog.google]

Today we’re announcing Google-Extended, a new control that web publishers can use to manage whether their sites help improve Bard and Vertex AI generative APIs, including future generations of models that power those products. By using Google-Extended to control access to content on a site, a website administrator can choose whether to help these AI models become more accurate and capable over time.

Google-Extended
A standalone product token that web publishers can use to manage whether their sites help improve Bard and Vertex AI generative APIs, including future generations of models that power those products.

User agent token Google-Extended
Full user agent string Google-Extended doesn't have a separate HTTP request user agent string. Crawling is done with existing Google user agent strings; the robots.txt user-agent token is used in a control capacity.

User-agent: Google-Extended
Disallow: /

Just like you could previously modify your robots.txt file to not be included in Googles webcrawling you are now supposed to be able to opt out of being used as language model fodder for their AI. So adding those two lines should apparently do the trick. No word on if other models will adhere to it.

No word on if they will remove content already gathered, don't bet on it, or how this will be punished in the future. The current claim is that it will not effect ranking in the search engine. But that could always change.

I like how they phrase it as you "help these AI models" and by then opting out of that you are not being helpful or nice. You help and they reap all the rewards of your work, that sounds like a good deal .. right!?


Original Submission