Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

No link to story available

Google and Microsoft's Chatbots Are Already Citing One Another

Accepted submission by hubie at 2023-03-28 02:01:21 from the I heard it from a guy who knows a guy... dept.
News

It's not a good sign for the future of online misinformation [theverge.com]:

If you don't believe the rushed launch of AI chatbots by Big Tech has an extremely strong chance of degrading the web's information ecosystem, consider the following:

Right now,* if you ask Microsoft's Bing chatbot if Google's Bard chatbot has been shut down, it says yes, citing as evidence a news article [windowscentral.com] that discusses a tweet [twitter.com] in which a user asked Bard when it would be shut down and Bard said it already had, itself citing a comment [ycombinator.com] from Hacker News in which someone joked about this happening, and someone else used ChatGPT to write fake news coverage about the event.

(*I say "right now" because in the time between starting and finishing writing this story, Bing changed its answer and now correctly replies that Bard is still live. You can interpret this as showing that these systems are, at least, fixable or that they are so infinitely malleable that it's impossible to even consistently report their mistakes.)

What we have here is an early sign we're stumbling into a massive game of AI misinformation telephone, in which chatbots are unable to gauge reliable news sources, misread stories about themselves, and misreport on their own capabilities. In this case, the whole thing started because of a single joke comment on Hacker News. Imagine what you could do if you wanted these systems to fail.

It's a laughable situation but one with potentially serious consequences. Given the inability of AI language models to reliably sort fact from fiction, their launch online threatens to unleash a rotten trail of misinformation and mistrust across the web, a miasma that is impossible to map completely or debunk authoritatively. All because Microsoft, Google, and OpenAI have decided that market share is more important than safety.


Original Submission