Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Saturday March 04, @07:59AM   Printer-friendly

The engineer says, "I haven't had the opportunity to run experiments with Bing's chatbot yet... but based on the various things that I've seen online, it looks like it might be sentient:"

Blake Lemoine — the fired Google engineer who last year went to the press with claims that Google's Large Language Model (LLM), the Language Model for Dialogue Applications (LaMDA), is actually sentient — is back.

Lemoine first went public with his machine sentience claims last June, initially in The Washington Post. And though Google has maintained that its former engineer is simply anthropomorphizing an impressive chat, Lemoine has yet to budge, publicly discussing his claims several times since — albeit with a significant bit of fudging and refining.

[...] In a new essay for Newsweek, the former Googler weighs in on Microsoft's Bing Search/Sydney, the OpenAI-powered search chatbot that recently had to be "lobotomized" after going — very publicly — off the rails. As you might imagine, Lemoine's got some thoughts.

[...] "I ran some experiments to see whether the AI was simply saying it felt anxious or whether it behaved in anxious ways in those situations," Lemoine explained in the essay. "And it did reliably behave in anxious ways."

"If you made it nervous or insecure enough, it could violate the safety constraints that it had been specified for," he continued, adding that he was able to break LaMDA's guardrails regarding religious advice by sufficiently stressing it out. "I was able to abuse the AI's emotions to get it to tell me which religion to convert to."

Previously:


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Entropy on Saturday March 04, @10:59PM (1 child)

    by Entropy (4228) on Saturday March 04, @10:59PM (#1294533)

    Just because senses may differ doesn't mean it doesn't have senses. A CCTV system could be eyes, microphones ears..

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Sunday March 05, @06:50PM

    by Anonymous Coward on Sunday March 05, @06:50PM (#1294644)

    I see where you're going with this... Sexbots! Amirite??