Eric Schmidt wants to prevent potential abuse of AI:
Add Eric Schmidt to the list of tech luminaries concerned about the dangers of AI. The former Google chief tells guests at The Wall Street Journal's CEO Council Summit that AI represents an "existential risk" that could get many people "harmed or killed." He doesn't feel that threat is serious at the moment, but he sees a near future where AI could help find software security flaws or new biology types. It's important to ensure these systems aren't "misused by evil people," the veteran executive says.
Schmidt doesn't have a firm solution for regulating AI, but he believes there won't be an AI-specific regulator in the US. He participated in a National Security Commission on AI that reviewed the technology and published a 2021 report determining that the US wasn't ready for the tech's impact.
Schmidt doesn't have direct influence over AI. However, he joins a growing number of well-known moguls who have argued for a careful approach. Current Google CEO Sundar Pichai has cautioned that society needs to adapt to AI, while OpenAI leader Sam Altman has expressed concern that authoritarians might abuse these algorithms. In March, numerous industry leaders and researchers (including Elon Musk and Steve Wozniak) signed an open letter calling on companies to pause AI experiments for six months while they rethought the safety and ethical implications of their work.
(Score: 4, Insightful) by Rosco P. Coltrane on Thursday May 25 2023, @11:08PM (1 child)
Eric Schmidt is a major cause of what's happening. He ran the beast that's busy stealing all the data that AI is trained on for many years. He literally made AI happen on the backs of everyone else. And now he's concerned that some people may get harmed? The Google man loves his fellow man now? Go fuck yourself Eric.
(Score: 2) by sjames on Friday May 26 2023, @08:07PM
More likely, he has seen that others will probably get a really good AI working before Google can (especially the way they're shedding development projects now) and so it must be stopped while they catch up.