Geoffrey Hinton, a computer scientist who has been called "the godfather of artificial intelligence", says it is "not inconceivable" that AI may develop to the point where it poses a threat to humanity:
The computer scientist sat down with CBS News this week about his predictions for the advancement of AI. He compared the invention of AI to electricity or the wheel.
Hinton, who works at Google and the University of Toronto, said that the development of general purpose AI is progressing sooner than people may imagine. General purpose AI is artificial intelligence with several intended and unintended purposes, including speech recognition, answering questions and translation.
"Until quite recently, I thought it was going to be like 20 to 50 years before we have general purpose AI. And now I think it may be 20 years or less," Hinton predicted. Asked specifically the chances of AI "wiping out humanity," Hinton said, "I think it's not inconceivable. That's all I'll say."
[...] Hinton said it was plausible for computers to eventually gain the ability to create ideas to improve themselves.
Also at CBS News. Originally spotted on The Eponymous Pickle.
Previously: OpenAI's New ChatGPT Bot: 10 "Dangerous" Things it's Capable of
(Score: 3, Interesting) by Beryllium Sphere (r) on Thursday March 30 2023, @05:45AM (7 children)
The shooter had attended that school so I doubt it was random but there are plenty of examples of pure hate out there.
There's lone nutbags who might get past the safeguards (but then, the Britannica has bomb making instructions IIRC). I could imagine large scale actors doing damaging things, like creating a propaganda LLM that hooked people's attention with entertainment.
And if they work as well at designing DNA sequences as they do at writing code, what happens when a biowarfare lab gets one?
(Score: 1) by khallow on Thursday March 30 2023, @06:17AM (3 children)
It might even be worth what the large scale actor sinks into the exercise. Massive ad campaigns exist so they must have some beneficial effect. But it's easy for multiple large scale actors to work at cross purposes.
Not much, unless they get significantly better at writing code.
(Score: 0) by Anonymous Coward on Thursday March 30 2023, @05:52PM (2 children)
Clippy exists too. Jeez, is there any logical fallacy you don't use in your arguments?
(Score: 1) by khallow on Thursday March 30 2023, @06:30PM
(Score: 1) by khallow on Friday March 31 2023, @05:08PM
If just one clippy exists, then it's likely a mistake. If a thousand clippies exist and they're coming out with more all the time - like the situation with massive ad campaigns, then we have to consider the question: why would they keep making them?
My take is that the Large Language Model (LLM) approach just isn't going to be damaging because if it has any advantage at all, then there will be a lot of actors using them due to low barrier to entry, not just one hypothetical bad guy. And they're competing with existing ads and propaganda which aren't going to be much different in effect. It's a sea of noise.
The real power will be in isolating people. That's how cults work. They're not just misinformation, but systems for isolating their targets from rival sources and knowledge.
For example, the scheme of controlling search results would be a means to isolate. So would polluting public spaces and then luring people into walled gardens where the flow of information can be tightly controlled. But I doubt any of these schemes will be as effective as physical isolation.
(Score: 2) by EJ on Thursday March 30 2023, @06:27AM (1 child)
I don't mean that particular school was random. I mean it's looking like the decision to attack the school was semi-random from a list of other possible targets. (S)he didn't appear to have any specific reason for any of the targets she chose at the school.
It looks like they wanted to lash out and just chose the school as the way to do it.
(Score: 2) by tangomargarine on Thursday March 30 2023, @02:37PM
I would guess that an elementary school would be the target you'd choose for the biggest headlines in the news. Other than maybe a maternity ward?
Or maybe it was semi-subconscious since we've been hearing about a school shooting every week or two for like the last 5 years.
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 0) by Anonymous Coward on Thursday March 30 2023, @09:05AM
But yeah customized pandemic viruses by some cultist groups or similar could cause big problems.