Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday November 15 2019, @12:26PM   Printer-friendly
from the skynet-anyone? dept.

John Carmack Sets Out To Create General AI

John Carmack, programmer extraordinaire, and developer of seminal titles like "Doom" and "Quake" has said "Hasta La Vista" to his colleagues at Oculus to to set out for a new challenge. In a Facebook post (https://www.facebook.com/100006735798590/posts/2547632585471243/) he declares that he is going to work on artificial general intelligence.

What are the chances he can pull it off, and what could go wrong?
 

John Carmack Steps Down at Oculus to Pursue AI Passion Project `Before I get too old'

John Carmack Steps Down at Oculus to Pursue AI Passion Project `Before I get too Old':

Legendary coder John Carmack is leaving Facebook's Oculus after six years to focus on a personal project — no less than the creation of Artificial General Intelligence, or "Strong AI." He'll remain attached to the company in a "Consulting CTO" position, but will be spending all his time working on, perhaps, the AI that finally surpasses and destroys humanity.

AGI or strong AI is the concept of an AI that learns much the way humans do, and as such is not as limited as the extremely narrow machine learning algorithms we refer to as AI today. AGI is the science fiction version of AI — HAL 9000, Replicants and, of course, the Terminator. There are some good ones out there, too — Data and R2-D2, for instance.

[...] Carmack announced the move on Facebook, where he explained that the uncertainty about such a fascinating and exciting topic is exactly what attracted him to it:

When I think back over everything I have done across games, aerospace, and VR, I have always felt that I had at least a vague "line of sight" to the solutions, even if they were unconventional or unproven. I have sometimes wondered how I would fare with a problem where the solution really isn't in sight. I decided that I should give it a try before I get too old.

Skynet? Singularity? With great power comes great responsibility. Can he do it? Should he?


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by VLM on Friday November 15 2019, @12:45PM (4 children)

    by VLM (445) on Friday November 15 2019, @12:45PM (#920668)

    Can he do it?

    Probably not.

    The rep that I've heard of his coding style is its detail oriented, wisely/highly optimized, his loop of bug fixing/optimization is fast, and he dogfoods real well so his stuff is actually enjoyable and works and users love it. I've not studied his code in detail, however.

    I mean, anyone could make something like "doom" in 2019 that takes multiple GHz speed cores and a huge graphics card and a billion lines of (crappy) code from a team of hundreds of low productivity people like modern studios do everyday; he shipped it successfully in '93.

    That's not really where AI is today, is it? Where something cool works in a large research lab and there's an obvious application but its not been productized and optimized to run on everyone's desks quite yet but adequate hardware has JUST arrived that will JUST barely work?

    So if his secret sauce special skill is of no use, he's just kinda an average member of management at the new place?

    I mean, if he was going to code on a project to make the worlds most popular and addictive new genre game that runs on low power rarely charged wrist mounted fitness trackers, I'd believe that and expect success from the guy. But the current plan is unlikely success.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1, Interesting) by Anonymous Coward on Friday November 15 2019, @12:59PM (2 children)

    by Anonymous Coward on Friday November 15 2019, @12:59PM (#920676)

    The more people and groups working on artificial general intelligence, the better.

    Artificial general intelligence has probably been created already. It's just languishing under military control.

    • (Score: 4, Interesting) by Anonymous Coward on Friday November 15 2019, @06:51PM (1 child)

      by Anonymous Coward on Friday November 15 2019, @06:51PM (#920759)

      I used to take a not dissimilar view. Then I ended up working with neural network based systems for a couple of years. Now I think anything vaguely resembling intelligence is probably impossible with current techniques, and we're most likely on our way to another AI winter once full self driving vehicles prove impossible.

      Why? Pretty simple. "AI" is driven by correlations. And it can pick up on some remarkable correlations that humans are not capable of. You can get from 0-90% super easily. And it looks like you're going to have created Data within a decade. Getting from 90-99% is a lot harder but still really quite doable. And at this point, your own results start to feel magical. As a silly example, I was able to give an arbitrary pattern to my network and it could generally pick the next digits. Of course it was just picking up on patterns and correlations but it really felt genuinely intelligent, like we might feel the first time we beam 2 3 5 out into space and get back 7 11 13. As an aside, no it could not do primes. In any case, now you're damned near positive you're going to be able to create Data. Then you start pushing for 99.9%. And things start getting really really hard. And by the time you start pushing for 99.99% it becomes increasingly clear that you're headed hard and fast for some asymptote.

      You can still do some cool things below that asymptote. For instance I worked on financial tech stuff, and it performed far better than humans. But if you want to apply this to a field like driving let alone some generalized field? No, it's simply not going to work. If we achieve anything like self driving I imagine we're going to have extensive 'hand coded' LIDAR (and probably also RADAR) systems constantly sanity testing everything. A fintech system can afford some head scratching decisions so long as the general outcome outperforms humans. A car can't handle you occasionally deciding to t-bone a concrete wall even if the other 99.99% of the time you drive like a super-human. I think the most probable outcome is us ditching automation altogether since even 'driver assistance' is probably going to do more harm than good since the driver is going to zone out. But if we do keep self driving vehicles I expect to see these systems with extensive 'hand coded' things driving only on white-listed paths which are further hand tuned, just like Waymo seems to be doing. And the final result may look like AI, but I think the emphasis is very much going to be on the "A" part there.

  • (Score: 2, Interesting) by Anonymous Coward on Friday November 15 2019, @06:15PM

    by Anonymous Coward on Friday November 15 2019, @06:15PM (#920741)

    In my opinion the secret sauce of individuals like Carmack is not any particular thing, but simply the overall skill set and brain that enables him to do what he does. For instance, imagine Carmack had chosen to pursue e.g. cosmological research instead of software development, I would generally expect he probably would have managed to excel there as well.

    In many ways I find many completely different tasks really boil down to the same thing at some level. It's simply assimilating information, obtaining a sufficiently intuitive understanding it in a logical and clear fashion, and then applying a good dose of creativity and cleverness to apply it in novel ways. I used to be much more a fan of Tabula Rasa, but I find real life experience tends to leave less and less room to believe in such things as the years pass.

    So of course none of this means he'll succeed or have any impact whatsoever. But I do think it means he has a vastly higher chance of such a happening than e.g. your average grad student who's focused on AI.