Stories
Slash Boxes
Comments

SoylentNews is people

posted by Blackmoore on Friday January 23 2015, @03:45AM   Printer-friendly
from the dreaming-of-electric-sheep? dept.

Physicists, philosophers, professors, authors, cognitive scientists, and many others have weighed in on edge.org's annual question 2015: What do you think about machines that think? See all 186 responses here

Also, what do you think?

My 2ยข: There's been a lot of focus on potential disasters that are almost certainly not going to happen. E.g. a robot uprising, or mass poverty through unemployment. Most manufacturers of artificial intelligence won't program their machines to seek self preservation at the expense of their human masters. It wouldn't sell. Secondly, if robots can one day produce almost everything we need, including more robots, with almost no human labour required, then robot-powered factories will become like libraries: relatively cheap to maintain, plentiful, and a public one will be set up in every town or suburb, for public use. If you think the big corporations wouldn't allow it, why do they allow public libraries?

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday January 23 2015, @01:37PM

    by Anonymous Coward on Friday January 23 2015, @01:37PM (#137229)

    Ramez Naam (guest blogging on Charlie Stross's blog) has some good thoughts on the topic in The Singularity Is Further Than It Appears [antipope.org] and the following few blog posts. That post makes a lot of the same points you do.

    Mainly because noone (out of ai-scientists) can't even define what intelligence is, and they're far to pretentious to admit that they don't really know what they're looking for.

    I wouldn't be too harsh on AI scientists: work on Artificial General Intelligence (AGI/"strong" AI) is essentially taboo among AI researchers. There's no serious research on it.

  • (Score: 2) by mtrycz on Friday January 23 2015, @10:25PM

    by mtrycz (60) on Friday January 23 2015, @10:25PM (#137453)

    Hey thanks, it looks interesting.

    About the strong AI issue, you're telling me that the people qorshipping The Singularity aren't actually into AI? Hadn't checked out yet.

    --
    In capitalist America, ads view YOU!
    • (Score: 0) by Anonymous Coward on Saturday January 24 2015, @01:29AM

      by Anonymous Coward on Saturday January 24 2015, @01:29AM (#137507)

      (I'm the GP.)

      Oh, obviously there's plenty of people around worshipping The Singularity, but they seem to be almost entirely disjoint from the group of people doing research in academia and industry who call their research "AI". (Note: I'm a CS graduate student at a top US university; many of my colleagues would call themselves AI researchers and my research (program synthesis) is arguably AI but isn't called such for historical reasons.) Modern AI research is primarily in "machine learning" which is about automatically or semi-automatically finding patterns in large datasets that are too complicated for a human to write down (e.g. handwriting recognition is about identifying the pattern of why all of the As are considered similar, etc.). It's probably best thought of as a programming technique where you don't really have any idea how to write a program for what you want to do but you have a lot of examples of what it should do. Any mention of trying to deal with semantics or intelligence is considered to be a failed dead end and techniques that just look for patterns without a concept of "understanding" them are greatly preferred.

      Not that belief in the Singularity is entirely unheard of in academia---I heard it from a perspective student once (so, an undergrad)---but it is laughed at as absurd.

      • (Score: 2) by mtrycz on Saturday January 24 2015, @09:55AM

        by mtrycz (60) on Saturday January 24 2015, @09:55AM (#137590)

        Hey great!

        Yeah, I'm somwhat proficient in AI techinques (optimization, machine learning, and some natural language processing), I just thought/assumed that the Singularity worshippers were people that actually do have an understanding of the topic, and that are actually into the research. I mean, when I hear Hawkings or Musk rambling, I'd assume they know what they're talking about.

        Thanks for claryfing that, I feel much better now. Someone should point that to the waitbutwhy guy, too.

        --
        In capitalist America, ads view YOU!
        • (Score: 2) by maxwell demon on Saturday January 24 2015, @10:48AM

          by maxwell demon (1608) Subscriber Badge on Saturday January 24 2015, @10:48AM (#137597) Journal

          If you hear Hawking ramble about physics, you can assume he knows what he is talking about. But AI is certainly not a physics subject, so there's no reason to assume that he knows more about it than you and me. Similarly I'd trust Musk to know something about business. But I see no reason to assume he has deeper knowledge about AI.

          --
          The Tao of math: The numbers you can count are not the real numbers.