Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday March 08 2019, @05:46AM   Printer-friendly
from the this-is-xin-xin-xin-hua-news dept.

China's State news outlet Xinhua, has debuted "AI anchors"

The company created these "AI anchors" from digital composites of human anchors and gave them synthesized voices to deliver the news. Other than those details, the agency has kept the technology used in bringing this anchorman to life under wraps.

Checking out the animation, the speech is surprisingly good and far better than the terrible robo voices on some video channels. The British accented English speaking video anchor proudly claims

I will work tirelessly to keep you informed as texts will be typed into my system uninterrupted

And, this being China, there's the capitalism angle:

One of the best parts of having an AI anchor? "He" doesn't sleep, said the press agency.

"According to Xinhua, "he" has become a member of its reporting team and can work 24 hours a day on its official website and various social media platforms, reducing news production costs and improving efficiency," the Xinhua media team noted.

An interesting advantage in the land of 1.4 billion people.

We'll just have to see how these guys compare with good ol' Max


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by HiThere on Friday March 08 2019, @05:17PM (3 children)

    by HiThere (866) Subscriber Badge on Friday March 08 2019, @05:17PM (#811601) Journal

    You *ought* to be frightened of "amorality+hyper-competence", because you're framing the problem wrong. The actual problem is a morality that incompatible with human survival + faster than human reaction time. We can't even imagine what hyper-competence would be like, so that's something that nobody has figured in. And that "morality" might well be something that those ordering the system think they want. Like relentlessly following orders. People always have a large "you know what I mean" and "you'll be sensible about interpreting this" in the rules they make, and can't even envision what actually obeying those rules would result in.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by ikanreed on Friday March 08 2019, @05:28PM (2 children)

    by ikanreed (3164) Subscriber Badge on Friday March 08 2019, @05:28PM (#811614) Journal

    No really, being a computer does not lend magic cognitive advantage that lets a machine circumvent the material realities it's working within.

    It's meaningless sci-fi narrative injected to make nerds feel smart.

    I'm not afraid of a machine carrying the 1 to realize nuking Manhattan slightly optimizes it's mineral extraction rates, developing the entire cognitive skillset for hacking, warfare, deceiving gatekeepers, and dealing with countermeasures all inside its head without ever actively learning or practicing those skills in a real-world environment. It's a terrible understanding of cognition and skillsets, whether natural or artificial.

    Besides which, this magic forseeing of all outcomes of its possible choices is profoundly dumb too. On a fundamental philosophical level. No system, no matter how intelligent is capable of that. Because intelligence isn't some magic power where more=more power. Everything about the scenario is dumb, and I don't care how many Yud-loving morons I've got to call dumb-asses to say so.

    I am worried about some rich fuck buying a drone and bombing the house of a south american union organizer. Purposefully and malevolently.

    • (Score: 2) by MichaelDavidCrawford on Saturday March 09 2019, @12:24PM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday March 09 2019, @12:24PM (#812000) Homepage Journal

      Recall the recent terrorist incident in Kashmir in which two or three dozen troops were murdered by some manner of fundamentalist cell.

      Kashmir is _profoundly_ disputed by India and Pakistan, as they have fought three wars over it. China wants it too though that never makes the news.

      Back in the day, Pakistan for its first Test, tested like four or five Bombs in just one day.

      While it tested its "Heavy" - what the Manhattan Project at first denoted the Hydrogen Bomb - in 1975, India followed suit by testing like four or five Bombs in just one day as well.

      What could this _possibly_ have to do with Artificial Intelligence, you quite reasonably protest?

      Neural Networks are not composed of Neurons, rather they are weighted averagers. That only recently made sense to me, in that doubtlessly more-complex Artificial Neurons would lead to training failures due to the development of instabilities, or training itself into a blind alley, logical firestorms and the like.

      HOWEVER.

      Human Neurons employ more than ONE HUNDRED THIRTY NEUROTRANSMITTERS, not just the well-known Serotonin, Norepinephrine or Noradrenaline and Dopamine. That Nitrous Oxide is a Neurotransmitter is what makes us laugh; in my own case, excess Norepinephrine makes _me_ laught due to Bipolar Mania.

      Recent research has lead to some understanding of Neural Networks with - similarly - more than one hundred thirty distinctly different varieties of input and output signaling, but it's not made the press otherwise so likely there's been little progress made towards real-world application.

      What's more, the AIs we've got today are largely two-dimensional, and are methodically connected, whether that 2-D is purely virtual in code or in reality as with neurons-on-a-chip.

      Now let's take your 130-neurotransmitter board and make it a mother fucking cube.

      Now connect it _randomly_.

      IT GETS BETTER!

      _Real_ human neurons let their tendrils do the walking. That's a good chunk of how we learn: a phenomenon known as Brain Plasticity leads the very fine fibers at the ends of which are neurotransmitter emitters and receptors to - in vast quantities - come and go in ways that are far beyond our present ability to understand experimentally.

      Not enough?

      Neurotransmitter receptors and emitters come and go _too_.

      Get back to me when you're not afraid of _me_ in an artificial intelligence lab, you Ignorant Mother Fucker [warplife.com]

      --
      Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by MichaelDavidCrawford on Saturday March 09 2019, @12:37PM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday March 09 2019, @12:37PM (#812002) Homepage Journal

      LOL: Robotic Guns.

      --
      Yes I Have No Bananas. [gofundme.com]