Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday February 19 2019, @01:59PM   Printer-friendly
from the build-a-[fire]-wall! dept.

Sex robots could be hijacked by hackers and used to cause harm or even kill people, a cybersecurity expert has warned.

Artificial intelligence researchers have consistently warned of the security risks posed by internet-connected robots, with hundreds recently calling on governments to ban weaponized robots.

The latest warning comes from a cybersecurity expert who made the prophecy to several U.K. newspapers.

“Hackers can hack into a robot or a robotic device and have full control of the connections, arms, legs and other attached tools like in some cases knives or welding devices,” Nicholas Patterson, a cybersecurity lecturer at Deakin University in Melbourne, Australia, told the Star.

“Often these robots can be upwards of 200 pounds and very strong. Once a robot is hacked, the hacker has full control and can issue instructions to the robot. The last thing you want is for a hacker to have control over one of these robots. Once hacked they could absolutely be used to perform physical actions for an advantageous scenario or to cause damage.”

https://www.newsweek.com/hacked-sex-robots-could-murder-people-767386

[Yes, the story is "clickbait-y", but the underlying point still remains that remote access to IoT (Internet of Things) devices could wreak havoc. Do any Soylentils have IoT devices and what, if anything, have you done to provide protection from undesired monitoring or tampering? --Ed.]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by DannyB on Tuesday February 19 2019, @03:25PM (3 children)

    by DannyB (5839) Subscriber Badge on Tuesday February 19 2019, @03:25PM (#803485) Journal

    Has anyone tried asking any of these personal assistants for dirty talk? (eg Alexa, Google, Siri, Cortana, etc)

    I still think it would be more fun to get them into an argument or at least talking to each other.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by takyon on Tuesday February 19 2019, @03:44PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday February 19 2019, @03:44PM (#803496) Journal

    Obviously people have done it. I don't see any examples on YouTube or the web because of clickbait and poisoned searches, but I think Alexa responds with a "That's rude" type of response.

    https://soylentnews.org/article.pl?sid=18/07/28/0147253 [soylentnews.org]

    You could also cheat by adding a user-created "skill" that overrides default responses, making Alexa say whatever you want. These things parrot back from a script, or in some cases grab semi-random content from around the web [soylentnews.org].

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by RS3 on Tuesday February 19 2019, @03:49PM

    by RS3 (6367) on Tuesday February 19 2019, @03:49PM (#803501)

    I'm learning that anything I think of has A) been thought of before, and B) there's a youtube video from years ago proving it:

    https://www.youtube.com/watch?v=KVroizhqwdk [youtube.com]

    Not sure how legit this is tho...

  • (Score: 2) by NotSanguine on Tuesday February 19 2019, @11:20PM

    by NotSanguine (285) <{NotSanguine} {at} {SoylentNews.Org}> on Tuesday February 19 2019, @11:20PM (#803751) Homepage Journal

    Has anyone tried asking any of these personal assistants for dirty talk?

    Yep. I did so a while back at my brother's house (after attempting to order two tons of creamed corn [xkcd.com] -- Thank you Randall Munroe!). Alexa was deliberately obtuse when I suggested various sex acts, and downright dumb when insulting her. More's the pity.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr