Sex robots could be hijacked by hackers and used to cause harm or even kill people, a cybersecurity expert has warned.
Artificial intelligence researchers have consistently warned of the security risks posed by internet-connected robots, with hundreds recently calling on governments to ban weaponized robots.
The latest warning comes from a cybersecurity expert who made the prophecy to several U.K. newspapers.
“Hackers can hack into a robot or a robotic device and have full control of the connections, arms, legs and other attached tools like in some cases knives or welding devices,” Nicholas Patterson, a cybersecurity lecturer at Deakin University in Melbourne, Australia, told the Star.
“Often these robots can be upwards of 200 pounds and very strong. Once a robot is hacked, the hacker has full control and can issue instructions to the robot. The last thing you want is for a hacker to have control over one of these robots. Once hacked they could absolutely be used to perform physical actions for an advantageous scenario or to cause damage.”
https://www.newsweek.com/hacked-sex-robots-could-murder-people-767386
[Yes, the story is "clickbait-y", but the underlying point still remains that remote access to IoT (Internet of Things) devices could wreak havoc. Do any Soylentils have IoT devices and what, if anything, have you done to provide protection from undesired monitoring or tampering? --Ed.]
(Score: 2) by DannyB on Tuesday February 19 2019, @03:25PM (3 children)
Has anyone tried asking any of these personal assistants for dirty talk? (eg Alexa, Google, Siri, Cortana, etc)
I still think it would be more fun to get them into an argument or at least talking to each other.
Fact: We get heavier as we age due to more information in our heads. When no more will fit it accumulates as fat.
(Score: 2) by takyon on Tuesday February 19 2019, @03:44PM
Obviously people have done it. I don't see any examples on YouTube or the web because of clickbait and poisoned searches, but I think Alexa responds with a "That's rude" type of response.
https://soylentnews.org/article.pl?sid=18/07/28/0147253 [soylentnews.org]
You could also cheat by adding a user-created "skill" that overrides default responses, making Alexa say whatever you want. These things parrot back from a script, or in some cases grab semi-random content from around the web [soylentnews.org].
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by RS3 on Tuesday February 19 2019, @03:49PM
I'm learning that anything I think of has A) been thought of before, and B) there's a youtube video from years ago proving it:
https://www.youtube.com/watch?v=KVroizhqwdk [youtube.com]
Not sure how legit this is tho...
(Score: 2) by NotSanguine on Tuesday February 19 2019, @11:20PM
Yep. I did so a while back at my brother's house (after attempting to order two tons of creamed corn [xkcd.com] -- Thank you Randall Munroe!). Alexa was deliberately obtuse when I suggested various sex acts, and downright dumb when insulting her. More's the pity.
No, no, you're not thinking; you're just being logical. --Niels Bohr