Stories
Slash Boxes
Comments

SoylentNews is people

Politics
posted by Fnord666 on Sunday September 03 2017, @10:09PM   Printer-friendly
from the judgment-day dept.

In a televised event, Russia's President Vladimir Putin spoke to a group of students about a number of topics, including AI and drones:

Russian president Vladimir Putin spoke about the potential power of artificial intelligence to students on Friday, saying "the one who becomes the leader in this sphere will be the ruler of the world," according to Associated Press. He then said "it would be strongly undesirable if someone wins a monopolist position," indicating that Russia would cooperate with other countries in the development of AI. While Russia is seen as skilled in technological propaganda, it has little presence in mainstream AI research.

Putin also envisioned a future for war where drones, ostensibly controlled by artificial intelligence, would fight proxy wars between countries. "When one party's drones are destroyed by drones of another, it will have no other choice but to surrender," he said.

Russian companies have been actively researching autonomous weapons, such as drones, robots and missiles, which would be able to pick targets and fire on their own. Documents from the US military show similar strategies, where swarms of drones would assist troops with real-time intelligence gathering and air support.

Putin puts on his Musk hat:

Putin touched on the topic of space technologies, hoping that space travel technology could one day be used in passenger travel, though not necessarily for journeys into outer space. He described the slashing of flight time from Russia's westernmost major city, Kaliningrad, to its easternmost, Vladivostok, as "a dream."

As far as space travel is concerned, Putin told students that there is hope for life on other planets in our Solar System.

"The flight to Mars would take no less than half a year, maybe even more," Putin said. "If you fly to Mars and buried yourself somewhere in there, then you could exist for some period of time. But you have to dig yourself in because cells simply die on the surface," he warned pupils.

Also at the New York Post and VOA.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by Anonymous Coward on Monday September 04 2017, @12:01AM (11 children)

    by Anonymous Coward on Monday September 04 2017, @12:01AM (#563246)

    But I take exception to this:

    Putin also envisioned a future for war where drones, ostensibly controlled by artificial intelligence, would fight proxy wars between countries. "When one party's drones are destroyed by drones of another, it will have no other choice but to surrender,"

    When one party's hardware is defeated, that side will send in its humans. In war, human lives are the only currency that really matters.

    • (Score: 5, Interesting) by looorg on Monday September 04 2017, @12:27AM (2 children)

      by looorg (578) on Monday September 04 2017, @12:27AM (#563249)

      I agree, I was completely on board with Putins analysis up until the point when the decided to conclude that future wars would be some kind of Robo-war (or drone war) and when you run out of hardware it's game over and you just roll over. It's like Russia didn't learn a thing from the 1979 invasion of Afghanistan, but then considering the other super-power didn't seem to pick up any hints perhaps one shouldn't be to surprised.

      But in regards to the other parts about AI, space travel and faster travel on earth (after all Russia is the largest country on earth so it would probably be of massive importance to them) he seemed fairly spot on. I guess this part of the reason why he, Putin, is such a fascinating (and scarey) leader, unlike many other world leaders he seems to actually be able to think and not be overly concerned about talking in soundbites for the 24/7 news cycles.

      Also if everyone from Putin to Musk to whomever is feeling that AI is so god damn dangerous why do they keep developing it? Shouldn't they wanna kill that so it doesn't become the next nuclear missile arms race? Sure we develop it for the same reason we do other dangerous things, we don't want someone else to get the upper hand and potentially wipe us out but beyond that.

      • (Score: 4, Insightful) by khallow on Monday September 04 2017, @10:42AM

        by khallow (3766) Subscriber Badge on Monday September 04 2017, @10:42AM (#563400) Journal

        Also if everyone from Putin to Musk to whomever is feeling that AI is so god damn dangerous why do they keep developing it? Shouldn't they wanna kill that so it doesn't become the next nuclear missile arms race? Sure we develop it for the same reason we do other dangerous things, we don't want someone else to get the upper hand and potentially wipe us out but beyond that.

        Because they don't have the power to prevent someone else from researching it. That's why the nuclear missile arms race happened in the first place. The US and USSR had separately the ability to halt their own research into the technology, but not that of their foe.

      • (Score: 2) by realDonaldTrump on Monday September 04 2017, @09:01PM

        by realDonaldTrump (6614) on Monday September 04 2017, @09:01PM (#563573) Homepage Journal

        This is how it was explained to me. People everywhere are trying to build the AI. Lots of very smart people. Sooner or later, someone will build it. The AI will take over, it will run everything. Ruling like this world has never seen. And it will have a friends list and an enemies list. Who doesn't? Let me tell you, no one wants to be on the enemies list. Do you understand? I hope you understand. If we build the AI we're definitely on the friends list. If we don't but the AI sees we were trying to build it, maybe, maybe we'll get on the friends list. If we don't try to build it, if we say, "oh no, I don't want the AI to run everything," we're definitely on the enemies list. That would be very bad. That's not what we want. So we're trying to build it, very sincerely trying. Harder than we tried with the Manhattan Project or Apollo. This is much, much more important. 🇺🇸

    • (Score: 4, Insightful) by takyon on Monday September 04 2017, @12:50AM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday September 04 2017, @12:50AM (#563251) Journal

      Theoretically, drone aircraft can pull more Gs and have better capabilities than the equivalent human-piloted aircraft. Humans can't withstand certain accelerations, and drones don't need an oxygen supply and don't waste space storing a human. The big risk is getting hacked, and that's where AI can come in. Program in the mission at the base or aircraft carrier, then the drones fly to the target and deal damage as directed but with the ability to adjust to real time conditions, and without the risk of receiving spoofed commands.

      Some more drone stuff:

      http://www.bbc.com/news/technology-38569027 [bbc.com]
      https://www.newscientist.com/article/2118412-us-army-wants-to-fire-swarm-of-weaponised-drones-from-a-missile/ [newscientist.com]
      https://www.nextbigfuture.com/2016/09/autonomous-drones-swarms-of-10-40.html [nextbigfuture.com]
      https://www.nextbigfuture.com/2016/12/darpa-wants-to-control-hundreds-of-air.html [nextbigfuture.com]
      https://www.nextbigfuture.com/2017/04/military-academy-teams-compete-with-drone-swarms.html [nextbigfuture.com]

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 4, Interesting) by takyon on Monday September 04 2017, @12:59AM (1 child)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday September 04 2017, @12:59AM (#563255) Journal

        Oh, more to the point, if the other side's drones enjoy capability and numbers advantages, sending humans to fight them will not be very helpful. You might as well surrender. If you do fight them, it will be costly for you to win since you can't easily replace the skilled human pilots.

        Drones can accelerate in ways that kill humans, and while it might take a while before the AI algorithms react to situations in a better way than humans can, drones could have the advantage of having much faster reaction times.

        http://www.dtic.mil/dtic/tr/fulltext/u2/a178485.pdf [dtic.mil]

        Visual reaction times of 143 to 461 ms, with a mean of 223 ms. Computers could get that number down to less even after counting spending some cycles to analyze incoming information.

        Pilots also get worse the longer they go without sleep:

        http://www.cti-home.com/wp-content/uploads/2014/01/Reaction-Time-and-Fatigue-Study.pdf [cti-home.com]

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by unauthorized on Monday September 04 2017, @06:32AM

          by unauthorized (3776) on Monday September 04 2017, @06:32AM (#563317)

          You might as well surrender.

          You make the extremely naive presumptions that people will act rationally. Most people do not, and the people who sit on the top of the political chain and have the power to force young men into conscription certainly are among them.

    • (Score: 1, Insightful) by Anonymous Coward on Monday September 04 2017, @12:59AM (1 child)

      by Anonymous Coward on Monday September 04 2017, @12:59AM (#563256)

      Although humans may not last long against advanced AI controlled hardware, if the country is a nuclear power it will simply send up every warhead it has available prior to surrender.

    • (Score: 2) by arslan on Monday September 04 2017, @03:34AM

      by arslan (3462) on Monday September 04 2017, @03:34AM (#563282)

      If the tech is sufficiently advanced, it only takes the first example to demonstrate the futility before the others potentially avoid the cost and take the option surrender after their robo-army loses.

      For example, think Matrix revolution like machine capabilities (sentinels et el) but controlled by a nation state. If you let lose on a losing nation to take over its infrastructure and flush out every single humanoid combatant and kill without prejudice (again assuming the AI is sufficiently advanced to distinguish combatants to non, or at least to match a human's judgement), well you just sit back and wait. Nuclear arsenal you say? Well again, it takes times to launch missiles and to get to target, if the tech is there you can neutralize them en-route.

      In fact you don't even have to occupy the annexed state with moist bodies, just use robotic ones and collect whatever "tax" from the occupation with expendable robots and rule by proxy. Any surviving resistance is just fighting against metal scraps. Of course you'd need to police the border, assuming the aggressor is nearby, but again with sufficient tech, it is all metal parts & labor.

      Will the tech every get to that point? Beats me. Putin probably have no clue either... but if the tech is there I don't see why not.

    • (Score: 1) by khallow on Monday September 04 2017, @10:38AM

      by khallow (3766) Subscriber Badge on Monday September 04 2017, @10:38AM (#563398) Journal

      When one party's hardware is defeated, that side will send in its humans. In war, human lives are the only currency that really matters.

      Depends on the situation, but I'd expect the human currency to get used up fast ( combination of technology superiority, control of the battlefield, better logistics and reconnaissance, etc) in a large war. In a future where humans don't have a routine presence on a battlefield, you're just going to get a bunch of people killed, if you try - after failing with the hardware side. We have too many past situations where human currency didn't have much value (particularly of the spearchucker versus firearm variety).

    • (Score: 0) by Anonymous Coward on Monday September 04 2017, @03:22PM

      by Anonymous Coward on Monday September 04 2017, @03:22PM (#563470)

      Depends very much on the robots we're talking about.
      Even without fancy AI, if you have good enough image recognition algorithms with sufficient CPU power and correspondingly fast&exact robotics, you can build a soldier robot that walks around and analyses its surroundings with a 100 FPS camera in real time. If a human soldier so much as tries to peek around a corner, the robot could recognise, target and shoot him in the part of the face that is showing before even the pupil of the first eye is in sight. Which leaves the humans zero chance in a traditional combat scenario.

      We could loosen the tolerances and give enough time for the human to fully peek around the corner, wait for his slow meatbrain to process the image, and maybe even the quarter second it takes for the electrochemical signal to reach his fingers. Kinect claims recognition between 0.2 sec (worst case) and 5 ms (predictive best case), and our robotics guys can do marker retargeting (albiet with lightweight fine machinery and small angles) 20+ times per second.
      So even putting together today's technology could result in plenty of dead humans per robot in a close combat scenario. You can come up with superiority scenarios for deser/aearial/naval/... warfare yourself ;)

      Sure it has to be hacking proof, supported by long distance regonition&surveillance, etc. but the point is: drop an advanced killer robot (or, say, a thousand, but one might be enough) as described in my first paragraph into a city, and you'll soon agree that once one side's machines are taken out, surrender is the only reasonable option.

  • (Score: 0) by Anonymous Coward on Monday September 04 2017, @12:53AM (3 children)

    by Anonymous Coward on Monday September 04 2017, @12:53AM (#563252)

    I am hoping the realDonaldTrump or perhaps another in official capacity (as opposed to the officially incapacitated...) sits down with students and tells them what he thinks of Science.

    • (Score: 0) by Anonymous Coward on Monday September 04 2017, @12:59AM (1 child)

      by Anonymous Coward on Monday September 04 2017, @12:59AM (#563254)

      I am hoping the realDonaldTrump or perhaps another in official capacity (as opposed to the officially incapacitated...) sits down with students and tells them what he thinks of Science.

      From a person that is not able to think (only to react), "thinking of science" is asking a wee too much.

      • (Score: 0) by Anonymous Coward on Monday September 04 2017, @01:18AM

        by Anonymous Coward on Monday September 04 2017, @01:18AM (#563261)

        From a person that is not able to think (only to react)

        You aren't able to think? What a shocker.

    • (Score: 4, Funny) by realDonaldTrump on Monday September 04 2017, @01:56AM

      by realDonaldTrump (6614) on Monday September 04 2017, @01:56AM (#563267) Homepage Journal

      Let me tell you, there's a lot of bad science. Very harmful. Man-made climate change, I’m not a big believer in that. It's just a very, very expensive form of tax. From which a lot of people are making a lot of money. China is making a lot of money from it. From the completely FARCICAL Paris agreement. Which I canceled. The ozone layer, they don't want us to use hairspray. They say it's bad for the ozone. That if I use the hairspray in my apartment, somehow it gets to Antarctica. They tell us to use the pump. Bing, bing, bing! Out come the big globs and your hair is all fucked up. Right? The vaccines, I'll tell you. We have a generation of kids with big problems. Who can't talk. Who can barely talk. Should be able to talk good, they can't. They call it a spectrum. They call it autism. Because of the vaccines. You take this little beautiful baby and you pump in so many vaccines. We had so many instances, people that work for me, 2 years old, a beautiful child, went to have the vaccine and came back and a week later got a tremendous fever, got very, very sick, now is autistic. Sad! We've got "environment friendly" light bulbs now. Light bulbs that cause cancer. The idiots who came up with them don't care. Wind farms, which look disgusting. Killing birds by the millions. Terrible for our birds. But even worse they are bad for people's health, they call it infrasound. You start staring at walls. Your ears bleed. Your lips vibrate, your whole body vibrates. You lose control of your bowels. Then the heart attack comes, you die. They call it wind turbine syndrome. They do an autopsy, they can spoon out your insides. Like they went through a blender. Disgusting! We have the cell phones now, the cyber. There's whole continents where you can't take a cell phone, very risky. Because the cyber in them can be hacked. In certain places, it can be hacked. It's very bad in the Middle East and Europe. And the hydrogen bomb. I'm starting to worry about that one. President Jong-un is saying he has it. Saying he exploded one. My Generals say maybe, maybe not. That if I nuke him he might be able to nuke me back. Or he might not. They don't know! And they say he might be crazy. A little bit crazy. Big worries for me. When I was a boy we didn't have these problems. In the early 1950s. America was great. A lot, a lot less science. And life was a lot better. A lot easier. I do miss my old life. 🇺🇸

  • (Score: 0) by Anonymous Coward on Monday September 04 2017, @01:48AM (3 children)

    by Anonymous Coward on Monday September 04 2017, @01:48AM (#563266)

    The good thing about fighting with robots is the human cost goes way down, in theory. But with a reduced human cost, wars could become considerably more frequent.

    I'm not convinced the human cost would really be reduced. Humans make for higher-value targets on a number of metrics. Mainly, we would presumably remain in control of the war effort (i.e. eliminating a few humans could disrupt battles in the same way that is true today, but possibly to greater effect) and we suffer more internally for the loss of human life. Combined with displacement and other war by-products, and the possible increased frequency of war-for-conquest, we may not reduce the human cost at all.

    That said, it's likely inevitable. Good luck.

    • (Score: 4, Insightful) by takyon on Monday September 04 2017, @02:36AM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday September 04 2017, @02:36AM (#563272) Journal

      War might not even be the right word for it. It could be virtual occupation. Drones bombing targets in Afghanistan, Iraq, Somalia, Yemen, Libya, Syria, and other countries. Multiple times every day of the year. No G.I. boots touching the ground. The cost of entry will be too high for most, so you can't call it a war. It will be a one-sided slaughter. Not far off from what we have today, but with even less human cost on one side. These drones aren't likely to fight Russian and Chinese drones just yet. They are likely to slaughter fighters and civilians in Asia and Africa.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Monday September 04 2017, @04:31PM (1 child)

        by Anonymous Coward on Monday September 04 2017, @04:31PM (#563481)

        Russia has been doing the war-for-conquest thing lately. China seems like they might want to, but are holding back for the most part.

        But war like you describe (still legally war, even though the US never declares it) is even worse than the scenario I presented. It highly incentivizes war-for-conquest, creating massive displacements. In that scenario, the human cost goes _way_ up.

        • (Score: 0) by Anonymous Coward on Monday September 04 2017, @04:58PM

          by Anonymous Coward on Monday September 04 2017, @04:58PM (#563491)

          China's only holding back because their opponents are big and nuclear (US/Japan and India, and Russia isn't exactly a major historical ally). They're constantly pushing the boundaries to try and push the limits out as far as possible. Since they're not willing to stop doing this, there'll be a misjudgment at some point down the line, and then things will get really ugly.

(1)