Slash Boxes

SoylentNews is people

posted by martyb on Friday August 11 2017, @06:22AM   Printer-friendly
from the Game-On! dept.

Tic-tac-toe, checkers, chess, Go, poker. Artificial intelligence rolled over each of these games like a relentless tide. Now Google's DeepMind is taking on the multiplayer space-war videogame StarCraft II. No one expects the robot to win anytime soon. But when it does, it will be a far greater achievement than DeepMind's conquest of Go—and not just because StarCraft is a professional e-sport watched by fans for millions of hours each month.

DeepMind and Blizzard Entertainment, the company behind StarCraft, just released the tools to let AI researchers create bots capable of competing in a galactic war against humans. The bots will see and do all all the things human players can do, and nothing more. They will not enjoy an unfair advantage.

DeepMind and Blizzard also are opening a cache of data from 65,000 past StarCraft II games that will likely be vital to the development of these bots, and say the trove will grow by around half a million games each month. DeepMind applied machine-learning techniques to Go matchups to develop its champion-beating Go bot, AlphaGo. A new DeepMind paper includes early results from feeding StarCraft data to its learning software, and shows it is a long way from mastering the game. And Google is not the only big company getting more serious about StarCraft. Late Monday, Facebook released its own collection of data from 65,000 human-on-human games of the original StarCraft to help bot builders.

[...] Beating StarCraft will require numerous breakthroughs. And simply pointing current machine-learning algorithms at the new tranches of past games to copy humans won't be enough. Computers will need to develop styles of play tuned to their own strengths, for example in multi-tasking, says Martin Rooijackers, creator of leading automated StarCraft player LetaBot. "The way that a bot plays StarCraft is different from how a human plays it," he says. After all, the Wright brothers didn't get machines to fly by copying birds.

Churchill guesses it will be five years before a StarCraft bot can beat a human. He also notes that many experts predicted a similar timeframe for Go—right before AlphaGo burst onto the scene.

Have any Soylentils here experimented with Deep Learning algorithms in a game context? If so how did it go and how did it compare to more traditional opponent strategies?


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by FakeBeldin on Friday August 11 2017, @10:39AM (1 child)

    by FakeBeldin (3360) on Friday August 11 2017, @10:39AM (#552228) Journal

    DeepMind and Blizzard also are opening a cache of data from 65,000 past StarCraft II games [...] and say the trove will grow by around half a million games each month.

    Wait, they're getting data on 16k games per day and they decided to release it with ... ~4 days of data? That's a paltry amount of data.
    Why not give it a full week and release data of 100k games upon announcement?
    Even better: announce today that you'll be releasing this next month, and release it with data of half a million games.

    Either the data of the 65k games released is useful as-is, in which case the half a million extra data a month is complete overkill, or the half a million extra data a month is really needed, in which case the 65k initial release seems too little to get any meaningful results.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Informative) by takyon on Friday August 11 2017, @11:07AM

    by takyon (881) Subscriber Badge <{takyon} {at} {}> on Friday August 11 2017, @11:07AM (#552231) Journal

    Look, if this effort is going to take as long as five years, clearly there is no problem with the amount of data they will get.

    Once they have examined a decent chunk of human games, they are likely to have the DeepMind AI play itself on an ultrafast version of the game anyway.

    This has been in the works for a while anyway: Google DeepMind to Take on Starcraft II []

    [SIG] 10/28/2017: Soylent Upgrade v14 []