Tic-tac-toe, checkers, chess, Go, poker. Artificial intelligence rolled over each of these games like a relentless tide. Now Google's DeepMind is taking on the multiplayer space-war videogame StarCraft II. No one expects the robot to win anytime soon. But when it does, it will be a far greater achievement than DeepMind's conquest of Go—and not just because StarCraft is a professional e-sport watched by fans for millions of hours each month.
DeepMind and Blizzard Entertainment, the company behind StarCraft, just released the tools to let AI researchers create bots capable of competing in a galactic war against humans. The bots will see and do all all the things human players can do, and nothing more. They will not enjoy an unfair advantage.
DeepMind and Blizzard also are opening a cache of data from 65,000 past StarCraft II games that will likely be vital to the development of these bots, and say the trove will grow by around half a million games each month. DeepMind applied machine-learning techniques to Go matchups to develop its champion-beating Go bot, AlphaGo. A new DeepMind paper includes early results from feeding StarCraft data to its learning software, and shows it is a long way from mastering the game. And Google is not the only big company getting more serious about StarCraft. Late Monday, Facebook released its own collection of data from 65,000 human-on-human games of the original StarCraft to help bot builders.
[...] Beating StarCraft will require numerous breakthroughs. And simply pointing current machine-learning algorithms at the new tranches of past games to copy humans won't be enough. Computers will need to develop styles of play tuned to their own strengths, for example in multi-tasking, says Martin Rooijackers, creator of leading automated StarCraft player LetaBot. "The way that a bot plays StarCraft is different from how a human plays it," he says. After all, the Wright brothers didn't get machines to fly by copying birds.
Churchill guesses it will be five years before a StarCraft bot can beat a human. He also notes that many experts predicted a similar timeframe for Go—right before AlphaGo burst onto the scene.
Have any Soylentils here experimented with Deep Learning algorithms in a game context? If so how did it go and how did it compare to more traditional opponent strategies?
Source: https://www.wired.com/story/googles-ai-declares-galactic-war-on-starcraft-/
(Score: -1, Offtopic) by Anonymous Coward on Friday August 11 2017, @06:29AM (1 child)
SoylentNews sucks. When a science/tech story is posted, there is barely any comment, because most of us are uneducated morons who neither know nor care about the fields. When some social/political shit gets posted, every moron and their uncles chime in with their moronic two bits.
(Score: 4, Funny) by takyon on Friday August 11 2017, @06:35AM
I agree. Commenting just 7 minutes after the story is posted is the best time to whine about the story having no comments.
Well, here comes that downmod you wanted.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by takyon on Friday August 11 2017, @06:31AM (3 children)
Microsoft should do the same thing but with AOE2 [wikipedia.org] (the fourth expansion, Rise of the Rajas, came out in December 2016).
DeepMind [wikipedia.org] started out with Pong, Space Invaders, etc. Microsoft did Ms Pac-Man [bbc.com]. Now DeepMind is taking on Starcraft so Microsoft should synergize and go for AOE2.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Friday August 11 2017, @06:45AM (1 child)
Get back to me when they've got an AI that can either gold farm in WoW or ISK farm in Eve Online without getting caught.
*THEN* we'll talk.
Also. Screw Blizzard getting to call them SC and SC2. Anybody who was a game in the 90s should know SC was Star Control :) Thankfully the UQM project has superceded the need for those acronyms for half of the Star Control games anyway.
(Score: 2) by takyon on Friday August 11 2017, @06:46AM
Finally, a way for DeepMind to become profitable.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by TheB on Saturday August 12 2017, @06:57AM
...while Open AI is doing DOTA 2 [youtube.com]
Would be more interesting if they would all play the same game.
Something like the Cyber Grand Challenge [youtube.com]
where multiple teams compete against each other using the same hardware.
(Score: 0) by Anonymous Coward on Friday August 11 2017, @06:45AM
Modders have created nearly unbeatable AI's when the game was new. Fucking hipsters.
(Score: 4, Insightful) by deimios on Friday August 11 2017, @06:57AM (2 children)
Starcraft is a Real Time Strategy game with heavy emphasis on the Real Time part by having mechanics that are deliberately time consuming, thus testing the player's multitasking ability.
Since an AI (or any computer program actually) has multitasking down to microseconds if they have some number of pre-programmed strategies that take advantage of their near 0 execution lag and 100% accuracy they should beat any human.
The necessity of an AI arises when the human player invariably adapts to these preset strategies and learns to counter them, not by playing the game more efficiently but by taking advantage of the predictability of non-AI routines.
I will be watching some of the AI vs human games if they publish them, however I predict they will be very one-sided.
(Score: 2) by takyon on Friday August 11 2017, @07:16AM
It has already been stated that actions/clicks per minute will be limited for the Starcraft-playing AI, it will have the same fog of war, etc. It will probably be required to scroll around using the same UI, use the minimap, and listen for audio cues [youtube.com] in order to respond to what is happening [youtube.com]. Which is a constraint that will ultimately make the AI better at playing the game. Compare that to how AlphaGo played with a handicap early on, but later beat human grandmasters without any handicap in the AI's favor. The playing field for this Starcraft-playing AI will be as level as it can be.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by darkfeline on Friday August 11 2017, @06:57PM
Not really. In the pro scene APM is rarely a limiting factor, a large proportion of actions/clicks are wasted merely to keep the player's hands active. Neither is pinpoint accuracy important as this isn't an FPS; misclicks are not common (although humans do make mistakes, which is presumably one of the advantages of AI).
Starcraft 2 is also a lot less micromanagement heavy than Starcraft 1. Overall, technical ability comprises a very small part of SC2 skills. Generally speaking, if your technical skill is lacking, then most likely your strategic and tactical skills are also lacking and you won't lose merely on your technical skills. Technical skills are generally not a huge factor in pro games.
(I haven't followed SC2 after the first pack though, it's possible that its reliance on technical skill has increased, although I doubt it since Blizzard very much emphasized reducing the reliance on technical skill going from SC1 to SC2.)
Join the SDF Public Access UNIX System today!
(Score: 0) by Anonymous Coward on Friday August 11 2017, @07:58AM (1 child)
So when will they let the AI play Global Thermonuclear War?
(But please, let it play Tic-Tac-Toe first!)
(Score: 4, Funny) by takyon on Friday August 11 2017, @08:28AM
Let Trumpu finish his turn first. 😂💣💣💣💣🔥🔥🔥🔥🔥🔥💀
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0, Offtopic) by kaszz on Friday August 11 2017, @08:37AM (1 child)
Google goes full chicken:
Google cancels meeting on diversity, citing safety concerns for employees [wsj.com] 2017-08-10
More.. [soylentnews.org]
(Score: 2) by takyon on Friday August 11 2017, @08:42AM
I already submitted that. It runs at 14:10 UTC.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by dltaylor on Friday August 11 2017, @08:44AM
Back in the LAN party days, it was fun for 3-6 of us to disconnect the router from the Internet, drop all the firewalls, and have at Starcraft/Broodwar. Since I had a few "spare" computers, I really wanted to train real expert systems as opponents, with the same limitations of visibility as human players (no way to beat the twitch speed, I supposed), and able to play as, and against, various play styles, rather than the inevitable zergling rush. No problem designing the the VGA, keyboard, and mouse interfaces, nor the basic game play, but simply could not reproduce the "thought process" on the PCs of the time.
Not normally envious of other programmers opportunities, but a bit of that for the folks at Google who are getting to finally do it.
(Score: 3, Interesting) by TheLink on Friday August 11 2017, @09:08AM (2 children)
Would the actions per second be throttled? https://www.youtube.com/watch?v=IKVFZ28ybQs [youtube.com]
(Score: 3, Informative) by takyon on Friday August 11 2017, @10:05AM (1 child)
Yes [theguardian.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Friday August 11 2017, @04:28PM
How about timing and mouse accuracy? Think about top level human APMs but with super timing and mouse precision: https://www.youtube.com/watch?v=YbpCLqryN-Q [youtube.com]
e.g. even with human level APMs you may still be able to get an AI to micro stuff at superhuman levels. I'm assuming that the timing vs certain things will be predictable enough if you set things up right and the opponent isn't microing that bit.
Humans can do stuff like this:
https://www.youtube.com/watch?v=zFmq-Q_ObdQ [youtube.com]
https://www.youtube.com/watch?v=WJp0t9n8DWk [youtube.com]
And much of this:
https://www.youtube.com/watch?v=CdSKD3LRHV8 [youtube.com]
I'm sure Google can afford more than enough processing power for a Starcraft 2 bot. Or is there going to be a MIPs or watt limit too? If the AI is limited to the human brain's 20-25W limit it may take a bit longer before the AIs win.
(Score: 2) by FakeBeldin on Friday August 11 2017, @10:39AM (1 child)
Wait, they're getting data on 16k games per day and they decided to release it with ... ~4 days of data? That's a paltry amount of data.
Why not give it a full week and release data of 100k games upon announcement?
Even better: announce today that you'll be releasing this next month, and release it with data of half a million games.
Either the data of the 65k games released is useful as-is, in which case the half a million extra data a month is complete overkill, or the half a million extra data a month is really needed, in which case the 65k initial release seems too little to get any meaningful results.
(Score: 4, Informative) by takyon on Friday August 11 2017, @11:07AM
Look, if this effort is going to take as long as five years, clearly there is no problem with the amount of data they will get.
Once they have examined a decent chunk of human games, they are likely to have the DeepMind AI play itself on an ultrafast version of the game anyway.
This has been in the works for a while anyway: Google DeepMind to Take on Starcraft II [soylentnews.org]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Saturday August 12 2017, @08:10AM
https://xkcd.com/1875/ [xkcd.com]