At the end of last year, Germany switched on a new type of massive nuclear fusion reactor for the first time, and it was successfully able to contain a scorching hot blob of helium plasma.
But since then, there's been a big question - is the device working the way it's supposed to? That's pretty crucial when you're talking about a machine that could potentially maintain controlled nuclear fusion reactions one day, and thankfully, the answer is yes.
A team of researchers from the US and Germany have now confirmed that the Wendelstein 7-X (W 7-X) stellerator is producing the super-strong, twisty, 3D magnetic fields that its design predicted, with "unprecedented accuracy". The researchers found an error rate less than one in 100,000.
(Score: 5, Insightful) by wonkey_monkey on Tuesday December 06 2016, @11:46PM
And zey said I vos mad!
The researchers found an error rate less than one in 100,000.
Kinda need to know the typical consequences of one "error" to put that in some perspective...
systemd is Roko's Basilisk
(Score: 0) by Anonymous Coward on Wednesday December 07 2016, @12:16PM
Well for every 100,000 seconds the thing runs, you can expect it to fail catastrophically. Such accuracy.
(Score: 2) by GreatAuntAnesthesia on Wednesday December 07 2016, @01:23PM
No, it means you can only ever build 99,999 of them. The next one you build will go "pop" in a spectacular fashion.
(Score: 3, Informative) by VLM on Wednesday December 07 2016, @02:40PM
I clicked a zillion times and read, well, skimmed, the papers and the engineering design predicting the magnetic field matched what they measured on a meter across the whole volume to around 5 or so digits of precision on the meter.
I remember doing annoying differential calculus in quantitative chemical analysis class like a quarter century ago to do error analysis so you can measure some some electrochemistry thingy result a bunch of times and then turn the std deviation of that measurement into error bars for the calculated result. You can't just say 5% error in is 5% error out when the transfer function is some whacky exponential or whatever such that 5% high on the input means 50% high on the output or who knows. Of course you can solve numerically but you're supposed to "gain insight on error minimization" by analyzing the functions. Sometimes the math was awful and you couldn't solve it, or at least we couldn't, if it was for something weird like lab temperature or whatever. The point of this reminiscence is I bet the overall plant is so complicated that there's no way to turn magnetic uniformity into a delta of economic profit if it were a generating plant (which it isn't). Or if there is one its probably made entirely of hand waving.
It is a pretty good measure of coolness. In EE stuff like these magnets and their power supplies its normal to operate across a zillion orders of magnitude for voltage or current but using typical mfgr tolerances you're lucky to get 3 figures of useful reliable data, and they get like 5 or 6 depending on how you pencil whip the statistics, so this must have been one hell of a cuckoo cuckoo clock to build that precisely and accurately.