Google has assembled thousands of Tensor Processor Units (TPUs) into giant programmable supercomputers and made them available on Google Cloud
[...] To be precise, Google has used a "two-dimensional toroidal" mesh network to enable multiple racks of TPUs to be programmable as one colossal AI supercomputer. The company says more than 1,000 TPU chips can be connected by the network.
Google claims each TPU v3 pod can deliver more than 100 petaFLOPS of computing power, which puts them amongst the world's top five supercomputers in terms of raw mathematical operations per second. Google added the caveat, however, that the pods operate at a lower numerical precision, making them more appropriate for superfast speech recognition or image classification – workloads that do not need high levels of precision.
(Score: 2) by bob_super on Thursday May 09 2019, @12:12AM (3 children)
I know. But if it's 2D, then it's a Pac-Man topology.
Toroids are 3D objects.
(Score: 2) by c0lo on Thursday May 09 2019, @12:51AM (1 child)
Topology doesn't care as long as you preserve the navigational - in this case, connectivity/networking - properties.
The RAM in your computer is 3D too, yet you are addressing it as 1D.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by bob_super on Thursday May 09 2019, @12:56AM
Hey ! Work with me here !
I wanna see the faces of CEOs when they have to deliver the presentation to the press, and their schmancy multi-million dollar supermachines run a Pac-Man topology.
(Score: 0) by Anonymous Coward on Thursday May 09 2019, @08:10AM
The surface of a torus is 2D.