Google has assembled [techerati.com] thousands of Tensor Processor Units (TPUs) into giant programmable supercomputers and made them available on Google Cloud
To be precise, Google has used a “two-dimensional toroidal” mesh network to enable multiple racks of TPUs to be programmable as one colossal AI supercomputer. The company says more than 1,000 TPU chips can be connected by the network.
Google claims each TPU v3 pod can deliver more than 100 petaFLOPS of computing power, which puts them amongst the world’s top five supercomputers in terms of raw mathematical operations per second.
Source: https://techerati.com/news-hub/scalable-ai-supercomputers-now-available-as-a-service-on-google-cloud/ [techerati.com]