Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.
posted by janrinok on Saturday May 11 2024, @01:02AM   Printer-friendly
from the old-space-heater dept.

Someone purchased the eight year old Cheyenne supercomputer for $480k. Failing hardware. Leaking water system. What would it be good for? Selling for parts would flood the market. Testing the parts would take forever. They also have to pay for transport from it's current location. Originally built by SGI.

https://gsaauctions.gov/auctions/preview/282996
https://www.popsci.com/technology/for-sale-government-supercomputer-heavily-used/
https://www.tomshardware.com/tech-industry/supercomputers/multi-million-dollar-cheyenne-supercomputer-auction-ends-with-480085-bid

Cheyenne Supercomputer - Water Cooling System

Components of the Cheyenne Supercomputer

Installed Configuration: SGI ICEā„¢ XA.

E-Cells: 14 units weighing 1500 lbs. each.

E-Racks: 28 units, all water-cooled

Nodes: 4,032 dual socket units configured as quad-node blades

Processors: 8,064 units of E5-2697v4 (18-core, 2.3 GHz base frequency, Turbo up to 3.6GHz, 145W TDP)

Total Cores: 145,152

Memory: DDR4-2400 ECC single-rank, 64 GB per node, with 3 High Memory E-Cells having 128GB per node, totaling 313,344 GB

Topology: EDR Enhanced Hypercube

IB Switches: 224 units

Moving this system necessitates the engagement of a professional moving company. Please note the four (4) attached documents detailing the facility requirements and specifications will be provided. Due to their considerable weight, the racks require experienced movers equipped with proper Professional Protection Equipment (PPE) to ensure safe handling. The purchaser assumes responsibility for transferring the racks from the facility onto trucks using their equipment.

Please note that fiber optic and CAT5/6 cabling are excluded from the resale package.

The internal DAC cables within each cell, although removed, will be meticulously labeled, and packaged in boxes, facilitating potential future reinstallation.

Any ideas (serious or otherwise) of suitable uses for this hardware?


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by DrkShadow on Saturday May 11 2024, @05:04AM

    by DrkShadow (1404) on Saturday May 11 2024, @05:04AM (#1356529)

    Whoops, I didn't give industrial discount on the energy. Lets take that at 50%.

    Then, energy for the old is $1.3mm/yr and new is $600k/yr for operating expense.

    So 6 years is break-even point on a new computer + energy vs using the old computer.

    That's not entirely unreasonable. You still have to take into account the node memory errors, water cooling replacement, and cabling(!!?!?), but not entirely unreasonable. Especially if you don't want to have to think about it. I might see ASW doing it for their Tx instances, except for the water cooling problems. I'm unsure what kind of unreliability cloud vendors tolerate.

    --

    Interestingly, 4900 memory dimms is fewer than the CPU count, and you need at *least* one each. So they almost certainly used 16GB dimms and put two per CPU. The new build would have to use 32GB DIMMs and have one per CPU. You could mix things up, get bigger CPUs and fewer of them, but that's out of scope.

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4