I found an old memoir by someone who had worked with Richard Feynman way back in the 80's.
Those days seem to presage a lot of things that have become commercial hot topics these days -- highly parallel computers and neural nets.
One day in the spring of 1983, when I was having lunch with Richard Feynman, I mentioned to him that I was planning to start a company to build a parallel computer with a million processors. (I was at the time a graduate student at the MIT Artificial Intelligence Lab). His reaction was unequivocal: "That is positively the dopiest idea I ever heard." For Richard a crazy idea was an opportunity to prove it wrong—or prove it right. Either way, he was interested. By the end of lunch he had agreed to spend the summer working at the company.
In his last years, Feynman helped build an innovative computer. He had great fun with computers. Half the fun was explaining things to anyone who would listen.
I was alive those days; might I be as old as aristarchus?
-- hendrik
(Score: 2) by FatPhil on Tuesday December 18 2018, @08:59AM (2 children)
This is a tech that gets reinvented every decade or so, as the balance between computation and communication oscillates. It appears skewed tori and things that look more like optimal sort networks and FFT butterflies are the current topological fashion rather than hypercubes.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2) by DannyB on Tuesday December 18 2018, @02:21PM (1 child)
It was way back in the day. My memory could be failing. But I seem to recall that the dimension of the cube determined the number of processors, and maybe the interconnects as a result. But I would be happy to be corrected on that point if I misunderstood or misremember.
The lower I set my standards the more accomplishments I have.
(Score: 2) by FatPhil on Tuesday December 18 2018, @02:47PM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves