I found an old memoir by someone who had worked with Richard Feynman way back in the 80's.
Those days seem to presage a lot of things that have become commercial hot topics these days -- highly parallel computers and neural nets.
One day in the spring of 1983, when I was having lunch with Richard Feynman, I mentioned to him that I was planning to start a company to build a parallel computer with a million processors. (I was at the time a graduate student at the MIT Artificial Intelligence Lab). His reaction was unequivocal: "That is positively the dopiest idea I ever heard." For Richard a crazy idea was an opportunity to prove it wrong—or prove it right. Either way, he was interested. By the end of lunch he had agreed to spend the summer working at the company.
In his last years, Feynman helped build an innovative computer. He had great fun with computers. Half the fun was explaining things to anyone who would listen.
I was alive those days; might I be as old as aristarchus?
-- hendrik
(Score: 3, Interesting) by fyngyrz on Monday December 17 2018, @04:29PM
Some problems are well suited to parallel approaches. I use them constantly in both image processing and RF signal processing, for instance. Some layered neural network architectures are also highly amenable to parallel processing on a per-layer basis.
It's like anything else, really. For problem type A, your best bet is to use one approach. For dissimilar problem B, your best bet would be to use a different approach. There's no universal magic bullet that is the "best" for all problem types. But there are some relatively optimum paths for specific problems, some of which turn out to be parallel processing. Such things definitely aren't limited to "academia." You do need to actually have a good grasp of the type of thing you're trying to solve if you are to effectively utilize parallel techniques.
--
Every once in a while declare peace. It confuses your enemies.