Electrical engineers at the Technische Universität München (TUM) have demonstrated a new kind of building block for digital integrated circuits. Their experiments show that future computer chips could be based on three-dimensional arrangements of nanometer-scale magnets instead of transistors. As the main enabling technology of the semiconductor industry — CMOS fabrication of silicon chips — approaches fundamental limits, the TUM researchers and collaborators at the University of Notre Dame are exploring "magnetic computing" as an alternative. They report their latest results in the journal Nanotechnology.
Technische Universität München's press release here.
Many have said that Moore's Law will run up against fundamental laws of physics, as there is a lower limit to how many atoms can be in a transistor. Allowing chip layout to go into the third dimension will definitely mitigate that, so we may continue to get more powerful computers for some time to come.
(Score: -1, Offtopic) by Anonymous Coward on Friday October 03 2014, @01:33PM
downvote if you're currently in drag and sucking on a pink pen.
(Score: 3, Interesting) by Dunbal on Friday October 03 2014, @02:30PM
The big problem with 3D is suddenly you have heat problems you never would have had in 2D
(Score: 2, Informative) by Anonymous Coward on Friday October 03 2014, @03:12PM
A 3D lattice with cooling channels is still better connected than a 2D surface with cooling planes.
The trick is moving heat through smaller channels... all part of the development.
A better trick is superconduction, eliminating the heat problem.
(Score: 3, Interesting) by MichaelDavidCrawford on Friday October 03 2014, @04:25PM
I own a Model Identifier MacBookPro10,1 early 2012 MacBook Pro with a core quad i7 and 8 GB of RAM. I use it for email, web browsing, listening to music and editing source files for my iPhone App.
It's rather like using a B-52 for a fucking cropduster. :-/
I also own a core quad xeon workstation I built myself. The motherboard is socketed for two CPUs and 64 GB of FB-DIMM, but despite having put some heavy loads on it I was never able to get any disk paging to happen.
Someone lifted my iPhone 4, so I don't have it anymore, but when I did have it, my iPhone was a lot faster computer, with more memory and storage than the Power Mac 8500 that I used to provide for myself as a software consultant for six solid years. I wrote a lot of code on that 8500, shipped a lot of products yet my damn phone was far more capable in just about every respect.
So what did I do with my iPhone? I made voice calls, checked the transit schedules, listened to music, read my email and tested my iPhone app.
I don't doubt that we're going to someday make far more capable computers someday, but we're going to run out of the ability to sell them to the public.
Yes I Have No Bananas. [gofundme.com]
(Score: 2) by takyon on Friday October 03 2014, @06:34PM
We need faster computers and we need chips with more FLOPS/W. The benefits are immediately felt by high-performance [hpcwire.com] and cloud computing, which will attempt to use all available resources [hpcwire.com]. Inefficiencies will be tackled as millions of dollars are on the line (as opposed to giving consumer devices more performance than is ever needed).
Consumer devices will advance to handle 4K graphics and become more responsive to the smallest device sizes (already at 1440-1600p), while consuming less energy. Gaming can always find a use for the capabilities of better GPUs, while scaling to support older GPUs. VR could benefit from much better graphics cards to run at a high resolution (eventually 4K or more per eye) and a high framerate (120 Hz or more). While smartwatches probably won't need big advances in performance, Google Glass and other wearables that aspire to deliver augmented reality could use faster processors.
There will be a need for small, low power, and disposable computers in the form factor of nanobots and other medical devices.
Desktops can and should be replaced by laptops or tablets with keyboards if an increase in desktop computing power would go completely unused. That's probably not the case for desktop gamers, though most PC gamers can get by on older or less powerful cards, and Nvidia Maxwell could start a trend in lowering excessive power consumption. Dockable smartphones seem like a neat idea (to replace desktops). Too bad Ubuntu Edge didn't opt for flex funding.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Saturday October 04 2014, @07:21PM
> It's rather like using a B-52 for a fucking cropduster. :-/
Didn't the USA ACTUALLY do this during the Vietnam War with B-52s spewing Agent Orange over the jungles there to defoliate them to uncover enemy forces?
That isn't much of a stretch as they were used for carpetbombing--essentially 'dusting crops' with HIGH EXPLOSIVE BOMBS!!!