Micron shared details of its 3rd generation of "10 nm-class" DRAM fabrication:
Micron's 3rd Generation 10 nm-class (1Z nm) manufacturing process for DRAM will allow the company to increase the bit density, enhance the performance, and the lower power consumption of its DRAM chips as compared to its 2nd Generation 10 nm-class (1Y nm) technology. In particular, the company says that its 16 Gb DDR4 device consumes 40% less power than two 8 Gb DDR4 DRAMs (presumably at the same clocks). Meanwhile, Micron's 16 Gb LPDDR4X ICs will bring an up to 10% power saving. Because of the higher bit density that the new 1Z nm technology provides, it will be cheaper for Micron to produce high-capacity (e.g., 16 Gb) memory chips for lower-cost, high-capacity memory sub-systems.[...] As for mobile memory, Micron's 16 Gb LPDDR4X chips are rated for transfer rates up to 4266 MT/s. Furthermore, along with offering LPDDR4X DRAM packages with up to 16 GB (8x16Gb) of LPDDR4X for high-end smartphones, Micron will offer UFS-based multichip packages (uMCP4) that integrate NAND for storage and DRAM. The company's uMCP4 family of products aimed at mainstream handsets will include offerings ranging from 64GB+3GB to 256GB+8GB (NAND+DRAM).
Micron's 3rd Generation 10 nm-class (1Z nm) manufacturing process for DRAM will allow the company to increase the bit density, enhance the performance, and the lower power consumption of its DRAM chips as compared to its 2nd Generation 10 nm-class (1Y nm) technology. In particular, the company says that its 16 Gb DDR4 device consumes 40% less power than two 8 Gb DDR4 DRAMs (presumably at the same clocks). Meanwhile, Micron's 16 Gb LPDDR4X ICs will bring an up to 10% power saving. Because of the higher bit density that the new 1Z nm technology provides, it will be cheaper for Micron to produce high-capacity (e.g., 16 Gb) memory chips for lower-cost, high-capacity memory sub-systems.
[...] As for mobile memory, Micron's 16 Gb LPDDR4X chips are rated for transfer rates up to 4266 MT/s. Furthermore, along with offering LPDDR4X DRAM packages with up to 16 GB (8x16Gb) of LPDDR4X for high-end smartphones, Micron will offer UFS-based multichip packages (uMCP4) that integrate NAND for storage and DRAM. The company's uMCP4 family of products aimed at mainstream handsets will include offerings ranging from 64GB+3GB to 256GB+8GB (NAND+DRAM).
Finally, a reasonable amount of RAM for smartphones. But I think we may need at least 24 GB, if not 32 GB.
Related: Xiaomi Announces Smartphones with 10 GB of RAMSamsung Mass Producing LPDDR5 DRAM (12 Gb x 8 for 12 GB Packages)
Get Ready for Smartphones with 16 GB of RAM
Ok, I'll get my sledge hammer ready!
Not that a single consumertard would even *know* what to do with more RAM. Bigger numbers? Time to throw everything away again and buy all new things!
Yea, yea, yea, now they can run one more Java based app. :P So wasteful.
More RAM is just another excuse to continue to use garbage-collected languages (which invariably have poor garbage collection).
You know, if Java had a good garbage collector, it would have been imploded long ago...
More data means more space for spyware and more space for authorities to browse.
640 GB ought to be a good starting point for anybody.
640 GB of ram oughta be enough for everyone!
I seriously wonder now, at what point in history there was literally 640GB of RAM for
everybody, ie, that much RAM on the entire planet. I'm thinking some time in the 1960s, but
I really don't know...
I wasn't around back in the olden days of computing, but my guess would be mid-1970s. The standard form of RAM from the mid-50s through mid-70s was magnetic core RAM [wikipedia.org] (or "core"), which basically needed to be manufactured by hand. Thus, the cost even into the early 1960s was something like $1 per bit. When semiconductor RAM and DRAM were introduced by the early 1970s, their price point had come down to around 1 cent per bit, which allowed them to take over the market.
As noted in the linked article, by 1970, IBM was producing about 20 billion cores per year, or 20 billion BITS of RAM, so I don't think we were anywhere near 640 billion BYTES of RAM by that time. (Also, just thinking about the manufacturing cost -- even though core RAM was down 1 cent per bit or less by 1970, 640 GB of RAM would have required a worldwide investment of more than $50 billion even at the low 1970 prices. Assuming manufacturing that across the 1960s, it would have likely cost at least $500 billion on RAM alone. For comparison, the entire Apollo space program cost the U.S. about $25 billion. So I don't think it's feasible the globe collectively made it anywhere near 640 GB of RAM worldwide in the 1960s.)
By the mid-1970s, with semiconductor RAM becoming dominant, chip sizes rapidly shifted upwards from 1 KB initially to 32 KB, and of course then higher. Mass-produced personal computers also started entering the market by the mid-late 1970s. So, I'd guess it's probably more likely we passed the 640 GB threshold in the mid-1970s. Definitely before 1980.
I had no idea we relied on core for that long. Googling around,
I found that semiconductor RAM existed in the mid-60s, but looks
to have been expensive and not much capacity being produced. I think you're
I bought a "PC" in 1982 for $700. It came with 16K of RAM and I eventually expanded it to 48K for an additional $300. 640GB is 48K x ~13 million. I'd guess world capacity probably hit 640GB in the mid to late 1970s - personal computers didn't really take off until the 1980s, but there would have been millions of business computers in use by 1979.
I've tried to get enthused about making a little touch-screen cased Ras-Pi4, but... it's just not worth the effort when you can buy a complete smartphone with stupendous specs for ~$150.
At this rate, a phone can be a desktop replacement - Chromecast or WiDi or whatever your flavor to the monitor, Bluetooth to the HIDs, and the same mobile device when you're not sitting at a real screen with a real keyboard and mouse.
When I get some time, I'm thinking a good project would be a "LoJack" app for a smartphone that's wired into whatever vehicle you want to monitor - GoogleFi data SIM for real-time video/gps tracking, accelerometer/gyros to detect unexpected movement, bluetooth detection of your phone in your pocket to disarm. If you want to get clever with the finance side you can probably get a good enough new phone for under $80 to run the app on, with no monthly fee and real-time reassurance that your tracked asset (car, boat, bike) is safely not being hooned by crackheads.
I've got the FLIRC case on Pi4. Now it runs at below 52°C most of the time.
Pi4 is OK for now, but I predict that the 3DSoC [soylentnews.org] concept will make its way into ARM chips and Broadcom within the next 10 years. And then we could see early 2020s HEDT performance in an ARM chip with less power consumption than Pi4. Meaning that systems that are OK like Pi4 could get turbocharged into "faster than you know what to do with" territory. Not sure how GPUs will be affected, but the industry should eventually shoot for 1 petaflops [reddit.com] in the mobile SoC form factor for standalone VR headsets (16K res, high framerate, raytracing, etc.). The rumor mill has it that ARM will make a major GPU announcement next year, and Samsung is licensing Radeon GPU technology from AMD for mobile chips [soylentnews.org], so there's going to be more focus on ARM graphics going forward.
Comparing Raspberry Pi to phones for desktop use, the 4 GB Pi4 could run you about $80 if including a micro-HDMI cable, power supply, a 32 GB microSD card, and sales tax. $95 if you throw in a FLIRC case at the non-discount price. And then shipping for Pi or case. From this article [gottabemobile.com], it seems like phones in that price range are packing 2-3 GB of RAM. Some of them have octa-core processors that may be faster if they are on a better node or newer ARM core design. IMO, it's at least in the same ballpark spec-wise, but the Pi gives you 4 USB ports, Ethernet, and dual displays. These are going to make it damn useful as a desktop. You could add a dock to the phone but that's extra cost if you're comparing. I'm thinking of getting an RTL-SDR [rtl-sdr.com] to occupy one of the Pi4's USB3 ports, but I hear it's not working properly with it yet. The other 3 are good for storage, keyboard, and mouse.
I agree that docking is where we are going in the future. Smartphones that everyone carries constantly will pack a powerful 3DSoC, act as your desktop computer when docked, with cloud or local backups for data in case you throw the thing in a ditch, losing your "desktop" computer. They will become so utterly powerful, phones will take over all current desktop roles and more than 16 GB of RAM may be desirable (all of this amount may need to be integrated directly with the 3DSoC). For the remaining desktop enthusiasts, new applications will have to be created in order to exploit/waste all the performance a desktop 3DSoC would have.
What's Z as in 1Z nm? And how does it compare with 1Y nm? Both seem to be 10 nm.
And what is MT/s? Megatera per second?
The 1X/1Y/1Z thing is some dumb marketing thing that lets them conceal the actual node they are making the memory on.
As you go from 1X to 1Z, the node shrinks. So maybe it was "16nm" down to "10nm" or something.
MT/s = megatransfers per second, which is the technically correct way to refer to RAM speeds.
does more phone or laptop ram require much more power?
It will be powered by mitochondria like in star wars.
But only if you charge the phone with a hand crank / bicycle generator
That would be midichlorians. Mitochondria were the microscopic lifeforms responsible for the force.
Related: https://starwars.fandom.com/wiki/Midi-chlorian [fandom.com]
In theory the larger the die size of the DRAM, the more power it consumes. The current Micron 16Gb chips are based on 1Ynm, and the 1Znm chips should be smaller and consume less power.
Wonder what the impact on battery performance will be. Maybe it’s still dominated by the display.
It is certainly dominated by display. And newer gens of LPDDR [wikipedia.org] tend to have lower voltage and power consumption. I am surprised they are talking about a 16 GB LPDDR4X package instead of LPDDR5 though.
It's also conceivable that smart use of the chunky RAM could prevent app reloads, browser tab reloads, unnecessary network transfers, etc. which could impact the battery.
A 1000 or maybe 2^10 times as much as "The first commercial SDRAM chip was the Samsung KM48SL2000, which had a capacity of 16 Mb" from 1992 [wikipedia.org].
At the bottom of this press release [samsung.com] is a nice timeline of the development of Samsung's mobile DRAM, from 256 MB in 2009 to 12 GB in 2019 (looks like 16 GB next year from Micron and/or Samsung). Speed went from 400 Mbps MDDR to 5500-6400 Mbps LPDDR5. The original iPhone had 128 MB of DRAM.