JEDEC has announced that it expects to finalize the DDR5 standard by next year. It says that DDR5 will double bandwidth and density, and increase power efficiency, presumably by lowering the operating voltages again (perhaps to 1.1 V). Availability of DDR5 modules is expected by 2020:
You may have just upgraded your computer to use DDR4 recently or you may still be using DDR3, but in either case, nothing stays new forever. JEDEC, the organization in charge of defining new standards for computer memory, says that it will be demoing the next-generation DDR5 standard in June of this year and finalizing the standard sometime in 2018. DDR5 promises double the memory bandwidth and density of DDR4, and JEDEC says it will also be more power-efficient, though the organization didn't release any specific numbers or targets.
The DDR4 SDRAM specification was finalized in 2012, and DDR3 in 2007, so DDR5's arrival is to be expected (cue the Soylentils still using DDR2). One way to double the memory bandwidth of DDR5 is to double the DRAM prefetch to 16n, matching GDDR5X.
Graphics cards are beginning to ship with GDDR5X. Some graphics cards and Knights Landing Xeon Phi chips include High Bandwidth Memory (HBM). A third generation of HBM will offer increased memory bandwidth, density, and more than 8 dies in a stack. Samsung has also talked about a cheaper version of HBM for consumers with a lower total bandwidth. SPARC64 XIfx chips include Hybrid Memory Cube. GDDR6 SDRAM could raise per-pin bandwidth to 14 Gbps, from the 10-14 Gbps of GDDR5X, while lowering power consumption.
I still use DDR2.
I have some PC-100 collecting warm dust. (But yes, using DDR2 to post this.)
The big thing to look for is whether RGBU (red-green-blue-ultraviolet) makes it into the new standard. Having to choose between RGB and ad hoc RGBU is no choice at all. If that situation persists in DDR5 it will mean a lost generation for computing.
The RGBU LEDs should be able to pulse fast enough to communicate across the room (with the CIA). Also, don't forget the Twitch integration.
just as they do not care about the latest CPUs because the simple fact is once we switched from the Mhz wars to Core wars PCs quickly so powerful most users aren't even using half of what they have so see no reason to replace what they have which is why guys like me moved into other jobs like HTPCs and home theater installs as PC sales? Have slowed to practically a crawl.
I have to say it ended up affecting me as well, I love gaming and during the MHz wars built a new PC every other year (with a CPU or GPU upgrade in the off year) and now? My 4 year old octocore with 16gb of RAM spends more time twiddling its electron thumbs waiting for me to come up with new tasks for it to do than I can come up with work for the thing, and that is with me recording gameplay, editing videos, and even doing multitrack audio DSP renders. My previous PC is a Phenom II X4 and despite the age still purrs like a kitten and plays the wife's World Of Warships at 60fps.
And I'm what would be considered a "hardcore" user, for regular users that are just doing normal office tasks and running basic programs? Well I have many customers that are still quite happy with their C2D and Athlon X2 laptops despite the age and I can see why as my 6 year old AMD netbook despite the age is perfect for service calls, looking up parts, etc. Users simply cannot come up with enough work to make their DDR 3 or even DDR 2 system obsolete, so I have a feeling DDR 5 will take awhile for the majority are using it.
This certainly matters since it will have effects on enterprise.
Personally, I think capacity/density matters more than speed for the home user. And the home user can see clear benefits from having lots of RAM. I'm using a 2 GB machine right now and wishing it had 4 or 8. If you get up to 32 GB or more, you can do more stuff, handle the crap Web 4.0 throws at you, and if you have way too much you could make a ramdrive. That's not apparent to the home user, but maybe operating systems could create ramdisks automatically out of spare RAM.
Home users will eventually get DDR5, just as they are starting to get DDR4 in their newly bought systems. That's unless another technology like HBM overtakes DRAM DIMMs. I don't think that will happen.
JEDEC has not given us enough info to let us know why density is doubling. In fact, it doesn't make much sense at all. Samsung, Micron, and the others should be handling the density, while JEDEC specifies the speeds/timing, right? Maybe JEDEC is going to add support for 2-layer 3D stacking to DDR5, like with 3D/vertical NAND and High Bandwidth Memory (another JEDEC standard, and each version specifies the maximum height of stacks). They could do this while keeping the DIMM memory module form factor intact.
Or maybe the DDR5 standard will just double the maximum capacity per module, and it is just bad reporting and ambiguous language in the JEDEC release.
I think "density" when it comes to memory modules refers to how much capacity is allowed in each module.
DDR - 1GBDDR2 - 2GB--- not sure of the limits of the newer ones.
Seems to be 16 GB for DDR3 [wikipedia.org] and 64 GB for DDR4 [wikipedia.org]... but I think manufacturers have exceeded those limits for both DDR3 and DDR4 [thememoryguy.com]. I guess you could call that non-JEDEC standard.
in other words, the number of address lines present in the DIMM slot
Except machines for the better part of a decade can hold 8Gb of RAM and for most users? That is frankly overkill. Hell the Q6600 I use for the main shop PC has 8Gb of RAM and that unit was literally a throw away from the local cable office because the GPU went out, 2Gb DDR 2 and 4gb DDR 3 sticks are dirt cheap, even 8gb DDR 3 chips are only $50 a stick so maxxing out the RAM in an older system? Really not expensive.1Sure Enterprise can use it, I never said there was NOBODY that would use it. You can sell the Enterprise 64 core chips that cost a couple of grand and $15k 4Tb SSDs and they'll snatch them up and ask for more because all they care about is iOPs and when they are handling millions of transactions a day? Throwing 20k at a box is really no big deal.
But that has nothing to do with mainstream and lets face it, its the mainstream companies want. The Enterprise market has been tightening its belt for years, you just don't see the mega corps just throwing money away on IT like you did back in the early 00s. Now its all about doing more with less, offshoring, and virtualization that lets them do the job of what would have been a dozen new units on a single box so enterprise sales alone? Is not gonna drive the industry. Why do you think GPUs have been taking huge leaps in design and CPUs have not? Or why the focus of the GPU industry is NOT on the top of the line units but the crucial $100-$250 dollar market? Because THAT is where the mainstream customers are and while they haven't been replacing their CPUs they have been swapping GPUs to play that hot new game!
So I stand by my statement, we will be looking at several years before DDR 5 becomes the RAM on the majority of systems and I bet DDR 4 will simply go nowhere, like GDDR 4 many will end up skipping it completely and waiting until their DDR 2 and DDR 3 systems die and then going with DDR 5. I see it out in the field all the time, systems being brought in to clean with 8gb of RAM and quad cores and the users are in no hurry to get new hardware, especially after I show them how fast an SSD OS drive makes even a C2Q feel. They simply see no point in shelling out several hundred on a new system when the one they have does everything they ask of it.
Please use GB and Gb correctly. 8 Gb = 1 GB.
I could definitely use more RAM, and as I said, there's options for when you have "too much". Even cheaper laptops are coming with 12+ GB of RAM (here's 12 GB at $330 [slickdeals.net], and this refurb has 16 GB and high specs for $700 [slickdeals.net]). Although the HDD to SSD transition is going to be more important for most users.
I will note that the cutting edge DDR4... just isn't expensive. The DRAM market has had oversupply for some time due to the decline in the PC market. Obviously, getting a new motherboard or processor is much more expensive, but if you happen to have done that, switching to DDR4 is not hard on the wallet. Some new desktops or laptops are in the $300 range and come with DDR4.
Some [anandtech.com] are predicting that memory modules will be replaced by HBM on package. Although HBM is currently more expensive, the smaller profile is well-suited for Ultrabooks or Chromebooks, even if it is not user-replaceable.
Moving up to those newer processors saddles you with a ton of DRM, not a huge deal for the regular users, but if you don't REALLY REALLY need the extra CPU horsepower, or the power savings, why not stick with your older hardware, that you are PRETTY SURE can't be remotely accessed without OS level exploits, rather than the new ones that very well might have those exploits baked into the firmware.
If they increase the minimum DRAM prefetch to 16n. Then there will be a lot of transfer waits before you can reach a specific position. This will most likely wreck memory accesses that are really random (like synthesis of configurable gate arrays). Higher speeds with lower voltage will lower the signal-to-noise ratio margin.
Modern "DRAM" is more like a post office where you send your requests and hopefully get a response back when you need them.