Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday August 18 2019, @12:54PM   Printer-friendly
from the my-first-hard-disk-held-40-MB dept.

Micron shared details of its 3rd generation of "10 nm-class" DRAM fabrication:

Micron's 3rd Generation 10 nm-class (1Z nm) manufacturing process for DRAM will allow the company to increase the bit density, enhance the performance, and the lower power consumption of its DRAM chips as compared to its 2nd Generation 10 nm-class (1Y nm) technology. In particular, the company says that its 16 Gb DDR4 device consumes 40% less power than two 8 Gb DDR4 DRAMs (presumably at the same clocks). Meanwhile, Micron's 16 Gb LPDDR4X ICs will bring an up to 10% power saving. Because of the higher bit density that the new 1Z nm technology provides, it will be cheaper for Micron to produce high-capacity (e.g., 16 Gb) memory chips for lower-cost, high-capacity memory sub-systems.

[...] As for mobile memory, Micron's 16 Gb LPDDR4X chips are rated for transfer rates up to 4266 MT/s. Furthermore, along with offering LPDDR4X DRAM packages with up to 16 GB (8x16Gb) of LPDDR4X for high-end smartphones, Micron will offer UFS-based multichip packages (uMCP4) that integrate NAND for storage and DRAM. The company's uMCP4 family of products aimed at mainstream handsets will include offerings ranging from 64GB+3GB to 256GB+8GB (NAND+DRAM).

Finally, a reasonable amount of RAM for smartphones. But I think we may need at least 24 GB, if not 32 GB.

Related: Xiaomi Announces Smartphones with 10 GB of RAM
Samsung Mass Producing LPDDR5 DRAM (12 Gb x 8 for 12 GB Packages)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by SomeGuy on Sunday August 18 2019, @01:44PM (11 children)

    by SomeGuy (5632) on Sunday August 18 2019, @01:44PM (#881735)

    Get Ready for Smartphones with 16 GB of RAM

    Ok, I'll get my sledge hammer ready!

    Not that a single consumertard would even *know* what to do with more RAM. Bigger numbers? Time to throw everything away again and buy all new things!

    Yea, yea, yea, now they can run one more Java based app. :P So wasteful.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Touché) by idiot_king on Sunday August 18 2019, @02:03PM (1 child)

    by idiot_king (6587) on Sunday August 18 2019, @02:03PM (#881740)

    More RAM is just another excuse to continue to use garbage-collected languages (which invariably have poor garbage collection).

    • (Score: 2, Funny) by Anonymous Coward on Sunday August 18 2019, @04:14PM

      by Anonymous Coward on Sunday August 18 2019, @04:14PM (#881782)

      You know, if Java had a good garbage collector, it would have been imploded long ago...

  • (Score: 4, Insightful) by hemocyanin on Sunday August 18 2019, @02:50PM

    by hemocyanin (186) on Sunday August 18 2019, @02:50PM (#881750) Journal

    More data means more space for spyware and more space for authorities to browse.

  • (Score: 3, Funny) by takyon on Sunday August 18 2019, @04:01PM (5 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday August 18 2019, @04:01PM (#881773) Journal

    640 GB ought to be a good starting point for anybody.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Sunday August 18 2019, @04:51PM (4 children)

      by Anonymous Coward on Sunday August 18 2019, @04:51PM (#881800)

      640 GB of ram oughta be enough for everyone!

      • (Score: 3, Interesting) by istartedi on Sunday August 18 2019, @05:44PM (3 children)

        by istartedi (123) on Sunday August 18 2019, @05:44PM (#881816) Journal

        I seriously wonder now, at what point in history there was literally 640GB of RAM for everybody, ie, that much RAM on the entire planet. I'm thinking some time in the 1960s, but I really don't know...

        --
        Appended to the end of comments you post. Max: 120 chars.
        • (Score: 4, Informative) by AthanasiusKircher on Sunday August 18 2019, @06:53PM (2 children)

          by AthanasiusKircher (5291) on Sunday August 18 2019, @06:53PM (#881830) Journal

          I wasn't around back in the olden days of computing, but my guess would be mid-1970s. The standard form of RAM from the mid-50s through mid-70s was magnetic core RAM [wikipedia.org] (or "core"), which basically needed to be manufactured by hand. Thus, the cost even into the early 1960s was something like $1 per bit. When semiconductor RAM and DRAM were introduced by the early 1970s, their price point had come down to around 1 cent per bit, which allowed them to take over the market.

          As noted in the linked article, by 1970, IBM was producing about 20 billion cores per year, or 20 billion BITS of RAM, so I don't think we were anywhere near 640 billion BYTES of RAM by that time. (Also, just thinking about the manufacturing cost -- even though core RAM was down 1 cent per bit or less by 1970, 640 GB of RAM would have required a worldwide investment of more than $50 billion even at the low 1970 prices. Assuming manufacturing that across the 1960s, it would have likely cost at least $500 billion on RAM alone. For comparison, the entire Apollo space program cost the U.S. about $25 billion. So I don't think it's feasible the globe collectively made it anywhere near 640 GB of RAM worldwide in the 1960s.)

          By the mid-1970s, with semiconductor RAM becoming dominant, chip sizes rapidly shifted upwards from 1 KB initially to 32 KB, and of course then higher. Mass-produced personal computers also started entering the market by the mid-late 1970s. So, I'd guess it's probably more likely we passed the 640 GB threshold in the mid-1970s. Definitely before 1980.

          • (Score: 2) by istartedi on Sunday August 18 2019, @10:06PM

            by istartedi (123) on Sunday August 18 2019, @10:06PM (#881878) Journal

            I had no idea we relied on core for that long. Googling around, I found that semiconductor RAM existed in the mid-60s, but looks to have been expensive and not much capacity being produced. I think you're right.

            --
            Appended to the end of comments you post. Max: 120 chars.
          • (Score: 2) by JoeMerchant on Monday August 19 2019, @01:15AM

            by JoeMerchant (3937) on Monday August 19 2019, @01:15AM (#881916)

            I bought a "PC" in 1982 for $700. It came with 16K of RAM and I eventually expanded it to 48K for an additional $300. 640GB is 48K x ~13 million. I'd guess world capacity probably hit 640GB in the mid to late 1970s - personal computers didn't really take off until the 1980s, but there would have been millions of business computers in use by 1979.

            --
            🌻🌻 [google.com]
  • (Score: 2) by JoeMerchant on Monday August 19 2019, @01:11AM (1 child)

    by JoeMerchant (3937) on Monday August 19 2019, @01:11AM (#881914)

    I've tried to get enthused about making a little touch-screen cased Ras-Pi4, but... it's just not worth the effort when you can buy a complete smartphone with stupendous specs for ~$150.

    At this rate, a phone can be a desktop replacement - Chromecast or WiDi or whatever your flavor to the monitor, Bluetooth to the HIDs, and the same mobile device when you're not sitting at a real screen with a real keyboard and mouse.

    When I get some time, I'm thinking a good project would be a "LoJack" app for a smartphone that's wired into whatever vehicle you want to monitor - GoogleFi data SIM for real-time video/gps tracking, accelerometer/gyros to detect unexpected movement, bluetooth detection of your phone in your pocket to disarm. If you want to get clever with the finance side you can probably get a good enough new phone for under $80 to run the app on, with no monthly fee and real-time reassurance that your tracked asset (car, boat, bike) is safely not being hooned by crackheads.

    --
    🌻🌻 [google.com]
    • (Score: 2) by takyon on Monday August 19 2019, @04:28AM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday August 19 2019, @04:28AM (#881972) Journal

      I've got the FLIRC case on Pi4. Now it runs at below 52°C most of the time.

      Pi4 is OK for now, but I predict that the 3DSoC [soylentnews.org] concept will make its way into ARM chips and Broadcom within the next 10 years. And then we could see early 2020s HEDT performance in an ARM chip with less power consumption than Pi4. Meaning that systems that are OK like Pi4 could get turbocharged into "faster than you know what to do with" territory. Not sure how GPUs will be affected, but the industry should eventually shoot for 1 petaflops [reddit.com] in the mobile SoC form factor for standalone VR headsets (16K res, high framerate, raytracing, etc.). The rumor mill has it that ARM will make a major GPU announcement next year, and Samsung is licensing Radeon GPU technology from AMD for mobile chips [soylentnews.org], so there's going to be more focus on ARM graphics going forward.

      Comparing Raspberry Pi to phones for desktop use, the 4 GB Pi4 could run you about $80 if including a micro-HDMI cable, power supply, a 32 GB microSD card, and sales tax. $95 if you throw in a FLIRC case at the non-discount price. And then shipping for Pi or case. From this article [gottabemobile.com], it seems like phones in that price range are packing 2-3 GB of RAM. Some of them have octa-core processors that may be faster if they are on a better node or newer ARM core design. IMO, it's at least in the same ballpark spec-wise, but the Pi gives you 4 USB ports, Ethernet, and dual displays. These are going to make it damn useful as a desktop. You could add a dock to the phone but that's extra cost if you're comparing. I'm thinking of getting an RTL-SDR [rtl-sdr.com] to occupy one of the Pi4's USB3 ports, but I hear it's not working properly with it yet. The other 3 are good for storage, keyboard, and mouse.

      I agree that docking is where we are going in the future. Smartphones that everyone carries constantly will pack a powerful 3DSoC, act as your desktop computer when docked, with cloud or local backups for data in case you throw the thing in a ditch, losing your "desktop" computer. They will become so utterly powerful, phones will take over all current desktop roles and more than 16 GB of RAM may be desirable (all of this amount may need to be integrated directly with the 3DSoC). For the remaining desktop enthusiasts, new applications will have to be created in order to exploit/waste all the performance a desktop 3DSoC would have.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]