Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by chromas on Saturday September 29 2018, @03:05AM   Printer-friendly
from the 640GB-ought-to-be-enough-for-now dept.

OPPO Find X to get 10GB RAM version, spotted at TENAA

There have been rumors of a 10GB RAM smartphone in development for a while now. Vivo's yet unreleased Xplay7 was rumored to come with 10GB RAM and the ASUS ROG Phone was also supposed to come with 10GB of RAM. It appears OPPO will be the first to launch a 10GB RAM phone judging by an updated TENAA listing of the Find X.

The Find X originally comes with 8GB of RAM and 128GB or 256GB of storage but Chinese leaker @UniverseIce shared a photo of an updated listing that shows the Find X will get a new 10GB RAM + 256GB ROM model.

We were able to confirm that the leak is genuine as the full TENAA specs listing for the Find X (PAFM00 model) now has a 10GB RAM variant. The update to the listing was made yesterday. The rest of the specs will remain the same as the other variant.

TENAA is China's phone regulatory body.

Also at The Verge, Engadget, Fossbytes, and BGR.

Related: Samsung Announces 12Gb LPDDR4 DRAM, Could Enable Smartphones With 6 GB of RAM
Samsung Announces 8 GB DRAM Package for Mobile Devices


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by MichaelDavidCrawford on Saturday September 29 2018, @04:15AM (11 children)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday September 29 2018, @04:15AM (#741688) Homepage Journal

    I was the only full-time coder for a company with twelve employees whose peak year during my time there was three million dollars gross.

    The most hard drive I had through that time was eighty megabytes, the most RAM I think 16mb.

    If I could write three million dollars worth of code with sixteen megabytes in 1992, why can't we _all_ do so in 2018?

    There are some legitimate explanations, for example there is a great deal of value to the in-memory databases that 64-bit addressing affords us.

    What gets me down is that many of today's most-popular applications are _dramatically_ slower than those of 1992. Shouldn't they be faster? I have a special hatred for {Open,Libre}Office. The time that elapses between my clicking their icon in my Mac's Dock until they're ready to accept user input is just inexcusable.

    But let's set that problem aside from now.

    Flappy Bird I understand made its author fifty grand PER DAY. That fellow had the right attitude. Surely more of us can come up with money-making yet resource-sparing products.

    --
    Yes I Have No Bananas. [gofundme.com]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2, Informative) by Anonymous Coward on Saturday September 29 2018, @04:49AM (4 children)

    by Anonymous Coward on Saturday September 29 2018, @04:49AM (#741702)
    1. If I could write three million dollars worth of code with sixteen megabytes in 1992, why can't we _all_ do so in 2018?
    2. What gets me down is that many of today's most-popular applications are _dramatically_ slower than those of 1992.

    The modern code is slower because it is cheaper. Today's computers are good enough to deal with the overhead of the OS and standard frameworks. Those who think otherwise are always welcome to code in assembly language, it's supported on most/all platforms. Of course, hundreds of bank forms are coded a bit differently than a pacemaker.

    • (Score: 0) by Anonymous Coward on Saturday September 29 2018, @05:13AM (2 children)

      by Anonymous Coward on Saturday September 29 2018, @05:13AM (#741711)

      Assembly language? No need since I entered the workforce 23 years ago.
      Any simple compiled language will do, like C. Oh, and lay off the dozen framework layers.

      • (Score: 2) by MichaelDavidCrawford on Saturday September 29 2018, @08:56AM (1 child)

        by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday September 29 2018, @08:56AM (#741749) Homepage Journal

        Not to actually write any, nor to read disassemblies during debugging.

        Rather it's to more intuitively understand what their high-level, interpreted code is doing to their CPU and their RAM.

        Back in the day, some newbie posted to Usenet a question about how he could write a C program that would execute the chmod(1) command-line program. He was completely unaware that he could just use the chmod() system call.

        That kind of newbie lack of insight is today found even among coders with ten or more years experience, because there are a great many who write nothing but Javascript during their entire careers.

        --
        Yes I Have No Bananas. [gofundme.com]
        • (Score: 0) by Anonymous Coward on Saturday September 29 2018, @05:22PM

          by Anonymous Coward on Saturday September 29 2018, @05:22PM (#741851)

          And every single one of those ignorant JS coders is a millionaire, because the techbro industry values young and stupid, just like you do, you old pedophile.

    • (Score: 2) by TheRaven on Saturday September 29 2018, @12:37PM

      by TheRaven (270) on Saturday September 29 2018, @12:37PM (#741773) Journal
      That argument sounds really convincing, until you remember that Smalltalk-80 ran on a machine with 512KB of RAM and presented a graphical environment written entirely in a high-level language (JavaScript is a similar abstraction level to Smalltalk). Now, admittedly, Smalltalk-80 used 1-bit colour and bitmap fonts, so there's a good reason for a couple of orders of magnitude more RAM and CPU / GPU power if you want to have composited true colour displays and hinted vector fonts.
      --
      sudo mod me up
  • (Score: 2, Interesting) by Anonymous Coward on Saturday September 29 2018, @04:58AM (4 children)

    by Anonymous Coward on Saturday September 29 2018, @04:58AM (#741705)

    If I could write three million dollars worth of code with sixteen megabytes in 1992, why can't we _all_ do so in 2018?

    Back in 1992, we were all writing code that was "close" to the computer. By that I mean, while most of us weren't writing machine code, there were few layers of abstraction between the application code and the actual CPU execution layer.

    Now there are so many layers upon layers of abstraction -- where abstraction equals "other people's code that we can't operate without yet we have no idea what it's really doing in that black box" -- that must be included/available for any application to work, that applications are relatively huge.

    It's also the reason why so few of the recently graduated "programmers" are any good at real troubleshooting, because they really have no idea what the computer is actually doing. They don't program a computer, they send calls to API classes which do all the real work for them. Lets them worry about font size and eliminating skeuomorphism instead of logic or efficiency.

    But I'm just a bitter old guy so what do I know?

    • (Score: 1, Interesting) by Anonymous Coward on Saturday September 29 2018, @05:50AM (1 child)

      by Anonymous Coward on Saturday September 29 2018, @05:50AM (#741723)

      Now there are so many layers upon layers of abstraction -- where abstraction equals "other people's code that we can't operate without yet we have no idea what it's really doing in that black box" -- that must be included/available for any application to work, that applications are relatively huge.

      Indeed, I did look now and then into sources of Qt to figure out why things don't work. Same with Atmel libraries. Found a crashing bug once in the library - without that fix the code would be a dead end.

      They don't program a computer, they send calls to API classes which do all the real work for them. Lets them worry about font size and eliminating skeuomorphism instead of logic or efficiency.

      That's what abundance of CPU speed and memory does to people. They choose the easiest, cheapest solution that can be sold to the equally undemanding customers. But things like high end CAD software or FPS games are coded with utmost respect to available CPU time.

      • (Score: 2) by takyon on Saturday September 29 2018, @06:07AM

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday September 29 2018, @06:07AM (#741729) Journal

        That's what abundance of CPU speed and memory does to people. They choose the easiest, cheapest solution that can be sold to the equally undemanding customers. But things like high end CAD software or FPS games are coded with utmost respect to available CPU time.

        The system works.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by MichaelDavidCrawford on Saturday September 29 2018, @08:49AM (1 child)

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday September 29 2018, @08:49AM (#741747) Homepage Journal

      I must say I often find some things today that are a vast improvement over their decades-ago equivalents:

      Instead of artificial woodgrain veneer I usually find real wood. For example growing up my family's dining room table had woodgrain formica, but the table I own myself, that i bought in 1998, it's made of real wood - solid even, no veneer.

      --
      Yes I Have No Bananas. [gofundme.com]
      • (Score: 0) by Anonymous Coward on Saturday September 29 2018, @11:42AM

        by Anonymous Coward on Saturday September 29 2018, @11:42AM (#741759)

        Real wood is too expensive now

  • (Score: 0) by Anonymous Coward on Sunday September 30 2018, @10:11PM

    by Anonymous Coward on Sunday September 30 2018, @10:11PM (#742168)

    If I could write three million dollars worth of code with sixteen megabytes in 1992, why can't we _all_ do so in 2018?

    Over-reliance on abstraction and bloated libraries that rely on bloated libraries that rely on bloated libraries.

    In 1992, you cared about optimizing code and reducing footprint size because you knew computers weren't that fast and didn't have a lot of memory or storage. (I remember the time: 16MB was an extravagance for my then-high-end 486/66, adding $1200 to the cost of my computer. 8MB was a lot and 4MB more common.) Today, nobody thinks twice about optimization. Processors are fast and getting faster every year, RAM and storage keeps getting faster and cheaper. Nobody has to take pride in shaving minutes (or hours) of runtime or kilobytes off their code anymore because the bloat will be absorbed by the next generation of hardware. It's quite sad, really.