Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Wednesday January 23 2019, @09:22PM   Printer-friendly
from the class-excavation dept.

Core blimey... When is an AMD CPU core not a CPU core? It's now up to a jury of 12 to decide

A class-action lawsuit against AMD claiming false advertising over its "eight core" FX processors has been given the go-ahead by a California judge.

US district judge Haywood Gilliam last week rejected [PDF] AMD's claim that "a significant majority" of people understood the term "core" the same way it did as "not persuasive."

What tech buyers imagine represents a core when it comes to processors would be a significant part of such a lawsuit, the judge noted, and so AMD's arguments were "premature."

The so-called "eight core" chips contain four Bulldozer modules, the lawsuit notes, and these "sub-processors" each contain a pair of instruction-executing CPU cores. So, four modules times two CPU cores equals, in AMD's mind, eight CPU cores.

And here's the sticking point: these two CPU cores, within a single Bulldozer module, share caches, frontend circuitry, and a single floating point unit (FPU). These shared resources cause bottlenecks that can slow the processor, it is claimed.

The plaintiffs, who sued back in 2015, argue that they bought a chip they thought would have eight independent processor cores – the advertising said it was the "first native 8-core desktop processor" – and paid a premium for that.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by AthanasiusKircher on Thursday January 24 2019, @02:12PM (1 child)

    by AthanasiusKircher (5291) on Thursday January 24 2019, @02:12PM (#791219) Journal

    Since I've been modded as "disagree," I'll just note that almost all most post is just factual. The only revision I would make is the first sentence should say "The hard disk manufacturers are correct NOW." It's true there were lawsuits and disagreement years ago, but the official standards organizations have weighed in and clarified the correct usage.

    Note that usually I'm happy to accept changes in language when usage changes decidedly. But we are not dealing with "language" in the normal sense here. The entire point of the SI system is to establish standards. The early use in the computing industry to call 1024 bytes a "kilobyte" because 1024 is close to 1000 was a deviation from the definition of the prefix. It took hold mostly when computing was still a niche industry.

    The SI system was put in place in part to avoid specifically these kinds of deviations in meaning regarding units. Before the metric system, you not only had regional differences in unit definitions (e.g., a "foot" could mean a somewhat different length in different countries, or even in different cities), but also different unit definitions by industry (e.g., an "ounce" of gold could be different from an "ounce" of water, and a "plumber's ounce" might be different from a "wine merchant's ounce").

    The computing industry appropriated the SI prefixes and attempted this very type of redefinition to fit their purposes. It's of course reasonable to have multiples of 2 used in computing for units. It is, however, against the very rationale of the metric system to have a deviation from definitions in a particular industry. (Actually, not even a whole industry -- it's not like "giga-" in "gigahertz" means 1024^3. It was basically only a deviation for the unit of storage space and sometimes data transfer rate.)

    I'll be the first one to agree that terms like "gibibyte" sound silly. But it's very reasonable to make the distinction in these units. And I believe Mac OS and Linux now make the GB/GiB distinction clearly and following the standards. Windows may be the only outlier that actually still measures hard disk space using the (inaccurate) KB/MB/GB/TB definition of powers of 2 -- though I haven't used Windows regularly in a few years, so I don't know.

    I'm surprised when these disagreements arise on tech forums, because arguing in favor of the 1 GB = 1024^3 definition is like arguing in favor of some other weird parochial measurement unit, like the U.S. should stick to pounds and feet. Such an attitude is usually condemned in places like this. Actually, it's a bit worse, since the SI prefixes are being misappropriated. It would be kind of like the U.S. "adopting" the kilogram, but simply defining it to be 2 pounds "because it's close enough, and it suits us better to use a unit that's a whole multiple of a unit we're already using." Such an error would be within 10%, which is approximately the error present in the TB definition.

    Anyhow, we can disagree about what should be the standard measurement unit (GB or GiB), but equating them or pretending that GB means GiB flies in the face of reasonable standards for measurement usage.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by RS3 on Friday January 25 2019, @08:30AM

    by RS3 (6367) on Friday January 25 2019, @08:30AM (#791670)

    I surely did not downmod you, and rarely do to anyone (only the very obvious trolls). I could discuss my dislike of the mod systems but that would be offtopic tome.

    I have hopefully intelligent thoughts but it's extremely late where I am and anything I write will tomorrow appear scattered to my then rested brain.

    But I will say that my immediate reaction to your post was going to be to mention the "K" and "M" in both hard disk and RAM sizing. As far as I know, in RAM sizing, "G" really does = 1024^3. I'll have to check in my IT museum and see how they sized hard drives long ago, but I'm pretty sure I'll find that "M" is 1024^2.

    For the record, I have great disdain for heated discussion, esp. this type. But I will point out: you mentioned the adoption of "K" meaning 1024, but you didn't disprove it, and it (K equaling 1024) kind of undermines your GiB argument.

    My thoughts / feeling: when "K" was adopted to mean 1024 in the computing world, and later "M", I and so many others thought "G" would naturally follow as 1024^3, _especially_ since the prefix "giga" is generally only used in technical / scientific parlance.

    I'm writing too much and I'm too tired. I have much more to say when rested. Bottom lines are: I and most people have been okay with "K", "M", "G", "T", "P", etc., all being powers of 2 when used in reference to computer / data storage. We all (well, most of us) felt cheated by big-business tycoons when they demoted hard disk "G" to mean 1,000,000,000. It just seemed a bit too convenient and certainly disingenuous. But most of us have accepted it and moved on with life, not sweating the details. :)