Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday November 07 2015, @03:12AM   Printer-friendly
from the 8=4 dept.

In 2011 AMD released the Bulldozer architecture, with a somewhat untraditional implementation of the "multicore" technology. Now, 4 years later, they are sued for false advertising, fraud and other "criminal activities". From TFA:

In claiming that its new Bulldozer CPU had "8-cores," which means it can perform eight calculations simultaneously, AMD allegedly tricked consumers into buying its Bulldozer processors by overstating the number of cores contained in the chips. Dickey alleges the Bulldozer chips functionally have only four cores—not eight, as advertised.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by Hairyfeet on Saturday November 07 2015, @04:31AM

    by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Saturday November 07 2015, @04:31AM (#259818) Journal

    No it means this dumbass doesn't know anything about CPU arches. I sell AMD exclusively in the shop and have read extensively about how the BD/PD arch works. What you have is a PAIR of integer cores and the FPU can be ONE 256bit FPU (for AVX) OR, and here is the key point, it can be used as TWO 128bit FPUs.

    So each module is 2 integers with 2 128bit FPUs that can be joined as a single 256bit if you require AVX. If the jobs you are doing is primarily multicore aware, which just FYI if it was a "fake core" the opposite would be true you will have FASTER performance with the AMD than you will if its a single core process, which Intel chips are better at...that's it, that is all the evidence you need to see its bullshit. Just look at these Linux benchmarks [phoronix.com] and you will see the more cores? The better it runs because these Linux programs can take advantage of more cores. If it were truly a "half core" then you simply wouldn't see those kinds of gains. There are also plenty of benchmarks that shows the FX-6350 is a good match for the Phenom II 1100T X6 [cpuboss.com] which according to their logic is a three core versus a 6 core...yet they are evenly matched? Bullshit.So this is just another case of "lawsuit lotto" by a dumbass that doesn't understand even the most basic of hardware design.

    Oh and just FYI, anybody who wants to build a truly awesome PC on a budget? The FX8320E and the FX6300 are the sweet spots at the moment in pure CPU, very fast, great at multitasking, easily OCs if you desire, and dirt cheap. In the APU line the A10-7870K is pretty damned impressive, can play modern games like BF4, and can be had for $127 shipped. Finally if you want an INSANELY cheap HTPC? Grab the Athlon 5350. Quad cores at 2.06Ghz, nice Radeon HD8400 baked in, only uses a max 25w (and according to kill-a-watt they average less than 12w for most tasks) and if you use a Linux OS like OpenELEC you can build a DAMN nice HTPC for less than $150. THIS is why I have no trouble selling AMD exclusively, as they have really nice chips at truly crazy cheap prices.

    And I put my money where my mouth is, I'm typing this on my FX-8320E with 16GB of RAM, R9 280 GPU, 3TB HDD with 240GB SSD, BD and DVD burners, Win 7 HP and a Rosewill Thor gamer case...cost? A hair under $700 after MIR. No way in hell you could get that kind of performance for that cost going Intel, no way, my games look jaw dropping purty and it just blows through transcodes and layering effects on my multitrack recordings like it was nothing. I had both the Phenom II X4 (currently used by my wife who is slaughtering a carrier in World Of Warships with it as we speak) and a Phenom II X6 (currently being used by my oldest in War Thunder to grind the American tank tree) which according to this lawsuit are both "superior" CPU due to the design of the FP and...yeah, just no. While they both make for great gaming CPUs the extra cores and higher clocks cut my transcodes and renders damned near in half, does that sound like the performance one would be seeing from "half a core" to you?

    --
    ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    Starting Score:    1  point
    Moderation   +4  
       Flamebait=1, Interesting=3, Informative=2, Total=6
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Interesting) by Snotnose on Saturday November 07 2015, @02:20PM

    by Snotnose (1623) on Saturday November 07 2015, @02:20PM (#259957)

    So this is just another case of "lawsuit lotto" by a dumbass that doesn't understand even the most basic of hardware design.

    I think it's more like some dude is playing lawsuit lotto hoping he gets a dumbass judge and/or jury.

    Interesting post hairyfeet. I've got an A10 myself and am pretty happy with it.

    --
    When the dust settled America realized it was saved by a porn star.
  • (Score: 2) by bzipitidoo on Saturday November 07 2015, @03:04PM

    by bzipitidoo (4388) on Saturday November 07 2015, @03:04PM (#259972) Journal

    I keep a foot in every camp-- in the PC world. (Don't use Macs, but do have a few ARM computers, a Beaglebone and a smartphone, and keep thinking about getting a Raspberry Pi.) When one of these hardware vendors finally releases an open source graphics driver for Linux (or FreeBSD, I'm not picky) with decent 3D acceleration, I'll make that my next computer. After all these years, we're still not quite there. Nouveau still does not have as good 3D acceleration as the proprietary Nvidia driver. On the AMD side, I hear Catalyst is a mess of a proprietary driver, and the Radeon/Mesa driver is only half the speed, and there are quite a few games that don't work on it at all. So I was interested when Intel stepped up their game with their integrated graphics. They used to be so bad that a PC with the previous generation of CPU and 1/3 the clock speed, but with NVidia graphics, would outperform the Intel integrated garbage. Intel's HD series is not glacially slow even at 2D, and can actually compete with old, low end Radeons and Nvidias. I have another PC with an i5-3317u and Intel HD4000 graphics, and it's not bad, a bit better than my aging, main PC with a Phenom II X4 945 and Radeon HD 5450.

    One of the things I like a lot is low power consumption. That Intel PC uses 30W max, when something with intense 3D accelerated graphics usage is running, 20W when watching a video, and only 10W when text editing. What does AMD have these days that matches that? (My old AMD takes 116W max, 70W for text editing.) The APU stuff? That Athlon 5350 you mentioned? 25W, you say? Presumably easy to repurpose from home theater to desktop usage. And the Linux Radeon driver, when will it be able to do decently speedy 3D acceleration?

    • (Score: 2) by Hairyfeet on Sunday November 08 2015, @12:51AM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Sunday November 08 2015, @12:51AM (#260170) Journal

      Here is a Phoronix review using Ubuntu [phoronix.com] and they liked it, and that is a year old, the drivers are even better now. AMD has been paying for extra FOSS devs to support their APUs and most of them run quite nice now, and you can score one of these chips just crazy cheap, we're talking $30 for the dual core Sempron and $54 for the top of the line Athlon 5350. If you want to build a really cheap HTPC that's Linux friendly? Well here ya go. I've been using them at the shop with both windows 8 (frankly the only place windows 8 works is as a 10 foot UI) and OpenELEC (Linux based XBMC/Kodi OS) and they seem to play quite nicely with both. I've also had customers use them as ULP office boxes and they are VERY happy with the performance, for all your basic tasks like web surfing, office work, watching vids? Its smooth and easy and is so low power its practically silent.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.