Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday April 23 2020, @12:33PM   Printer-friendly
from the Sorry-about-that-boss! dept.

Worst CPUs:

Today, we've decided to revisit some of the worst CPUs ever built. To make it on to this list, a CPU needed to be fundamentally broken, as opposed to simply being poorly positioned or slower than expected. The annals of history are already stuffed with mediocre products that didn't quite meet expectations but weren't truly bad.

Note: Plenty of people will bring up the Pentium FDIV bug here, but the reason we didn't include it is simple: Despite being an enormous marketing failure for Intel and a huge expense, the actual bug was tiny. It impacted no one who wasn't already doing scientific computing and the scale and scope of the problem in technical terms was never estimated to be much of anything. The incident is recalled today more for the disastrous way Intel handled it than for any overarching problem in the Pentium micro-architecture.

We also include a few dishonourable mentions. These chips may not be the worst of the worst, but they ran into serious problems or failed to address key market segments. With that, here's our list of the worst CPUs ever made.

  1. Intel Itanium
  2. Intel Pentium 4 (Prescott)
  3. AMD Bulldozer
  4. Cyrix 6×86
  5. Cyrix MediaGX
  6. Texas Instruments TMS9900

Which CPUs make up your list of Worst CPUs Ever Made?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Informative) by Anonymous Coward on Thursday April 23 2020, @10:58PM (1 child)

    by Anonymous Coward on Thursday April 23 2020, @10:58PM (#986249)

    One solution to backwards compatibility while replacing with a nee architecture is to include a VM with the OS.
    The VM runs the old code. Apple did this to successfully transition CPU architecture 3 times. They are now going to transition CPU again...

    Starting Score:    0  points
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  

    Total Score:   1  
  • (Score: 1, Insightful) by Anonymous Coward on Friday April 24 2020, @12:14AM

    by Anonymous Coward on Friday April 24 2020, @12:14AM (#986291)

    A virtual machine is in almost all circumstances worse than having actual hardware. Weird inconsistencies pop up, performance is uncertain, additional software invites additional crashing and security gaps, and altogether, it's a far less reliable experience. This is something that can be dealt with to a certain extent, if the underlying architecture is set up to support it, but an architecture shift almost certainly guarantees that this will not be the case. Running x86 VMs on an x86 server is going to be much more efficient than running x86 VMs on an ARM server. Same goes for other architectures. This will be no different, and the gap will be non-negligible. Recompilation may or may not work well enough for your particular use case; most people use software beyond Microsoft Office and Chrome.

    Home users may accept it (because they can't afford anything else, which is naturally what the companies are counting on), but business users are reluctant to embrace it when they have an alternative, and as such Itanium lost out to x86 hardware in business, especially when AMD64 came along and allowed them to keep using their software as-is. AMD64 was probably AMD's finest hour; to this day their contribution is probably the most massive expansion on the x86 instruction set since the move from IA16->IA32. It even changed the abbreviation; IA64 was Itanium, AMD64 was the current generation of 64-bit x86 code.

    With Apple products, you aren't buying a computer, you're renting an experience. And the Apple experience has disdain for long-term backwards compatibility. You get to just wait until they decide you should no longer use those programs, and they go bye-bye. They did it before and they'll do it again. Although Microsoft's in the process of disposing of their backwards compatibility with Windows 10 S (or at least, charging a premium for it), and I suspect that may well end their reign, you can still run programs from the 90's without modification.

    On 32-bit versions of Windows 10, despite how much I loathe its spyware, you can still run 16-bit Windows applications. THAT is worthwhile backwards compatibility, not Apple's limp-wristed nod that expects you to "get with the program" in a couple of years, and naturally buy or re-subscribe to services (preferably Apple's). It's essentially a grace period that expires when Apple deigns fit. It is also, unfortunately, part of the overall trend of companies to make certain you never own or control a computer. Since Apple was doing it FORTY YEARS AGO (Apple II->III) and has done it on every machine since, it is thoroughly entrenched in their mindset and strategy to half-assedly provide limited backwards compatibility just long enough to twist arms to switch, and then to kill old architectures without mercy, eventually tossing the compatibility layer into the trash. It might be one reason why Macs have such a small percentage of the total desktop marketshare.