Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by hubie on Thursday October 24, @09:17PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Intel had a solution ready to add 64-bit features to the "classic" 32-bit x86 ISA, but the company chose to push forward with the Itanium operation instead. A new snippet of technology history has recently emerged from a year-old Quora discussion. Intel's former "chief x86 architect," Bob Colwell, provides a fascinating tidbit of previously unknown information.

AMD engineer Phil Park was researching the history behind the x86-64 transition, when he discovered the conversation. Colwell revealed that Intel had an inactive internal version of the x86-64 ISA embedded in Pentium 4 chips. The company's management forced the engineering team to "fuse off" the features.

The functionality was there, but users could not access it. Intel decided to focus on the 64-bit native architecture developed for Itanium instead of x86-64. The company felt that a 64-bit Pentium 4 would have damaged Itanium's chances to win the PC market. Management allegedly told Colwell "not once, but twice" to stop going on about 64-bits on x86 if he wanted to keep his job.

The engineer decided to compromise, leaving the logic gates related to x86-64 features "hidden" in the hardware design. Colwell bet that Intel would need to chase after AMD and quickly implement its version of the x86-64 ISA, and he was right. Itanium CPUs had no native backward compatibility with 16-bit and 32-bit x86 software, so the architecture was one of the worst commercial (and technology) failures in Intel's history.

[...] Bob Colwell made significant contributions to Intel's history, managing the development of popular PC CPUs such as Pentium Pro, Pentium II, Pentium III, and Pentium 4 before retiring in 2000. Meanwhile, today's x86 chips marketed by Intel and AMD still retain full backward hardware compatibility with nearly every program developed for the x86 architecture.


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by ledow on Friday October 25, @07:27AM (4 children)

    by ledow (5567) on Friday October 25, @07:27AM (#1378592) Homepage

    Intel chose to "reinvent" the instruction set - presumably for consistency and to remove legacy - rather than continue backwards compatibility.

    That's what it boils down to. And it's not an inherently bad intention in itself, removing cruft and starting with an architecture designed for the purpose, but they completely mishandled it.

    Millions of customers worldwide would have to change processor, board, maybe even RAM, etc. in every single machine they owned, plus rewrite ALL their software.

    Or they could use AMD and just carry on as normal and transition on their own timescale.

    That's ultimately how it went down on a small business / individual level. Nobody in those places cared one jot about Itanium and never would. Even enterprises baulked at it, and they had the resources, funding and expertise to actually take it on.

    In the same way that the "IBM-compatible PC" originated, it was little to do with what Intel, or IBM, or Microsoft wanted. It was to do with what customers could reasonably obtain and not have to throw everything they already had away. It was determined by the smaller end of the market once it hit mainstream, not enterprise.

    Intel could have released - at any time - a slightly non-compatible 64-bit instruction set for x86 and, if they had wanted to, they could have blown AMD out of the market and forced them to catch back up and abandon their own version of it. All they had to do was release a 64-bit instruction set that was slightly different and not compatible, just at the time that AMD were releasing their processors, forcing them to go back to the drawing board. In essence, it actually happened entirely the other way around, and Intel were forced to throw Itanium away because a niche competitor at the time had released what customers wanted but importantly - Intel did not have a response at the time. It was Intel who had to go back and become compatible with AMD's new instructions and play catch-up.

    So it wasn't Intel being evil (which is a surprise), it wasn't them trying to screw over customers (in a way they were trying to help), it was probably a decision championed by technical people and by marketing people alike inside Intel. But in terms of overall image and customer understanding it was a failure.

    People weren't just going to throw out 40 years of architecture for no reason, when a product literally existed that a) ran everything just fine without having to emulate (also Transmeta's failing, in essence), which b) already was on the market and c) did fancy new powerful stuff too.

    It was a misstep by Intel which could have worked if AMD hadn't got there.

    But it was driven almost entirely by customers NOT wanting to have to upgrade literally everything they had all over again for no technical reason that would affect them.

    A lesson that many technology companies should learn from.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=2, Touché=1, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Interesting) by turgid on Friday October 25, @03:12PM

    by turgid (4318) Subscriber Badge on Friday October 25, @03:12PM (#1378619) Journal

    Itanium was over-engineered, overcomplicated rubbish. It was a triumph of Marketing and MBAism over Engineering. It was designed to be too complicated to close and had it special secret IP protected six ways to Sunday so that potential competitors couldn't afford to license it.

    It also relied on magical compilers which never really appeared (hint: a compiler can't predict the future at compile time).

    The suits bought into it, and it did most of its job, that is it killed MIPS, PA-RISC, Alpha, Clipper etc. It didn't kill UltraSPARC or POWER. ARM survived because it excelled at low power applications.

  • (Score: 3, Insightful) by turgid on Friday October 25, @03:23PM (2 children)

    by turgid (4318) Subscriber Badge on Friday October 25, @03:23PM (#1378621) Journal

    There was one other thing, and that was intel's x86 marketing. They were very good at disuading people not to buy competitors' x86 chips (Cyrix and AMD) back in the day. I was taken in by it too. The computer press always reported Intel chips' performance in a good light and there was always that nagging suspicion that a non-intel x86 CPU wouldn't be quite compatible.

    When AMD brought out the 32-bit Athlon, they showed Intel up completely in terms of performance. Software ran flawlessly too. All doubt was gone. There was no reason AMD64 wouldn't be a success, and very little reason to go to Itanium.

    • (Score: 3, Informative) by VLM on Friday October 25, @04:29PM

      by VLM (445) on Friday October 25, @04:29PM (#1378629)

      Fitting in with the marketing theme

      Itanium CPUs had no native backward compatibility with 16-bit and 32-bit x86 software

      IIRC some of the marketing had a theme that Itanium would be so incredibly fast that it would be able to emulate a 32-bit CPU either fast enough or faster than legacy hardware.

      The disaster of Itanium is that it was, indeed, considerably faster than legacy 32 bit processors, however, it was not fast enough to emulate legacy procs either adequately or faster than real time.

      Even poor emulation performance would have been "usable" because most software is not speed critical. If edlin.com has 4 ms of latency instead of 0.25 ms of latency no human would notice or care, while something like a database core engine would run rewritten recompiled very fast code maybe 4 times faster than legacy 32 bit processors. However, it was dismally unusably slow IIRC. I only got to play on a couple Itanium machines when they were new and not for long.

      I seem to recall there was also a product tie-in with Y2K. Not exactly a Y2K scam but close. They announced in the early 90s they'd be shipping in time for Y2K. Then in the late 90s we'll be replacing everything with Y2K compliant hardware so why not buy some Itanium stuff we need all new software versions anyway because "Y2K". Unfortunately shipping dates slipped until about the dotcom crash. Too late for the Y2K feeding frenzy but just in time for the dotcom recession in sales.

      The most amazing thing about Intel is their history is just a long list of disasters that would normally destroy any other company, but like a nuclear war-proof cockroach Intel just never dies. I'm sure they'll keep right on doing that until they eventually do go out of business. Other companies at least occasionally ship a stellar amazing product. Not Intel. Its like their companies "secret sauce" is surviving disasters. Other companies sometimes ship fast stuff, cheap stuff, cool stuff, stylish stuff, but not Intel, their superpower is you can spray an entire can of Raid (bug spray) at Intel after nuking it, but that roach never dies. Not yet, anyway.

    • (Score: 3, Insightful) by stormwyrm on Friday October 25, @07:28PM

      by stormwyrm (717) on Friday October 25, @07:28PM (#1378669) Journal
      The end came when Microsoft gave its blessing to x86-64 (they still insist on calling it x64 though lol). That was the fatal iceberg that sunk the Itanic. Intel could have provided some compatibility layer that allowed 32-bit code to run in virtualized mode inside the processor to allow for some kind of migration path, the way the 80386 had a Virtual 8086 mode that allowed 16-bit MS-DOS applications to run properly even in 32-bit protected mode. I think they attempted to do some kind of dynamic translation that could run 32-bit x86 code by converting it on the fly to Itanium instructions, but I don't think it ever performed anywhere near as well as they hoped, with performance of at most a 100 MHz Pentium in an age when gigahertz clock speeds were typical. That the entire architecture was also so awfully expensive didn't help matters. You might as well use one of the other incompatible RISC architectures available as IA-64 offered no tangible advantages. Soon after AMD offered their own solution to the 32-bit to 64-bit migration path that was cheaper, more performant, and had better compatibility than IA-64. Microsoft accepted it and at that point Itanium could ever only hope to be at best a niche architecture just like PA-RISC, Alpha, or SPARC, perhaps seeing widest use in enterprise data centres or HPC clusters, and it even struggled in that market.
      --
      Numquam ponenda est pluralitas sine necessitate.