A history of ARM, part 1: Building the first chip:
It was 1983, and Acorn Computers was on top of the world. Unfortunately, trouble was just around the corner.
The small UK company was famous for winning a contract with the British Broadcasting Corporation to produce a computer for a national television show. Sales of its BBC Micro were skyrocketing and on pace to exceed 1.2 million units.
But the world of personal computers was changing. The market for cheap 8-bit micros that parents would buy to help kids with their homework was becoming saturated. And new machines from across the pond, like the IBM PC and the upcoming Apple Macintosh, promised significantly more power and ease of use. Acorn needed a way to compete, but it didn't have much money for research and development.
Sophie Wilson, one of the designers of the BBC Micro, had anticipated this problem. She had added a slot called the "Tube" that could connect to a more powerful central processing unit. A slotted CPU could take over the computer, leaving its original 6502 chip free for other tasks.
But what processor should she choose? Wilson and co-designer Steve Furber considered various 16-bit options, such as Intel's 80286, National Semiconductor's 32016, and Motorola's 68000. But none were completely satisfactory.
In a laterĀ interview with the Computing History Museum, Wilson explained, "We could see what all these processors did and what they didn't do. So the first thing they didn't do was they didn't make good use of the memory system. The second thing they didn't do was that they weren't fast; they weren't easy to use. We were used to programming the 6502 in the machine code, and we rather hoped that we could get to a power level such that if you wrote in a higher level language you could achieve the same types of results."
But what was the alternative? Was it even thinkable for tiny Acorn to make its own CPU from scratch? To find out, Wilson and Furber took a trip to National Semiconductor's factory in Israel. They saw hundreds of engineers and a massive amount of expensive equipment. This confirmed their suspicions that such a task might be beyond them.
Then they visited the Western Design Center in Mesa, Arizona. This company was making the beloved 6502 and designing a 16-bit successor, the 65C618. Wilson and Furber found little more than a "bungalow in a suburb" with a few engineers and some students making diagrams using old Apple II computers and bits of sticky tape.
Suddenly, making their own CPU seemed like it might be possible. Wilson and Furber's small team had built custom chips before, like the graphics and input/output chips for the BBC Micro. But those designs were simpler and had fewer components than a CPU.
Despite the challenges, upper management at Acorn supported their efforts. In fact, they went beyond mere support. Acorn co-founder Hermann Hauser, who had a Ph.D. in Physics, gave the team copies of IBM research papers describing a new and more powerful type of CPU. It was called RISC, which stood for "reduced instruction set computing."
[...]
(Score: 3, Interesting) by optotronic on Tuesday September 27 2022, @02:19AM (2 children)
This is a very interesting article. I was big into microcomputers at the time and read about RISC a long time ago but I had no idea ARM chips existed in 1985. In the early nineties I had heard of some RISC cpus, and some computers using the processors, but they always seemed second-tier in spite of my wanting them to succeed because of the relative simplicity of the architecture. We even owned a Mac PowerPC for years but now I don't remember whether I knew it used a RISC cpu.
I'm very happy to see ARM chips in so many computers now. I just wasn't aware of their heritage.
(Score: 3, Funny) by driverless on Tuesday September 27 2022, @09:48AM (1 child)
Almost as good as designing supercomputers with a #3 pencil, the back of a quadrille pad, and a team 34 people -- including the janitor.
(Score: 0) by Anonymous Coward on Thursday September 29 2022, @08:27AM
I'm wondering how backward compatible the modern ARM instruction set is with the original ARM's. Are they practically different instruction sets or are they like the modern AMD64 still being able to support most 8088 stuff?
(Score: 5, Interesting) by bart on Tuesday September 27 2022, @10:36AM (1 child)
My buddy Axel Roest had an Acorn Archimedes, and we programmed a Mandelbrot generator in ARM assembly using fixed point math. Its performance was awesome compared to anything we tried on any of the other computers we had.
The funny thing is that you could hear from the pink noise on the speakers attached to it where the calculation was in the Mandelbrot set. Mesmerizing. And an indication of poor audio design :-)
(Score: 0) by Anonymous Coward on Thursday September 29 2022, @09:13PM
Was he in a pop group called guns and roses by any chance?
(Score: 4, Insightful) by Rich on Tuesday September 27 2022, @12:55PM (2 children)
As they say, you get farther if you don't know what's considered impossible. I'm amused that they took the decision after seeing how WDC worked and came up with ARM. WDC's Bill Mensch was part of the original 6502 design team and the add-ons he did wit WDC to turn it into the 65816 are a sorry kluge on top of the original at best, completely outdated by other contemporary developments (*), and nothing like designing a super fast new CPU from scratch. What the ARM guys achieved is incredibly impressive, in comparison (and history still proves them right).
(*) Yet I still sold my Amiga 1000 for an Apple IIgs...
(Score: 2) by turgid on Tuesday September 27 2022, @04:49PM (1 child)
You sold your Amiga 1000 for an Apple IIgs? Why?
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 3, Interesting) by Rich on Tuesday September 27 2022, @08:13PM
Mostly because I was an Apple II guy, and, as we'd say today "heavily invested in the ecosystem". Also, having interest in electronic music, the Ensoniq DOC 5503 sound chip won me over. Had they shipped an AY3-8910 as they originally considered, I wouldn't have switched back down.
I even knew people who had ties into Commodore Germany, but somehow the local Amiga scene seemed not to be a fertile ground for new things. Whereas the Apple side (near zero manufacturer contact AND very uncompetitive pricing over here) was a neat playground, even with very few IIgs sold. And still, paid commissions were coming in between the hobby stuff.
I guess the main driver in the 80s were the Apple II's slots, which enabled all kinds of stuff and built up a momentum that lasted long beyond the time where the Apple II was leading edge (and eventually was absorbed by the IBM PC compatibles with their slots).