https://mark.engineer/2022/04/calculating-pi-digits-on-first-intel-microprocessor-intel-4004/
One day I thought about the performance gap between the first Intel processor and modern machines. Of course, we can try to do some estimations empirically – we know clock rate and how the pipeline is organized and what features intel 4004 CPU has (but it would not be standard FLOPS, because there was no embedded support for float numbers yet). But there are few details: architecture bit width (only 4 bits in comparison with modern 64 bits!), very limited instruction set (it's missing even basic logical operators like AND or XOR) and peripheral limitations (ROM/RAM accesses).
So I decided to research the subject in practice. After some thinking, I chose π number calculation as a benchmark. After all, even ENIAC did that (in 1949) and achieved a new record for the amount of calculated digits.
Usually, we chose hardware, based on our goals. But in that case, we need to choose an algorithm, based on restrictions that come with intel 4004. So what do we have?
CPU is very basic and its instruction set has very few ALU operations – addition/subtraction of 4-bit operands, inversion (NOT operator), rotation left/right. And ... that's all, folks. No multiplications, division or any other logical operators.
(Score: 2, Interesting) by Anonymous Coward on Monday April 18 2022, @04:59AM (2 children)
It is now so easy:
Newton would shake his head in disbelief.
(Score: 3, Interesting) by FatPhil on Monday April 18 2022, @08:35AM
Here's a somewhat depressing recent example of this problem in action: http://youtu.be/dtiLxLrzjOQ
Shoulda used a streaming spigot algorithm instead...
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 0) by Anonymous Coward on Monday April 18 2022, @12:51PM
Or in a modern shell, with one less process: bc -l "scale=1000;4*a(1)"
(Score: 0, Interesting) by Anonymous Coward on Monday April 18 2022, @05:05AM (6 children)
I have to say, as someone with a much longer historical perspective, that all this nostalgia for computing only a few decades ago really puzzles me. Is it that the basics are no longer taught? So that any sufficiently advanced tech not only appears to be magic, but to the current lot, is? Or, do all the aging nerds on SoylentNews long for the days that they could program a breadboard, or get a cursor to move on a monitor, albeit, a printer monitor? I would seriously like to know who pushes this area of interest. I know Ncommander has done some of this. Anyone else on the staff, that are seriously retro enamoured?
(Score: -1, Offtopic) by Anonymous Coward on Monday April 18 2022, @05:08AM
Oh, forgot to say, this is me.
(Score: 4, Interesting) by turgid on Monday April 18 2022, @12:42PM
For me, as modern computers and software become larger and more complex, I'm further away from the very basics, the elementary stuff that makes it all work. I was lucky in that I started on 8-bit micros with a few kB of RAM and BASIC built in. I had to learn machine code. I had to understand how multiplication, division and trig functions were implemented using adds, subtracts and bit shifts.
I found all this stuff intrinsically very interesting and rewarding personally to do myself. In simple words, it's fun. It also is a very good grounding in maths which sets you up for life.
I hate to sound like an old geezer, but there are people in their teens and twenties today who don't know any of this stuff, and may have thought about it but have never had the opportunity to experience it for themselves. I'm not sure writing trig functions using just basic arithmetic in Python is high on anyone's list of fun things to try.
I remember the great sense of wonder I experienced when I learned logarithms for the first time, and then calculus. It's that sort of thing.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 5, Interesting) by ElizabethGreene on Monday April 18 2022, @03:47PM (2 children)
I don't have a strong formal computer science education. I'm reasonably clever at solving problems, but I'm the first to admit there is a lot I don't know.
Watching Ben Eater's series on creating a basic computer on breadboards filled in a lot of gaps for me. e.g. I now understand what the "microcode updates" published by Intel actually do. I understood before that some instructions took multiple clock cycles to complete; Now I understand why. I didn't understand how plugging in a 387 math coprocessor would "just work" without code changes, now I do.
Another set of courses in that vein was "Nand to Tetris" on Coursera. These courses conceptually took me from logic gates and flip flops to multiplexers and from that to a functional computer (e.g. video memory, io, stack, etc.)(I wish they'd covered interrupts more). The second half of that course went further by building an assembly to machine language interpreter, a high level language compiler, building a basic operating system and virtualization.
Part of me is nostalgic. I remember wanting a computer and now I know I could have built one. The other part is foundational knowledge that I haven't picked up elsewhere. Also "I built a computer from scratch on a PCB I laid out and etched myself." is pretty cool nerd cred. :)
(Score: 2) by krishnoid on Monday April 18 2022, @04:43PM (1 child)
I've held that if you can understand how CPU instructions map to assembly, which build into the compiler output for a given language's expressions, that you can then understand how *any* high-level language has to be able to generate code against that same foundation of CPU instructions, (hopefully) flat memory addressing, stack and heap, etc. Learning a new language then doesn't have to be an unintelligible nightmare [youtu.be].
(Score: 2) by ElizabethGreene on Tuesday April 19 2022, @04:05AM
I get what you're saying. Most languages do feel like nifty wrappers around the same primitive ideas.
Then you've got the oddballs over in the corner handing out free candy. Yes, LISP, I'm looking at you. :)
(Score: 0) by Anonymous Coward on Monday April 18 2022, @06:43PM
I think just about anything is subject to nostalgia. But I don't think this is the same thing. There weren't any 4004 based PCs. Instead, as I think is apparent from the article, this is driven by the usual hacker desire to make hardware do things it isn't supposed to be capable of. The more unusual the hardware, the more interesting.
This would be pretty easy on a 6502, with proper logical operators and a lot more memory that can be filled with lookup tables. That might entertain whoever decided to do it but it wouldn't be an interesting article. And it's only a little bit newer than the 4004. The 4004 is in that weird space where it is capable of but unsuited for this task. That makes it interesting to read about.
(Score: 2) by Kell on Monday April 18 2022, @05:33AM (3 children)
This is a really interesting project. I'm trying to do something similar but with the intention of building a relay computer that produces decimal digits of pi and outputs on a punched tape. The strict memory limitations of working with physical relays will make algorithm design very important. I'm currently looking at spigot algorithms as a potential class, but as yet there is no algorithm I know of with memory demands that scale 1:1 with desired output size. If anyone of the smart math wonks here have insights, I'd love suggestions!
Scientists ask questions. Engineers solve problems.
(Score: 3, Informative) by FatPhil on Monday April 18 2022, @08:12AM (1 child)
There are time/space trade-offs, I came across an oldish paper (early 2000s?) only recently on this matter, but alas it's not one of the ones linked to on the wikipedia spigot algorithm page, and a web search doesn't find it either. However, I'm not sure if that let space grow larger than plain spigot, or time. Gourdon and Bellard have done a lot on this, Percival and Plouffe too, I'm sure if you do a thorough-enough literature search, you'll come across something.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 3, Interesting) by Kell on Monday April 18 2022, @01:46PM
Good resources there, thanks! I've read the Percival and Plouffe stuff, and a few others too. My back-of-the-mind goal was to produce a digit of pi every second until the end of the universe... but the amount of memory required was infeasible. If your point about the roughly 1:1 memory growth is correct, then to make the system work until the death of the solar system was much more feasible, requiring a paltry 58 bits*. That would suggest that a 64 bit implementation would be appropriate. Up next: figuring out how many registers will be required. I don't really want to make a full ALU that reuses general purpose registers: I'd like to make each computational stage its own register so that I can more easily debug it and verify that everything is operating correctly. Fun times!
*Yes, I am intensely aware that the relays will wear out long before this. That's not the point.
Scientists ask questions. Engineers solve problems.
(Score: 2, Interesting) by Anonymous Coward on Monday April 18 2022, @03:09PM
> the intention of building a relay computer that produces decimal digits of pi and outputs on a punched tape
Well, historically, Babbage was supposed to be doing Logarithms with his Difference Engines, so I'd consider that the goal of a edwardianpunk computer.
My approach was an OISC/Subleq design implimented in Verilog, which has an HLL compiler (Oleg Mazonka's HSQ).. I got as far as generating factorials before I got bored...