Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday May 08 2024, @12:24AM   Printer-friendly

http://www.righto.com/2024/04/intel-8088-bus-state-machine.html

The 8088 processor communicates over the bus with memory and I/O devices through a highly-structured sequence of steps called "T-states." A typical 8088 bus cycle consists of four T-states, with one T-state per clock cycle. Although a four-step bus cycle may sound straightforward, its implementation uses a complicated state machine making it one of the most difficult parts of the 8088 to explain. First, the 8088 has many special cases that complicate the bus cycle. Moreover, the bus cycle is really six steps, with two undocumented "extra" steps to make bus operations more efficient. Finally, the complexity of the bus cycle is largely arbitrary, a consequence of Intel's attempts to make the 8088's bus backward-compatible with the earlier 8080 and 8085 processors. However, investigating the bus cycle circuitry in detail provides insight into the timing of the processor's instructions. In addition, this circuitry illustrates the tradeoffs and implementation decisions that are necessary in a production processor. In this blog post, I look in detail at the circuitry that implements this state machine.


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by Tork on Wednesday May 08 2024, @12:28AM (13 children)

    by Tork (3914) Subscriber Badge on Wednesday May 08 2024, @12:28AM (#1356132) Journal
    I love articles like these, but I don't really get them. I spend half the time going "how did they ever build a computer without a computer to help design it?" heh
    --
    🏳️‍🌈 Proud Ally 🏳️‍🌈
    • (Score: 5, Funny) by Rosco P. Coltrane on Wednesday May 08 2024, @03:01AM (12 children)

      by Rosco P. Coltrane (4757) on Wednesday May 08 2024, @03:01AM (#1356148)

      "how did they ever build a computer without a computer to help design it?"

      Dude, this is 2024. The question you should ask yourself is how they built a computer without AI.

      • (Score: 2, Funny) by Anonymous Coward on Wednesday May 08 2024, @03:15AM (5 children)

        by Anonymous Coward on Wednesday May 08 2024, @03:15AM (#1356152)

        How did they build AI without AI?

        • (Score: 5, Insightful) by Anonymous Coward on Wednesday May 08 2024, @04:13AM (4 children)

          by Anonymous Coward on Wednesday May 08 2024, @04:13AM (#1356168)

          The architectural achievements of the ancients never ceases to amaze me.

          I can't help but consider our own technology and and wonder if it will join the ranks of obscurity as the construction techniques of the ancients.

          With all the encryption we are so fond of that enforces intellectual property ownership, the number of people who control the "keys" dwindle, even though even they cannot build such a thing.

          People thrashing through the remains of our civilization won't have a clue how to start it back up. For now, our legacy systems are still usable, but given hundreds of years, even they will corrode to dust.

          • (Score: 5, Insightful) by Rich on Wednesday May 08 2024, @09:28AM (3 children)

            by Rich (945) on Wednesday May 08 2024, @09:28AM (#1356185) Journal

            Two days ago, I got nerd-triggered into a "Duty Calls" operation (cf. https://xkcd.com/386/ [xkcd.com]) by a "Tagesschau" (which is the most important german news outlet) article about Zeiss and ASML. The article stated that a CO2 laser generated EUV (the zinc pellets hit by its IR-beam do), the "Apollo 11 chip" having 1000 transistors (it had 6, also deeply analyzed by Ken Shirriff from TFA), and ASML making microprocessors (their customers do). The article was then corrected in these three places (without a changelog, and still saying that "around 1970, chips had 1000 transistors.". The NE555 certainly had less.)

            Later I told what I did to someone, and had to explain that there's something like "digital archaeology". Mostly done out of curiosity by random nerds, but it probably should be the most important field in archaeology, because no one will be able to reconstruct the path from Shockley's point contact experiment to an ASML Twinscan - and how progress in the end products went along. Fortunately, we have the early microprocessors on OpenCores, but the fact that this (https://opencores.org/projects/a-z80 [opencores.org]) models a Spectrum and the picture has a big joystick in front, demonstrates how this isn't some official project done by a nation state, but preserving civilization is done by nerds for fun.

            What a single nerd can do ends not far after the Z80s complexity (but, on topic, there IS an 8088 project on OC: https://opencores.org/projects/rtf8088 [opencores.org])

            • (Score: 5, Informative) by EEMac on Wednesday May 08 2024, @02:36PM

              by EEMac (6423) on Wednesday May 08 2024, @02:36PM (#1356206)

              > preserving civilization is done by nerds for fun.

              Case in point: the TRS-80 Model II archive [github.com] is a ridiculously complete set of documentation and software for an obscure-but-capable computer, maintained by one guy just because he was interested.

            • (Score: 2, Informative) by Anonymous Coward on Wednesday May 08 2024, @06:10PM (1 child)

              by Anonymous Coward on Wednesday May 08 2024, @06:10PM (#1356233)

              Same instruction-level architecture, RADICALLY different physical and toolchain implementation.

              Multi-phase NMOS logic doesn't look or work the same as CMOS LUT-based designs.

              Ken is doing God's work documenting how the actual transistor-level logic works.

              • (Score: 4, Insightful) by Rich on Wednesday May 08 2024, @07:12PM

                by Rich (945) on Wednesday May 08 2024, @07:12PM (#1356244) Journal

                I'd assume the logic descriptions are somewhat close, probably most for the exactly understood 6502, and the LUT-layout is done by the RTL compiler, so the logic part isn't hat much of a worry (also given that by 1982 all the old chips appeared in CMOS variants). For the exact NMOS layouts, you're of course right, and Ken's work indeed is incredibly valuable. Paper copies of old "semiconductor design" books might still exist, but they will not cover the secret tricks from the vendors that Ken uncovers.

      • (Score: 4, Funny) by krishnoid on Wednesday May 08 2024, @04:13AM (5 children)

        by krishnoid (1156) on Wednesday May 08 2024, @04:13AM (#1356169)

        Since you asked [reddit.com], according to Google's generative AI:

        Regular computers are programmed with predetermined instructions, and will only produce a result if they are explicitly told how to achieve it. AI systems, on the other hand, are designed to mimic human intelligence and can learn, adapt, and make decisions based on data without explicit programming.

        • (Score: 2) by krishnoid on Wednesday May 08 2024, @04:15AM (4 children)

          by krishnoid (1156) on Wednesday May 08 2024, @04:15AM (#1356170)

          No wait, there's more:

          The first mechanical computer, Charles Babbage's Difference Engine, was designed in the 1820s. It was powered by steam and used a hand crank to calculate a series of values and print the results in a table.

          From the invention of computer programming languages up to the mid-1970s, most computer programmers created, edited, and stored their programs line by line on punch cards.

          In the 1980s, there was a boom in AI research, due to both breakthroughs in research and additional government funding. Deep Learning techniques and the use of Expert Systems became more popular. These techniques allowed computers to learn from their mistakes and make independent decisions.

          Today, AI is used in a wide variety of applications, including self-driving cars, facial recognition, and medical diagnosis.

          • (Score: 2) by Freeman on Wednesday May 08 2024, @01:38PM (2 children)

            by Freeman (732) on Wednesday May 08 2024, @01:38PM (#1356200) Journal

            What happened to the 90s and 2000s? I'm guessing "AI" research was pretty much a joke until the 2000s. Sure, there may have been some interesting things thought about and said about the potential for "AI". Actual "AI" like systems were pretty crude back then. Nowadays, they've gotten a lot better at faking it.

            --
            Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
            • (Score: 4, Insightful) by DannyB on Wednesday May 08 2024, @04:39PM (1 child)

              by DannyB (5839) Subscriber Badge on Wednesday May 08 2024, @04:39PM (#1356224) Journal

              There were exciting things in the 1990s. I was into Common Lisp back then. I subscribed to AI magazine. (I threw them all out a couple years ago.)

              Symbolic Reasoning was the thing. Expert Systems were all the rage. Term rewriting, unification pattern matchers. Computer Algebra Systems (in your pocket with the TI-89). Languages like Prolog, OPS/5 [dtic.mil], and others.

              Some great successes, just as today, fueled a huge hype cycle. However while the symbolic reasoning approach yields many useful tools, it seems fundamentally wrong now compared to "modern" AI which is neural networks, Bayes networks, machine learning and statistical predictive approaches based on training.

              What happened was called the "AI Winter". Funding was drying up. The pace of great successes from big investments wasn't as big as desired compared with early successes.

              As a Common Lisp fan, I remember realizing that "Worse is Better" [wikipedia.org] was going to win the day. (translated: everyone was going to use C instead of Common Lisp)

              Nonetheless, I had tons of fun learning about Minimax with Alpha-Beta Pruning and building game playing programs (TicTacToe, Connect4, Reversi, Checkers). Learning efficient ways to write Breadth First Search, and A* search for solving puzzle problems that have big combinatorial explosions. In 2010 I wrote a fun program to solve the puzzle known as "Unblock Me" or "Traffic Jam". I've been meaning to write my own Sudoku solver -- yes I know that's a solved problem -- but it is the challenge of doing it myself. I know the right toolbox of tricks. It's the fun of it. These problems are why I prefer garbage collected languages. Memory management and the size of a word on the machine hardware are the last things I want to be thinking about. I want to be using high order languages.

              --
              Did you hear of the police having the tires stolen from their cars? They are working tirelessly to catch the culprits.
          • (Score: 3, Funny) by Tork on Wednesday May 08 2024, @04:20PM

            by Tork (3914) Subscriber Badge on Wednesday May 08 2024, @04:20PM (#1356220) Journal

            In the 1980s, there was a boom in AI research, due to both breakthroughs in research and additional government funding. Deep Learning techniques and the use of Expert Systems became more popular. These techniques allowed computers to learn from their mistakes and make independent decisions.

            Ten bucks says that's a verbatim quote from an obscure straight-to-rental 80's movie. Like... The Particularizer, or Dead Time, or something like that.

            --
            🏳️‍🌈 Proud Ally 🏳️‍🌈
(1)