Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday September 20 2021, @07:46AM   Printer-friendly

https://spectrum.ieee.org/q-a-with-co-creator-of-the-6502-processor

Few people have seen their handiwork influence the world more than Bill Mensch. He helped create the legendary 8-bit 6502 microprocessor, launched in 1975, which was the heart of groundbreaking systems including the Atari 2600, Apple II, and Commodore 64. Mensch also created the VIA 65C22 input/output chip—noted for its rich features and which was crucial to the 6502's overall popularity—and the second-generation 65C816, a 16-bit processor that powered machines such as the Apple IIGS, and the Super Nintendo console.

Many of the 65x series of chips are still in production. The processors and their variants are used as microcontrollers in commercial products, and they remain popular among hobbyists who build home-brewed computers. The surge of interest in retrocomputing has led to folks once again swapping tips on how to write polished games using the 6502 assembly code, with new titles being released for the Atari, BBC Micro, and other machines.

Mensch, an IEEE senior life member, splits his time between Arizona and Colorado, but folks in the Northeast of the United States will have the opportunity to see him as a keynote speaker at the Vintage Computer Festival in Wall, N.J., on the weekend of 8 October. In advance of Mensch's appearance, The Institute caught up with him via Zoom to talk about his career.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Funny) by Rosco P. Coltrane on Monday September 20 2021, @07:51AM (2 children)

    by Rosco P. Coltrane (4757) on Monday September 20 2021, @07:51AM (#1179615)

    Mensch, an IEEE senior life member, splits his time between Arizona and Colorado

    This is called the white hair trail. It extends all the way to Florida. It is the preferred route of migration for pensioners in autumn and spring.

    • (Score: 2) by aristarchus on Monday September 20 2021, @08:12AM

      by aristarchus (2645) on Monday September 20 2021, @08:12AM (#1179618) Journal

      Snowbirds, of an earlier generation? Or is it different for the grey-neckbeared nerds of old?

    • (Score: 0) by Anonymous Coward on Monday September 20 2021, @12:31PM

      by Anonymous Coward on Monday September 20 2021, @12:31PM (#1179634)

      Why are picking on the little Rabbits, Sheep? Keep to your Soy latte flock and leave the little silicone hole searching Rabbits alone.
      ;)

  • (Score: 4, Informative) by driverless on Monday September 20 2021, @01:30PM

    by driverless (4770) on Monday September 20 2021, @01:30PM (#1179646)

    ... you can't beat "On the Edge: The Spectacular Rise and Fall of Commodore", which despite its title is actually the best history of the 6502 anyone's ever written.

  • (Score: 5, Informative) by DannyB on Monday September 20 2021, @02:22PM (3 children)

    by DannyB (5839) Subscriber Badge on Monday September 20 2021, @02:22PM (#1179658) Journal

    A few years ago I re-read the old BYTE magazines from the beginning up until shortly after the 1984 introduction of the first Macintosh.

    My reaction was: wow, I had forgotten how shockingly primitive the technology was. I suddenly realized how important the standardization was when the "holy trinity" was introduced in 1977. (TRS-80, Apple II, and Commodore PET) Suddenly there could be off-the-shelf commercial software. Before this, things as simple as a keyboard didn't have any kind of standard interface to the computer. How could you possibly write commercial standard software? The closest thing to standard was a serial port to a terminal.

    Today's microcontrollers are way more powerful than early systems were. And those cost a LOT of money, in 1970s dollars which were actually worth something.

    The 6502 was pretty clever. WOZ's "sweet 16" extended instructions was also clever. It's amazing the things people did back then.

    Download old BYTE magazines from here: https://www.americanradiohistory.com/Byte_Magazine.htm [americanradiohistory.com]

    Or higher quality scans here: https://archive.org/details/byte-magazine [archive.org]

    Or Popular Electronics: https://www.americanradiohistory.com/Popular-Electronics-Guide.htm [americanradiohistory.com]

    Creative Computing: https://archive.org/details/creativecomputing [archive.org]

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 5, Interesting) by DannyB on Monday September 20 2021, @02:23PM (2 children)

      by DannyB (5839) Subscriber Badge on Monday September 20 2021, @02:23PM (#1179659) Journal

      BYTE magazine, April 1980, page 115.

      NEW HIGH-SPEED COMMUNICATIONS BUS: Xerox Corporation recently made a public announcement of a new concept of processor-to-processor communications intended for an office environment. This novel concept is called "Ethernet", and is a result of some of the work being done in their research labs. In this concept, a single coaxial cable is used as a high-speed communications bus between all processors; communication protocol is handled through software or software supplemented by special-purpose hardware. Rumor has it that an Ethernet processor is now being developed by some form of joint arrangement between Xerox and Intel.

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 5, Funny) by vux984 on Monday September 20 2021, @03:49PM (1 child)

        by vux984 (5045) on Monday September 20 2021, @03:49PM (#1179696)

        You always hear about these new technologies at the research stage, and then we never hear about them again. Call me when this ethernet is an actual product i can buy.

        • (Score: 3, Funny) by DannyB on Monday September 20 2021, @04:24PM

          by DannyB (5839) Subscriber Badge on Monday September 20 2021, @04:24PM (#1179713) Journal

          This is only one example of the amusement, sometimes astonishment, of looking back at the old BYTE magazines.

          BYTE 1978-July, pg 42
          Conversation overheard in local computer store:
          Customer: What's the difference between static and dynamic memory?
          Salesman: Static memory works, and dynamic memory doesn't.

          --
          To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
  • (Score: 2, Interesting) by Anonymous Coward on Monday September 20 2021, @04:55PM

    by Anonymous Coward on Monday September 20 2021, @04:55PM (#1179737)

    What a nerdfest article! Good find. Of course VIA was born from a real needs discussion over lunch. If only we had more such meetings!

    from my work with I/O chips, I knew [how computers were used] in the real world. People want to work with the 65x chips because they are accessible.

    More HW engineers who dogfood, *please*.

  • (Score: 2) by Rich on Monday September 20 2021, @07:51PM (2 children)

    by Rich (945) on Monday September 20 2021, @07:51PM (#1179814) Journal

    Semi-OT, but being close enough to TFA and WDMs work to be post-worthy:

    I wasted a metric shitload of time on home computer archaeology in the last few days, probably triggered by the news of the passing of Sir Clive. Actually, just before I saw this article, I skimmed over the undeveloped 65C4 design requirements. Interesting that they assumed at MOS in 1982 that a memory-only architecture was the way forward, while the ARM people mentioned in TFA actually were the first to get it right from a modern standpoint. Ok, the 68K might have been a contender if they had survived until micro/macro-decoding became what it is now. (In my binge, I also read the script of the 68K design group interview)

    Anyway, the question driving me was how an early 80s home computer might have looked if anyone had assumed that the quest for arcade-grade gaming (and none of this "filing cooking recipes") was the main driver for computing at home. Woz did realize this in '76 and had Break-Out as benchmark, and Tramiel must have such an idea as well, wen the Atari ST was offered in full RGB and hires paperwhite displays (for the now accepted uses of replacing typewriters and calculators).

    I came to the conclusion that the peculiarities of the NTSC signal would have mandated a 160x200/4-bit colo(u*)r mode, and a 640x200/1-bit text mode. This would require video accesses at 1.79 MHz, which would neatly fit into one phase of an 68B09E bus cycle and also into the 270ns cycle of then-available memory. But would this be the right CPU? The 6502 is hopelessly underpowered to do gaming in a 16K framebuffer, and I don't think a similarly synchronized 65816 could outperform a 6809 either. The Z80 is cheaper than the 6809, and might be tempting especially in its Z80H variant at 7.16MHz - but becomes non-deterministic with the memory interleave. As does the 8088, which is fugly, but somehow hits a few sweet spots with the 64K limit. Finally, a sort-of high-end choice could be the 68008 - but would this be throttled too much by both its 16-bit instructions and the video limited 8-bit memory? Z80 and 8088 could run CP/M and MS/DOS and therefore have Turbo Pascal for developers; the 6ß09 could run OS/9, but that lacks a bit in end user offers.

    What would be your design win?

    * I tend to use British spelling, but as this wouldn't work in PAL countries, I have to resort to American. Much as the Zed-Ex 80 computer has a Zee 80 CPU ;)

    • (Score: 3, Informative) by shortscreen on Tuesday September 21 2021, @03:51AM (1 child)

      by shortscreen (2252) on Tuesday September 21 2021, @03:51AM (#1179919) Journal

      Anyway, the question driving me was how an early 80s home computer might have looked if anyone had assumed that the quest for arcade-grade gaming (and none of this "filing cooking recipes") was the main driver for computing at home. Woz did realize this in '76 and had Break-Out as benchmark, and Tramiel must have such an idea as well

      I'd say there were many others. I mean the C64 didn't have hardware sprites and the SID to help you balance your checkbook. The Atari 800 launched in '79 with a video chipset that was a substantial evolution over the VCS game console (said chipset later being recycled for the Atari 5200), and the POKEY sound + I/O chip which went on to be used in Atari's own arcade systems. The 800 had good ports of arcade titles like Pacman, Centipede, Space Invaders, and all that. There was plenty of cross pollination going on between home computers, arcade games, and game consoles right from the beginning.

      When the story about Sinclair ran on SN, there was a thread about the TI-99, another home computer which was a big deal. The TI-99 itself fell by the wayside but its 9918 video chip was more-or-less copied by every Japanese game system throughout the '80s. Sega started out using the actual TMS9918 until they developed their own which remained backwards compatible. The MSX standard and Yamaha also ran with it. The Hudson Soft chipset used by NEC and Nintendo's PPU are similar designs. They all have a character-mapped background with 8x8 characters, sprite multiplexing, separate memory bus which the CPU has to access through address/data registers, and similar memory access patterns where character data gets loaded during the active display and sprite data is loaded during H-blank.

      I came to the conclusion that the peculiarities of the NTSC signal would have mandated a 160x200/4-bit colo(u*)r mode, and a 640x200/1-bit text mode. This would require video accesses at 1.79 MHz, which would neatly fit into one phase of an 68B09E bus cycle and also into the 270ns cycle of then-available memory.

      The Atari 800, 5200, and 7800 all use a 1.79MHz CPU clock sync'd with the pixel clock. But they don't interleave memory accesses for CPU and video, instead the CPU is halted when video data is being accessed. So that usually means you get something like 160x200 with 2-bit color which doesn't steal all the bandwidth. I'm sure that fast enough memory had become available that they could have done interleaved access to increase their bandwidth, but they never did. In the grand scheme of things, having your CPU and video share the same memory is bad for performance and that is just as true now as it was then.

      BTW, using a pixel clock of 1.5x the color burst turned out to be pretty popular. 5.37MHz = ~256 pixels.

      The 6502 is hopelessly underpowered to do gaming in a 16K framebuffer

      Everything was underpowered to do gaming with a framebuffer which is why it generally wasn't attempted except on systems that had nothing else. Even having a fancy blitter in addition to a framebuffer you'd still struggle to replicate 60hz smooth scrolling with multiple animated objects on the screen like the arcades had, because they (often) were using multiple memory buses and character-mapped graphics combined with sprites like the TMS9918 but on steroids. A 6502 has little trouble updating a 1KB map that defines a 32x32 grid of 8x8 tiles, followed by a 64-byte sprite attribute table.

      • (Score: 2) by Rich on Tuesday September 21 2021, @10:21AM

        by Rich (945) on Tuesday September 21 2021, @10:21AM (#1180019) Journal

        Thanks for the elaborate reply. Straight A in nerd history class. :)

        You're probably right in that the Ataris and the C64 were designed to be gaming machines, but they fell short of what colour and smoothness arcade hardware of the time offered. Woz perfecty got his breakout (from about 1975) done, even in 16 colors, that's what Lo-Res is for. But it turns out the blocky "Pong" class of games was soon obsolete, and from 1980 on, we had pixel accurate RGB in the arcades (With Galaxian or Scramble as benchmarks). In so far, it was very unlucky for Atari that they sunk the immense cost into the 8-bit series chipset that early, because video-wise it just wasn't there.

        The TMS9918 is pretty much amazing, but even that falls just slightly short. It probably lives off the assumption that memory is extremely limited, and animation can be updated in VBL by adjusting background tiles and sprite positiions from single storage, where it became apparent that page flipping is a far superior general solution (IF you have the memory for that). It also neglects a higher resolution monochrome mode that would have helped to break into pro markets that the remaining hardware could have served.

        The SID story is pretty amazing, though, and it is interesting to read the interviews with Bob Yannes (who went on to create the 5503). Here, the C64 was ahead of everything else until arcade got more and more PCM (or maybe a Yamaha FM chip in between). I associate arcade sound with the AY-3-8910 series, and the 8-bit Ataris already got into that class.

        What I think was odd was no one (except Woz) who aimed for TV accepted that clean NTSC graphics have to sit right on the color burst. If we look at how the signal is made, it is apparent that some sort of ugly fringes must appear, dependent on what the TV thinks is actual color information, and what is luminance. (Or did the 1.5 x burst pixel systems you mentioned assume that they could have a steep edge with higher frequency than the color carrier at least on direct composite connection?) Around 1982 one should have been able to get 4164-120 RAMs, which have a cycle of 270ns, which - just barely - fits into two NTSC color carrier cycles, which gives the resolutions I mentioned.

        Anyway, enough of those idle musings, back to work :)

  • (Score: 5, Interesting) by stormreaver on Monday September 20 2021, @08:33PM (2 children)

    by stormreaver (5101) on Monday September 20 2021, @08:33PM (#1179831)

    People want to work with the 65x chips because they are accessible. You can trust the technology.

    I never programmed the 6502, but I did a LOT of 6809 programming back in the 80's and early 90's. From what I've seen, the 6502 and 6809 have a lot of similarities. What I loved about the 6809 (I always imagined that the 6502 was the same) was the great orthogonality/symmetry of the instruction set, and the fantastic memory addressing modes. I especially loved the indirect addressing modes. One of my favorite constructs took the form:

    LDD [,X++], which loads the 16-bit D register with the contents of memory pointed to by the X register, then increments the X register by two bytes. All that in a single instruction. Intel needs several instructions to perform the same operation.

    Another was LDY [D,X++]. This does the same as above (but loads the Y register instead of the D register), but calculates the memory address by using D as an offset from X.

    When it was clear that the 6809's future was bleak, I switched to IBM PC computing. The assembly architecture was a wretched mess compared to the beauty of the 6809. I initially thought that the assembly books I bought were pranking me, and that something worthy of industry dominance would be presented after the jokes were all told. Nope, Intel architecture really was that awful.

    • (Score: 3, Interesting) by dltaylor on Tuesday September 21 2021, @12:11AM (1 child)

      by dltaylor (4693) on Tuesday September 21 2021, @12:11AM (#1179886)

      Intel's processors were junk back then. Their memory devices were pretty good, except for the EPROMs that used radioactive clay to make the carriers.

      When IBM management fumed about all of the Apple IIs in their headquarters, they demanded an IBM-branded device. Engineering designed in a Z8000, which was an amazing device for the time (predating the best of the bunch 68000 by a bit): orthogonal registers and instructions for 8 16-bit registers. However, at the time Zilog was owned by a little company called Exxon, while Intel was going out of business. IBM could buy 8088s and their co-dependent chips cheaper than the parts from Zilog, so the materials people demanded the Intel crap. It was a great deal for Intel, obviously, but seriously impeded creating the software. On top of that it wasn't until very late that Intel even made the better parts than NEC or, later AMD. The V10 and V20 were faster, clock-for-clock, than the 8088 and 8086, and the PR100s were faster than the same clock 75 MHz Pentiums.

      • (Score: 2) by Rich on Tuesday September 21 2021, @09:20AM

        by Rich (945) on Tuesday September 21 2021, @09:20AM (#1180013) Journal

        I wonder how much corruption was involved in such decisions. We deal with rising empires here whose owners were aggressively manipulative and had zero conscience. Somehow it seems unlikely that all the design wins come down to the more successful week of all-nighters of the winning design team...

  • (Score: 3, Interesting) by dltaylor on Monday September 20 2021, @10:14PM

    by dltaylor (4693) on Monday September 20 2021, @10:14PM (#1179861)

    One of my favorites, made by, maybe, Rockwell, was a 6502 variant that had an embedded FORTH interpreter. That went into one of those scrolling, flashing, .. LED signs. The nice part for me was that the FORTH experience gave me enough background to, later, write Sun-compatible boot drivers for SCSI boards. At first, it seemed too complicated, but I realized I was trying write C code in FORTH, made myself stop that and think in FORTH terms. Much easier after that.

    Personally, I built a time domain reflectometer using a 6502 for the compute portion back around 1980. The LCD dot matrix display that was selected took it's data as horizontal rows, while, of course, the A/D converter provided me amplitude that needed to be displayed vertically. Converting between the two meant some very careful zero-page programming to keep up with the desired repetition rate.

(1)