In 1979 the Macintosh personal computer existed only as the pet idea of Jef Raskin, a veteran of the Apple II team, who had proposed that Apple Computer Inc. make a low-cost “appliance”-type computer that would be as easy to use as a toaster. Mr. Raskin believed the computer he envisioned, which he called Macintosh, could sell for US $1000 if it was manufactured in high volume and used a powerful microprocessor executing tightly written software.
Mr. Raskin’s proposal did not impress anyone at Apple Computer enough to bring much money from the board of directors or much respect from Apple engineers. The company had more pressing concerns at the time: the major Lisa workstation project was getting under way, and there were problems with the reliability of the Apple III, the revamped version of the highly successful Apple II.
An interesting look at the design of Apple Mac ...
(Score: 5, Interesting) by DannyB on Monday July 03, @03:16PM (3 children)
128 K bytes, that ought to be enough for anybody!
At the time, in early 1984, it was obvious to everybody, except Steve Jobs, that 128 K was clearly not enough. All developers understood this. Steve stuck to his guns that the basic machine should come in one simple non-expandable configuration.
After internal squabbling at Apple, eventually we got the "fat mac" which has 512 K. Enough to actually build useful programs.
In 1987, I was the co-author of a Mac program called Timbuktu.
What it was...
How it worked...
This got our company acquired. As we added features to the program, and were planning other software titles based on the same technology. (Screen Recorder, which would record your screen to a file which could be played back, Timbuktu/Remote which would work over high speed modems.)
I remember a meeting where we developers, including new ones we had hired, were clear that these products should only run on 512K or higher memory Macs. There were a few who insisted that we make this all work on all macs including the original 128K mac. We pointed out the infeasibility of this, and our reasoned arguments won the day.
There are other stories I could tell about artists who knew nothing about user interface design -- although their sketches looked fantastic -- like consumer electronics equipment that looks beautiful but is a disaster to attempt to operate.
Those days were fun days. Not the only fun days, I've had plenty of others, but those few years were fun days.
I think that Steve Jobs sticking to his 128 K Mac, and other ideas of limiting upgrades, is what got him stripped of his power at Apple.
If you eat an entire cake without cutting it, you technically only had one piece.
(Score: 2) by Tork on Monday July 03, @08:57PM (2 children)
This era was well before my time, but I have often wondered if vector displays being used so much back then played a role in this. Seems like the moment one describes the memory requirements needed just to have a buffer for a full-screen raster image that's when the lightbulbs start to go on. But vector displays can operate on practically nothin. Maybe there was so much momentum...?
Or maybe not. I'm not even sure I'm right about vector displays requiring less RAM. (I latched on to that idea while doing a lil retro-gaming..) I wish I had a good way to get my head into what that era of computing was like. I'm in a retro group for 90's DOS games and a lot of the memes that get posted are about how we just lived with the quirks of the area, like a game crashing and leaving the desktop gamma up high enough to give you a sunburn. I feel like there's a whole bunch of nuance that you had to be there for.
🏳️🌈 Proud Ally 🏳️🌈
(Score: 1, Interesting) by Anonymous Coward on Monday July 03, @11:40PM
First shooter game? https://www.youtube.com/watch?v=UP2OaKHaDxM [youtube.com] MIT Space War on the PDP-1 used a vector display. Friends found what they believed to be the original display tube in a Cambridge MA surplus store and grabbed it. I think it may have eventually wound up in the Computer Museum, back when it was in Boston.
There were also hardware character displays, say 40 characters x 25 lines. Each character one byte, so 1000 bytes was all the memory needed to cover the whole screen. Of course only 256 choices for each character rectangle - usually half ASCII, other half not standardized.
(Score: 2) by DannyB on Wednesday July 05, @02:06PM
I am trying to imagine attempting to read SN on a vector graphic display with green or blue phosphor.
Raster displays were the way to go for modern GUIs. Especially once you understand how QuickDraw and it's Region data structure works. This gives you fast graphics. Over 7000 characters per second -- in 1984 on a machine with 8 MHz 68000 processor. No hardware text mode. Overlapping windows easy because QuickDraw region clipping can allow drawing in partially exposed windows even though windows partially on top if it can obscure various parts of it in arbitrarily complex ways. And you can take advantage of complex region clipping in your own programs. And did anyone mention that QuickDraw was Amazingly Fast! Insanely fast. The fundamental abstraction is the blitter with regions.
If you eat an entire cake without cutting it, you technically only had one piece.
(Score: 3, Touché) by Anonymous Coward on Monday July 03, @07:26PM (1 child)
Read through the linked article, no references to the Xerox Alto https://en.wikipedia.org/wiki/Xerox_Alto [wikipedia.org] and only a couple of tiny mentions of the Xerox Star. But the whole idea of a GUI for Lisa and Mac came from a tour that PARC gave to Apple employees, at least that's the version of the story I remember.
(Score: 4, Informative) by DannyB on Wednesday July 05, @02:16PM
Apple did steal a lot from Xerox. If you can call it that. Xerox had no idea what they had. Management ordered their top people to show Apple everything. After all, how much could this technology really benefit Xerox. After all, Xerox was a
documentcopy machine company.Eventually Apple employed people from Xerox such as Larry Tesler.
Apple did contribute some of its own inventions, such as the menu bar at the top of the screen with pull down menus. Microsoft, in its typical fashion, skrewed this up by putting menu bars on individual windows. The utility of having the menu bar at top of screen, and swapping menu bars depending on which window is in front, is that you can SLAM the mouse pointer to the top of the screen and hit the menu bar. On MS Windows, you need to use your micromotor precision to get the pointer on a menu and then pull it down.
Apple also invented certain features of dialog boxes.
The biggest thing, which puzzled Xerox people, was that on Mac and Lisa, if you moved or closed a window, everything underneath the newly exposed area automatically was redrawn very quickly. And not using off screen buffers. Events were sent to all of the windows, and they could simply redraw, and QuickDraw's region clipping simply limited which actual pixels could be touched on screen, such that application drawing, such as half of a spreadsheet cell, and half of the letter Z are nicely clipped without the application having any knowledge of this. The genius for that was Bill Atkinson. He simply did not know that Xerox could not redraw damaged areas of windows. He simply assumed that this had to be done and came up with a genius way of doing it.
If you eat an entire cake without cutting it, you technically only had one piece.