An interesting piece of computing history which may be familiar to many here ... Here is the inside story of personal computing at the legendary research lab
This article was first published as "Inside the PARC: the 'information architects'." It appeared in the October 1985 issue of IEEE Spectrum. A PDF version is available on IEEE Xplore. The diagrams and photographs appeared in the original print version.
In late 1969, C. Peter McColough, chairman of Xerox Corp., told the New York Society of Security Analysts that Xerox was determined to develop "the architecture of information" to solve the problems that had been created by the "knowledge explosion." Legend has it that McColough then turned to Jack E. Goldman, senior vice president of research and development, and said, "All right, go start a lab that will find out what I just meant."
Goldman tells it differently. In 1969 Xerox had just bought Scientific Data Systems (SDS), a mainframe computer manufacturer. "When Xerox bought SDS," he recalled, "I walked promptly into the office of Peter McColough and said, 'Look, now that we're in this digital computer business, we better damned well have a research laboratory!' "
In any case, the result was the Xerox Palo Alto Research Center (PARC) in California, one of the most unusual corporate research organizations of our time. [...] PARC, now in its fifteenth year, originated or nurtured technologies that led to these developments, among others:
- The Macintosh computer, with its mouse and overlapping windows.
- Colorful weather maps on TV news programs.
- Laser printers.
- Structured VLSI design, now taught in more than 100 universities.
- Networks that link personal computers in offices.
- Semiconductor lasers that read and write optical disks.
- Structured programming languages like Modula-2 and Ada.
(Score: 5, Interesting) by DannyB on Monday June 06 2022, @10:25PM (1 child)
The
manglersmanagers at Xerox simply had no idea what they had.How could this technology help us sell copiers which is the business we are in?
How often should I have my memory checked? I used to know but...
(Score: 2, Funny) by Anonymous Coward on Tuesday June 07 2022, @12:47AM
We imported a few manglers into our engineering group.
Mangler : Engineering group == Bowling ball : Washing machine
(Score: 1, Interesting) by Anonymous Coward on Monday June 06 2022, @10:58PM
Xerox's xeography patent was about to run out, so they needed technology to keep their shareholder value.
It didn't really work out the way their buisiness was structured.
(Score: 4, Interesting) by Anonymous Coward on Monday June 06 2022, @11:12PM (4 children)
http://worrydream.com/refs/Strassmann%20-%20The%20Computers%20Nobody%20Wanted.pdf [worrydream.com]
(Score: 0) by Anonymous Coward on Monday June 06 2022, @11:37PM
Thank you for this. I look forward to reading it.
(Score: 1, Informative) by Anonymous Coward on Tuesday June 07 2022, @02:44AM
A seconder for Thank You. I was not going to give up my email, date of birth, credit card, dog's maiden name, address, PIN, etc to the IEEE for one PDF...
(Score: 2) by janrinok on Tuesday June 07 2022, @05:41AM (1 child)
Latest version of FireFox tags this PDF as a potential security risk. I haven't looked closely at it. Caveat Emptor.
(Score: 4, Interesting) by RamiK on Tuesday June 07 2022, @12:23PM
It's probably giving warning you about the http (vs. https).
Strassmann perspective isn't really Xerox-corporate's own: He was hired as Chief Computer Executive to organize and replace some $150 million worth of IBM and Univac computing hardware with the new in-house Sigma and XDS (Xerox Data Systems) when Xerox decided to compete against IBM in data-processing and needed someone with experience in replacing IBM hardware and software that wasn't influenced by inter-departmental and inter-personal ties. That is, he was hired because he was an outsider to the Xerox corporate perspective. And, his day-to-day job was IT, source peripherals and oversee and contract third parties to write compilers and port software. So, while technically an executive with ~100 engineers under him while handling some acquisitions, he didn't deal with customers, marketing or product design so you can't suit him up as an MBA.
Anyhow, to give a couple of examples of Strassmann views and criticisms:
1. (pre-PARC days) Peter McColough mislabeling of the early Sigma machines as IBM replacements when they lacked the software and were only appropriate for smaller scale operations.
2. (PARC days) Goldman's academia and DoD ties and interests not falling in line with Xerox's business interests.
So, he holds opinions that go against both Xerox corporate perspective and the popular modern perception of what PARC was and the how and why Xerox failed.
p.s. I've made mention of the book back in 2018 here: https://soylentnews.org/comments.pl?cid=757359&sid=28402 [soylentnews.org]
compiling...
(Score: -1, Offtopic) by Anonymous Coward on Tuesday June 07 2022, @12:03AM (1 child)
Your momma so big that when she goes swimming in the lake,
she causes the dam to fail.
(Score: 3, Touché) by DannyB on Tuesday June 07 2022, @04:58PM
An eclectic eel has a wider variety of interests. And better jokes.
How often should I have my memory checked? I used to know but...
(Score: 2, Insightful) by Anonymous Coward on Tuesday June 07 2022, @12:08AM (15 children)
The programming language that comes to mind when Xerox PARC is mentioned is smalltalk. Structured programming was already mainstream with Pascal, ALGOL, C, etc., and Modula was a successor language to Pascal, I think.
(Score: 2, Interesting) by Anonymous Coward on Tuesday June 07 2022, @01:09AM (7 children)
Modula-2 was a different direction from Smalltalk.
Pascal was procedural. Smalltalk was object-oriented. Modula-2 was modular, but not object-oriented. It was also designed to be a systems language.
It's a pity that Modula-2 didn't win the subsequent wars. It is quite a nice language. It has a very tight, clean syntactic structure, and the strict typing with fixed-length arrays catches a lot of bugs at compile time.
Now we're left slowly and painfully discovering that object orientation was popular because the proponents of abstraction extended it from data abstraction and modular design, through to combined code and data abstractions that end up becoming either subverted for convenience, or an opaque straitjacket. Oh well.
(Score: 4, Insightful) by hendrikboom on Tuesday June 07 2022, @01:48AM (2 children)
Its successor, Modula 3, was *not* designed by Wirth.
It was an efficient systems language that supported object-orientation without insisting it.
Its basic structuring mechanism was to build a program out of modules with explicitly defined interfaces.
(Score: 4, Insightful) by DannyB on Tuesday June 07 2022, @01:44PM (1 child)
Back in the 80's my coworkers and I were excited about Modula 3. It was like a dream too good to be true.
Too bad no cross platform commercial compilers seemed to materialize. I kept my eye out. But nothing.
But then my late 1986, I discovered Lisp and became obsessed with that for the next six years.
How often should I have my memory checked? I used to know but...
(Score: 3, Interesting) by hendrikboom on Tuesday June 07 2022, @06:53PM
I discovered Lisp and Algol 60 at about the same time, reading issues of Communications of the ACM in the university library. That would have been 1963? 1964? 1965? Not sure.
But I had implementations of neither. The university computer was an IBM 1620, with 20K digits of memory (a decimal architecture). I realised if I was to use any of these languages (instead of Fortran or assembler) I'd have to implement them. I spent a long time fantasizing algorithms for implementing Algol 60, and finally went and implemented Lisp. It looked more feasible.
I've done several Lispish implementations since. And now I'm home in Racket.
Though I'm still tied down with old software in C/C++, Modula 3, and Ocaml. I like the static type checking of Modula 3 and OCaml; Typed Racket is awkward for many of the overloaded operations.
(Score: 0) by Anonymous Coward on Tuesday June 07 2022, @02:21AM
That was not a battle between OO and structural/modular paradigms. It was C++ vs others. C++ gradually added OO features to the successful established C while maintaining backward compatibility, and then later piling on any and every fashionable features.
The same reason x86 ISA came to dominate despite the superior 68k ISA - maintaining backward compatibility with old 8080 software, particularly the CP/M library, but kept adding new fashionable features like RISC underpinning, vector operations, etc.
(Score: 1, Interesting) by Anonymous Coward on Tuesday June 07 2022, @04:32AM
"It's a pity that Modula-2 didn't win the subsequent wars. It is quite a nice language. It has a very tight, clean syntactic structure, and the strict typing with fixed-length arrays catches a lot of bugs at compile time."
STRONGLY disagree. Making arrays statically sized and making that size part of the type means you have to make all arrays that interact with each other the same size. That size has to be large enough for the biggest array, if they are to interoperate. This means wasted memory. It's also hard to come up with one size fits all.
For a painful example of this, in the predecessor to Modula-2, Pascal, strings were declared as fixed length arrays of char. Imagine a string backed by an array of 128 characters max being a different, incompatible type to a string backed by an array of 255 characters. This was so awful that it was universal for compilers to implement an extension for the string type where the length was not part of the type.
(Score: 3, Interesting) by janrinok on Tuesday June 07 2022, @05:51AM (1 child)
I enjoyed about 18 months of working with Modula-2 a long time ago. But it didn't have the same community support that languages seem to have today and it necessitated rewriting lots of standard routines and libraries, all of which then required testing etc, just to achieve what was already do-able in other languages without the extra work.
It didn't make economic sense for the project although it remained popular in academia. Surprisingly, Modula-2 libraries are now to be found on various sites such as Github, but unfortunately decades too late.
(Score: 2) by hendrikboom on Tuesday June 07 2022, @06:57PM
Which is why I never went to Modula 2, but did take on Modula 3, which not only has objects with inheritance, array types needn't have sizes built into them.
Modula 3 was not designed by Wirth.
(Score: 5, Funny) by Rosco P. Coltrane on Tuesday June 07 2022, @02:13AM (6 children)
And BASIC! It had line numbers remember?
(Score: 1, Funny) by Anonymous Coward on Tuesday June 07 2022, @02:31AM (5 children)
BASIC was not developed with structural programming paradigm. It was developed as a simplified, imterpreted version of Fortran for programming beginners.
(Score: 2) by janrinok on Tuesday June 07 2022, @05:53AM
I think that was a "Whoosh"...
(Score: 2) by DannyB on Tuesday June 07 2022, @01:47PM (2 children)
Your sarcasm detector needs to be recalibrated at an authorized service center.
How often should I have my memory checked? I used to know but...
(Score: 2) by Freeman on Tuesday June 07 2022, @02:59PM (1 child)
For some reason I read that as decalibrated.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by DannyB on Tuesday June 07 2022, @03:13PM
Maybe they mean the same thing.
How often should I have my memory checked? I used to know but...
(Score: 3, Interesting) by hendrikboom on Tuesday June 07 2022, @07:01PM
Despite the replies suggesting sarcasm, that is substantially true.
Basic was designed to run efficiently on a time-shared system, able to compile very line of code as it was typed in, independently of the rest of the program.
To accomplish this very experimental goal, one of the design principles was that it be well behind the state of the art in language design.
A lot of the restrictions on the language resulted from this constraint.
(Score: 2, Informative) by Anonymous Coward on Tuesday June 07 2022, @01:15AM (1 child)
The language they are thinking of is Mesa
(Score: 2) by hendrikboom on Tuesday June 07 2022, @01:48AM
Yes.
(Score: 4, Interesting) by Anonymous Coward on Tuesday June 07 2022, @01:56AM (3 children)
I once heard Xerox execs referred to as "copier heads". I vaguely recall seeing their application of the advanced technology advertised in a magazine at one point and I think I later saw it in a documentary about the subject: It was a huge copier, with a stand-up work station, and a CRT running all that amazing GUI software. Yes, they basically built a graphical PC in to the copier just for the purpose of maybe doing a few compositional tweaks on documents you wanted to copy.
It sounded like a classic case of having that cash cow and not being able to do anything other than milk it. It's up there with Kodak inventing the digital camera and something like FaceBook, but not being able to capitalize on it.
Digital pictures, you see, would cut in to film sales. Electronic documents, you see, would cut in to copier sales.
Classic. New technology is going to cut in to your sales no matter what. It might as well be you doing the cutting, but it usually isn't. You've got a good thing going. Why rock the boat?
It kind of makes sense if you step back and realize that corporations really aren't people. Real people would rather work another 10 years and make sure they retire with the fixed-income pension from Kodak than spend the last 10 years risking it all on digital. They don't care about the company, and why should they? Thus, it's quite possible for a collection of real rational economic actors to destroy something that economists mistakenly regard as a rational actor. See also, golden parachutes. See also the younger more ambitious version of this: The "traitorous 8", the "Fairchildren" that finagled technology from Fairchild semiconductor, found lucrative ways to exploit it, and started the while Si Valley VC scene.
(Score: 3, Interesting) by theluggage on Tuesday June 07 2022, @02:35PM (2 children)
I think Kodak had the better argument there - most of Kodak's consumer photography business was on the King Camp Gillette business model - sell cheap cameras and clean up on the film and processing (and you still got the latter if people bought someone else's camera). Digital photography obliterated that model and knocked film sales & processing back into a tiny niche (c.f. the days when everybody and their dog shot half-a-dozen rolls on holiday, came back and had them all printed). Kodak had to totally change what they did and complete with other big-name camera and lens makers - which probably wasn't their strength. I think Kodak would have been toast whatever they did - and they did make an effort to get in on the digital printing business (the whole PhotoCD business was part of a wider effort to get their film processing network digital-ready and sell reprints, enlargements, drinks coasters etc. to digital camera users).
In Xerox's case, electronic documents and DTP initially created a massive demand for laser printing and physical document production/management, which was a direct evolution of Xerox's copier business. I'd say it's only in the last 10 years or so (and the last 2 years especially) that the adage "the paperless office is about as practical as the paperless lavatory" really stopped applying. Last I looked, laser printers and supplies were still selling...
Imagine what would have happened if, instead of selling Manhatten to Steve Jobs for a string of beads, Xerox - possibly the one company that was big enough and ugly enough to go toe to toe with IBM - had really thrown their weight (and corporate customer base) into PCs and DTP.
Heck, they didn't even keep the string of beads, which would have turned out to be diamonds 30 years later...
(Score: 1, Interesting) by Anonymous Coward on Tuesday June 07 2022, @04:59PM
In 2008 after the crash, understandably customers were cutting back on buying things like copiers. A year or two later, Ursula Burns announced to all the Xeriods that the bad news was that the customers weren't coming back in such numbers. They'd learned to live with fewer paper documents.
Another great misstep for Xerox was the ColorQube, which was an A3 end-of-corridor multifunction copier using solid ink technology. Xerox acquired Techtronix for the solid ink printing. The tag line was "Color for the cost of black and white." Also, there was far less waste than with laser copiers. The problem with the solid ink was that the image quality wasn't so great and the ink flaked off easily. It was basically melted wax crayons.
Eventually, they offloaded thousands of engineers to HCL and cut back the Techtronix people's pay to minimum wage and all sorts of nasty things like that.
Xerox had completely run out of ideas. 3D printing had come along a few years before and one PHB send out an email saying, "Is that something we should get into?" The solid ink heads could do 3D printing, but PHBs hadn't realised the implication and they hadn't realised the usefulness of 3D printing.
(Score: 3, Interesting) by DannyB on Tuesday June 07 2022, @05:22PM
IBM is the wrong company. They were all about mainframes. They thought there would be a market for 2 million PCs. Tops. IBM saw the Tandy, Apple and Commodore had come in and created a minor competitive market. IBM thought they would go in with the PC and take that market away from them. IBM still considered the hardware to be what was actually profitable. Not the software.
Given their thinking of 2 million PCs max, and all profit is in the hardware, no wonder IBM was happy to allow Microsoft to sell their own version of PC-DOS known as MS-DOS. Like Xerox, IBM, Kodac and buggy whip makers, they simply could not understand what was happening. Too focused on the existing money making business. Automobiles like PCs took over the market in ten years.
Sidetrack . . .
Microsoft is the company that might have done something with desktop publishing. But GUIs just didn't seem to be their interest. Until the late 80s when desktop publishing really took off and it was then too late. Then early 90s with multimedia. It took a long time but Windows 95 was the first worthy competitor that could tempt Mac users to the dark side.
In the 1990s, I heard the following on a documentary, I think it was called "The Machine that Changed The World." The expert explained that management never sees a paradigm shift coming. So they hire people who will recognize it and tell them when it is coming. Then when the experts tell them a paradigm shift is coming, management never believes them.
How often should I have my memory checked? I used to know but...
(Score: 4, Interesting) by DannyB on Tuesday June 07 2022, @01:57PM (7 children)
Key Apple people were given a tour and shown all of the secrets of Xerox. The engineers at Xerox objected, but somehow I suppose Jobs must have seduced them with his reality distortion field.
Apple people came away inspired. Apple added significant improvements over what they had seen. The menu bar with pull down menus is one example. Another underlying enabling technology was QuickDraw and it's Region concept and efficient code for working with Regions. Think of a Region as a "set of pixels". An arbitrary shaped set. Arbitrary cutouts, boundaries. You could think of it as a 2D array of flags, but it was vastly more efficient than that. Bill Atkinson was the genius here. You could do Boolean operations on two regions. Union. Intersection. Subtraction. Inversion. Offset. Expand/Contract. And QuickDraw was FAST. It could draw over 7000 characters per second. And this wasn't some optimized fonts to fit in byte widths or other nonsense. This was an arbitrary font, style, size drawn on the screen, considering a clip region (probably wide open), but it went through all of the graphics machinery. I'm sure the 7000 cps was an optimal case, choosing the right font, having all the right GrafPort settings, etc.
QuickDraw enabled new things. One was that when you moved a window, the underlying window parts that had now become exposed could be rapidly redrawn by their respective applications. Xerox's system could not do this. The user had to give a command to 'repair' the parts of the screen where windows had been uncovered by dragging another window.
I was a long time Mac developer for the classic Mac back in the day. I have many fond memories of all this.
How often should I have my memory checked? I used to know but...
(Score: 2) by DannyB on Tuesday June 07 2022, @01:58PM (4 children)
I wrote the low level parts of Timbuktu and Timbuktu/Remote from Farallon.
How often should I have my memory checked? I used to know but...
(Score: 0) by Anonymous Coward on Tuesday June 07 2022, @06:28PM (3 children)
Don't tell me you had anything to do with Dark Castle, did you? I love that game. I'd love to play it again with those damn bats and throwing rocks and the guy whipping the prisoner (loved throwing rocks at him) and all.
(Score: 0) by Anonymous Coward on Tuesday June 07 2022, @06:58PM (2 children)
Ah, nevermind. I thought Timbuktu was a Mac game (memories from back then are fuzzy these days). I don't think I ever used it, though I'm sure I have heard of it. We had stand-alone Macs in our physics building and I used those a lot for writing up lab reports and papers for other classes. They also had Dark Castle and I probably spent much more time on that to the detriment of my lab writeups!
(Score: 3, Insightful) by DannyB on Wednesday June 08 2022, @02:20PM (1 child)
Timbuktu [wikipedia.org] was a screen sharing program. Back when we weren't sure if screen sharing was even possible on an AppleTalk (240 kbps) network.
Timbuktu/Remote was a follow on version that worked over 9600 kbps modems. I have to implement my own packet frame format and sliding window protocol. I also got a huge education about RS-232. Bought a book on subject. It is way more complex than meets the eye.
At one point I had two stacks of very expensive "high speed" 9600 kbps modems on my desk. The two stacks were identical. At the bottom of both stacks would be a pair of Vendor-A modems. Then next on both stacks were Vendor-B's modems. Etc.
What I found remarkable was that all of these modems were highly optimized for throughput with no consideration given to latency. Latency was very important for Timbuktu/Remote. If you moved your mouse, you needed that action to get to the other computer, and then the pixel changes to come back to your screen.
The Hayes V-Series modem (despite its deceptive name) used a proprietary protocol but had excellent latency and throughput characteristics. It was the modem we used at demos in our booth at MacWorld. I remember giving countless demos to impressed onlookers.
Now everyone takes GUI screen sharing for granted.
To get screen sharing to work, we had to solve numerous technical problems. Over many months we had worked it all out on the back of collected napkins from various restaurants in town at lunch time. There came a day (August 1987) when we realized we had solved every problem and only needed to seek a "go ahead" to start building this. Four months later version 1.0 was shipping. December 1, 1987.
I built the low level background code that hooked Quickdraw, had network implementation, etc. The other programmer built the desk accessory user interface in his first adventure into Macintosh programming.
The 68000 microprocessor family was a pleasure to program in.
This software was originally conceived for one purpose: so our support department could do remote support of our specialized financial accounting software that we sold. However Timbuktu ended up getting our small company acquired.
How often should I have my memory checked? I used to know but...
(Score: 0) by Anonymous Coward on Wednesday June 08 2022, @03:04PM
Thank you for relating this. It brings back memories. That would explain why I didn't use Timbuktu if it came out in '88. I was at the tail end of college at that point and going forward I spent the next ten or so years on a VAX cluster and some unix terminals. We had a VAX in college, but our physics department used a lot of Macs. They were doing some pioneering work in hands-on physics instruction. They were developing and using sensors that worked with the Mac for position sensing, timing, photocells, etc. It was so that you could do things like add them to your frictionless tracks and actually measure the initial and final momenta of the carts before/after their collisions and stuff like that (a cart would slide by, break the photodiode circuit and start a timer, then break the next photodiode circuit and the timer would stop, etc.). The data would show up in a real time plot on the screen and stuff like that. Not too long ago in the alumni magazine they had a bit on the interesting things they have in the college museum, and one of them was one of those sensors; let me tell you, nothing like realizing how old you are when you see something you used and worked with being shown off as one of the interesting relics in a museum. :O
(Score: 3, Insightful) by Rich on Wednesday June 08 2022, @01:17PM (1 child)
While you quote all that, don't forget mouse acceleration that made the idea a viable input device for the masses. Before the Mac, a mouse had to be operated with the whole arm moving. Ever since, the palm rests and the mouse merely gets flicked between thumb and ring finger.
(Score: 3, Interesting) by DannyB on Wednesday June 08 2022, @02:22PM
It is amazing the little details we take for granted. See my post just a little bit above where I describe more about Timbuktu. We did have to solve quite a few technical problems to make this work. At this time, the only thing I know of that was even close to remote screen sharing was X windows.
How often should I have my memory checked? I used to know but...