Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday October 10 2018, @08:41AM   Printer-friendly
from the RIP dept.

Arthur T Knackerbracket has found the following story:

Ken Bowles, a UC San Diego software engineer who helped popularize personal computers in the 1970s and '80s through advances that were exploited by such entrepreneurs as Apple's Steve Jobs, died on Aug. 15 in Solana Beach. He was 89.

His passing was announced by the university, which said that Bowles, an emeritus professor of computer science, had died peacefully.

Bowles was not well-known to the general public. But he was famous in computer science for helping researchers make the leap from huge, expensive mainframe computers to small "microcomputers," the forerunner of PCs.

He was driven by the desire to make it faster and easier for researchers and programmers to work on their own, and to develop software that could be used on many types of computers.

By 1968, Bowles found himself in the perfect spot to push his vision. He was appointed director of the university's computer center, just three years after joining the faculty.

University historians say Bowles taught his students to write and rewrite code on the world's first microprocessors, the chips that revolutionized the computer industry in the 1970s. They were soon writing programs expressly for microcomputers, bypassing mainframes.

Bowles and his team also adopted and modified Pascal, an early programming language that was opening up computer science. The modified version became known as UCSD Pascal and was widely used to teach people how to program.

[...] "The development of UCSD Pascal was a transformative event not just for UCSD but for all of computer science," according to a statement by Dean Tullsen, chair of the department of computer science and engineering at UC San Diego.

"It was arguably the first high-level programming system that both worked on small systems that schools, most businesses, and eventually individuals could afford, and was portable across many systems."

-- submitted from IRC


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Bot on Wednesday October 10 2018, @09:31AM (1 child)

    by Bot (3902) on Wednesday October 10 2018, @09:31AM (#746884) Journal

    That's all, folks!

    --
    Account abandoned.
    • (Score: 0) by Anonymous Coward on Wednesday October 10 2018, @09:34AM

      by Anonymous Coward on Wednesday October 10 2018, @09:34AM (#746885)

      Now we know what's under your dress. A mountain of Pascal code.

  • (Score: 5, Interesting) by martyb on Wednesday October 10 2018, @09:34AM (11 children)

    by martyb (76) Subscriber Badge on Wednesday October 10 2018, @09:34AM (#746886) Journal

    One of the reasons for the popularity of the C programming language is that it was relatively easily to port the language to a different hardware platform. Once that was done, all manner of attendant tools and utilities could be recompiled and targeted for that platform... voilĂ ! You could take it with you!

    The UCSD p-System accomplished a similar result by providing a portable environment for code to run in... conceptually a "virtual" machine or what they termed a pseudo System hence "p-System". It facilitated bringing languages from expensive mainframes to the then-proliferating mini-computers and then micro-computers and PCs.

    The entry on Wikipedia forUCSD Pascal [wikipedia.org] is quite informative, especially the history section [wikipedia.org] which notes Knowles' contribution:

    UCSD p-System began around 1974 as the idea of UCSD's Kenneth Bowles,[6] who believed that the number of new computing platforms coming out at the time would make it difficult for new programming languages to gain acceptance. He based UCSD Pascal on the Pascal-P2 release of the portable compiler from Zurich. He was particularly interested in Pascal as a language to teach programming. UCSD introduced two features that were important improvements on the original Pascal: variable length strings, and "units" of independently compiled code (an idea included into the then-evolving Ada programming language). Niklaus Wirth credits the p-System, and UCSD Pascal in particular, with popularizing Pascal. It was not until the release of Turbo Pascal that UCSD's version started to slip from first place among Pascal users.

    The Pascal dialect of UCSD Pascal came from the subset of Pascal implemented in Pascal-P2, which was not designed to be a full implementation of the language, but rather "the minimum subset that would self-compile", to fit its function as a bootstrap kit for Pascal compilers. UCSD added strings from BASIC, and several other implementation dependent features. Although UCSD Pascal later obtained many of the other features of the full Pascal language, the Pascal-P2 subset persisted in other dialects, notably Borland Pascal, which copied much of the UCSD dialect.

    I was exposed to the p-System some time in the very late 70's / early 80's. At the time, it was a "Big Thing" and garnered much press for its flexibility and portability. The p-System ran p-code which could be generated for other languages besides Pascal. FORTRAN is one language I know of which was supported. Back at that time (the IBM PC was just being introduced) the p-System garnered a great deal of publicity and many articles supporting its promise.

    Because the p-code was interpreted, there was a performance penalty to be paid for the portability. Further, early products failed to garner as much traction as they could have due to the pricing structure. Ultimately, native code compilation provided better performance and could be had at a lower price, relegating the p-System to being a stepping-stone in the history of computers. Its principles and concepts live on, however in such things as the Java VM.

    Kenneth Bowles, RIP.

    --
    Wit is intellect, dancing.
    • (Score: 0) by Anonymous Coward on Wednesday October 10 2018, @09:59AM

      by Anonymous Coward on Wednesday October 10 2018, @09:59AM (#746888)

      We had an Apple ][+ around 1980 and wanted to use it for engineering simulation work. When we looked at the choices of language to write in, it looked like the only system with usable/fast floating point support was UCSD Pascal, so we used that for a few years. It was a little big for that computer, came on several floppies and iirc there was quite a bit of floppy shuffling required to use it. But it did let us get our work done.

    • (Score: 4, Informative) by Rich on Wednesday October 10 2018, @10:50AM (3 children)

      by Rich (945) on Wednesday October 10 2018, @10:50AM (#746897) Journal

      Because the p-code was interpreted, there was a performance penalty to be paid for the portability.

      The 6502's instruction set makes it impractical to generate native code for it, so a compact intermediate format is the only way to go at all. The stack is at a fixed address, only 256 bytes long, and there are no addressing modes to access data on it. The zero-page modes ("(zz),Y" and the less useful "(zz,X)") do not fit at all into the model with variables on a stack that compilers need. However, they are useful to implement a virtual machine with an (even indexable) register set in the zero page. Woz recognized that early on and wrote the "SWEET16" machine for it. Also, the need to go through the zero page for any universal address-range indirection creates seriously bloated code. After Turbo Pascal made its great impression on the Z80, the 6502 eventually got a native compiler with "Kyan Pascal". I once saw its output and it is awful on the border of uselessness - but there's nothing they could really have done better without resorting to intermediate code.

      • (Score: 2) by turgid on Wednesday October 10 2018, @02:09PM

        by turgid (4318) Subscriber Badge on Wednesday October 10 2018, @02:09PM (#746951) Journal

        FORTH was another VM language popular at the time and made its way into Open Firmware.

      • (Score: 1) by optotronic on Thursday October 11 2018, @02:17AM (1 child)

        by optotronic (4285) on Thursday October 11 2018, @02:17AM (#747253)

        The 6502's instruction set makes it impractical to generate native code for it, so a compact intermediate format is the only way to go at all.

        What about Deep Blue C, available on the Atari 800 series? Our company, Innovision, made at least one product (Plexus, a BBS) using it, and I'm pretty sure it compiled to native code -- I remember debugging a printf function in machine code.

        • (Score: 2) by Rich on Thursday October 11 2018, @11:48AM

          by Rich (945) on Thursday October 11 2018, @11:48AM (#747390) Journal

          What about Deep Blue C, available on the Atari 800 series? Our company, Innovision, made at least one product (Plexus, a BBS) using it

          I said "impractical", not impossible. Generally the code was twice as big and half as fast compared to compiled Z80. Kyan claim "twice the speed of [Z80 Turbo Pascal]", but the small print says "At the same clock rate". Now about every Z80 ran on 4 MHz while about every 6502 ran on 1 MHz. I do have a zipchipped //c at 8 MHz, but I'd say that doesn't really count, or only against super fast late model CMOS Z80s. Of course you could constrain yourself to use mostly 8-bit variables and declare everything static, and end up with a running program, but that wasn't remotely close to the efficiency the Apple II offered once a CP/M card was plugged in and ran Turbo Pascal.

    • (Score: 2) by VLM on Wednesday October 10 2018, @11:40AM (2 children)

      by VLM (445) on Wednesday October 10 2018, @11:40AM (#746904)

      Its principles and concepts live on, however in such things as the Java VM.

      Wouldn't disagree with any of that post except to extend that by analogy p-system was to pascal as jvm is to Java/C++. Also the reason Borland "Turbo" compiler products swept the field was their compiler was fast.

      • (Score: 1) by optotronic on Thursday October 11 2018, @02:20AM (1 child)

        by optotronic (4285) on Thursday October 11 2018, @02:20AM (#747256)

        Also the reason Borland "Turbo" compiler products swept the field was their compiler was fast.

        It also had a great IDE for the time, greatly simplifying program development.

        • (Score: 2) by VLM on Thursday October 11 2018, @02:40PM

          by VLM (445) on Thursday October 11 2018, @02:40PM (#747439)

          Nostalgia time... I also remember great (for the era) graphics libraries and stuff like that.

          Always a trade off, be super portable and run anywhere, or run super well but only on PC-XT with CGA card or whatever.

    • (Score: 3, Interesting) by DannyB on Wednesday October 10 2018, @01:38PM

      by DannyB (5839) Subscriber Badge on Wednesday October 10 2018, @01:38PM (#746943) Journal

      In the early 80's I was using the UCSD p-System. There were two vendors that I know of. SoftTech of course. And a Canadian company DataLex.

      Others already mention that the p-System had about a 2K 'ish bytecode interpreter. This is what had to be ported to each new system. Once there, the entire p-System world of binaries could run on it. (Like Java) It had a performance penalty. But not that bad. It was interpreting bytecode, not source code. The p-System environment had features unlike other languages and put BASIC to shame. You could segment your code easily. Segments could be dynamically loaded and unloaded -- without the programmer's assistance or knowledge. Code segments containing functions that had active stack frames could be unloaded! It was like a poor-man's virtual memory. When a return instruction was to a function in an unloaded code segment, that segment would be dynamically re-loaded, perhaps elsewhere in memory than its original location.

      One problem with the p-System on the IBM PC and clones was that it was an OS. On the same level as MS-DOS. So to install it you had to partition the system. Even if you wanted only the p-System on a PC, installation was a pain. Reformat / Partition then install. Partitions could not be dynamically resized in those days.

      DataLex offered a solution to this by the mid 80's. The DataLex Bubble. The p-System interpreter was a DOS EXE. The p-System volumes (normally partitions) were DOS files. This enabled p-System applications to be packaged as MS-DOS applications.

      In order to make operations easy I used Turbo Pascal 7 to write a set of MS-DOS utility commands that could manipulate p-System volumes (eg, DOS files). You could copy files between DOS and a p-System volume using a syntax as if the DOS file were a sort of subdirectory. Other utilities including expanding, shrinking, krunching and other operations on p-System volumes. This enabled shipping compact but full volumes as part of a product on floppy disks. These volumes were completely full with zero free space. During installation volumes copied to the hard drive could then be expanded to have free space available within them.

      I fondly remember those days, but by the mid 80's I was already a Mac developer using MPW Pascal which was a natural transition from the p-System.

      --
      The lower I set my standards the more accomplishments I have.
    • (Score: 0) by Anonymous Coward on Wednesday October 10 2018, @01:56PM

      by Anonymous Coward on Wednesday October 10 2018, @01:56PM (#746948)

      Does writing that language make you a pee hacker?

      (obref for the uninitiated) [wikipedia.org]

    • (Score: 2) by ledow on Wednesday October 10 2018, @02:37PM

      by ledow (5567) on Wednesday October 10 2018, @02:37PM (#746961) Homepage

      The problem of such systems is fairly obvious though:

      You're paying the code conversion costs all the time it's executing.

      If you port the source code, and then compile once, the initial compile is "slow" but everything else operates at native speed for the rest of its existence.

      Compile-times are rarely cared about by anyone but developers. Run-times are cared about by everyone.

      The only way that I can see to combat that - and some bright spark will tell me that Some System X already does it - is to put the source into the executable itself (as just one dormant ELF section, for instance) and have the system "recompile" if it doesn't find a binary ELF section relevant to the architecture it's currently running on (or it detects corruption or changes, etc.). Then, when compiled, it can add that architecture's compiler output into the executable as a new ELF section should it ever happen to run on that system again.

      With a bit of "you can always prune all ELF sections and it'll just regenerate the necessary code if things get too large" built into a "strip"-like tool, and a standard for bundling the source in a portable fashion, you could easily enjoy the advantages of (and even have an ELF section for) p-Code like that, while also providing compatibility across all systems, no run-time hit (except on the first run) and a "native" compiled bundle for any platform that generates itself whenever you run it on a new platform. The only cost is executable size, but most of that would NOT be ELF sections relevant to you anyway, so you wouldn't need them in memory.

      I think Apple tried this a bit with their Universal Binary, but there was no automatic process and (obviously) no source code in those instances.

      But there's nothing stopping you using "p-code" as the base source code, and then compiling optimised native versions from it which are the things that actually execute, and bundling them back into the executable.

(1)