Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by janrinok on Sunday July 09 2023, @01:14AM   Printer-friendly

PL/I stands for Programming Language 1, and its aim was to be the Highlander of programming languages:

...there would be no need for 2, 3, or 4 if everything went to plan. While it is clear today that goal was never reached, what might not be evident is that what PL/I was trying to achieve was a pretty reasonable idea, or at least not entirely crazy. What also wasn't evident at the time was how enormously difficult that reasonable idea turned out to be.

PL/I was designed by IBM with the goal of bringing together the power of 3 different programming languages: FORTRAN (1954), ALGOL (1958), and COBOL (1959).

On paper, this makes a lot of sense. Computer programming can be difficult, and why should there be multiple programming languages? And because computer programming of the era required a lot of punched cards, having One Good Programming Language would have on paper (or cardboard) benefits to simplify the process of development as well. Work on the PL/I specification started in 1964, and work on the first compiler in 1966.

[...] But PL/I wasn't just a development effort, it was also in effect a system conversion. There was an explicit goal for developers to start using PL/I, but were also implicit goals for developers not just to only stop using FORTRAN, COBOL, and ALGOL directly, as well as to convert their existing solutions and codebases to PL/I. As if that wasn't hard enough, compounding the problem was that FORTRAN, COBOL, and ALGOL were all evolving in real time. As I described in my BLOG@CACM post "The Art of Speedy Systems Conversions," a system conversion is one of the most difficult things to do in software engineering. The existing system typically has massive head start, and the replacing system needs to start up development, accelerate, reach feature parity, and then both systems need to be stable long enough to make the switch.

The author presents development timelines of COBOL, FORTRAN and ALGOL showing development on these languages was active for years after PL/I development had started. The historical verdict?


Original Submission

When compared to the thousands of other programming languages that have been created in the past 60+ years, PL/I was a success. PL/I reportedly was used in the development of the Multics operating system and the S/360 version of Sabre airline reservation system, among others. PL/I was taught at the college-level. PL/I has been around for decades. Most programming languages would be envious to do half as well.

But PL/I didn't achieve its strategic goal of consolidating scientific and business computing with the best new programming paradigms that research could provide, and it wasn't for a lack of trying. That goal, although well-intentioned, became impossible as both FORTRAN and COBOL kept accelerating. In terms of adoption, COBOL became the most widely used programming language in the world by 1970, and ripping out an existing COBOL system and replacing it with PL/I was going to be a hard sell to customers. The same could surely be said of existing FORTRAN systems. COBOL and FORTRAN also kept accelerating in terms of language definition during the 1960s, making PL/I's feature parity with them not just a challenge, but also ambiguous as it took both COBOL and FORTRAN years to stabilize their own respective standards.

Previously: Why Are There So Many Programming Languages?

Related Stories

Why Are There So Many Programming Languages? 68 comments

Over at ACM.org, Doug Meil posits that programming languages are often designed for certain tasks or workloads in mind, and in that sense most languages differ less in what they make possible, and more in terms of what they make easy:

I had the opportunity to visit the Computer History Museum in Mountain View, CA, a few years ago. It's a terrific museum, and among the many exhibits is a wall-size graph of the evolution of programming languages. This graph is so big that anyone who has ever written "Hello World" in anything has the urge to stick their nose against the wall and search section by section to try find their favorite languages. I certainly did. The next instinct is to trace the "influenced" edges of the graph with their index finger backwards in time. Or forwards, depending on how old the languages happen to be.

[...] There is so much that can be taken for granted in computing today. Back in the early days everything was expensive and limited: storage, memory, and processing power. People had to walk uphill and against the wind, both ways, just to get to the computer lab, and then stay up all night to get computer time. One thing that was easier during that time was that the programming language namespace was greenfield, and initial ones from the 1950's and 1960's had the luxury of being named precisely for the thing they did: FORTRAN (Formula Translator), COBOL (Common Business Oriented Language), BASIC (Beginner's All-purpose Symbolic Instruction Code), ALGOL (Algorithmic Language), LISP (List Processor). Most people probably haven't heard of SNOBOL (String Oriented and Symbolic Language, 1962), but one doesn't need many guesses to determine what it was trying to do. Had object-oriented programming concepts been more fully understood during that time, it's possible we would be coding in something like "OBJOL" —an unambiguously named object-oriented language, at least by naming patterns of the era.

It's worth noting and admiring the audacity of PL/I (1964), which was aiming to be that "one good programming language." The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's.

The author goes on to reason that new languages are mostly created for control and fortune, citing Microsoft's C# as an example of their answer to Java for a middleware language they could control.

Related:
Non-Programmers are Building More of the World's Software
Twist: MIT's New Programming Language for Quantum Computing
10 Most(ly dead) Influential Programming Languages


Original Submission

This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Interesting) by Anonymous Coward on Sunday July 09 2023, @01:28AM (2 children)

    by Anonymous Coward on Sunday July 09 2023, @01:28AM (#1315177)

    What? It was one of those rare cases where a teenage boy turned out to be wise. Several of my fellow nerds suggested we go take a short course on it. I said "no" because technology moves fast, and by the time we graduate everything you learn in such a language-specific seminar could be worthless.

    Funny, I'd never heard of it so I thought it was new. This was the 80s, and when I wiki'd it, it says it started in 1964! If you'd told me it was that old I might have wanted to know who was using it; but I probably still wouldn't have wanted to learn it; there was no compiler for the C-64 AFAIK, so it would have just been a short course on... something I couldn't use.

    I may not have been thinking this at the time; but I stand by the attitude. People have called me "willfully ignorant" over the years; but I prefer to think of it as triage. If you don't make some decisions about what to cut, you get information overload, spend time on blind alleys, etc. The people who accuse you of willful ignorance are usually upset you don't care about their pet project, or quite often it's a conspiracy theory where they want you to read a stack of books, or Marxism. Marxists love to give you a reading list. I say, "I don't need to be a farmer to know what a cow pie is".

    • (Score: 2) by RS3 on Sunday July 09 2023, @01:55AM

      by RS3 (6367) on Sunday July 09 2023, @01:55AM (#1315181)

      Similar story here, and I agree on focusing / specializing versus spreading yourself too thin. It's pretty hard to know what's going to be good, career-wise, to learn and dig into. It seems worse than ever now.

      I was (much?) more into hardware going into uni in the later 80s, but certainly had programming exercises and courses. We did Fortran, Pascal (?), and PL/C, which is (evidently) Cornell's version of PL/1 (no, I didn't go to Cornell but kind of maybe wish I had). I didn't care for Pascal and didn't know why we were learning it. Fortran does what it does, seemed boring. I don't remember PL/1 / PL/C but I remember that I liked it a lot. It felt like I could get a lot done with it.

      I also did some IBM Rexx and remember really liking that. I'd already been into BASIC for years.

      I have one of these [ibm-1401.info]. They hadn't learned to do structured programming at that point.

    • (Score: 2) by driverless on Monday July 10 2023, @09:40AM

      by driverless (4770) on Monday July 10 2023, @09:40AM (#1315365)

      For those interested in reading more, here's the fixed version of the second link [acm.org].

  • (Score: 2) by Mojibake Tengu on Sunday July 09 2023, @02:43AM (5 children)

    by Mojibake Tengu (8598) on Sunday July 09 2023, @02:43AM (#1315186) Journal

    Rust is technically inferior to PL/I and you ask for a historical verdict?

    Maybe such verdict may come after 200 years passes... of course with FORTRAN still in active use by enginerers.

    Do you really think ever changing your funny software every week is a path to security and stability? I call it a hell road... And it will hit the End some day.

    Yes, I have a PL/I handbook in my library...

    --
    Respect Authorities. Know your social status. Woke responsibly.
    • (Score: 2, Insightful) by janrinok on Sunday July 09 2023, @05:26AM (3 children)

      by janrinok (52) Subscriber Badge on Sunday July 09 2023, @05:26AM (#1315198) Journal

      Rust is technically inferior to PL/I

      Based on what metrics? I could say the world is a cube, but it does not make it true.

      • (Score: 2) by Mojibake Tengu on Sunday July 09 2023, @02:09PM (1 child)

        by Mojibake Tengu (8598) on Sunday July 09 2023, @02:09PM (#1315237) Journal

        Write an Ackermann's function in Rust, such implement as it could really compute some results.
        If you use recursion by definition, you will find your limits very soon.

        In PL/I it can be written as procedure with intertwined cycles on array. Even in C it can be written... with intertwined goto's.

        I could say the world is a cube, but it does not make it true.

        Well, http://sauerbraten.org/ [sauerbraten.org]

        --
        Respect Authorities. Know your social status. Woke responsibly.
        • (Score: 2) by janrinok on Sunday July 09 2023, @03:29PM

          by janrinok (52) Subscriber Badge on Sunday July 09 2023, @03:29PM (#1315242) Journal

          But there are alternative ways of writing that using iteration [baeldung.com]. Recursion has the same limits in many languages. That doesn't mean that only PL/1 and C can be used to write an Ackermann's function.

          There is a reason that not many people are using PL/1 nowadays, and it isn't because it can use intertwined cycles on an array.

      • (Score: 2) by driverless on Monday July 10 2023, @09:46AM

        by driverless (4770) on Monday July 10 2023, @09:46AM (#1315366)

        Based on the fact that I can write it now in PL/I and in a year's time, five years, ten years, twenty years, I can compile it and it'll still work. With Rust I'll have had to rewrite it fifteen times and rework my build environment and toolset twenty times, and even then I can't be sure it won't be futzed around with a few days later and require another rework.

        In production, stability and predictability are far more important than ooh, shiny! And Rust has very little of the former even if it does have endless amounts of the latter.

    • (Score: 0) by Anonymous Coward on Sunday July 09 2023, @05:58PM

      by Anonymous Coward on Sunday July 09 2023, @05:58PM (#1315259)

      "I can't imagine a programming language better suited to a domain I'm not familiar with. There haven't been new domains to apply computing to since 1952."

      Today, I resist the urge to insult your small-mindedness. I'll just point it out.

  • (Score: 4, Interesting) by gznork26 on Sunday July 09 2023, @03:48AM (3 children)

    by gznork26 (1159) on Sunday July 09 2023, @03:48AM (#1315189) Homepage Journal

    The idea that PL/1 could be the Programming Language to Rule Them All seemed to have gone its figurative head when I was on a project for the US Air Force at MacDonnell-Douglas in 1981. The company was involved in a competition against Boeing, and I'd been hired on because I had worked in Fortran, COBOL and Assembler. The COBOL was essential because it was the only language that was supported for IBM's magnetic-belt mass storage system, for which I spent time making bit-level node maps of data stored on the thing.

    Anyway, another part of the overall project ended up needing parts coded in COBOL, Fortran, Assembler and PL/1 to solve various tricky problems. We'd addressed all of the individual problems in the language most suited to it, and were running into trouble integrating the parts, which needed to pass data between portions written in various languages. By a lucky fluke, one of the people discovered that PL/1 routines would only work properly if the main entry point for the entire thing was written in PL/1. So we ended up with a very short MAIN, which basically called the functional MAIN, and then exited. No clue why, but that was the secret sauce.

    --
    Khipu were Turing complete.
    • (Score: 2) by istartedi on Sunday July 09 2023, @04:17AM

      by istartedi (123) on Sunday July 09 2023, @04:17AM (#1315190) Journal

      Sort of wonder how this happens. Was there any documentation on the calling convention used by PL/1? That's where I'd start looking, but I have no idea where to go from there if that wasn't the problem. It sounds like the kind of project where you might have gotten disgusted to the point of not caring about voodoo code as long as it worked.

      --
      Appended to the end of comments you post. Max: 120 chars.
    • (Score: 2) by coolgopher on Sunday July 09 2023, @04:45AM

      by coolgopher (1157) on Sunday July 09 2023, @04:45AM (#1315194)

      Sounds like a runtime initialisation issue, just like you have your .bss zeroing as part of the C runtime startup, which without you end up with weird and wonderful effects (been there, done that, didn't care for it).

      Of course, having not actually touched PL/I, this is only an educated guess. YMMV, caveat emptor, etc.

    • (Score: 2) by HiThere on Sunday July 09 2023, @01:46PM

      by HiThere (866) Subscriber Badge on Sunday July 09 2023, @01:46PM (#1315235) Journal

      I really liked PL/1, but even in the early days there were a lot of dialects, and I never ran across a compiler that implemented the entire thing. I did my first red-black tree in PL/1. Doing it in FORTRAN just seemed like something crazy to even attempt, and I didn't have access to an Algol compiler. C may have been around, but I didn't encounter it for another decade.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 3, Insightful) by Beryllium Sphere (r) on Sunday July 09 2023, @07:59AM (5 children)

    by Beryllium Sphere (r) (5062) on Sunday July 09 2023, @07:59AM (#1315202)

    Fortran was so neurotic about data types that you couldn't add an int to a float in the early versions.

    PL/I overreacted in the other direction, converting data types automatically with what one writer called "wild abandon".

    A colleague called me in to help with a bug. It was one of those "It can't possibly be doing that!" bugs.

    As best I remember, he assigned 1 to something but somehow got it into a string. You might expect the string would have been "1", but it was padded with leading zeroes.

    Somehow, the next thing he did truncated it, so all that was left was one of the leading zeroes.

    I probably have some details wrong.

    It was also a resource hog. A relative who worked for IBM back in the day said the problem was a failure to ship subsets that could run practically.

    More readable than Fortran, though, and full of luxury features like a "PUT DATA" statement that printed the compile-time name of a variable along with its value.

    • (Score: 2) by sgleysti on Sunday July 09 2023, @03:38PM (3 children)

      by sgleysti (56) Subscriber Badge on Sunday July 09 2023, @03:38PM (#1315244)

      Fortran was so neurotic about data types that you couldn't add an int to a float in the early versions.

      Well, if you're using FORmula TRANslation to write your program, surely all of the variables for scalars, vectors, and matrices in the numerical code should be REAL, COMPLEX, DOUBLE PRECISION, or DOUBLE COMPLEX. INTEGER variables are used to index elements of vectors and matrices. I'm not sure why you would want to convert between them ;)

      We even made it easy on you! "The default implicit typing rule is that if the first letter of the name is I, J, K, L, M, or N, then the data type is integer, otherwise it is real." See? I, J, K, L, M, and N for indices, just like the paper or textbook describing the numerical algorithm you're trying to code, INTEGER by default, everything else is REAL. Give the people what they want.

      • (Score: 3, Informative) by hubie on Sunday July 09 2023, @06:06PM (1 child)

        by hubie (1068) Subscriber Badge on Sunday July 09 2023, @06:06PM (#1315260) Journal

        In FORTRAN, GOD is REAL (unless declared INTEGER)

      • (Score: 3, Funny) by driverless on Monday July 10 2023, @09:57AM

        by driverless (4770) on Monday July 10 2023, @09:57AM (#1315367)

        "The default implicit typing rule is that if the first letter of the name is I, J, K, L, M, or N, then the data type is integer, otherwise it is real."

        Which is still with us today in pretty much every programming language and teaching text: The universal primary loop induction variable is i, followed by j, followed by ..., because they were sacred to the ancient Egyptians or some other reason lost in the mists of time. The Ancients mutter to themselves about some demonic being called FORTRAN or ZALGOL or something but that's just a myth.

    • (Score: 0) by Anonymous Coward on Sunday July 09 2023, @06:52PM

      by Anonymous Coward on Sunday July 09 2023, @06:52PM (#1315265)

      you couldn't add an int to a float in the early versions.

      No problem. Just create an array of all the floats and index in to that whenever you want to convert. Then to get back to integer, just use a bunch of if-else statements. Easy-peasy, and the first algorithm is O-1, so very fast. /sarcasm.

      In all seriousness though, some combination of division and modulo (assuming they're supported) could be used to pull this off. In any event, working around the restriction isn't that bad; but annoying enough that I can see why they didn't keep it like that too long.

  • (Score: 1, Insightful) by Anonymous Coward on Sunday July 09 2023, @10:13AM

    by Anonymous Coward on Sunday July 09 2023, @10:13AM (#1315213)

    Speaking as someone who has delved into the intricacies of PL/I, I am
    sure that only Real Men could have written such a machine-hogging,
    cycle-grabbing, all-encompassing monster. Allocate an array and free
    the middle third? Sure! Why not? Multiply a character string times a
    bit string and assign the result to a float decimal? Go ahead! Free a
    controlled variable procedure parameter and reallocate it before
    passing it back? Overlay three different types of variable on the same
    memory location? Anything you say! Write a recursive macro? Well,
    no, but Real Men use rescan. How could a language so obviously
    designed and written by Real Men not be intended for Real Man use?

    Source: https://motd.ambians.com/quotes.php/name/freebsd_fortunes/toc_id/1-0-2/s/2618 [ambians.com]

  • (Score: 3, Interesting) by VLM on Sunday July 09 2023, @11:13PM (2 children)

    by VLM (445) on Sunday July 09 2023, @11:13PM (#1315298)

    The entire point of PL/I is the first compiler was written IIRC in system360 basic assembly language because it was supposed to be a package deal.

    Up until the '360 IBM always sold two product lines. A business processor more or less unit record equipment on steroids really good at branching sorting and counting and thruput but not so much floating point, and a separate hardware product line for scientific processing which offered all kinds of floating and decimal ops.

    Turing being Turing, anything that can "compute" can simulate something else that computes, but it'll take extra memory and be really slow. So the IBM business line could do an incredibly shitty job of trigonometry if it was needed for some kind of billing or something, or the IBM science line could bravely count punch cards and do taxes, although pretty slow compared to the business hardware line.

    And this separation went up to programming languages. Business hardware mostly ran compiled COBOL and Science hardware mostly ran FORTRAN.

    The suits sold everyone on a combined hardware, the System 360, which like a full circle could do it all, business and science. This came along with a universal language to rule them all, PL/I. It never went anywhere because the sysprogs and software devs all continued to use FORTRAN or COBOL as appropriate and PL/I was a total WTF why would anyone want that? All our existing science code is written in FORTRAN and all our general ledger stuff is written in COBOL and WTF would I write in PL/I? The answer turned out to be "nothing".

    A side problem is IBM being IBM, this was the era before GNU of software being expensive. You want this new PL/I compiler? Sure it'll be $10K and your employee productivity will be zeroed for a couple weeks/months as they get used to PL/I. Uh no thanks I'll stick with COBOL/FORTRAN.

    This is all somewhat before my time but not before my dad's time and I heard stories growing up... (Insert Star Wars quote manipulated to fit along the lines of "this was your father's lightsabre, a civilized weapon from a more civilized time... etc). Also I have an Aunt who was a sysprog for a paper company (which might almost dox her; how many female sysprogs were there in the 70s at multinational megacorporations doing paper mfgr? She's prob the only one ever...) and her and her bro (my dad) would swap stories when they got together about IBM bullshit. Then I got a job at a mega finance services company in the 90s so that meant I got to meet all the old timers from IBM and hear stories. It kind of runs in my blood, like midi-chlorians I guess.

    • (Score: 2) by gznork26 on Monday July 10 2023, @03:28AM (1 child)

      by gznork26 (1159) on Monday July 10 2023, @03:28AM (#1315327) Homepage Journal

      MIDI-chlorians? Doesn't that mean Jedi ought to be natural musicians?

      --
      Khipu were Turing complete.
      • (Score: 2) by VLM on Monday July 10 2023, @10:23PM

        by VLM (445) on Monday July 10 2023, @10:23PM (#1315476)

        Don't want to know where the DIN plug goes. But yeah thats both why lightsabers make whooshy sounds, and why Jedi's move so fast what with a 31.25K baud rate. I myself could barely keep up with reading BBS screens back in the 1200 baud days and 2400 baud was often too fast, so your average Jedi runs about "twenty times" my bandwidth.

  • (Score: 1, Informative) by Anonymous Coward on Monday July 10 2023, @06:30AM

    by Anonymous Coward on Monday July 10 2023, @06:30AM (#1315346)

    you guessed it.

(1)