Freelance software developer Tim Coates has written a short post about how Pascal remains a valuable and usable language.
Special Note: When I mention Pascal, I'm including versions like Delphi, Free Pascal, Lazarus, and others that have developed over the years. And regardless of which version you use, these variations of Pascal each bring something unique to the table. Together, they keep Pascal relevant and versatile, offering a range of tools for both new and experienced developers.
Pascal was initially started back in 1970 by Niklaus Wirth.
Previously:
(2024) RIP: Niklaus Wirth 15.2.1934 - 1.1.2024
(2023) Thinking Back on 'Turbo Pascal' as It Turns 40
(2023) Pioneering Apple Lisa Goes "Open Source" Thanks to Computer History Museum
(2020) ALGOL 60 at 60: The Greatest Computer Language You've (Probably) Never Used
(2018) UCSD Pascal Pioneer Ken Bowles Has Passed Away
(2018) Original Version of Photoshop Was Written in Pascal; Source Released
(2016) The Developer Died 14 Years Ago, Here's a Print Out of His Source Code
Related Stories
This week, reader “Earl” tells us that just this year he responded to “a Craigslist ad for a Novell NetWare Admin to figure out why .nlm files would not be loaded and fix the issue.”
[...] The return call came “almost instantly” and Earl “gave them my expensive price and advised them that I was not the first choice for a NetWare admin, but I had extensive system troubleshooting experience.”
Those caveats didn't matter: the person who placed the ad said he's run it for months and months and never had a reply from anyone in the USA. Earl was just 90 minutes away by train and got the gig.
When Earl visited the site, he was told that an electrical storm had taken out the NetWare server and Windows 95/98 clients. Said server was a Dell PowerEdge 1300 with 64MB of RAM and a 10GB IDE hard drive. Earl reckons it was built in 1997 or 1998, so was a bit taken aback when told this was “the new server”.
[...] Next came a request to boot up the Compaqs, which had power supply and fan failures. A request to swap the disks from the dead Compaqs was not something Earl could do, as they had tossed out the necessary SCSI cables a few years back.
Earl was asked to do all of these things so the company could run its bespoke accounting program, which was written for it in 1993.
The developer, it turned out, had died in 2001. But the source code was in the company safe … on about 2000 pages of dot matrix printer paper. And there were backups of the old data … on 20 years worth of floppy disks and a pair of CD-ROMs.
[...] Earl told the company that they'd need a working server, running NetWare, before he could even begin to contemplate the task of typing in the source code so he could see if the backups could be restored. Then he'd have to hope that a Pascal compiler could cross-compile for NetWare to have even a chance of setting things to rights.
To the company's credit, it tried hard to meet his requests. Two weeks later Earl says he returned to the company, where a working PowerEdge 1300 with a PCI network awaited.
[...] But he didn't have his own monitor.
[...] He somehow got to work. DOS 6.22 and all the device drivers “installed like a charm”. NetWare 4.1 installed. It was seen by both Windows 95 and 98 on the frail network. Now it came time to restore the application.
But it turned out that the stack of disks contained only data, not the application. Even the 10MB disk from the “old” server was uselessly corrupted.
Earl tried to explain this problem, but the client was having none of it and showed him the door.
Earl tells The Register the client owes him about US$5,000.00 for his time and is showing no signs of paying up. At least he didn't have to re-type all that source code: perhaps there weren't enough keyboards in the office!
Thomas Knoll, a PhD student in computer vision at the University of Michigan, had written a program in 1987 to display and modify digital images. His brother John, working at the movie visual effects company Industrial Light & Magic, found it useful for editing photos, but it wasn’t intended to be a product. Thomas said, “We developed it originally for our own personal use…it was a lot a fun to do.”
Gradually the program, called “Display”, became more sophisticated. In the summer of 1988 they realized that it indeed could be a credible commercial product. They renamed it “Photoshop” and began to search for a company to distribute it. About 200 copies of version 0.87 were bundled by slide scanner manufacturer Barneyscan as “Barneyscan XP”.
The fate of Photoshop was sealed when Adobe, encouraged by its art director Russell Brown, decided to buy a license to distribute an enhanced version of Photoshop. The deal was finalized in April 1989, and version 1.0 started shipping early in 1990.
Arthur T Knackerbracket has found the following story:
Ken Bowles, a UC San Diego software engineer who helped popularize personal computers in the 1970s and '80s through advances that were exploited by such entrepreneurs as Apple's Steve Jobs, died on Aug. 15 in Solana Beach. He was 89.
His passing was announced by the university, which said that Bowles, an emeritus professor of computer science, had died peacefully.
Bowles was not well-known to the general public. But he was famous in computer science for helping researchers make the leap from huge, expensive mainframe computers to small "microcomputers," the forerunner of PCs.
He was driven by the desire to make it faster and easier for researchers and programmers to work on their own, and to develop software that could be used on many types of computers.
ALGOL 60 at 60: The Greatest Computer Language You've (Probably) Never Used:
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages.
The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes.
ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula.
"In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things.
[..] "Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive."
[...] "And a bunch of people thought you could do better."
[...] One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert.
Lisa OS 3.1's 1984 source Pascal code now available under a non-commercial license:
As part of the Apple Lisa's 40th birthday celebrations, the Computer History Museum has released the source code for Lisa OS version 3.1 under an Apple Academic License Agreement. With Apple's blessing, the Pascal source code is available for download from the CHM website after filling out a form.
Lisa Office System 3.1 dates back to April 1984, during the early Mac era, and it was the equivalent of operating systems like macOS and Windows today.
The entire source package is about 26MB and consists of over 1,300 commented source files, divided nicely into subfolders that denote code for the main Lisa OS, various included apps, and the Lisa Toolkit development system.
First released on January 19, 1983, the Apple Lisa remains an influential and important machine in Apple's history, pioneering the mouse-based graphical user interface (GUI) that made its way to the Macintosh a year later. Despite its innovations, the Lisa's high price ($9,995 retail, or about $30,300 today) and lack of application support held it back as a platform. A year after its release, the similarly capable Macintosh undercut it dramatically in price. Apple launched a major revision of the Lisa hardware in 1984, then discontinued the platform in 1985.
The Lisa was not the first commercial computer to ship with a GUI, as some have claimed in the past—that honor goes to the Xerox Star—but Lisa OS defined important conventions that we still use in windowing OSes today, such as drag-and-drop icons, movable windows, the waste basket, the menu bar, pull-down menus, copy and paste shortcuts, control panels, overlapping windows, and even one-touch automatic system shutdown.
Several sites are reporting on the 40th anniversary of Turbo Pascal.
At the vintage computing web blog, Byte Cellar:
November marked the 40th anniversary of Turbo Pascal, the first Integrated Development Environment (or IDE), which allowed a user to quickly and easily write a program in the Pascal programming language and see it compiled and linked — all in one go — with an executable dropped to disk at the end. Much simpler a process than the traditional model of programming in a text editor, using a compiler to convert the source into object code (often over several passes), and running a linker to integrate any required libraries, Turbo Pascal was friendly, fast, and cheap. Created by Anders Hejlsberg, the development package was released by Borland in November 1983 at a price of $49.99 for both CP/M and DOS-based systems.
Created by Niklaus Wirth in 1970, Pascal is a small and efficient procedural programming language that is easy to use and, thanks to its structured programming nature, was often employed as a language for learning programming concepts at a level higher than traditional, early BASIC. It is in this capacity that I had my first hands-on experiences with the language in an A.P. Computer Science class I took in high school during the late ’80s. Here, at its 40th anniversary, I thought I would share some memories I have with Turbo Pascal.
And over at The Register:
https://lists.inf.ethz.ch/pipermail/oberon/2024/016856.html
I am deeply saddened to have received the news of Niklaus Wirth's passing and extend my heartfelt condolences to his family and all those who were dear to him. I wanted to take a moment to reflect on the profound and positive impact that Niklaus had on my life and career, and to express my gratitude for all that he meant to me.
(Score: 5, Informative) by driverless on Wednesday November 06, @12:58AM (3 children)
Once you got to a non-crippled variant like Turbo Pascal and its descendents it was actually a pretty decent language, like a cleaner, safer version of C.
(Score: 5, Interesting) by stormreaver on Wednesday November 06, @01:49AM (2 children)
When I was the librarian for my local CoCo computer group back in the 90's, we had a copy of a C compiler and a Pascal compiler in the library. I hadn't settled on which language I was going to learn next (I had learned BASIC and assembler), so I grabbed the Pascal disk and installed it. Nothing worked, so I tried the C compiler. It worked, so I decided to learn C. That bit of serendipity started me down a path that would not have happened had the Pascal compiler worked. That always amuses me.
(Score: 2) by VLM on Wednesday November 06, @01:56PM (1 child)
The OS-9 ones? I remember the big problem I had with the OS-9 compiler is it was K+R C and later on the world moved to ANSI C and I found the change just large enough to be quite annoying.
I never experimented with the Pascal one.
Back in the level 1 days and 35 track single sided disks we were ALWAYS out of both memory and disk space usually at the same time; I would suspect that was the problem?
(Score: 2) by stormreaver on Wednesday November 06, @02:31PM
I'm pretty sure they were the OS-9 ones. The C compiler was most definitely K&R. Since it was my introduction to C, I didn't have the culture shock of ANSI changes.
I never figured out why the Pascal compiler didn't work, as (if I remember correctly) there were no messages of any kind. The compiler started and stopped with no diagnostics, and there was no binary generated.
(Score: 4, Informative) by looorg on Wednesday November 06, @01:00AM (1 child)
So did anyone bother to watch his youtube channel cause as far as I can tell the article does not mention why Pascal deserves a second look.
I have been looking into Rascal, it seems somewhat interesting. It's Pascal for 8bits (c64 etc). For me I guess this is as close as I have gotten to Pascal since being taught it about 30 years ago.
https://hackaday.com/2021/11/30/turbo-rascal-is-the-retro-pascal-compiler-we-always-wanted/ [hackaday.com]
(Score: 2) by HiThere on Wednesday November 06, @09:37PM
Pascal was quite a good language. It was good enough that at one point Apple picked Pascal rather then C as it's professional programming language.
However, it suffers from poor documentation, and it's not friendly to Unicode . Neither of those is an inherent problem, but both are present. And it doesn't have any decisive advantages over C.
I've often thought that it was a pity that Modula II wasn't taken up more widely, but these days C++ is a far better choice. It's also too bad that Algol languished, but it did, and currently I don't think there would be any advantage in reviving it. (I never really got a good look at Algol68 though. Perhaps that had some features that would still be desirable.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 5, Insightful) by Snotnose on Wednesday November 06, @01:10AM (14 children)
I was a self taught programmer in the late 70's early 80's. I learned assembly and got hired to write C. Without a degree.
Figured out real quick that if I didn't get a college degree my job prospects were on the Jeffry Epstein running a daycare level. Went to college, took a required Pascal class. On every fricken assignment I could think of 3 different ways to solve the problem in C, and 0 in Pascal. I think I got a C in that class. I loved Turbo Pascal, that was eye-opening. I hated Pascal.
IMHO, a much better teaching language is Java. Pascal was good in it's day, but Java is it's modern successor.
Of course I'm against DEI. Donald, Eric, and Ivanka.
(Score: 1, Insightful) by Anonymous Coward on Wednesday November 06, @04:01AM (6 children)
UGH. I have to work with idiots that come from schools that teach them Java as a primary language. Their skills are shit. I had one even throw a hissy fit because he has to actually write code ("what do you mean there isn't a library for it?!"). Java and it's ilk are toys that create crappy developers.
(Score: 5, Insightful) by owl on Wednesday November 06, @04:22AM (1 child)
Java's meant to appeal to the enterprise market, where a company (or govt. agency) hires a bunch of H1B hires to clump together pre-written libraries into a Rube Goldberg machine to barely do bespoke task X.
The H1B hires could not write any of the library code if they tried, they simply can't write any algorithm more complicated than moving data from function call X and stuffing it into function call Y.
(Score: 0) by Anonymous Coward on Wednesday November 06, @05:54PM
This resonates.
I worked in a university science lab and in the good ol' days it was nerds figuring out how to do things themselves. Now it's managers asking why you can't just download someone else's widget and use it. Just get a result - anything! - as quick as possible, who cares if you understand what it does, just use it and move on.
(Score: 4, Insightful) by theluggage on Wednesday November 06, @11:05AM (2 children)
Don’t blame the language - the problem is schools passing off narrow courses in Java Forms as computer science, to meet demands for cheap “programmers” to literally turn out CRUD.
You could teach serious, fundamental computer science and algorithm design using, I dunno, PHP (that’s kinda a major point behind CS) or just as easily teach shallow string-together-libraries/CRUD programming in Delphi or Python.
(Score: 3, Touché) by crm114 on Wednesday November 06, @03:46PM (1 child)
CRUD - Create, Read, Update, Delete
Sounds like a DBMS major to me. You sure the schools know the difference?
(Score: 3, Interesting) by VLM on Wednesday November 06, @06:25PM
For laughs I compared my degree requirements in 2005 to 2024 and they STILL require "Database Basics" more or less codd normal form and SQL, probably including REST APIs now.
In other news, the old compiler class looks simpler and uses newer tools. They still have computer architecture as an elective; I think you'll be a pretty poor programmer without learning that stuff. They've added a ton of electives that you would expect. Game design, AI, data science, cybersecurity, mobile app development, if it was "cool" in the last decade or two, they have an elective for it.
I vaguely remember using GNU Flex and Bison to write a text adventure that used very weird functionality, more or less. It was definitely manipulating the concept of a text adventure to create a Flex/Bison exercise rather than using the tools to implement a "normal" text adventure. I think nowadays they would use ANTLR. All I remember of BNF today, is I did not like it.
(Score: 3, Insightful) by HiThere on Wednesday November 06, @09:44PM
Your complaint has value, but it's really wrong. If you have to keep reinventing the wheel your code will take a lot longer to write, and likely won't be any better. (Actually, it will likely be worse, because presumably the libraries have had lots of debugging.)
FWIW, I started out with assembler, and moved from there to FORTRAN IV, so I've spent lots of time working without decent libraries. If there's a decent library, I'd prefer to use it rather than write my own version. These days I usually use C++, but BELIEVE, I depend on libraries. I'm picky about them, but I depend on them. When is the last time you implemented a radio button from scratch?
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 4, Insightful) by ledow on Wednesday November 06, @01:48PM (6 children)
25+ years ago I graduated with honours.
One of the courses I was made to attend for three years was Introduction to Programming, which was entirely Java.
It was pointless and out-of-date even back then, and I literally handed in the assignments (by FTP, no less!) without ever attending a lecture (and passed through it with high marks).
I'd been programming in everything from VB to C to assembly (Z80 and x86) throughout my childhood, and Java was an awful language for teaching, and I would contest it's still an awful language for the things it's used for (even with Oracle etc. licensing aside). It was *supposed* to be portable, platform independent, a secure isolated "VM", etc. etc. and it never delivered any of that in any practical sense (I always loved hearing my lecturer moan about things in email because it was a dual-boot campus, NT and Linux, and almost all the assignments he got handed back were so reliant on Windows-only features that he started to just mark them zero if they didn't work on Linux too... I never had that problem).
Java is a completely different beast to something like C or Pascal, and old Java knowledge is basically useless nowadays.
I don't understand why any particular language is regarded as "good" for this - it's like saying that Chinese is the best language to grow up with because more people speak it, or Indonesian because it has fewer vowels (or whatever). It's a dumb way to teach communication.
And I don't get why we exclude legacy languages that were literally DESIGNED for education just because they're a bit outdated. We all started reading See Spot Run and then progressed, it's fine, we don't have to Lord of the Rings people on their first day of programming. And likely by the time ANY of those kids/students/adults get into the workplace, the actual language they'll need to know will be entirely different and probably didn't exist (e.g. technically Python existed while I was at university, and even when I was just a kid... I never heard anything of it until nearly 10 years later... nobody could predict that it would be a major language. And Rust is only 9 years old).
There is no "teaching language". What there are are "teaching methods" and it's a perfectly viable teaching method to start with a simple language, even a made-up one, that works to progress students rather than trying to lump them into a current-day commercially-viable complex language from day one and then wonder why it's all changed by the time they get to the workplace.
(Score: 3, Interesting) by hendrikboom on Wednesday November 06, @03:51PM (3 children)
That's exactly the philosophy behind Racket's teaching languages. They are (more or less) subsets of the full Racket language, designed for use at different stages of programming knowledge. They avoid letting the beginner jump in at the deep end and get lost. But as you progress through the learning process, you get more and more of the features of the full Racket language in the teaching language you've reached.
And if you're doing self-study, and want to jump ahead, you do indeed have the complete Racket language available. The program that implements the teaching languages also implements the complete language (and several other programming languages as well, for that matter. Even Algol 60, about as far removed from Lisp's syntactic conventions as you can get.). The choice of language is made on the first line of he program, which declares which language you're going to use.
(Score: 3, Funny) by DannyB on Wednesday November 06, @09:13PM (2 children)
Despite my love of Lisp in its multitudinous dialects, I find the following quote (not mine) amusing . . .
Lisp has all the visual appeal of oatmeal with fingernail clippings mixed in.
People who can't distinguish between etymology and entomology bug me in ways I cannot put into words.
(Score: 3, Interesting) by hendrikboom on Thursday November 07, @07:26PM (1 child)
The parentheses are probably the fingernail clippings, right?
One of Racket's many sublanguages is Rhombus [github.com]. It has different syntax [youtube.com] from (but compatible semantics to) Racket. It needs far fewer parentheses.
(Score: 2) by DannyB on Friday November 08, @03:27PM
At least, like Java and Clojure, it allows Unicode characters in identifiers.
I can't over emphasize how important it is for code readability to use emoji characters in programming language identifiers.
People who can't distinguish between etymology and entomology bug me in ways I cannot put into words.
(Score: 2) by HiThere on Wednesday November 06, @09:52PM (1 child)
Yes, but I'm not convinced that Scratch is really a decent way to learn programming, even if that's what it was designed for.
And Pascal is a lot better than just a toy language. But I'm not sure it's better than C for learning. In either case you start with a subset of features and build from there.
Someone could make a good argument for Lisp, Erlang or Forth, because they're really different languages than C, but Pascal is just about the same language, only differing in a few syntax rules. (The only significant one I can think of is NOT in Pascal's favor. It has excessive visible scope for variables.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by ledow on Thursday November 07, @08:37AM
Scratch isn't a programming language. It's a flowcharting "control" tool. It's like Logo, but without the interactivity and the written language of Logo.
I work in UK schools and have to explain this all the time.
One time, a new teacher joined the school and was desperate to show how "good" they were and insisted that the Year 7 / Year 8 (12/13 year olds) must learn Scratch.
He was telling me this while I set up his accounts for the first day for him to use them. I advised him against it, and it was taken rather badly because I'm "not a teacher".
"Sure, I understand that. I do have a degree in CS though. And I think our kids might hate it and I'd advise you to talk to our head of IT teaching."
"Why?"
"Well, the kids did Scratch in Years 3 and 4, and progressed past it and they're all programming in Python out there."
The look on his face was a picture.
He never taught any class for IT after that.
(Score: 4, Interesting) by sjames on Wednesday November 06, @04:37AM (1 child)
I tried out the UCSD p-system and Pascal on the Apple][. It seemed like a good language, but it was SO adamant about it's types that it seemed like there was an impenetrable barrier between it, the real world, and even the hardware itself. I know that CAN be a good thing, but not for general purpose use unless a well integrated escape hatch exists. In C, you just set a pointer and dereference it. BASIC had PEEK and POKE. Casting a void binary blob to anything was right out. That could have been OK had there been a module for it, just name it HereBeDragons and consider the programmer warned.
I honestly think it could have given C a run for it's money if it could have just been a little less up tight.
Of course, when Java came out in the '90s and the tech media talked breathlessly about Java being the first write once run anywhere because of the JVM, I thought of the p-system from '69 that I tried in the early '80s and laughed my ass off.
(Score: 3, Informative) by DannyB on Wednesday November 06, @02:59PM
In my first job out of college, I was using the UCSD p-System Pascal along with Apple Pascal. These were extremely similar because the p-System was version IV while Apple Pascal was built from p-System version II.
If you wanted to do C-like peek/poke stuff there were a few tricks we used.
I'm going from 42 years of memory here, so please forgive any mis steaks, but approximately something like this . . .
First trick:
TYPE
BYTE : -128..127;
BYTES : PACKED ARRAY [0..1] OF BYTE;
PBYTES : ^BYTES;
ADDR : RECORD UNION CASE Int OF
0: addr: Integer; (* I know I'm skrewing the pooch on syntax wrong being here . . . *)
1: bytes: PBYTES;
END;
VAR
Memory : ADDR;
BEGIN
Memory.addr = 0;
(* peek and poke *)
Memory.bytes^[16384] = 22; (* poke *)
WRITELN( Memory.bytes^[16385] ); (* peek *)
END.
Second trick:
We also wrote an "Addr()" function and declared it as external. I don't remember the syntax. But it took a pointer of any type, and returned an ADDR (which had both the address accessible in integer form and as pointer to bytes form. We then wrote that function with about two lines of assembly code (separately on both IBM PC and Apple). All it did was return its argument. But to the Pascal compiler, it was an external function that took any type of pointer at all, a pointer to any structure in memory, and returned an address of if so you could then do byte fiddling on that structure. Such as a record. Or structure of some low bytes in memory, or anything. It suddenly becomes about as easy as C to do a lot of "evil" things that "you can't do in Pascal".
That sounds like a lot of work. But you only have to put it in one unit. Then simply USE (equivalent of "import" or "include") that unit to have all the magic available to you.
Danger Will Robinson!
Now any rare bits of hardware or other fiddling we would do would always be carefully confined to a select few UNITS (eg, modules linked at runtime).
On the Apple II and Apple /// we could manipulate screen memory directly and at very good speed.
On IBM PC, not so much. I had to write 8086 assembly (using p-System assembler, with a unit in Pascal declaring the few assembly functions). Those functions did direct screen manipulation on the PC so we could scroll arbitrary rectangular areas of text in any direction at high speed, blank them, capture and restore them (eg, we could create all sorts of pop up text windows and dialog boxes before there were GUIs).
Apple liked it so much I still have (somewhere) an autographed copy of Apple's user interface guidelines (about 1983 ish) autographed by Bruce Tognazzini.
All of this magical user interface trickery wasn't used in any mass market software. It was a specialized vertical market accounting system. So you probably never heard of it.
Those were fun daze. But the things I did later on the classic Macintosh [soylentnews.org] greatly eclipsed that.
People who can't distinguish between etymology and entomology bug me in ways I cannot put into words.
(Score: 2) by YeaWhatevs on Wednesday November 06, @06:02AM (1 child)
It was alright, but I discovered it was rather limited as I learned C. Maybe modern variants bring what was missing, but I doubt it.
(Score: 2) by DannyB on Wednesday November 06, @03:12PM
My first language out of college was Pascal. See the abominations I did with Pascal, just above your post. :-)
Niklaus Wirth would have had an early demise if he saw it.
People who can't distinguish between etymology and entomology bug me in ways I cannot put into words.
(Score: 4, Informative) by bzipitidoo on Wednesday November 06, @09:28AM
I learned Applesoft BASIC, then 6502 assembler, and then Pascal (Turbo Pascal 3.0 on the PC). The Apple had a version of Pascal, but it was a pain to use. Next was PL/1, VAX assembler, then C. Throw in some Prolog, and LISP (Scheme). I found those 2 languages quite mind bending after having grown accustomed to imperative programming. Turbo Pascal 4.0 was nice when it came out. The binaries it created were smaller and faster than those created by 3.0. And IIRC, 4 was the start of syntax highlighting. 3 did not have that. I was sold. I very briefly dabbled with what I believe were intended to be Pascal's successors, Modula 2 and Modula 3.
Next was C++ and the OOP paradigm. Since Borland Turbo Pascal was so good, I did not hesitate in trying Borland's Turbo C++ 2.0, then Borland C++ 4.5, and 5.x. Sadly, Borland's C++ compilers had some very bad bugs. Switched to gcc/g++ on Linux. Learned FORTRAN 77, and a bit of Java. Learned Perl and SQL on the job. Learning Perl was pretty easy as I was already familiar with shell scripting (bash), and C/C++ was the language I used the most. SQL is a little trickier. One thing I learned to do in SQL is keep the queries simple, and make copious use of temporary tables. Really weird how an SQL query with 2 joins might take hours to complete, but that same query takes less than a second if split into 2 parts with the results of the 1st join dumped into a temporary table upon which the 2nd join is performed. Possibly that was due to the engine used (Informix) and was not necessarily an issue with SQL itself.
Any more, doesn't matter whether I know a particular language, I can pick it up whenever I need. My experience with Java is very dated now. Played around with JavaScript, and Python to see if they were worth a deeper dive. In JavaScript, I worked with a lot of SVG. I did JavaScript in a "pure" way, that is, no JQuery or other library. No TypeScript either. But I dunno, maybe you can't be said to really know JavaScript unless you also know JQuery.
So now, to have someone talk up Pascal ... there are reasons why I don't use Pascal any more. Good reasons. The syntax is more verbose, and it doesn't have the libraries. I'd much rather use curly braces than type out BEGIN and END over and over. I do not know if Lazarus, Delphi, and so forth have the really nice conveniences of built in hashing (associative arrays), dynamic sized arrays, regexes, OOPy stuff, and such like that have become pretty standard, but I know Turbo Pascal sure didn't have any of that. Yes, Pascal is a good teaching language. After all, that was its original purpose. But no, today I wouldn't pick that language for anything really, not even teaching.
(Score: 3, Insightful) by theluggage on Wednesday November 06, @11:27AM (7 children)
Which is probably why Pascal “lost” - bog-standard K&R C - warts and all - was perfectly usable (in its time) to write useful applications and operating systems. Implementations for personal computers were joined at the hip with a useful subset of the standard-ish UNIX libraries, and this was formalised by ANSI C.
Likewise, Java - love it or hate it - was a complete package with everything you needed to develop *crossplatform* GUI applications.
*Standard* Pascal was about as useful as a mermaid in a chorus line. E.g ISTR couldn't even open a data file by name. Now, VAX Pascal, Turbo Pascal etc. *were* perfectly good for real development but they all relied on non-standard, proprietary extensions.
Languages succeed when they solve developers’ immediate problems, not because they meet some academic concept of elegance or rigour. That’s how abominations like PHP, Perl or JavaScript get on the “top languages” list - the “better alternatives” at the time didn’t actually solve the developer’s primary need to get something running on the target platform without re-inventing the wheel.
(Score: 2) by ledow on Wednesday November 06, @01:54PM (1 child)
I still program in C - C99 specifically, very occasionally using some gcc extensions - because of this.
I can throw my 20-year-old code at a machine, and it might "warn" about certain newer problems but it will just work.
Similarly I can lob it through Emscripten or similar things and that same code just works in a browser (even including OpenGL, audio and networking nowadays).
I "port" my code, which usually consists of "get a compiler set up, then just build it" to all kinds of platforms - ARM handhelds, x86 Linux machines, x86-64 Windows machines, etc. - and it pretty much "just works".
Having that set "base" protocol was an important thing in C, even if they keep trying to reinvent new ones and move you on. My C99 code still works, still compiles against libraries, is still readable, and platform-independent parts are kept to a minimum.
I've read the latest C standard and to be honest there is nothing in C11, C17 or C23 that interests me enough to forgo the compatibility of C99.
(Score: 2) by VLM on Wednesday November 06, @06:01PM
Life is much harsher for microcontroller peeps. Things have been getting better. Mostly. Its a lot of fun trying to keep track of which MC focuses on C first vs C++ first and which versions of each they support. A lot of people write ghetto C++ which is just C code cut and pasted into a minimal (if any) wrapper and compiled as c++. Other than filename extensions being .cpp you wouldn't be able to tell they're supposedly C++ programmers. People like that get VERY MAD if someone joins up and starts putting actual C++ OO stuff into code. WTF bro we don't do objects here, etc.
Espressif (the ESP32 ESP8266 people, also RISCV) ships a nice C23 compiler. Very nice, IIRC.
AVR AFAIK is or was shipping C++17 only.
Arduino "very recently" used to ship C++11 but C++17 could be enabled and often even worked.
"In the old days" there was something weird I don't remember about platform.io and C11 vs C17. Like they enabled C17 as the default before it actually worked LOL or maybe I misremember. Platform.io is like "lets put all the dev systems in Docker but don't call it docker and hide all the dockerness from the user because its supposed to work, even though it makes it impossible to fix when it doesn't work" It... works most of the time now, but you'll spend more time making it work than you'll spend appreciating the packaging, its "high touch" not install and go. The good part is once you learn on one system/platform then all the other platform.io system/platforms work the same way, which admitted is extremely cool. Its the kind of project that would change the world if they invested about 10x as much effort into polishing the rough edges.
Teensy IIRC defaulted to compile with C++14 or you had to force it to C++17.
Anyway, the hardware stuff and OS-ish stuff will have to be rewritten but something to c_degrees_to_f_degrees(float c) (blah) return float f will copy paste and compile unchanged from C99 in most anything today, which is nice.
Funny trivia: all the copyright dates for C17 are in 2018 because everything ships late so no small number of non-C programmers call it C18, ha ha very funny we like standards so much we created some new ones that are technically identical LOL. I think IAR had a magazine advertisement claiming C17 and C18 support that got a complete WTF response at a former client. Well, C18 must be better than C17 because its one newer, why didn't you use C18? Uh...
Another funny trivia is programmers know C++17 and C17 are separate things but for several release cycles they've been released the same year, so you'll get people to this day still insisting that if C++03 exists (which it does) then C03 "must" exist. Not so, LOL.
I'll give you one cool thing, UTF8 support in C11 Thats kind of cool and worth using if your application has any sort of I18N requirements.
As for stuff that's generally not cool but is at least interesting, and I'm probably forgetting stuff:
You can create anonymous structs and union in C11 so that means you can make nested structures. This causes programmer brain damage and makes my head hurt just thinking about wrestling with the syntax no matter how fancy your IDE is, but it does technically work. The concept seems simple and obvious and the syntax looks like old fashioned line noise. How am I supposed to look at that and not get a headache?
C23 binary literal constants aka 0b11110000 meaning 0xF0. This is hilarious because the C standards committee refused to add it, so the GCC peeps said F you and added it to GCC anyway "a long time ago", so if you use gcc you probably think 0b10101111 or whatever is in the C language standard, but "only" GCC supported it for "a long time" so the standards committee was like F you right back and refused to add it to the standard despite people using it, due to the usual small group political problem BS as I understand it. Now in C23 "every" c compiler is supposed to implement gcc style 0b binary literals. Its HILARIOUS if you're trying to cut and paste old code from a GCC using project to a lame compiler that is not C23 and doesn't do the 0b literal format, ha ha very funny. You can always turn a bunch of 0 and 1 into hex digits in your head if you're any good at this stuff but it is annoying.
The only cool-ish thing I know about in C23 is that even IEEE 754 gets new revisions so supposedly the trig functions are using more cooler more modern IEEE 754
Totally unimpressive category:
C17 is just C11 2.0 now shipping with bugfixes I don't really anything cool or new. I suppose there's no reason to use C11 which is functionally C11 1.0 if you have access to C17 which is C11 2.0.
C23 steals nullptr_t as a concept from c++11, WTF squared this should be an OSHA violation for everyone involved, the standards designers, the peeps using it, just everyone.
(Score: 2) by VLM on Wednesday November 06, @05:18PM
In the "old days" interop was very important so even if there was a cost to running everywhere, it was seen as a necessary cost.
However, Pascal straddled the end of that era where only IBM-PC MSDOS compatibility mattered. For example, see the "famous"? Kingdom of Kroz puzzle game series which was written in Turbo Pascal and AFAIK never released for anything except IBM PC / MSDOS / ANSI.SYS. It was a fun old game when I played in the 80s and its still fun; its kind of a mashup between an action game and a puzzle game on 2D text mode using I think EVERY feature of ANSI-SYS, almost like a demo of what text mode could do on PC. Could the series have been recompiled for literally anything that runs Pascal? Well, yeah, technically, however the concept of write-once-run-everywhere was already dead in the market.
Pretty much every issue Java had a generation (or two) later was first experienced on Pascal P-code systems. "Well, in theory a perfect programmer with perfect libraries and runtime support "could" run anywhere but in practice you'll have to custom compile for everything..." Also EXACTLY like java, the educational-industrial complex had a HUGE attraction to it, for whatever reason. I hit higher ed JUST as higher ed was abandoning Pascal, which I guess dates me as gen-X obviously.
Its absolutely ASTOUNDING how fast Pascal disappeared when it was abandoned, way faster than Perl, for example. I would hazard a guess if you took a computer science class in the 80s either HS or Uni, you probably had to do a data structures-like class using Pascal, and it disappeared overnight once academia switched to IIRC C / C++ for awhile before hitting Java and now I guess Python?
(Score: 2) by HiThere on Wednesday November 06, @10:03PM (3 children)
There were lots of versions of C, too. I used to use a version of C ("Lifeboat C" IIRC) that allowed assembler code to be written in-line with C code. Definitely not part of the standard.
I think the reason C won was because Unix was written in C, and so were it's utilities, and the open source tools and implementations. But it may be that it was easier to write a compiler for C, so many people did so. And compilers used to be expensive. The cost of a compiler is the reason I never really got into LISP, though I did get into FORTH. If Neon Forth hadn't died in the conversion to a IBM PC compatible version, I might well have continued with FORTH, though that was a bit late in the game, and Forth was never going to win except in an environment highly constrained in memory.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by theluggage on Thursday November 07, @12:56PM (2 children)
True - but the difference from Pascal lies in the "core" language that nearly all versions built on.
Wirth described an educational language for teaching algorithms and data structures which lacked many features essential for writing real-world code, leaving them to be defined by the implementation.
Kernighan and Ritchie described a completely usable language and a subset of UNIX standard library functions, that could be used - and had been used to write practical applications and operating system tools. Even where "enhancements" were added these were often pulled from Stroustrup's C++ (e.g. function parameter typing) or from well-established Unix libraries.
Obviously, there were exceptions - your inline assembler for example - but they weren't things you had to use for basics like passing command-line arguments, associating file handles with named files, aligning data structures with OS/hardware defined parameter blocks, non-trivial string manipulation (OK C/UNIX string functions are full of landmines, but at least they existed) etc. Plus, the C preprocessor neatly lets you patch around otherwise compiler-breaking differences without having to maintain multiple source versions.
Portable code never wrote itself - but even today I can write useful C code that will compile and run on both a Mac and a Raspberry Pi (baring a couple of #ifdefs to deal with minor BSD vs. GNU issues).
(Score: 2) by HiThere on Thursday November 07, @07:59PM (1 child)
Ummm...C also has it's I/O functions in libraries that aren't part of the language. And C was around long before C++. So I don't think that argument works. It was Fortran (and BASIC) that had I/O functions built into the language. Perhaps LISP did too.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by theluggage on Friday November 08, @09:13PM
A substantial set of I/O and utility functions (basically a subset of the Unix standard libraries - so fairly well-defined) were described in K&R edition 1 which, initially, was the de-facto definition of the language... and they were later incorporated into the ANSI standard for C. The practical upshot was that C implementations on microcomputers pretty reliably came with those functions.
But C++ - nee "C with Classes" - was already in development before C "went huge" on microcomputers in the early/mid 80s and was (I may be wrong) the origin of changes such as function parameter typing that went into ANSI C. If I'm wrong, then I guess the new features came from ANSI C - similar timeframe and C compilers were implementing draft ANSI standards long before they were ratified. Anyway - microcomputer C compilers had a clear "standard" to draw on for extensions rather than each going their own way.
(Score: 2, Insightful) by skaplon on Wednesday November 06, @12:21PM
Lazarus it's an IDE, not a language and *uses* free pascal. Both aims to be an open source equivalent of Delphi that, while Delphi has linux support in the most recent incantations, does so only at the most expensive packages and the UI it's firemonkey only, which it's like QT's QML while most component suites only support VCL, which is windows-only and much like QT Widgets.
(Score: 3, Interesting) by DannyB on Wednesday November 06, @03:20PM
I loved Pascal. However the major problem which led me to C, then C++ and then Java, was . . .
There was no standard cross platform cross compiler standard for Pascal. The UCSD p-System Pascal, Turbo Pascal, MPW Pascal, Lisa Pascal, all were slightly different dialects all with incompatible but highly necessary extensions. Without the extensions Pascal was too crippled. However all of the crutches for Pascal were incompatible across compilers and platforms.
Eventually, in Java, I could build anything, and it would be cross platform. Desktop software, web server software, even programs that ran on the original highly limited 2014 Raspberry Pi -- which had Java out of the box. Example: a desktop GUI program to explore the Mandelbrot set which I wrote in 2004 (ten years before the Ras Pi) would run unmodified on the. Not even a recompile. Just move the compiled binary to the Pi, and it ran on the desktop GUI perfectly slowly.
While Java has its warts, and some people hate it (just as is true of Pascal), it obviously did something right. It was constantly in the top two programming languages for popularity for over twenty years, and now still sits in the top three (haven't checked recently). A language doesn't do that unless it is doing something right for some group of every day working programmers. Regardless of whether you like/i> it or not. I similarly respect other languages I don't like if they are widely used, because . . . they must be doing SOMETHING right.
People who can't distinguish between etymology and entomology bug me in ways I cannot put into words.