Stories
Slash Boxes
Comments

SoylentNews is people

The Fine print: The following are owned by whoever posted them. We are not responsible for them in any way.

Journal by turgid

Memory safe languages are coming back into fashion nowadays.

Back into fashion? They're not new?

That raises the question as to why the world's software is built on memory unsafe languages such as C and C++.

In a word, efficiency. If we look back in history a bit to where high level languages were becoming common, we can see why.

My experience with programming begins in the 1980s, so I have no direct experience of how things were prior to that, but I did start out on 8-bit microcomputers with BASIC interpreters.

To give you some context, these machines had CISC CPUs that ran at a few MHz (1 to 4 usually) and took several clock cycles to complete a single machine instruction. The primitive BASIC interpreter lived in a few kB of ROM and was written in machine code.

Those BASIC interpreters were slow, but they were fairly user-friendly. When you declared or even just used a new variable (integer, floating-point or string) for the first time it would be initialised. When you concatenated strings, it happened as if by magic. You could insert and delete substrings. You could take input from the user in strings of arbitrary length that would adjust dynamically. There was no such thing as overflow. Array bounds were checked. The program would stop with an error code if you tried to read or write out out bounds.

In previous decades, compiled languages were developed to run on mainframe and minicomputers. A typical 1970s vintage minicomputer was about as big (RAM) and powerful (speed) as a 1980s home microcomputer. Compilers were always complex and notoriously arcane, unreliable and generally mysterious. They were usually buggy and not to be trusted, apart from in some very specialised and expensive cases.

Along came languages like C and Pascal. Pascal was nice but considered a "toy." It needed extensions to be useful. However, in the 1970s C came along. It was simple and powerful. A C compiler could run on a relatively small machine (a few tens of k or RAM) and produce usefully efficient machine code, fast enough that writing in plain machine code or assembly language wasn't really worth it any more. There was also the advantage of portability. The compiler could target different architectures so one piece of source code could compile and run on very different hardware.

This simplicity and efficiency came at a price. The language and the compiler did not support such things as array bounds checking, variables were not automatically initialised and strings were effectively static, not dynamic. Library routines were provided to supply some of that functionality.

The run-time checking (and safety) was eliminated in favour of the programmer being all knowing and all seeing and infallible.

With hindsight, better choices might have been made but remember that the machines of the day were so small and slow that it was the only way to get efficient enough software out the door in a reasonable timeframe.

Most computers were not networked. The Internet wasn't even a thing. Security wasn't so important. A lot of the shortcomings in the languages and their libraries weren't that much of an issue.

Other things did come along but the "good enough" solutions had become entrenched. Ada was one such alternative. It didn't catch on because compilers were expensive (to call a compiler Ada it had to go through an expensive validation process) and there was a bit of a reaction against "strict" languages (for all the wrong reasons).

Programming languages are there to express mathematical concepts. They need to be precise and unambiguous. The human brain is neither. Experience tells us that we should chose languages and compilers that help us reduce ambiguity and express with greater precision that which we wish to achieve.

There is no perfect programming language. It always used to be that you would use your programming language to build up a framework of "subroutines" to help you solve your problem. FORTH was great fun for doing this. LISP looks eveen better, and is indeed the essence of programming it seems.

Back to my original point: memory safety got sidelined for efficiency. Efficiency was important because forty years ago hardware was so slow and primitive. I haven't been on a slow computer in over ten years. I have computers from over ten years ago that aren't slow.

Efficiency, in compilers or in the runtime, is not a problem any more. Why are we not programming with memory safe languages as a matter of course? What wheels is Rust reinventing?

Display Options Threshold/Breakthrough Reply to Article Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Mojibake Tengu on Wednesday February 22, @10:39AM (10 children)

    by Mojibake Tengu (8598) on Wednesday February 22, @10:39AM (#1292983) Journal

    Rust is reinvented Basic.

    Basic was suppressed only politically as as cultural taboo, but it was adequate tool for memory safety situations. I consider Edsger Dijkstra an ideologist, fundamentalist and criminal who mentally destroyed several generations of programmers by his fanaticism.

    Problem with languages like Lisp which heavily use dynamic memory allocation (or any other dynamic resource) is runtime nondeterminism, which is not acceptable (especially in time domain) for some critical systems. Embedded people who do weapons know the price: final cost is life. Java is still forbidden in aerospace industry I hope.

    Forth is actively used as FreeBSD bootloader, not dead yet. C++ was safe before templates were invented. Shame on mutilators! For one example, C++ on Arduino (where STL does not fit into the platform because of memory constraints) is still perfectly safe.

    In near future, expect comeback, in a form of 64bit new Forth-like languages on both AMD64 and RISCV-64 platforms, complemented with pure assembly, without memory compromising typical to current bloatware compilers. We are not alone now, fresh blood is joining us...

    --
    The edge of 太玄 cannot be defined, for it is beyond every aspect of design
    • (Score: 4, Insightful) by hendrikboom on Wednesday February 22, @05:16PM (9 children)

      by hendrikboom (1125) on Wednesday February 22, @05:16PM (#1293048) Homepage Journal

      C++ was safe before templates were invented.

      C++ has never been safe. It has always had the ability to do arithmetic on pointers.

      • (Score: 2) by Mojibake Tengu on Thursday February 23, @05:28AM (8 children)

        by Mojibake Tengu (8598) on Thursday February 23, @05:28AM (#1293105) Journal

        Doing arithmetic on pointers is perfectly valid concept on any von Neumann architecture and perfectly safe if done correctly. One good strict methodology is restriction to validated accessors only.

        Did you ever used a kitchen knife?

        Would you deny using knives to your offspring? To all future generations?

        Do you think the Rust compiler does not use pointers in its guts? Or the LLVM backend itself, which the Rust runtime is built upon?

        --
        The edge of 太玄 cannot be defined, for it is beyond every aspect of design
        • (Score: 3, Insightful) by RS3 on Thursday February 23, @07:16AM (3 children)

          by RS3 (6367) on Thursday February 23, @07:16AM (#1293115)

          I'm confused- if you do arithmetic on a pointer such that it now points to a memory address which is outside of your program's allotted memory, isn't that the definition of unsafe?

          I know you qualified with "if done correctly", but that can be said of anything. If I'm understanding everything here, the idea is to reduce the many ways that a program can get itself into problems. IE, it's like anything with safety limits built in.

          • (Score: 0) by Anonymous Coward on Thursday February 23, @06:24PM

            by Anonymous Coward on Thursday February 23, @06:24PM (#1293152)

            If you do arithmetic on ANYTHING it is "the definition of unsafe". Because overflows, and with floats, precision loss also.
            The whole REASON for your existence as a programmer is to do things in such a way that those dangers do not do any harm in the specific conditions you are writing the code for, and to make sure those conditions hold.
            Otherwise, a simple Perl script would have replaced you decades ago.

            Safe handling of pointers is, if anything, much easier than safe math in general, and your mistakes get you a nice visible crash instead of an insidious bullshit result.
            A memory safe language, by saving inept code monkeys from reaping their deserved crop of crashes, merely invites them to go forth and produce other types of bugs in job lots.

          • (Score: 0) by Anonymous Coward on Friday February 24, @06:10AM (1 child)

            by Anonymous Coward on Friday February 24, @06:10AM (#1293214)

            That is one way that memory can be unsafe. But that isn't the point of the GP comment. It is designed to face-save and still look smart. That is why they brought up "von Neumann architectures" despite being completely irrelevant to the concept of memory-safety or pointer arithmetic (you can do it in other architectures including the Harvard machine you are running). It is also why they added the "if done correctly" caveat to puff up that they are ones that can program safely. Both of which distract from the fact that their "I'm better than you" argument was completely derailed and rendered impotent by someone pointing out it was factually wrong in at least one area.

            • (Score: 0) by Anonymous Coward on Friday February 24, @12:26PM

              by Anonymous Coward on Friday February 24, @12:26PM (#1293230)

              This, dude, is the ChatGPT level of tired bullshit. Yes, there ARE people (and chatbots, apparently) smarter than you. No, they are NOT obligated to protect your fragile feelz.

        • (Score: 1, Insightful) by Anonymous Coward on Thursday February 23, @06:18PM (2 children)

          by Anonymous Coward on Thursday February 23, @06:18PM (#1293149)

          Did you ever used a kitchen knife?

          Time for a car analogy. If you can't drive a car with swing-arms, pointy hood ornaments and steel dashboards, you shouldn't drive a Corvair. Cars are unsafe, roads are unsafe, energy sources are unsafe. Don't let boobs near them.

          If you can't handle writing correct code every day, all day, you are delusional about being a programmer, and should go back to GWBasic without PEEK/POKE.

          Forget about malice, a boob can show you how unsafe any language can be.

          • (Score: 2) by PiMuNu on Monday February 27, @02:09PM (1 child)

            by PiMuNu (3823) on Monday February 27, @02:09PM (#1293528)

            > If you can't handle writing correct code every day

            Just saying, about 50 % of the lines of code I write have an error in. I am saved by the compiler...

            • (Score: 0) by Anonymous Coward on Tuesday February 28, @04:57AM

              by Anonymous Coward on Tuesday February 28, @04:57AM (#1293644)

              As long as it doesn't urp on the first typo, like in the old days. It's amazing how bad a good writer's rough draft can look. Even da Vinci had... never mind, bad example, his sketches would sail through the strictest syntax checker.

        • (Score: 3, Insightful) by hendrikboom on Friday February 24, @04:49PM

          by hendrikboom (1125) on Friday February 24, @04:49PM (#1293261) Homepage Journal

          The point of restricting arithmetic on pointers is that you won't do it be accident, that you will only do it when you actually want and need it and are capable of saying so.

          Of course programs do arithmetic on pointer all the time -- for example, every time you select an element of an array. A "safe" language will do bounds checks, at compile or run-time as appropriate.

  • (Score: 4, Informative) by DannyB on Wednesday February 22, @03:35PM (8 children)

    by DannyB (5839) Subscriber Badge on Wednesday February 22, @03:35PM (#1293020) Journal

    Like you, I began programming about the same time. About 1977 in BASIC. Started using "real" computers in college in 1979.

    There was a debate going on about the pros and cons of using high level languages (eg, FORTRAN, COBOL, etc) vs Assembly. Having classes in all of these it was obvious to me the pros and cons. Over time, I noticed that the trend was ever higher level languages. To make human programmers more productive. Computers were only getting faster, bigger and cheaper. I had the sense to realize that this trend would continue. My college roommate and I realize now, looking back, that we vastly underestimated how MUCH bigger, faster and cheaper computers would get. We could not have imagined the computers, tablets, smart phones, microcontroller boards, etc of today. And we considered ourselves aware and informed about computers.

    Looking back:

    I've learned that, despite wanting there to be one, there is and probably never will be any one perfect programming language suitable for all purposes. People should stop trying to pursue this. Different tools are good for different things. Yes, you can use a wrench to pound a nail into wood. If there were one perfect programming language, we would all be using it already.

    Economics. Back in the day, computers were very expensive. A good sized minicomputer with 192 K words of memory and 40 MB disk, printer, tape drive could cost close to half a million dollars. The biggest systems could have a megabyte of memory! Even mainframes topped out at a few megabytes of memory.

    Back on point: computers were expensive, humans were cheap. Today it is the opposite. Humans are very expensive, computers are cheap. Back in the day it made sense to optimize for every single cpu cycle and every byte. Today it is the opposite. You should be optimizing for development time. Or rather, business people would say, you should be optimizing for dollars. That makes sense. If I can use a high level language and beat my low level language competitor to market by six months to a year, my manager and I will laugh all the way to the bank while he brags about how efficient his solution is. We can just throw an extra 64 GB of memory into the server and a few extra cpu cores, for less than the cost of a couple months of a human developer plus benefits, and call it a day.

    Opinions about Pascal:

    Starting in the early 1980s and continuing past the year 2000, we made a lot of money developing in the "toy" Pascal language. We used the UCSD p-System. It ran on lots of hardware. It compiled down to p-Code. The p-Code binary was portable to any system running the p-System. (sort of like Java today) We could use compile time switches to turn on or off things like array bounds checking, etc. There were some simple well known tricks in Pascal to get a "typeless" pointer and convert between that and an integer. So we could do low level things to the PC hardware in selected places. We could even declare a packed byte array and name it "memory", put zero into that pointer, and now we had a global variable with read write access to any byte in memory, but better and more natural than BASIC's PEEK and POKE. We could link our Pascal with Assembly code (that had to be rewritten for different processors). Pascal became Apple's official development tool for quite a few years, so we realized we were on the right track. Turbo Pascal took the PC market by storm. In short, we were professional developers and could make Pascal do anything we needed to the low level hardware, yet use a high level type safe language. Yes, we could make the machine crash. We had to know what we were doing. But that is no less the case when writing Assembly. (aside: I was disgusted with writing 8086 code for the PC to get high performance arbitrary rectangular screen scrolling operations because the BIOS was so slow. I found it a pleasure to write 68000 Assembly on Macintosh.)

    In the mid 80's we couldn't buy a "database" system, so we had to build our own. And we did. We all had computer science degrees. Our product could run on Apple II and Apple ///, and Macintosh and IBM PC, and access the same database files on disk. Using the same Pascal code. (But on Mac slightly modified to Apple's native Pascal compiler dialect. To be doing that on a network in the 80's was a big deal at the time.)

    With the rise of Object Pascal, we were also seeing the rise of C and C++. I learned both and used them. We were generally horrified at its lack of type safety. I discovered that C++ was so big and difficult to implement in the mid 1990s that every compiler implemented non-overlapping subsets of the language. Which was bad if you wanted to write cross platform code. Not too long after this, I discovered Java.

    More observations looking back:

    In the early 1990s I was telling friends and predicting that "in ten years all new modern languages would have GC.". That turned out to be true. Looking back, almost all new modern languages in the last twenty years have GC, with a few like Rust being the major exception that proves the rule.

    People underestimate GC in Java in particular. Because Java was an open platform to experiment on, a couple decades of advanced research happened on the platform. Today we have advanced JIT compilers in Java. And GC's that can handle multi-Terabyte memory heaps with 1 millisecond max pause times. These GCs run concurrently on some cpu cores in parallel with the primary work load.

    Observation about GC: GC can LOWER program latency of the primary workload. Time is something that you simply cannot buy back. Time. The primary workload can proceed without spending any cpu cycles on memory management. No ref counting. No GC. Just malloc away, and it's cheap. The GC runs on other cpu cores and cleans up in parallel with the primary money-making workload. But GC happens on other cpu cores. Not impacting the primary workload. You can't buy back latency. The overall cost of this is higher, but it actually speeds up the primary workload. The primary money making work load earns more than enough money to pay the freight for the GC. This is easy for business people to measure and analyze. (clue: hardware is cheap and getting cheaper) But it won't stop the C++ guy from complaining about GC.

    There is no one perfect language. Quit looking for it. Quit thinking that YOUR favorite language is it. It's not. I wouldn't use Java to write a device driver or boot loader. But I wouldn't use C++ to build an application either.

    If a language has been among the top languages for a couple decades or more, and is still in the top four or five, then that language is probably doing something right for some group of people. It doesn't make other languages "wrong".

    People seem to get excessively attached to their favorite language even though it is just a tool. For making money. That is, frankly, what computers have ever been about. Despite that, I've had endless pleasure building personal pet projects and toys over a lifetime of programming.

    --
    How often should I have my memory checked? I used to know but...
    • (Score: 3, Informative) by Freeman on Wednesday February 22, @04:48PM (2 children)

      by Freeman (732) Subscriber Badge on Wednesday February 22, @04:48PM (#1293042) Journal

      I played with BASIC as a kid. It was really just copying code from a print book and playing the program on my computer. Still, it was quite interesting and fun.

      The first computer class I took (we'll skip the typing class) was a programming class in high school that taught Java. I believe the introductory course in our University used Java at the time as well, which I also took when I started attending.

      Now, I use Python to do some data manipulation and custom searching with pymarc. I've learned to like the convenience of a decent IDE as well (I use PyCharm).

      I didn't get a job as a programmer or even a job that really utilizes my Computer Information Systems degree. Though, my degree did get me looked at due to the Library needing someone that could speak Information Technology jargon. Knowing how to craft simple code and not be daunted by scripts has been a very good thing.

      --
      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
      • (Score: 2) by DannyB on Wednesday February 22, @09:45PM (1 child)

        by DannyB (5839) Subscriber Badge on Wednesday February 22, @09:45PM (#1293073) Journal

        Before BASIC, I first was exposed to programming by a friend in my church who had an HP programmable calculator. Sorry, can't remember which one, but this would have been in about 1976. I borrowed the calculator manual from him for a week. Wrote a program, sorry can't remember what. Tried it out and it worked. This was where I suddenly got my first understanding of what these new microcomputers were all about. I was seeing them in Popular Electronics and BYTE magazines. At this point in time a machine with 16K was quite expensive. The "holy trinity" (TRS-80, Commodore PET, Apple II) would not be introduced until 1977.

        My high school got a nice BASIC computer for about $8000. I went crazy learning everything about it.

        I stopped dabbling with electronics, TTL, etc, and never picked up a soldering iron again. Now I don't even remember which end of a soldering iron to pick up.

        When I went to college, the big computers were tiny by comparison to a modern laptop.

        --
        How often should I have my memory checked? I used to know but...
        • (Score: 3, Interesting) by RS3 on Thursday February 23, @07:43AM

          by RS3 (6367) on Thursday February 23, @07:43AM (#1293118)

          In pre-school (I'm kidding- it was jr. high) a friend had a login to a nearby large city's school system computer. It was maybe 1976 or so. We used a dumb terminal and acoustic coupler. We were in nerd heaven when it would connect at 300 baud (rather than 110, ugh). It was a CDC 7600. Not sure if we were really logged in to the CDC, or some kind of smaller system that would feed jobs to the 7600. Anyway, you were basically (pun intended) in a BASIC shell, so we wrote simple stuff, run someone's "Star Trek" game, copied and edited it, etc.

          Being a hardware engineer I've always liked the hardware, esp. analog stuff. But I also like machines doing useful work, so early on I found "peek" and "poke" and THAT interested me big time. Now I could make the computer do hardware stuff. In fact, over the years I've found the PC parallel port very very useful for controlling things.

          I remember going with my parents to someone's home where they were "multi-level-marketing" selling the "Timex Sinclair". It was a beautiful large old farmhouse, and there were a dozen or so people listening to the spiel. I was the only person to notice a mouse (furry rodent kind) dart across the floor and under a couch. My parents didn't buy into the Sinclair.

          In college, mid-late 80s, we had IBM system 390, and some other kind of system that you logged in to, edited jobs, submitted JCL to the 390, got results, and/or printed them on the insanely fast line printers. I remember I really liked XEDIT [wikipedia.org], but haven't even looked into it since then.

          I've known several people who had programmable HP calculators. Somewhere I have one someone was tossing, but I never programmed one. IIRC it has little magnetic strips you could save your program onto.

    • (Score: 2, Insightful) by Anonymous Coward on Wednesday February 22, @11:51PM (4 children)

      by Anonymous Coward on Wednesday February 22, @11:51PM (#1293084)

      Minor nitpick: Rust is garbage collected. It is true that it does not have a tracing garbage collector, but it does do compile-time escape analysis combined with other GC tricks and limiting the use of objects to handle garbage for you.

      Another thing worth mentioning is that you rightly point out that some garbage collectors do pause (but some don't), but modern pauses are limited (ours are measured in microseconds, predictable, and once warm we don't pause at all thanks to proper tuning) and occur on separate threads. However, C and C++ can have pauses too when implicit/explicit destruction of objects causes an avalanche of frees and deletes which all block your thread from doing anything useful and can be quite expensive depending on your target environment. People like to imagine C's and C++'s memory management as being low-cost and GC's being high-cost when it can be exactly the opposite depending on the situation and definition of "cost."

      • (Score: 3, Insightful) by DannyB on Thursday February 23, @04:16PM (3 children)

        by DannyB (5839) Subscriber Badge on Thursday February 23, @04:16PM (#1293141) Journal

        We may simply have a terminology problem here. I think of garbage collection (GC) as one kind of memory management. Rust's escape analysis and borrow checking is another kind of memory management. C and Pascal style new/dispose operations done manually is another kind of memory management. C++ style smart pointers with reference counting is another kind of memory management.

        Management of dynamic memory is the problem. There are several solutions:

        • automatic runtime garbage collection
        • manual new/dispose memory management -- more bookkeeping for the programmer
        • reference counting combined with manual new/dispose
        • Rust style escape analysis and borrow checking at compile time
        --
        How often should I have my memory checked? I used to know but...
        • (Score: 1, Insightful) by Anonymous Coward on Friday February 24, @12:23AM (2 children)

          by Anonymous Coward on Friday February 24, @12:23AM (#1293195)

          That is what most people think anymore: conflating Garbage Collection with Tracing Garbage Collection or other "exotic" types. One of my colleagues claims that the reason for that is because there has been a stigma so long against garbage collection that people want to claim they don't do it and thus redefined the terms. There are two major memory divisions: manual memory management, where the programmer is responsible for handling that part of memory, or automatic memory management, where either the target or implementation handles that part of memory. As already stated, automatic memory management itself has two divisions: those handled by the target and those handled by the implementation. The automatic memory management done by the implementation is garbage collection. All methods of doing so is included under the banner of garbage collection.

          Because handling memory in a completely manual fashion, especially in a multitasking environment, can be a PITA and error prone, an increasing amount of automatic memory management was desired. A number of techniques were invented including region-based, tracing, segments, stacks, heaps, counting, etc. Some of those became the purview of the target, others were built into the implementation, some require cooperation between the two, and in some the division of responsibility depends on how much the target supports it (or does not support it).

          One side effect of that proliferation of automatic memory management and the further separation between the abstract machine and the physical machine is that the garbage collection done by the implementation disappeared from most people's mental model. Even something as "manual" as assembly or C on modern machines can do a surprising amount of garbage collection, especially on targets that don't support automatic memory management in the way modern environments do. You can always pretend that automatic variables are allocated in the stack, the abstract machine in C certainly does, but the real story of how the memory is laid out could be very different.

          So coming down to it, most language implementations do not have a tracing garbage collector or other "exotic" form of GC. So while they don't have GC in the increasingly colloquial sense, they are garbage collected in the technical sense.

  • (Score: 4, Informative) by hendrikboom on Wednesday February 22, @05:11PM (15 children)

    by hendrikboom (1125) on Wednesday February 22, @05:11PM (#1293046) Homepage Journal

    There are two niches for unsafe languages:

    (1) when even a microsecond's delay for garbage collection (or synchronising with a garbage collector on another core) is too long for real-time response.

    (2) for implementing type-safe languages.

    Some safe languages have designated (and explicitly marked) unsafe corners for such things. This makes them compatible subsets of unsafe languages.

    • (Score: 2) by DannyB on Wednesday February 22, @09:47PM (1 child)

      by DannyB (5839) Subscriber Badge on Wednesday February 22, @09:47PM (#1293074) Journal

      Unsafe languages are perfect for working close to the hardware but would prefer to work just a bit above Assembly language. Using C, you can "see" every single cpu cycle. There is a character in C that represents every cpu cycle, even a pointer access is visible in the code.

      --
      How often should I have my memory checked? I used to know but...
      • (Score: 2) by hendrikboom on Friday February 24, @04:33PM

        by hendrikboom (1125) on Friday February 24, @04:33PM (#1293254) Homepage Journal

        True. The times one has to work close to the hardware are rare nowadays, but they still exist and are important.
        The majority of the work done in C or C++ does not need this low-level work.

        -- hendrik

    • (Score: 1, Insightful) by Anonymous Coward on Thursday February 23, @12:10AM (12 children)

      by Anonymous Coward on Thursday February 23, @12:10AM (#1293085)

      Those are not necessarily true. Many of our real-time systems use Erlang and BEAM. There are also hard real-time Java implementations and hard real-time Ada implementations with GC. There are plenty of situations where garbage collection can work with real-time systems, even hard ones, and the memory management in C/C++ can also cause missed deadlines. But the solution in both cases is proper engineering not knee-jerk avoidance of entire categories just because some instances are bad.

      And there are plenty of type-safe and memory-safe languages implemented in type-safe and memory-safe languages or self-hosted.

      • (Score: 2) by turgid on Thursday February 23, @07:57AM (11 children)

        by turgid (4318) Subscriber Badge on Thursday February 23, @07:57AM (#1293119) Journal

        I work on systems, in C, where all the memory management is done up-front by the compiler, at compile time. There's no malloc() or free() and no newfangled features like arrays whose sizes are decided at runtime.

        • (Score: 3, Insightful) by DannyB on Thursday February 23, @04:34PM (10 children)

          by DannyB (5839) Subscriber Badge on Thursday February 23, @04:34PM (#1293143) Journal

          This how early computers worked. No dynamic memory management. Languages didn't have any concept of it.

          The reason dynamic memory management appeared was to be able to solve more sophisticated problems.

          Personal example:

          I started with BASIC. Proud and filled with Dunning Kruger I could do anything! I started looking at some AI books and more interesting problems.

          How would I make a BASIC program that could play Tic Tac Toe against a human? One book explained it very simply. You start with the current board, look at all possible moves, and the new game boards generated by those moves. Then repeat (actually recursion). Look at all those game boards and do the same thing again, what are all the possible moves the opponent could make in response to my move, and what new game boards would result.

          This approach is called Minimax. (and there is an Alpha-Beta pruning optimization I won't go into here.)

          In BASIC I realized I didn't have any simple way of doing any of this.

          When I learned Pascal with dynamic memory management and recursion, I suddenly realized that I could easily write code to solve this problem. The language provided me with the necessary tools.

          If I were building a cash register, or a computer controlled drill press, or an ATM, I would definitely not need any dynamic memory management or recursion. But for some real world problems those things are a must have.

          If you think you don't need them, then you end up reinventing them. I recognized that in BASIC I could "invent" dynamic memory management by having a large preallocated set of parallel arrays for the dynamic data structures of the game boards. I could turn recursion inside out into iteration and manage all of the 'stack frames' manually in these arrays.

          All this is doing is falling victim to Greenspun's Tenth Rule. [wikipedia.org]

          Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.

          This is what I imagined doing in BASIC without even knowing what Pascal or Lisp was yet. I recognized it as a big software development problem just to play games. Something that becomes a simple problem when using a higher level programming language.

          Less than a decade after learning Pascal, I learned Lisp and suddenly realized that it was now easy to solve even more sophisticated problems that would have required huge effort in Pascal.

          Wisdom: use the right tool for the right job.

          It is possible to pound a nail into wood using an adjustable wrench.

          --
          How often should I have my memory checked? I used to know but...
          • (Score: 1, Insightful) by Anonymous Coward on Friday February 24, @07:55AM (4 children)

            by Anonymous Coward on Friday February 24, @07:55AM (#1293220)

            As I read more of your comments, I become more convinced that you should write a book.

            • (Score: 2) by Tork on Friday February 24, @08:25PM (3 children)

              by Tork (3914) on Friday February 24, @08:25PM (#1293275)
              This thread, including your comment, is such a beautiful example of why I love the users on this site. I'm just sad I don't have the programming chops to contribute to the conversation, but man I am enjoying reading through it.
              --
              Slashdolt Logic: "25 year old jokes about sharks and lasers are +5, Funny." 💩
              • (Score: 1, Insightful) by Anonymous Coward on Saturday February 25, @01:52AM (2 children)

                by Anonymous Coward on Saturday February 25, @01:52AM (#1293309)

                Everyone has something to contribute; everybody has a unique point of view. The real difference is whether you have the confidence to state it and the panache to deliver it.

                • (Score: 1, Funny) by Anonymous Coward on Sunday February 26, @07:06AM (1 child)

                  by Anonymous Coward on Sunday February 26, @07:06AM (#1293417)

                  Everyone has something to contribute; everybody has a unique point of view.

                  I don't.

                  • (Score: 0) by Anonymous Coward on Sunday February 26, @08:29PM

                    by Anonymous Coward on Sunday February 26, @08:29PM (#1293464)

                    Say goodnight, Ari!

          • (Score: 2) by turgid on Saturday February 25, @01:16PM (3 children)

            by turgid (4318) Subscriber Badge on Saturday February 25, @01:16PM (#1293361) Journal

            My greatest disappointment in life was at university being told to write some code in FORTRAN-77 for a project (pure, no extensions for portability) having learned (to Dunning-Kruger level) BASIC, FORTH, C, assembly and MODULA-2 at home myself. I needed to implement linked lists and trees in FORTRAN arrays, allocated at compile time. "Don't worry, we have a machine with 128MB and we will probably eventually run your code on the Cray." How I wailed and gnashed my teeth. FORTRAN-77 is like BASIC only worse. Many a painful evening was spent in front of a DEC VT320. I even used emacs in those days.

            • (Score: 2) by DannyB on Sunday February 26, @10:18PM (2 children)

              by DannyB (5839) Subscriber Badge on Sunday February 26, @10:18PM (#1293473) Journal

              One thing that I had an "ah ha" moment in FORTRAN was when I realized that all of the variables in a FORTRAN function were LOCAL in scope to that function! Being Dunning Krugered with BASIC, I was accustomed to all variables being global and there being only the one single global scope. I had no concept of separately declared functions that had locally scoped parameter names and variable names completely independent of the scope of the code which called the function.

              Of course, Pascal had this same feature.

              --
              How often should I have my memory checked? I used to know but...
              • (Score: 2) by turgid on Sunday February 26, @10:28PM (1 child)

                by turgid (4318) Subscriber Badge on Sunday February 26, @10:28PM (#1293476) Journal

                Yes, my dad told me all about structured programming, stacks and local variables. I had played about with C, BASIC compilers (seriously), Modula-2 and all sorts of things, but it was a real shock to be how backward FORTAN-77 was. I mean, Algol, Simula, Pascal and all sorts of other languages had been (and were going). FORTRAN-77 was wedded to punched cards, The source was 80 column, and the first n columns were only for labels (c.f. BASIC line numbers and assembly language labels) and so on. It felt like programming on someone's home-made "thing." It was atrocious,

                • (Score: 3, Interesting) by DannyB on Monday February 27, @06:01PM

                  by DannyB (5839) Subscriber Badge on Monday February 27, @06:01PM (#1293549) Journal

                  Yep.

                  I even did FORTRAN IV on an IBM 1130 for a while. A brief while. But I'm glad I did. I used an 029 and an 026 keypunch long enough to get good at it.

                  A bigger minicomputer I used had FORTRAN 77. I learned that weird machine's assembly language and even wrote a few programs in it for my own amusement.

                  To anyone who can recognize the message "Vulcan Quiescent" I could elaborate more meaningfully.

                  --
                  How often should I have my memory checked? I used to know but...
          • (Score: 2) by hendrikboom on Saturday March 04, @12:38AM

            by hendrikboom (1125) on Saturday March 04, @12:38AM (#1294394) Homepage Journal

            I once implemented garbage collection in Fortran, doing memory allocation in a large array.
            Did Greenspun's Tenth Rule apply?
            Only trivially.
            The program I was implementing *was* a Lisp interpreter. And this was years before Common Lisp was invented.

  • (Score: 3, Interesting) by hendrikboom on Wednesday February 22, @05:13PM (7 children)

    by hendrikboom (1125) on Wednesday February 22, @05:13PM (#1293047) Homepage Journal

    It is possible to implement a safe version of Pascal by implementing checks for all the restrictions in Wirth's language definition. But to my knowledge, no one has.

    • (Score: 3, Informative) by DannyB on Wednesday February 22, @09:55PM (6 children)

      by DannyB (5839) Subscriber Badge on Wednesday February 22, @09:55PM (#1293075) Journal

      In UCSD p-System Pascal, in the early and mid 80s, you could turn on/off compile time flags. Thus you could compile your code without array bounds checking, or string length checking and other things. Then you could go right ahead and do unsafe things, and not necessarily get a runtime error if you fully understand what you are doing to the machine.

      If you knew the structure of a file control block (part of the I/O), (and it was documented) you could declare a Pascal RECORD of exactly that layout. Create a pointer to that type, a variable of that pointer type, and with trickery, initialize that pointer to any chosen address from an integer. In short, you could now get a pointer to a type-safe record with all of the internal elements of a file I/O control block and subvert how the I/O system worked. The kind of thing you expect to do only in C. However we knew what we were doing and did this sort of thing in Pascal as necessary. It didn't have any runtime cost. That trickery of assigning that integer into a pointer has various source code text to it, but all comes down to a single machine instruction to do the actual initialization of that pointer to some arbitrary place in memory.

      We could have certain Pascal UNITs (eg, a "library") compiled unsafe, and do all our unsafe operations inside there, but only provide nice "safe" public interfaces to those debugged and tested routines.

      --
      How often should I have my memory checked? I used to know but...
      • (Score: 3, Insightful) by hendrikboom on Friday February 24, @04:42PM (4 children)

        by hendrikboom (1125) on Friday February 24, @04:42PM (#1293258) Homepage Journal

        Yes. I remember that system. I once taught a class of beginners using that implementation on a PDP-11. It was fun.

        The tricks you mention have since been explicitly supported formalized in subsequent so-called "systems" languages, such as Modula 3. There are specific modules marked "UNSAFE" in the programs where you can do such tricks when essential. These modules usually constitute only a small part of the program.

        In Pascal you accomplished the conversion of an integer to a pointer using unions. If you read Wirth's Pascal language definition carefully, though, you'll discover that this is not allowed. But no compiler I know of actually checks that the record variant you pull the pointer out of is the same as the variant you put the integer into.

        In Modula 3 and similar systems languages you do get such checks except where you specifically disable them.

        • (Score: 2) by DannyB on Sunday February 26, @10:11PM

          by DannyB (5839) Subscriber Badge on Sunday February 26, @10:11PM (#1293470) Journal

          Yes, UNION was one of the tricks to have a pointer and integer overlay one another in a RECORD.

          I am going from ancient memory here. We could declare an external assembly language function that accepted an untyped VAR argument and returned an integer. So you could pass ANYTHING to the argument, which was passed as an address (hence the VAR) and the one or two instruction assembly simply returned the address as an integer. Now we had our own magic address function. Call it: Addr(VAR something). To get the address, simply assign the address of anything to an integer:

          fileCtrlBlkAddr: INTEGER;

          fileCtrlBlkAddr := Addr( ...something... );

          On the various few systems we ran on we had to recreate that tiny assembly language function. However that enabled a lot of low level trickery that was safely hidden behind nice safe interfaces of tested units.

          --
          How often should I have my memory checked? I used to know but...
        • (Score: 2) by DannyB on Sunday February 26, @10:12PM (2 children)

          by DannyB (5839) Subscriber Badge on Sunday February 26, @10:12PM (#1293471) Journal

          We dearly wished for Modula 3. We never really found a compiler for it.

          --
          How often should I have my memory checked? I used to know but...
          • (Score: 3, Informative) by hendrikboom on Saturday March 04, @12:42AM (1 child)

            by hendrikboom (1125) on Saturday March 04, @12:42AM (#1294398) Homepage Journal

            See links in the Wikipedia article. [wikipedia.org]

            • (Score: 2) by DannyB on Sunday March 05, @04:45PM

              by DannyB (5839) Subscriber Badge on Sunday March 05, @04:45PM (#1294625) Journal

              Thanks.

              At the time, it was very ancient times. Before the Web. Before America Online. Before we had access to the internet (before being acquired by big company). Back in the days of dial up CompuServe.

              What we needed was compilers for our target systems: IBM PC, Macintosh, and Apple ///. (We had dropped the Apple ][ due to memory expansion limitations.)

              If we had found our dream Modula 3 compilers, there would still be the consideration of how much software to rewrite?

              As I matured, I realized that you never rewrite a substantial code base unless there is a really good reason to.

              --
              How often should I have my memory checked? I used to know but...
      • (Score: 2) by turgid on Friday February 24, @10:57PM

        by turgid (4318) Subscriber Badge on Friday February 24, @10:57PM (#1293291) Journal

        This was one of the things that impressed me with Pascal compilers back in the day, that you could compile in things like bounds checking. I never saw a C compiler that could do that. I think I have waxed lyrical before about Modula-2 (Pascal's successor) which really made me understand encapsulation and strong typing but anyway...

  • (Score: 3, Interesting) by Beryllium Sphere (r) on Sunday February 26, @06:35PM

    by Beryllium Sphere (r) (5062) on Sunday February 26, @06:35PM (#1293453)

    The inspiration for Rust was when its inventor had to climb more than 20 flights of stairs because the microcontroller on his building's elevator had crashed.

    That's an example of a niche in which computing resources are limited. Others include spacecraft, where radiation-hard hardware is generations behind state of the art and power budgets are cruelly tight.

    Auto makers use embedded computers that passed qualification years ago.

    There are also situations where you need to wring the last bit of performance out of well-spec'ed hardware. Correct me if I'm wrong, but I don't think there are major gaming engines written in Python.

    I've only started learning Rust but it looks like its virtues are not limited to memory safety. I suspect they're exaggerating when they talk about "fearless concurrency" but anecdotally Mozilla got something multithreaded running in Rust after two failures in C++.

    What makes me curious is that there are web services implemented with Rust. There you'd expect time to market to be the most powerful design force with adding hardware being an easy option.

  • (Score: 3, Interesting) by RamiK on Sunday February 26, @10:25PM

    by RamiK (1813) on Sunday February 26, @10:25PM (#1293475)

    As noted in this recent reddit post [reddit.com], Go is basically Oberon-2 with C tokens: https://www.youtube.com/watch?v=0ReKdcpNyQg&t=1070s [youtube.com]

    --
    compiling...
(1)