Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.
posted by hubie on Tuesday July 01, @03:57AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The US Cybersecurity and Infrastructure Security Agency (CISA) and the National Security Agency (NSA) this week published guidance urging software developers to adopt memory-safe programming languages.

"The importance of memory safety cannot be overstated," the inter-agency report [PDF] says.

Memory safety refers to the extent to which programming languages provide ways to avoid vulnerabilities arising from the mishandling of computer memory. Languages like Rust, Go, C#, Java, Swift, Python, and JavaScript support automated memory management (garbage collection) or implement compile-time checks on memory ownership to prevent memory-based errors.

C and C++, two of the most widely used programming languages, are not memory-safe by default. And while developers can make them safer through diligent adherence to best practices and the application of static analysis tools, not everyone deploys code with that much care.

To further complicate matters, code written in nominally safe languages may still import unsafe C/C++ libraries using a Foreign Function Interface, potentially breaking memory safety guarantees.

[...] Google and Microsoft have attributed the majority of vulnerabilities in large software projects to memory safety errors. In Google's Android operating system, for example, 90 percent of high-severity vulnerabilities in 2018 came via memory safety bugs. In 2021, the Chocolate Factory noted that more than 70 percent of serious security issues in Chromium came from memory safety flaws.

The infamous Heartbleed flaw in the OpenSSL cryptographic library was the result of a memory safety error (an out-of-bounds read) in C code. And there are many other examples, including the mid-June Google Cloud outage, which Google's incident report attributes to a lack of proper error handling for a null pointer.

Within a few years, the tech industry began answering the call for memory-safe languages. In 2022, Microsoft executives began calling for new applications to be written in memory-safe languages like Rust. By 2023, Consumer Reports – a mainstream product review publication – published a report on memory safety and government officials like Jen Easterly, CISA's director at the time, cited the need to transition to memory-safe languages during public appearances.

The memory safety push created some turmoil in the Linux kernel community over the past year, as efforts to integrate Rust-based drivers met resistance from kernel maintainers. And it has alarmed the C/C++ communities, where developers have been busily trying to come up with ways to match the memory safety promises of Rust through projects like TrapC, FilC, Mini-C, and Safe C++.

The CISA/NSA report revisits the rationale for greater memory safety and the government's calls to adopt memory-safe languages (MSLs) while also acknowledging the reality that not every agency can change horses mid-stream.

[...] A recent effort along these lines, dubbed Omniglot, has been proposed by researchers at Princeton, UC Berkeley, and UC San Diego. It provides a safe way for unsafe libraries to communicate with Rust code through a Foreign Function Interface.

This is exactly the sort of project that CISA and the NSA would like to see from the private sector, particularly given pending budget cuts that could depopulate CISA by a third.

While the path toward greater memory safety is complicated by the need to maintain legacy systems and the fact that MSLs may not be the best option for every scenario, the government's message is clear.

"Memory vulnerabilities pose serious risks to national security and critical infrastructure," the report concludes. "MSLs offer the most comprehensive mitigation against this pervasive and dangerous class of vulnerability."


Original Submission

This discussion was created by hubie (1068) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Funny) by c0lo on Tuesday July 01, @04:41AM (6 children)

    by c0lo (156) Subscriber Badge on Tuesday July 01, @04:41AM (#1408960) Journal

    I mean, if your program and your data are the same, you can't mishandle them.

    'sides, LISP programmers and blackhat hackers should give a zero overlapping on a Venn diagram, I doubt the today's youngsters has the attention span to decode CADR((((...)))) expressions

    --
    https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 5, Funny) by DannyB on Tuesday July 01, @03:24PM (5 children)

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @03:24PM (#1409011) Journal

      The problem with Lisp is this. Lisp was created back in a time (1959) when there was a great over abundance of parenthesis. Back when they were very inexpensive.

      Today, like most other things, parenthesis are in short supply. The national strategic stockpile of parenthesis is at historically hysterically low levels due to poor management, and lack of conservation efforts. It is so bad that some languages need an airlift and airdrop shipment of parenthesis. Some languages *cough* also need a major shipment of curly braces to denote scope.

      If you have extra boxes of parenthesis at your desk, then you are lucky enough to be able to program in Lisp.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 3, Funny) by c0lo on Tuesday July 01, @07:06PM

        by c0lo (156) Subscriber Badge on Tuesday July 01, @07:06PM (#1409034) Journal

        Blame techbros for increased demand on parentheses.
        Other than that, they are easy to make, even at hobbyist level; boring as a hobby, but I can sacrifice half a day over weekend to make enough for a quarter (the curlies are a bit more complicated to make, but one can find enough of them second-hand)

        --
        https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by sonamchauhan on Wednesday July 02, @05:33AM (2 children)

        by sonamchauhan (6546) on Wednesday July 02, @05:33AM (#1409080)

        Whose fault was it not switching to ((parenthesis)v6) ?

        • (Score: 2) by Tork on Wednesday July 02, @03:51PM (1 child)

          by Tork (3914) Subscriber Badge on Wednesday July 02, @03:51PM (#1409126) Journal

          Whose fault was it not switching to ((parenthesis)v6) ?

          To be fair, nobody expected mass adoption of home colligators; we may never need v6.

          --
          🏳️‍🌈 Proud Ally 🏳️‍🌈
          • (Score: 2) by sonamchauhan on Friday July 04, @09:02AM

            by sonamchauhan (6546) on Friday July 04, @09:02AM (#1409292)

            Hahah -- and Unicode joined the rescue party as well...

            ⁽ ⁾₍ ₎⦅ ⦆❨ ❩⟮ ⟯()

      • (Score: 2, Interesting) by khallow on Wednesday July 02, @10:32PM

        by khallow (3766) Subscriber Badge on Wednesday July 02, @10:32PM (#1409173) Journal
        Modern oligopolies control the supply of parentheses and suppress knowledge of viable and often superior alternatives. For example, how did the Soviets manage to keep up with the Western world for so long despite their grossly inefficient production of parentheses? Answer: they used the powerful soul technique of reverse Polish notation (it was originally Polish notation, but was reversed in the Peoples' Congress of 1935 when Poland rebuffed Soviet overtures.)

        As computers became more vital to Soviet prosperity, their need for alternatives to parentheses grew. Fortunately, RPN was well positioned to deliver and the prescient meeting organizers were honored posthumously in1958 (having been executed in some 1938 purges). One can only imagine what could happened to the world markets for Western letters and punctuation, if these guys had continued to research alternatives to the Capitalist typographic quagmire rather than exhibit regrettable treasonous sympathies for enemies of the People.
  • (Score: 3, Funny) by corey on Tuesday July 01, @05:13AM

    by corey (2202) on Tuesday July 01, @05:13AM (#1408963)

    In the same voice as the bikie at the start of Terminator 2: “You fergut ta say Ada.”

  • (Score: 5, Insightful) by bzipitidoo on Tuesday July 01, @05:14AM (11 children)

    by bzipitidoo (4388) on Tuesday July 01, @05:14AM (#1408964) Journal

    You want good code? Then don't hire bad programmers. Don't push your programmers, good or bad, to code as fast as possible. Don't use the "lines of code" metric to measure programmer performance. The stuff that is produced when the team is deathmarching in great haste will be horrible.

    Now, who manages programming badly and tries to make up for the bad planning by pushing the programmers to the breaking point? Could it be, oh, commercial vendors such as MS?

    • (Score: 5, Insightful) by jb on Tuesday July 01, @08:16AM (2 children)

      by jb (338) on Tuesday July 01, @08:16AM (#1408974)

      You want good code? Then don't hire bad programmers.

      Couldn't agree more. As the old saying goes, If you pay peanuts you get monkeys. Those who take the art and science of software engineering seriously don't need "memory safety".

      Don't use the "lines of code" metric to measure programmer performance.

      Why ever not? Just remember to set the sign bit when doing so. The number of lines of code a maintenance programmer removes from the code base (with zero loss of reliability, security or required functionality) is a fairly accurate metric for his real world performance. Green fields projects are a different kettle of fish (perhaps requirements met over LOC added times a language-specific constant might be a better proxy there), but those are much rarer today than they used to be.

      • (Score: 2) by pkrasimirov on Tuesday July 01, @10:01AM (1 child)

        by pkrasimirov (3358) Subscriber Badge on Tuesday July 01, @10:01AM (#1408980)

        Include in this count the everlasting config creep, endless xml/json/yaml/ini, make them auto-generated for bonus horror points. Oh wait, let's not even look at auto-generated code. The only time I ever saw a Java interface file of 300000 bytes was an auto-generated abomination of an interface with an inner class with boiler-plate methods. Ooh the horror! All that in what was called Application Server bloatware.

        Delete.

        • (Score: 2) by jb on Wednesday July 02, @07:46AM

          by jb (338) on Wednesday July 02, @07:46AM (#1409092)

          Include in this count the everlasting config creep, endless xml/json/yaml/ini, make them auto-generated for bonus horror points.

          Indeed. And for extra bonus points the two characters that every postmaster used to fear most: m4

    • (Score: 3, Informative) by JoeMerchant on Tuesday July 01, @01:59PM (1 child)

      by JoeMerchant (3937) on Tuesday July 01, @01:59PM (#1409005)

      Nah, it's all about the training wheels.

      Put 'em in Rust and enforce them to not use those proven unsafe vulnerability prone design patterns.

      (of course, once Rust has 30 years of development history, it will accrue a new stable of vulnerability prone design patterns so large and complex that a return to C++ where the vulnerabilities were more easily identified may be hailed as the only hope...)

      Meanwhile "data driven" "procedural enforcement" of stepping over the known potholes is the way that clueless leadership feels like they are in control.

      It reminds me of tales of "certified secure" military projects where the toolset was restricted to a small set of "certified" tools, use of anything else ist strengstens verboten. So, the developers spend 10x the effort re-inventing wheels inside their clean room, taking that effort away from the real things the project said it was supposed to be investigating. Now, the real desired results can vary from: budget inflation, providing metrics about how expensive it is to do secure research and development, project torpedoing: showing that after $20M or R&D effort we made zero progress toward the goal, so the $200M main budget can be looted for another project, etc. etc.

      --
      🌻🌻🌻 [google.com]
      • (Score: 3, Interesting) by bzipitidoo on Tuesday July 01, @03:39PM

        by bzipitidoo (4388) on Tuesday July 01, @03:39PM (#1409013) Journal

        I was a defense contractor for 2 years, I've seen their ridiculous restrictions. It's as you say. They cling to old versions of Windows because it takes them far too long to certify new versions. They were sticking with Windows 2000 when Windows XP was over a year old. Don't know what they're doing now, but I'd guess they're all on Windows 10 (or older) and refusing to update to Windows 11. The worst case was the use of telnet despite ssh having been around for 10 years at that point. They used a phone call to securely tell me the password they had assigned me for the telnet sessions, LOL.

        When asked why they won't even look at Linux, they claim that Windows is more trustworthy because it is Made In America. That's just an excuse. The real reason is that they feel more comfortable with Windows.

    • (Score: 5, Insightful) by DannyB on Tuesday July 01, @03:40PM (5 children)

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @03:40PM (#1409014) Journal

      You want good code? Then don't hire bad programmers.

      It's not bad programmers. Even good programmers introduce these bugs. They are inevitable because the language allows it.

      The solution is to use a memory safe language that makes it impossible to express these vulnerabilities. It shouldn't be possible to have a buffer overflow. This is actually a solved problem.

      There are other problems too, like format string errors. Using printf with only one parameter, like printf("hello whirrled"), or even very much more worser printf("hello there "+yourName );

      It's like nobody wanting to face up to the real problem with school shootings: the fact that kids can get [access to] guns easier than they can get candy.

      Now I'm not saying we should banish C. I think it has a place, like microcontroller firmware, boot loaders, device drivers, and possibly OS kernels. However the higher up the chain we get, the less appropriate it seems to be.

      Don't push your programmers, good or bad, to code as fast as possible. Don't use the "lines of code" metric to measure programmer performance.

      Those are also excellent points. But they don't address the real problem nobody wants to talk about. C is not a memory safe language and it probably impossible to make it into one.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 3, Interesting) by https on Tuesday July 01, @07:19PM (1 child)

        by https (5248) on Tuesday July 01, @07:19PM (#1409037) Journal

        Because something is allowed does not make it inevitable.

        How C is taught has a huge impact. `free' isn't mentioned in K&R until almost the last page of the last chapter, and I managed to pass my first C course without understanding it at all. Memory use is not (yet) part of basic algorithms.

        --
        Offended and laughing about it.
      • (Score: 5, Insightful) by turgid on Tuesday July 01, @08:08PM

        by turgid (4318) Subscriber Badge on Tuesday July 01, @08:08PM (#1409044) Journal

        C is not a memory safe language and it probably impossible to make it into one.

        There's no "probably" about it. It's not memory safe and it is not possible to make it into one. It would no longer be C. C is being used for things way beyond its original purpose, and C++ was a mistake. It made the bad things worse and deferred the good things asymptotically to the future.

      • (Score: 4, Insightful) by jb on Wednesday July 02, @08:14AM (1 child)

        by jb (338) on Wednesday July 02, @08:14AM (#1409093)

        There are other problems too, like format string errors. Using printf with only one parameter, like printf("hello whirrled"),

        Oh, come on. Anyone who's been around more than 5 minutes would use fputs(3), or depending on context perhaps even write(2), instead.

        or even very much more worser printf("hello there "+yourName );

        If your compiler is any good at all, that should error out: pointer arithmetic and string literals don't mix.

        It's like nobody wanting to face up to the real problem with school shootings: the fact that kids can get [access to] guns easier than they can get candy.

        C programmers don't kill people; script kiddies pretending to be C programmers kill people.

        Now I'm not saying we should banish C. I think it has a place, like microcontroller firmware, boot loaders, device drivers, and possibly OS kernels. However the higher up the chain we get, the less appropriate it seems to be.

        Larry Wall famously wrote "real programmers can write assembly code in any language." [Camel book, p. 543 in the 3rd edition]

        However, the reverse is also true, in fact somewhat truer: real programmers can write high level code in any language ... and once you're used to it writing well structured high level code in a low level language tends to be more productive than fighting tooth-and-nail with a high level language to get usable access to low level constructs.

        There's also great commercial benefit in being able to tell clients "the buck stops here" and really mean it: bugs in direct-to-client releases of my code are my responsibility to fix, at my cost. They're rare enough to have no appreciable impact on the bottom line. But you just can't do that with a huge stack of third party dependencies, as is the case with most of these "modern" languages.

        • (Score: 2) by DannyB on Wednesday July 02, @01:28PM

          by DannyB (5839) Subscriber Badge on Wednesday July 02, @01:28PM (#1409108) Journal

          real programmers can write high level code in any language ... and once you're used to it writing well structured high level code in a low level language tends to be more productive than fighting tooth-and-nail with a high level language to get usable access to low level constructs.

          We've been trying "just program good" for decades and failing. Humans are unfortunately just human and make Miss. Stakes. Imagine if a language simply made it impossible to express some of the worst and most common problems. And you don't usually ever need low level constructs in high level application programs. And when you do, there ARE usually already good solutions, including ways to have low level access.

          --
          The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
  • (Score: 2) by PiMuNu on Tuesday July 01, @05:31AM (2 children)

    by PiMuNu (3823) on Tuesday July 01, @05:31AM (#1408966)

    IIRC it's more memory safe than C, but can still make memory errors?

    • (Score: 2, Informative) by shrewdsheep on Tuesday July 01, @07:26AM (1 child)

      by shrewdsheep (5215) Subscriber Badge on Tuesday July 01, @07:26AM (#1408970)

      There is unsafe code in Rust, which must be explicitly declared (hardware access, low-level optimizations, interactions with foreign code). For the rest, memory lifetime and accessibility is strictly guaranteed. To nitpick, it doesn't prevent bit-flips, so technically, yes, Rust is not memory safe.

      • (Score: 3, Interesting) by JoeMerchant on Tuesday July 01, @02:25PM

        by JoeMerchant (3937) on Tuesday July 01, @02:25PM (#1409008)

        What I find in my early Rust learning is: lots of guardrails. You want to access that there? Ho, ho - that requires a different declaration. You want to call this function? Its parameters are a lot more fussy about what you pass in that something like C or Fortran.

        One ugly construct that became common in Qt/C++ in the last 20 years is: fn(const QString &s). Back in Qt 4.3 days, you'd just call fn(QString s) and be done with it, but... most functions don't need to make a deep copy of s, so you can get speedier code (nanoseconds per instance, in practice) by passing a constant reference to the object instead... all the static analysis tools got hard-ons for recommending the speedup - and it usually doesn't cause any problems to start with (const Object &x) and revise to simple (Object x) when you _do_ want to modify inside the function, instead of making an explicit copy... I just don't like all the extra boilerplate in the function declarations.

        Rust? It all looks like ugly extra un-necessarily complex and confusing boilerplate to me. I'm sure I could find a happy subset of Rust to code in that isn't so messy to my eyes, but the problem is: even doing simple things I seem to be pulling in 200+ crates, and all those interfaces need to be met on their own terms, and so far it looks to me like the developers of the various common crates out there are all trying out every niche and corner of the Rust syntax, resulting in a highly diverse set of interface syntax. Which brings me back to something I like about Qt: it covers a LOT of ground with a single design philosophy. Sure, I despise QtQuick, so I still don't use it, but within the C++ library they at least have maintained a reasonably consistent design subset of the broader C++ lexicon.

        --
        🌻🌻🌻 [google.com]
  • (Score: 0) by Anonymous Coward on Tuesday July 01, @08:02AM (12 children)

    by Anonymous Coward on Tuesday July 01, @08:02AM (#1408971)

    there are many other examples, including the mid-June Google Cloud outage, which Google's incident report attributes to a lack of proper error handling for a null pointer.

    This will not be fixed by Rust et cetra. This was (likely) not a memory-corruption bug, and likely the result of Go language code -- memory safe code.

    Even Javascript, a perfectly memory-safe language, has null pointers. Any time you set a variable = null and try and treat it as an object, you've committed a null-pointer deference exception -- and you get an exception for it. Same as go.

    I wish people who don't know what they're talking about wouldn't be part of the discussion. Sigh.. so many times, people who don't know learn-as-they-go, causing damage all the way.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday July 01, @08:05AM (3 children)

      by Anonymous Coward on Tuesday July 01, @08:05AM (#1408972)

      It's the first call for null-free languages!

      All Herald BASIC!

      • (Score: 3, Touché) by theluggage on Tuesday July 01, @08:21AM (1 child)

        by theluggage (1797) on Tuesday July 01, @08:21AM (#1408975)

        But POKE is probably not memory safe… :-)

        • (Score: 2) by DannyB on Tuesday July 01, @03:59PM

          by DannyB (5839) Subscriber Badge on Tuesday July 01, @03:59PM (#1409019) Journal

          Before you go POKE ing something, you should do some PEEK ing Peking first.

          --
          The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 1, Interesting) by Anonymous Coward on Tuesday July 01, @11:55AM

        by Anonymous Coward on Tuesday July 01, @11:55AM (#1408986)

        You joke, but in my IT travels (though, more properly, that should be travails) over the decades I've come across two unrelated organisations where all their bespoke 'critical' system code was written in BASIC, code which had migrated from mainframes, through minis to the eventual desktop PCs, where I had the dubious pleasure of finally meeting with it in all it's antique (but still serviceable) glory.

        Then there was that place 26 years ago with the bespoke inventory system written in some database software's BASIC dialect running on a Sun server - funny story, the system was mostly Y2K compliant...mostly, that is, apart from the copy protection/licensing code that the muppet (friend of one of the PHBs) they'd contracted (no paperwork) to write (no source code, as there was no real contract) the damn thing had introduced on the fly into the thing (and, once I finally tracked him down, for which he wanted £16,000 to fix the problem, he was told, to quote The Big Yin, 'Gettifuyabassa').

        There's probably a lot of 'legacy' BASIC code still lurking out there quietly doing it's thing, and I don't want to even begin to think about the amount of 'Visual' stuff...

    • (Score: 2) by RamiK on Tuesday July 01, @08:44AM (2 children)

      by RamiK (1813) on Tuesday July 01, @08:44AM (#1408976)

      Rust's compiler forbids the use of uninitialized variable bindings: https://doc.rust-lang.org/std/ptr/index.html [rust-lang.org]

      e.g. Even null function pointers are initialized as zero: https://doc.rust-lang.org/std/ptr/fn.null.html [rust-lang.org]

      --
      compiling...
    • (Score: 2) by janrinok on Tuesday July 01, @09:02AM (2 children)

      by janrinok (52) Subscriber Badge on Tuesday July 01, @09:02AM (#1408977) Journal

      likely the result of Go language code -- memory safe code.

      If it was Go, then I am willing to bet that it was a programmer who didn't handle an error code. Almost 'everything' returns an error code, which should be checked, and is usually a value to show that no error occurred. My IDE flags up any errors that are not tested and handled. But if you intentionally write software that does handle the error then that is on the programmer, not the language.

      --
      [nostyle RIP 06 May 2025]
      • (Score: 2) by janrinok on Tuesday July 01, @09:03AM

        by janrinok (52) Subscriber Badge on Tuesday July 01, @09:03AM (#1408978) Journal

        *does not handle

        --
        [nostyle RIP 06 May 2025]
      • (Score: 2) by pkrasimirov on Tuesday July 01, @12:12PM

        by pkrasimirov (3358) Subscriber Badge on Tuesday July 01, @12:12PM (#1408988)

        And it could be made much simpler with exceptions, their whole existence is exactly to avoid the need to check every op error code.

    • (Score: 2) by DannyB on Tuesday July 01, @03:42PM

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @03:42PM (#1409015) Journal

      The language should have pointer types which can be null, and pointer types which cannot be null. Now this can be checked by the compiler at both the point where a pointer is assigned and returned from a function, and at the point where a pointer is used with or without checking for null first.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
    • (Score: 2) by JoeMerchant on Tuesday July 01, @07:50PM

      by JoeMerchant (3937) on Tuesday July 01, @07:50PM (#1409040)

      >lack of proper error handling for a null pointer.

      See, that's what I like about pointers in C - you've got a function that accepts a pointer from doG knows where? If there's not a clean null check on that pointer, or especially a smart pointer, or whatever - flag that. Assume somebody in the future _will_ be calling that function with a null pointer and deal with it, at least by not crashing.

      So, yeah, many of my functions start off with a set of: if (spx == nullptr) { LOG_ERROR("spx is NULL"); return; } if (spy == nullptr) { LOG_ERROR("spy is NULL"); return; } etc. but how easy is that to understand what it's doing? How long do those checks take? How certain are we that a nullptr is not going to cause a problem here? I developed that habit because a lot of my code runs fine in steady state, but during startup or shutdown sometimes "all the things" aren't ready or available when certain things get triggered. And, when they're not - there's no problem not doing whatever the function might do if they were ready, so just skip it, there will be a "everything is ready" signal coming along soon enough to trigger all the things to do their startup dance. But not if something triggers this handler before all of its components are ready. Should I spend hours and hours attempting to stop all the early triggers? I don't think so, particularly when some of the triggers are coming from actors outside of my control. Instead of trying to filter all the triggers (and doing the null checks in there...) just have a simple function with simple null checks and be done.

      --
      🌻🌻🌻 [google.com]
  • (Score: 1, Insightful) by Anonymous Coward on Tuesday July 01, @12:16PM (15 children)

    by Anonymous Coward on Tuesday July 01, @12:16PM (#1408989)

    Automatic memory management is not the saving grace for "safe" memory. Hire better programmers. Stop using garbage like Agile. No more cowboy coding.

    • (Score: 4, Insightful) by DannyB on Tuesday July 01, @03:55PM (12 children)

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @03:55PM (#1409017) Journal

      The only real solution is to fix it in the language. Don't you think we've been trying to fix this with "good programming" for decades already, and still failing?

      Simply make it IMPOSSIBLE to express these errors in the language.

      Make it impossible to have a buffer overflow (or that it throws an exception).

      This is not going to go over well here, but . . .

      USE GARBAGE COLLECTION. (This almost sounds like an earlier post to use Lisp.)

      GC solves the 3 most common bugs that have cost BILLIONS and BILLIONS of dollars!

      1. Not deallocating something when you no longer have a reference to it.
      2. Deallocating something TWICE (or more)
      3. Deallocating something, but having other copies of that pointer, which later get used to make modifications to what is now deallocated and now possibly re-allocated memory.

      These three, especially the third are VERY DIFFICULT bugs to find and fix. GC solves them very neatly.

      I would also point out the efficiency of GC. The money making threads on a busy server have to pay the freight for GC -- and they do. But the money earning threads don't see a single cpu cycle of memory management (other than malloc, which is very efficient as long as there is memory available, which the GC works to ensure. The GC threads are run on other cpu cores that are not running the primary workload. That cost must be paid, and it is. However the money making threads run faster because they see no memory management cpu cycles.

      I know there will be a chorus of voices saying "smart pointers". These do not handle cycles. Also when something goes out of scope and frees a single pointer there can actually be an avalanche of deallocation take place -- all in line with the primary workload code!

      I know people get set in their ways. But this IS a solved problem. I also understand that GC is not appropriate for some uses where C shines, boot loaders, firmware, microcontrollers, device drivers, etc.

      ----------

      Now there are other problems that need to be solved, like format strings being used as if they are just plain strings. This is an insidious problem when some user input can be sent as a format string, or, the well know problem of inserting it into, say, a SQL query in raw form. Some of these, I don't know a good language level solution for.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 3, Insightful) by hopdevil on Tuesday July 01, @04:11PM (4 children)

        by hopdevil (3356) on Tuesday July 01, @04:11PM (#1409023) Journal

        Honestly have to disagree with "where C shines". There are a few.. exceedingly few.. cases where you need to get to the level of managing memory yourself. It should be thought of as an exception to the rule and there better be a damn good reason not to use something like Rust instead.

        • (Score: 2) by DannyB on Tuesday July 01, @04:22PM (3 children)

          by DannyB (5839) Subscriber Badge on Tuesday July 01, @04:22PM (#1409025) Journal

          I can't disagree with that.

          However I do recognize that any popular language, whether I like it or not, is popular for a reason It must be doing something right.

          If there were perfect language for all possible uses, we would already be using it.

          And to the question of why don't we just create one perfect programming language (I'll get right on that after lunch!), the problem is that the design constraints upon languages pull in different, often conflicting directions. Some languages want integers to be unlimited in size, and this be invisible to the programmer. Want to take ten thousand factorial -- no problem! Get back a simple (but very big) integer value. However other languages want an integer to be a machine word. The conflicting constraints go on and on. GC or not? Weak or strong typing? (I type at 70 wpm) Homoiconicity or not? Etc.

          So if a language is popular, it probably is so for some good reason(s) and I try not to complain too loudly about it. Popularity means it solves a particular set of problems, or has a particular fitness for certain problems. Whether I like some languages or not is irrelevant. Of course not everyone is as open minded.

          --
          The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
          • (Score: 3, Interesting) by hopdevil on Tuesday July 01, @04:37PM

            by hopdevil (3356) on Tuesday July 01, @04:37PM (#1409029) Journal

            It is a bit handwavy, but I can't fully disagree. The issue is popularity has a momentum of it's own.. people continue to do a thing because everyone else is doing it (and have done so for decades) not because it is necessarily a good thing.

            C has done it's job well, but programming languages have also evolved over time. The new C standards have attempted to keep up, but they are held back by 1) adoption and 2) the simple fact that C is forced to fill too many roles it is ill-suited to fill. I advocate that people use the right tool/language for the job and importantly, one that ensures the project will be maintained in the long run. Again, C isn't the right language (with very strict exceptions).

            I'm not any kind of language nazi, I'm not even a programmer. But I see *bad* code on a daily basis.. and bad C code is catastrophic when it results in a security vulnerability; bad Rust won't compile.

          • (Score: 2) by JoeMerchant on Tuesday July 01, @07:55PM (1 child)

            by JoeMerchant (3937) on Tuesday July 01, @07:55PM (#1409043)

            My measure of C and C++'s success?

            Most of the compiler/interpreters for the fancy language du-jours are written in C or C++.

            --
            🌻🌻🌻 [google.com]
            • (Score: 3, Insightful) by DannyB on Tuesday July 01, @09:15PM

              by DannyB (5839) Subscriber Badge on Tuesday July 01, @09:15PM (#1409055) Journal

              What can I say.

              C is, and has long been, the portable "assembly language" for all computing platforms.

              So a new compiler might best be initially written in C, and generate C. Although now days we have LLVM.

              --
              The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 2) by JoeMerchant on Tuesday July 01, @07:52PM (6 children)

        by JoeMerchant (3937) on Tuesday July 01, @07:52PM (#1409041)

        It is impossible to fix all errors with language. Example: I am lying.

        No matter how smart you think your language design is, a more clever idiot will come along and do something screwed up that you didn't anticipate.

        If you can anticipate these things well enough to design a language preventing them, you can check for them with static analysis.

        --
        🌻🌻🌻 [google.com]
        • (Score: 2) by DannyB on Tuesday July 01, @09:24PM (5 children)

          by DannyB (5839) Subscriber Badge on Tuesday July 01, @09:24PM (#1409056) Journal

          Bad programming can be done in any language.

          However a language design can try to eliminate problems. The sharp edges of C are, IMO, pointer manipulation, no array bounds enforcement, null terminated strings, old library functions with no length checking.

          And consider this. The fact that the following wonderful optimizations are even possible should send one running and screaming into the night.

          Sneak the following optimizations into your friend's header files.

          // Speed up loops!
          #define while if
          // Use less memory!
          #define struct union

          That optimization shouldn't even be possible. But I dare you to hide these somewhere in your friend's header files.

          --
          The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
          • (Score: 2) by JoeMerchant on Tuesday July 01, @10:14PM (4 children)

            by JoeMerchant (3937) on Tuesday July 01, @10:14PM (#1409061)

            >The sharp edges of C are, IMO, pointer manipulation, no array bounds enforcement, null terminated strings, old library functions with no length checking.

            And my answer to all of those are: newer library functions.

            Use smart pointers where appropriate. Do strings through the library, and along those same lines use library containers instead of raw arrays.

            Anyone who feels the need to use raw C arrays / pointer indexing / etc. has volunteered for the BIG code review party.

            We had this discussion 10 years back when attempting to coerce some C# adherents to move to Qt - and it's not about the language, it's about the libraries you use in it.

            --
            🌻🌻🌻 [google.com]
            • (Score: 2) by DannyB on Tuesday July 01, @10:45PM (3 children)

              by DannyB (5839) Subscriber Badge on Tuesday July 01, @10:45PM (#1409063) Journal

              I don't disagree with you. (Except that pointer manipulation is baked into the language, and there is existing code that uses it.)

              I have to say, the things you describe building, sound to me like you're turning C++ into Java -- just by using greatly improved libraries.

              The only thing you're missing is Garbage Collection (GC) which is of huge benefit. A few years back I wrote some things here on SN about GC (in my journal even) mentioning the new upcoming ZGC which could handle 16 TB heaps (yes, you read that right) with 1 ms max pause times. Here we are today where ZGC is the default GC in Java. You can still select other GCs to use, such as the Parallel GC which is great for throughput but lousy for latency -- if that better describes your workload. Also the 16 TB max memory limitation was removed a few years back, 2017 I believe. Now the only practical memory limitation is the user space on Linux (128 TB, from googling a few years back), (but unknown about Windows limits), and that the biggest machine I know of, one of IBM's Z machine something-or-other mainframes has a max 40 TB of memory.

              Also, Java comes with (came with) the original Raspberry PI (2014) that had 512 MB of memory and one core. And I wrote programs that ran on that Java on an original Pi. So it does run on small systems. Just not what most people call "microcontrollers".

              --
              The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
              • (Score: 3, Interesting) by JoeMerchant on Tuesday July 01, @11:15PM (2 children)

                by JoeMerchant (3937) on Tuesday July 01, @11:15PM (#1409065)

                My primary experience with GC is in the most inappropriate place imaginable: on Raspberry Pi Picos in python. They are extremely resource constrained and GC is NOT a good idea there, but... it's python... and the python dev environment on Picos is 10x more developed and user friendly than the C/C++ environment that's also available, but just not fun to work with.

                If I were ever doing more than one-off projects in the Pico I'd likely bite the bullet and make them in C/C++ - but if the python is good enough for my one-offs (which it usually is, when GC isn't screwing me up), I don't see the need for more development pain just to have my processor feel less abused.

                In the multi-TB world, yeah, I can see GC making sense if you're putting together systems that use those kinds of resources - micromanaging all of it doesn't make sense. Most of what I do professionally falls into the 500MB-1GB realm, and most of the "big" data users in there are pretty orderly / pre-determined / not making me feel the absence of GC...

                So many use cases, no one solution will ever be optimal for all.

                --
                🌻🌻🌻 [google.com]
                • (Score: 2) by DannyB on Wednesday July 02, @01:18PM (1 child)

                  by DannyB (5839) Subscriber Badge on Wednesday July 02, @01:18PM (#1409107) Journal

                  What I do involves many server instances that are typically in the 24 GB range with plenty of cores. Not TB. Just one of these servers easily handles hundreds of active users, using different databases, but a single Tomcat instance on java, without breaking a sweat. But it's nice to know that Java scales both up and down.

                  It's nice but also not nice that Java does require the resources of an OS, which makes it unsuitable for microcontrollers, sadly. But does work on the Pi quite well.

                  --
                  The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
                  • (Score: 2) by JoeMerchant on Wednesday July 02, @02:22PM

                    by JoeMerchant (3937) on Wednesday July 02, @02:22PM (#1409117)

                    That's another point about the software we write: single user, on a single terminal doing a highly focused task...

                    --
                    🌻🌻🌻 [google.com]
    • (Score: 2) by turgid on Tuesday July 01, @08:20PM (1 child)

      by turgid (4318) Subscriber Badge on Tuesday July 01, @08:20PM (#1409051) Journal

      Agile's not garbage. Many implementations of it are, however.

      • (Score: 2) by DannyB on Tuesday July 01, @09:25PM

        by DannyB (5839) Subscriber Badge on Tuesday July 01, @09:25PM (#1409057) Journal

        In some cases All qualifies as Many.

        --
        The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
  • (Score: 3, Interesting) by Username on Tuesday July 01, @01:21PM

    by Username (4557) on Tuesday July 01, @01:21PM (#1408996)

    If the nsa wants it, it probably compiles some kind of backdoor into your programs.

  • (Score: 2) by turgid on Tuesday July 01, @08:11PM (1 child)

    by turgid (4318) Subscriber Badge on Tuesday July 01, @08:11PM (#1409047) Journal

    Why don't they stop whinging and actually make one of these "memory safe" languages instead of assuming someone else will do it or the C++ people will put it into C++29 or something?

    • (Score: 3, Insightful) by DannyB on Tuesday July 01, @09:27PM

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @09:27PM (#1409059) Journal

      I saw some YouTube videos (don't remember now) about this a month or so ago.

      There are people working hard on these problems.

      The general conclusion seemed to be that the language is no longer compatible and thus is really a different language.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
  • (Score: 2) by turgid on Tuesday July 01, @08:19PM (3 children)

    by turgid (4318) Subscriber Badge on Tuesday July 01, @08:19PM (#1409050) Journal

    Two or three years ago, the last time I was interviewing people for C programming jobs I was absolutely horrified by the level of cluelessness apparently degree-educated software engineers with years of embedded experience exhibited.

    The first mistake my employer made was to make an exclusive contract with an external recruiter. This recruiter had unlimited access to an enormous pool of underpaid staff at overseas consulting companies. These people ostensibly had an education and several years of professional experience doing embedded C development on such things as anti-lock braking systems for several very famous car manufacturers.

    It became apparent that very few of them understood the difference between static and automatic variables, local variables, arrays, strings, the stack, the heap and issues such as endianness.

    One guy was adamant you could declare an array which would be sized automatically on the stack inside a function just by declaring a pointer like int * foo;. But then he waffled about you having to be careful about not using up too much memory in the array because your program would crash.

    • (Score: 3, Insightful) by DannyB on Tuesday July 01, @09:32PM (2 children)

      by DannyB (5839) Subscriber Badge on Tuesday July 01, @09:32PM (#1409060) Journal

      It is frightening that old guys like me understood this way back in the 1980s.

      Oh, and in the mid 1990s, I did write in C++ using MetroWerks Code Warrior:
      1. A Mandelbrot explorer program (on classic Macintosh with GUI, using their "PowerPlant" class library.
      2. A classic and minimal BASIC interpreter that was portable to at least Mac and Windows.

      I'm not completely ignorant of C and C++, but I simply have had no use for it in my work ever, and my personal amusement use of it was decades ago.

      --
      The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
      • (Score: 2) by turgid on Wednesday July 02, @07:15AM (1 child)

        by turgid (4318) Subscriber Badge on Wednesday July 02, @07:15AM (#1409088) Journal

        When I started to learn C in the 1980s it was exciting because it was a powerful compiled language that produced very efficient machine code even on the very small machines of the day. It was also what the mighty Unix was written in. It was cool.

        There were other languages about and there were some promising directions in programming language research and development. There were "safe" languages like Ada, but their compilers were very expensive and they "didn't let you do stuff." It was apparent that computers would get faster and so higher-level languages would become more suitable since the run-time overheads would become far less significant. In fact, there came a point in the late 1990s where it became feasible to write code in LISP. See the essays of Paul Graham.

        C++ came along and became very fashionable simply because it was originally quite compatible with C and added on classes so you could do OOP. I'll leave it there because I will go off on a rant.

        • (Score: 2) by DannyB on Wednesday July 02, @01:36PM

          by DannyB (5839) Subscriber Badge on Wednesday July 02, @01:36PM (#1409109) Journal

          That pretty much describes my experience. Coming from Pascal where I COULD do C-like things with pointers simply by declaring a few tricky things that don't actually generate any code, but make pointer manipulation trivially possible. I got interested in C and C++ for the reasons you describe. I also was a huge fan of Common Lisp and played with it for about six years.

          --
          The server will be down for replacement of vacuum tubes, belts, worn parts and lubrication of gears and bearings.
(1)