Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday February 28 2019, @02:55PM   Printer-friendly
from the hello-entropy dept.

The National Vulnerability Database (NVD) is a US government-funded resource that does exactly what the name implies-acts as a database of vulnerabilities in software. It operates as a superset of the Common Vulnerabilities and Exposures (CVE) system, operated by the non-profit Mitre Corporation, with additional government funding. For years, it has been good enough—while any organization or process has room to be made more efficient, curating a database of software vulnerabilities reported through crowdsourcing is a challenging undertaking.

Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product.

In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."

https://www.techrepublic.com/article/software-vulnerabilities-are-becoming-more-numerous-less-understood/


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Freeman on Thursday February 28 2019, @03:13PM (3 children)

    by Freeman (732) on Thursday February 28 2019, @03:13PM (#808140) Journal

    With the increased usage of software, there's bound to be even more vulnerabilities, so that's not terribly surprising. Less understood, implies that the general public had a clue in the first place. I see no reason why a vulnerability would be harder to understand for those that need to fix it or protect against it. With all of the recent breaches in security, it should also be a lot easier for their boss, and bosses boss to understand the need for security. Though, perhaps I'm missing something.

    --
    Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 4, Insightful) by JoeMerchant on Thursday February 28 2019, @03:43PM

      by JoeMerchant (3937) on Thursday February 28 2019, @03:43PM (#808158)

      I see no reason why a vulnerability would be harder to understand for those that need to fix it or protect against it.

      As compared to the days when, say, the STONED floppy boot sector virus was the major threat, I'd say that modern systems are more complex by a couple of orders of magnitude and that does make them harder to understand, fix and protect.

      --
      Україна досі не є частиною Росії. https://www.newsweek.com/russian-state-tv-ukraine-war-dirty-bomb-putin-1754428
    • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @03:55PM

      by Anonymous Coward on Thursday February 28 2019, @03:55PM (#808161)

      Many programming languages/runtimes have package managers that have grown up around them. These package managers can pull in mountains of code to any given project. That certainly makes security more problematic and opaque even to developers.

    • (Score: 5, Insightful) by RamiK on Thursday February 28 2019, @04:44PM

      by RamiK (1813) on Thursday February 28 2019, @04:44PM (#808181)

      Some new exploits are targeting hardware details that aren't openly documented or even available to the driver developers. For Intel ME you at least had a few outside developers working for motherboard OEMs and NICs that read Intel's docs and even saw some of their sources. And there were open implementations for the specs around so people had a pretty good idea what's going on in general. Then you had those ARM cores found on x86 server-board providing the management features that admins were messing with on a daily basis... Those were running linux and serving html so some poking around got you quite far. But nowadays you have the micro-controllers on everything from your hard disk to your motherboard's sata and usb switches being targeted since they've gradually came to run on increasingly more generic compute cores as nodes kept shrinking. How many people even know the instruction sets for those let alone the software? And how many of those work for Antivirus companies?

      --
      compiling...
  • (Score: 3, Insightful) by DannyB on Thursday February 28 2019, @03:38PM (23 children)

    by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @03:38PM (#808153) Journal

    Languages are not the ONLY area to focus on for increased security.

    Yes, I'm talking about C, and some of its conventions like null terminated strings, and functions like strcopy that don't accept a parameter for absolute maximum target length.

    While I wholeheartedly agree that C is good for certain uses (portable assembler, microcontroller, bootloaders, and maybe OS kernels), it is NOT a good choice for writing applications. And I would even go so far as to say command line applications and utilities.

    I say that despite the vast bulk of code that has been written in it. I say it in light of decades of hindsight.

    No language / runtime system is perfect. But we should at least be able to have security considered in the design. It should not be possible to have string overflows. Buffer overflows. Stack overwriting. The length of a string is an inherent property of the string, not determined by a sentinel. Yes, I know this seemed clever back when every byte and cpu cycle mattered. Welcome to the 21st century.

    Just as type safe languages prevent or detect problems at compile time that would only show up at runtime, building in some security safety into languages and libraries would make it harder to make blunders. When doing a code review of a library, one should not have to treat each function as a puzzle as to what ways could this be exploited. The language could naturally enforce certain constraints that prevent overwriting memory.

    Even if there were a minor runtime cost, such as bounds checking, isn't it worth it? (Modern compiler tech is pretty advanced, and bounds checks often get promoted to the beginning of a loop instead of within the loop, just as an example.) Wouldn't you pay a bit to have more peace of mind?

    The whole principle of trying to make systems more secure surely includes re-examining our programming languages.

    --
    I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 4, Insightful) by JoeMerchant on Thursday February 28 2019, @03:48PM (9 children)

      by JoeMerchant (3937) on Thursday February 28 2019, @03:48PM (#808160)

      Counterpoint: after 30 years of commercial use the vulnerabilities in C are well documented, well understood, and relatively easy to check for.

      The latest garbage (collecting) language may claim to automatically take care of all your problems, but they inevitably create new problems in the process, and simply due to their immaturity those problems are less well identified and understood.

      I'm not saying that C is the right language for everyone. I am saying that a proficient C programmer can produce safer code in C than a less proficient programmer using a "safer" language, including that same C programmer being tossed into a new language.

      --
      Україна досі не є частиною Росії. https://www.newsweek.com/russian-state-tv-ukraine-war-dirty-bomb-putin-1754428
      • (Score: 3, Insightful) by DannyB on Thursday February 28 2019, @05:49PM (7 children)

        by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:49PM (#808218) Journal

        I'm not sure why you bring up GC. I don't know of vulnerabilities caused by GC alone. GC systems are not exactly immature. Research on Java GC goes back 20+ years. Research on GC in general goes back to the 80s and probably earlier.

        Now as for problems versus vulnerabilities, yes. Higher level languages bring new problems. And taking a similar approach to what I suggested for C, those should be avoided.

        Example: Java and other languages get this temptation to introduce what I'll call 'format strings'. What I mean by that is like the formatting of printf() using %s and similar. But not just for printing, but even for processing strings. Now there's nothing wrong with that. But then the formatting languages go too far and it starts to become possible to execute code in some sense and affect the operation of the program. Sort of like SQL injection, if an attacker knows to introduce the right attack into the right field that goes through some format processor, you've got a problem. Now I think the REAL problem here is that programmers should treat raw user input as untrusted and not process it through functions that might have undesirable results. But maybe there are ways to help enforce that.

        Suppose a language had separate types such as String, SqlString, HtmlString, etc. These would NOT be assignment compatible. The only way to assign a String to an HtmlString would be through a function that expands any html entities.

        HtmlString hs = strToHtml( s );

        An HtmlString is presumably sent to a browser to be rendered inline as text.

        Now if the string 's' contained <script>insert evil js code here</script>, it would expand and appear on the html page a visible script tag but could not be executed by the browser.

        Similarly you could not assign a plain string to an SqlString without going through a function that scrubs it for certain things.

        This is but an example. The general idea is to try to remove opportunities to accidentally create vulnerabilities. The fact that they are easy to unintentionally create is the very reason we have so many.

        I am not saying we should get rid of C entirely. Or immediately. And I understand people are attached to it. And sometimes people, including myself, don't like change. But progress brings change. If things about our systems could be made safer this seems like progress.

        We could all go back to writing in assembler. Or even hex code. Or toggle switches on the front panel to hand enter instructions and watch the blinking lights.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
        • (Score: 2) by DannyB on Thursday February 28 2019, @05:53PM

          by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:53PM (#808219) Journal

          As an ACTUAL EXAMPLE of what I meant by the <script> tag, see my post above. Notice that your browser included the script with the evil javascript code here, but didn't execute it. That's because I typed it in, pre-expanding any html entities.

          &lt;script&gt;

          which appears on the browser as:

          <script>

          --
          I get constant rejection even though the compiler is supposed to accept constants.
        • (Score: 1) by redneckmother on Thursday February 28 2019, @06:26PM (1 child)

          by redneckmother (3597) on Thursday February 28 2019, @06:26PM (#808241)

          We could all go back to writing in assembler. Or even hex code. Or toggle switches on the front panel to hand enter instructions and watch the blinking lights.

          Let's not, and say we did. Been there, done that, got the t-shirt, got it autographed, and gave it away.

          I DO miss the blinkenlights (S/360 and others), though.

          --
          Mas cerveza por favor.
          • (Score: 2) by DannyB on Thursday February 28 2019, @07:09PM

            by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:09PM (#808277) Journal

            I never used an S/360 or 370. But I did get to briefly use an IBM 1130 with punch cards and an 029, and sometimes 026 keypunch. Just for one semester. Then interactive CRTs on a new (but obscure) minicomputer. I first learned to write significant assembler code on that beast, as well as learn several high level languages, including at the end, Pascal.

            I also wrote 8086 code in the early 1980s to do high speed scrolling, writing, filling, etc of character into rectangular "windows" of the IBM PC character display. Going through the BIOS int 13 let alone DOS was too slow.

            I say that because people think that my affinity for garbage collection and high level languages means I don't know how to do anything low level or manage memory.

            I also (ahem) did binary machine code patches to some software to bypass a nag screen that would come up. Not anything to do with licensing. But this one program (*cough* Microsoft Works *cough*) on Macintosh would start up going to an open / new dialog, when it could go to a different mode at startup that was accessed by just clicking Cancel. So I traced the execution, and came up with a tiny patch. Everyone in the office loved it. It had nothing to do with piracy.

            Let's not, and say we did.

            So really, moving to an OS, and a language like C is a true advancement, at that time.

            Yet people then resist further advancements. But back in the 1970's I remember people arguing that we should never use high level languages like FORTRAN or heaven forbid Pascal.

            --
            I get constant rejection even though the compiler is supposed to accept constants.
        • (Score: 2) by hendrikboom on Friday March 01 2019, @10:34AM (3 children)

          by hendrikboom (1125) Subscriber Badge on Friday March 01 2019, @10:34AM (#808627) Homepage Journal

          Research on GC in general goes back to the 80s and probably earlier.

          It goes back to at least the 60's.

          • (Score: 2) by DannyB on Friday March 01 2019, @03:34PM (2 children)

            by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:34PM (#808716) Journal

            The earliest GC research I personally read about was in the 80s. I'm sure it goes further back. But once you move back to the 70's, memory is so precious, and so limited, that it is difficult to imagine what good you can do to make things better.

            I have a decent sized hardcover textbook on GC that I bought in the 90's. It covered amazing developments in GC that I found quite informative. From what I see in the various Java GCs, that textbook is clearly obsolete now.

            --
            I get constant rejection even though the compiler is supposed to accept constants.
            • (Score: 2) by hendrikboom on Saturday March 02 2019, @12:00AM (1 child)

              by hendrikboom (1125) Subscriber Badge on Saturday March 02 2019, @12:00AM (#809005) Homepage Journal

              The original garbage collector was part of the original Lisp implementation.

              The advance I remembered from the 60's was a technique for garbage-collecting Lisp memory without needing extra storage for a stack It used two marking bits in every cell. It reversed pointers so as to have a trail back and then reversed them again while backing out, using the cells themselves to contain a stack. It used two marking bits in every cell so as to mark what had been visited, and also to remember which pointers had been reversed.

              This was invented precisely because memory was so precious and limited!

              • (Score: 2) by DannyB on Monday March 04 2019, @02:56PM

                by DannyB (5839) Subscriber Badge on Monday March 04 2019, @02:56PM (#809779) Journal

                I realize GC goes back to original Lisp. I'm sure early implementations were simple mark/sweep.

                In an excellent GC hardcover textbook I purchased in the early 90s, it did cover the technique you describe of reversing the pointers and then back again. Another technique it introduced me to was for when a system had virtual memory. Simply have two separate address spaces within virtual memory. Old and New. To GC you copy objects from Old space to New space. As you visit objects that are live, they are copied to New space, and the object in old space is replaced with a tagged forwarding address to new space. As you see pointers to objects in old space, if the object has a forwarding address, simply update the pointer as the object has already been moved. Once you've visited all of the live objects, everything live is now in New space, and Old space can be discarded or paged out. eg, none of its pages need to be swapped into real memory pages in RAM. Plus, New space is compacted, all blocks consecutively allocated.

                Then there were treadmill approaches and coloring approaches and discussion of how to make a GC run concurrently with the program (eg 'mutator'). As I have read about all of the various GCs available in Java, I am impressed by the amount of research that has been done in the last couple decades. The cost of GC is now astonishingly low. Stop-The-World pauses are very short -- typically just to mark the root set and then let the program continue while GC proceeds to collect. GC can now run on multiple cpu threads. Things are tuned so well with workloads that Full-GC operations never happen -- because they would cause long pauses. The design of modern GC seems to be to perpetually avoid ever doing a Full GC. It just keeps up with the mutator.

                --
                I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 2) by DannyB on Thursday February 28 2019, @07:23PM

        by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:23PM (#808290) Journal

        a proficient C programmer can produce safer code in C than a less proficient programmer using a "safer" language

        What you seem to be saying reinforces my original point. C is unsafe at any speed. If a programmer can easily screw up a pointer, or double free memory, or use after free, or fail to free, then the language is too low level.

        Quote:
        "A programming language is low level when its programs require attention to the irrelevant."
        - Alan J. Perlis.

        That's item 46 on Top 50 Programming Quotes of All Time [junauza.com].

        Managing memory and pointers, and buffer overflows, and string overflows is a descent into the irrelevant. It truly is irrelevant to almost any problem being solved by almost any programmer today. It should simply be impossible to mess up pointers, and get these types of overflows without going to a deliberate effort to do so. There are ways I can create these types of errors in Java, but not by accident. I can read and write files, network sockets and never worry about a buffer overflow or string overflow. It just can't happen.

        Now don't get this wrong. I'm NOT saying we should get rid of C. Just that we should confine its use to what it is appropriate for. And writing application software it not what it is appropriate for.

        If there were one single perfect programming language for everything, we would all be using it already.

        Complain about Java or any other GC language, but the economics of why they are useful and so widely used will prove you wrong. It's not the 1960's anymore. Computers don't cost many millions of dollars with programmer time being cheap. It's the other way around. A team of programmers costs millions of dollars a year, and one month of a developer salary (before benefits) can easily buy a very sweet computer.

        If you're optimizing for bytes and cpu cycles, then you're almost certainly doing it wrong. If I need another 64 GB of (expensive) memory on the production server but can beat my C competitor to market by a year, my boss won't blink an eye and we'll laugh all the way to the bank. I'm optimizing for dollars not bytes and cpu cycles. These things are just an economic reality.

        including that same C programmer being tossed into a new language.

        I would agree that being tossed into a new language can be a jarring experience.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by tangomargarine on Thursday February 28 2019, @05:09PM (8 children)

      by tangomargarine (667) on Thursday February 28 2019, @05:09PM (#808191)

      I was waiting for you to conclude we should all start using Rust but you never got there.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by DannyB on Thursday February 28 2019, @05:35PM

        by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:35PM (#808212) Journal

        I wasn't intending to go there. Rust is not for everything any more than any other language.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 3, Insightful) by DannyB on Thursday February 28 2019, @05:54PM (6 children)

        by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:54PM (#808221) Journal

        If there were one perfect programming language for everything, we would all already be using it.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
        • (Score: 2) by c0lo on Friday March 01 2019, @01:51AM (5 children)

          by c0lo (156) on Friday March 01 2019, @01:51AM (#808510) Journal

          We are very close to having one: Malbolge.

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0
          • (Score: 2) by DannyB on Friday March 01 2019, @03:22PM (4 children)

            by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:22PM (#808710) Journal

            We already had it: BASIC in 4K ROM

            All possible programing problems from computer algebra systems, to theorem proving, to advanced compilers, natural language processing; these all can be solved with BASIC.

            --
            I get constant rejection even though the compiler is supposed to accept constants.
            • (Score: 2) by c0lo on Saturday March 02 2019, @02:44AM (3 children)

              by c0lo (156) on Saturday March 02 2019, @02:44AM (#809044) Journal

              Is it so?
              Then write a BASIC program that demonstrates that P = NP

              --
              https://www.youtube.com/watch?v=aoFiw2jMy-0
              • (Score: 2) by DannyB on Monday March 04 2019, @03:00PM (2 children)

                by DannyB (5839) Subscriber Badge on Monday March 04 2019, @03:00PM (#809782) Journal

                What I'm suggesting is that any program that does what you suggest can most certainly be implemented in BASIC. It's a facetious argument against using languages which are too low level for the purpose. Any existing theorem proving programs can be implemented in BASIC. You would first need to implement dynamic memory management with a heap (in an array) and then GC in order to build something like a theorem prover. You would probably even need to implement a lisp and then a Prolog or Haskell on top of that.

                --
                I get constant rejection even though the compiler is supposed to accept constants.
                • (Score: 2) by c0lo on Monday March 04 2019, @10:47PM (1 child)

                  by c0lo (156) on Monday March 04 2019, @10:47PM (#810017) Journal

                  In 4k ROM, yeah.

                  Look, CS-wise, I know Basic is Turing-complete, I'm being facetious when it comes to engineering aspects of it.

                  --
                  https://www.youtube.com/watch?v=aoFiw2jMy-0
                  • (Score: 2) by DannyB on Tuesday March 05 2019, @12:04AM

                    by DannyB (5839) Subscriber Badge on Tuesday March 05 2019, @12:04AM (#810058) Journal

                    Yeah, 4K ROM. As long as it is Turing complete, you can implement whatever you want on top of it as long as you have enough memory available. As I said you could build a malloc / free type allocator with an array as the heap. You could build some kind of bytecode machine. You could build GC, etc.

                    The ROM doesn't need to be big, just the RAM needs to be adequate for your multi-terabyte Java heap hello world application.

                    --
                    I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 0) by Anonymous Coward on Friday March 01 2019, @02:28AM (1 child)

      by Anonymous Coward on Friday March 01 2019, @02:28AM (#808523)

      Jesus tap dancing Christ, is someone STILL complaining about C strings being a problem?
      You do realize that pretty much all string operations in C are just function calls and that you can WRITE YOUR OWN string handling libraries to replace the ancient 1970s ones?
      It's such a standard approach that you can download such a library ready to go or write your own in half a day if you are willing to omit Unicode fancy stuff.

      Everyone has pretty much decided on a string representation as a struct with one field holding the string data in UTF-8, null terminated (UTF-8 was specifically designed to allow for C string backward compatibility); another field holding the length in bytes of the string.

      We had in house written counted-length strings and string functions (in C) at my first job almost 25 years ago!

      Sure "one standard" for C strings using counted length would be best, as would arrays that had a field for array size, but it is possible to add these via a library. The problem is solvable/solved.

      • (Score: 2) by DannyB on Friday March 01 2019, @03:31PM

        by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:31PM (#808715) Journal

        You do realize . . .

        I very much realize that I was able to write some C++ classes in the 1990s that realized Pascal like strings. Nothing in C prevents that.

        Problem: anybody's non-standard strings are not the real strings that everyone uses in C.

        As a bonus my string classes did lazy copy on write. That is if you assigned, copied, etc a string to another string, they shared the common character buffer. As soon as one string was to be modified, then it would do a copy on write, cloning the character buffer and make the modifications -- if the string was shared. Obviously this was a reference counted garbage collection -- but there could be no data structure cycles.

        Of course, this is before I knew about the advantages of immutable data structures and hardware wasn't as cheap as it is today.

        Then I realized I needed to build a unicode variant of these. And I had other library classes I built up, File IO, etc. And everything used my own strings.

        Problem: my strings weren't the STANDARD strings. So I had to have adapters to work with other libraries.

        You do realize that maybe even in the 90's I knew more than you give me credit for.

        By the late 1990s, every good C++ compiler was still a (different) subset of the proposed standard. I finally gave up on C++, and started looking at Java. Cross platform was also one of my major goals. I soon realized that despite the platform runtime costs, it solved ALL of my checklist problems and requirements. But is not perfect for everyone nor for all uses.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by jb on Friday March 01 2019, @05:23AM (1 child)

      by jb (338) on Friday March 01 2019, @05:23AM (#808580)

      Yes, I'm talking about C, and some of its conventions like null terminated strings, and functions like strcopy that don't accept a parameter for absolute maximum target length.

      Oh, come on. Your straw man doesn't even come close to holding up.

      strncpy(3) has been around for at least as long as C has been standardised (C89, 30 years ago). Granted it doesn't appear in my 1978 copy of K&R, but no new programmer should be learning C from anything that old today (and if that's really the latest thing you've read about C, you should at the very least get up to speed with C89 before writing anything in C for production use).

      Better still, srtlcpy(3) has been around since OpenBSD 2.4 (Dec 1998, 20 years ago) and if your libc of choice doesn't have it, you can implement it yourself trivially in less than 10 minutes, or just copy OpenBSD's (it's only about 25 lines long, permissively licensed and easy to find here [openbsd.org]) assuming you're willing to give credit where credit's due.

      If the symptom you're seeing is inappropriate use of strcpy(3), the problem isn't the C language (nor even the C library), but rather an inexperienced (or 30+ year out of touch) programmer...

      Those exist for any given language.

      • (Score: 2) by DannyB on Friday March 01 2019, @03:38PM

        by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:38PM (#808722) Journal

        Yes, my opinion of C was formed quite a long time ago. But other criticisms I would have of it for high level applications are still valid. I'm not saying nobody should use C. I'm not saying C should go away. I am saying that C is not the best choice for applications at or above the command line level.

        My point is that there have been and are now ever more languages that enforce much greater safety right in the language.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
  • (Score: 4, Insightful) by TheFool on Thursday February 28 2019, @04:06PM (6 children)

    by TheFool (7105) on Thursday February 28 2019, @04:06PM (#808170)

    Software itself is becoming more bloated (more code to have bugs) and less understood. The kinds of apps that get glued together (it's hard to call something that is 90% external library code "written") today are massive compared to what we wrote 25 years ago, and even the system itself is becoming more complicated because everything needs to be talking to a web server somewhere. And it all needs to be done in half the time because somewhere along the line sales realized they never had to ship "finished" software, they could just ship updates to it until people stopped caring.

    We need to teach non-software people to understand that quality software takes far more time than the junk food equivalent they are used to consuming or this is what you'll get. Can you eat potato chips for every meal? Yes. It's pretty fast to grab a bag and chow down, and it'll fill you up. But it's unhealthy, and sooner or later that's going to catch up with you. We're just witnessing that in software.

    • (Score: 2) by DannyB on Thursday February 28 2019, @05:59PM (5 children)

      by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:59PM (#808225) Journal

      Is it bloat? Or is it complex features?

      Which is more bloated? Notepad or Word?

      Which has more features? Notepad or Word?

      These features creep in and people like and accept them. Before long your super simple text editor highlights misspelled words.

      I remember in 1984 when Apple introduced the Macintosh. Many PC magazines were horrified at the memory and cpu power required to run a GUI. I hadn't seen so much whining and complaining since Nixon resigned. Now here we are today all using GUIs. (Unless you browse SN using a text mode browser.)

      --
      I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @07:58PM (2 children)

        by Anonymous Coward on Thursday February 28 2019, @07:58PM (#808323)

        "Hello World" in Rust clocks in at 2MB. A huge attack surface. In my dictionary if Hello World goes over a few hundred bytes as a compiled executable, it is bloatware.
        Today much of the programming is glueing together large pre-made components or painting on top of a framework. The devs know the API, but have little to no information on the (often proprietary) base system - and that is where all the complexity and potential for bugs lives. Often these "higher" "safe" languages compile down to C or their runtimes were written in C, making them only as "safe" as the skills of the implementer.

        • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @08:00PM

          by Anonymous Coward on Thursday February 28 2019, @08:00PM (#808325)

          But Rust is the safest, most perfect, and most loved programming language.

        • (Score: 2) by DannyB on Thursday February 28 2019, @09:29PM

          by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @09:29PM (#808377) Journal

          I had no idea. I don't use Rust.

          Any idea why? Does it statically link a lot of unneeded library into the executable?

          I don't know what a "Hello World" in Java clocks in at, disk space wise. But I know that the new ZGC or Red Hat's new Shenandoah GC can handle Terabytes of heap, yes really, with 10 ms pause times -- so the Hello World should run efficiently!

          --
          I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 1, Insightful) by Anonymous Coward on Thursday February 28 2019, @08:02PM (1 child)

        by Anonymous Coward on Thursday February 28 2019, @08:02PM (#808327)

        Is it bloat? Or is it complex features?

        It is bloat.

        Which is more bloated? Notepad or Word?

        Which is tastier, an apple or an orange?

        Your comparison is poor. The examples serve significantly different needs. Try something like "which is more bloated, Word 95 or Word 2019?" or "which is more bloated, Windows 7 or Windows 10?" or "which is more bloated, OS X 10.6 or OS X 10.14?"

        • (Score: 2) by DannyB on Thursday February 28 2019, @09:33PM

          by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @09:33PM (#808378) Journal

          My comparison is meant to address that sometimes when people complain about bloat they might be wanting the "Notepad" solution and think that nobody else would be served by the significantly different "Word" solution. Because what they need is what everyone else needs.

          Or to put it in concrete terms, if Java didn't serve a real need, it wouldn't be the number one language for years in a row on multiple programming language indexes. Somebody out there must be finding it useful and economical.

          --
          I get constant rejection even though the compiler is supposed to accept constants.
  • (Score: 5, Interesting) by ilsa on Thursday February 28 2019, @04:29PM (16 children)

    by ilsa (6082) Subscriber Badge on Thursday February 28 2019, @04:29PM (#808175)

    I have noticed that over the last, oh, decade or so, software quality has utterly collapsed.

    Writing quality software has never ever been easy. It never will be easy. And it requires skilled developers to write that code. The problem is that companies are trying so hard to lower the barrier of entry to software development (presumably so that they can flood the market and force down wages...), that we have developed some very bad cultural issues as an industry.

    Developers thing WAY too highly about themselves. Everything now revolves around developer velocity, and literally everything else is now secondary, including code quality and user experience.

    Technologies like Javascript and MongoDB are now the technologies du jour, because they are so easy to get started with. Nevermind that Javascript is a steaming shitpile with new layers being added on an almost daily basis in an effort to overcome it's shortcomings. MongoDB is used because people find SQL and relational databases too difficult to use. Which is really depressing, because they're bloody easy.

    So now we have a huge glut of second-rate programmers who think they are god's gift to the universe but in actuality have next to no idea what they're doing. These people are reinventing the wheel over and over again. They're making the same mistakes, over and over again. Increasing vulnerabilities is just a symptom of a much larger problem.

    Oh, and this "less understood" bit is horse shit. All the issues we're running into are perfectly well understood because they have been made repeatedly in the past and there are troves of documentation regarding avoidance and mitigation. The problem is that no one bothers to learn because they immediately discard anything old as inferior and irrelevant.

    • (Score: 2) by opinionated_science on Thursday February 28 2019, @05:41PM (1 child)

      by opinionated_science (4031) on Thursday February 28 2019, @05:41PM (#808213)

      I agree with most of what you wrote, however the statement

      "MongoDB is used because people find SQL and relational databases too difficult to use"

      I don't believe is supported by evidence.

      Sure SQL is a pain. But in analytics with high dimensional data, the noSQL technologies are orders of magnitudes faster than any SQL.

      It is a testament to SQL's veritable status, that there are wrappers, like SparkSQL [apache.org], that allow a somewhat straightforward comparison.

      The graphDB's also dovetail nicely with our ML (machine learning) technologies.

      • (Score: 2) by ilsa on Friday March 01 2019, @03:43PM

        by ilsa (6082) Subscriber Badge on Friday March 01 2019, @03:43PM (#808730)

        My (admittedly anecdotal) experience has shown that there are a lot of developers who just don't understand SQL. They do not understand relational algebra.

        I agree that there are plenty of scenarios where a NoSQL database makes much more sense. My problem is that I have seen plenty of situations where the decision to go NoSQL had nothing to do with performance or architecture, but because NoSQL was more convenient for the developer. And that's the part that bothers me greatly. They don't even understand that they are painting themselves into a corner, that they are making it more difficult to interact with that data in the future, etc.

    • (Score: 3, Interesting) by DannyB on Thursday February 28 2019, @06:01PM (2 children)

      by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @06:01PM (#808226) Journal

      The developer velocity thing is a management problem. A problem of mis-aligned priorities and delaying small costs that will turn into gigantic costs.

      Developers are happy to think about security issues and other problems that managers are happy to sweep under the rug until someday.

      --
      I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 2) by bob_super on Thursday February 28 2019, @07:38PM (1 child)

        by bob_super (1357) on Thursday February 28 2019, @07:38PM (#808308)

        Who cares if it has bugs, as long as the customers' private data flows into the database ?
        Let's focus on what's valuable, shall we ?

        • (Score: 2) by DannyB on Thursday February 28 2019, @09:24PM

          by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @09:24PM (#808372) Journal

          No worries if the customer's data flows OUT of the database into someone else's hands.

          Developer Velocity is what matters.

          --
          I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by exaeta on Thursday February 28 2019, @06:54PM (5 children)

      by exaeta (6957) on Thursday February 28 2019, @06:54PM (#808264) Homepage Journal
      I agree. I want to see software improve a bit, security wise, but it seems there are few coders left who care.
      --
      The Government is a Bird
      • (Score: 2) by DannyB on Friday March 01 2019, @03:41PM (4 children)

        by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:41PM (#808726) Journal

        In this entire discussion it is quite apparent the resistance to any suggestion of making changes that would improve security at a fundamental level.

        The arguments come down to: software developers should just be good enough to avoid errors.

        Then why don't we just stick to assembly language if someone is against higher order languages.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
        • (Score: 2) by exaeta on Tuesday March 05 2019, @04:07AM (3 children)

          by exaeta (6957) on Tuesday March 05 2019, @04:07AM (#810131) Homepage Journal
          The problem is that teaching software developers IS the only solution. We can't all start to code in languages like Java and Ada because those introduce a loss in efficiency, which is not ideal for many types of applications. Sure, there are areas where it doesn't matter so much, but in many cases the efficiency DOES matter quite a lot. The only way for us to maintain code that is both efficient and secure is to step up our game on the individual front, not some magical tech solution.
          --
          The Government is a Bird
          • (Score: 2) by DannyB on Tuesday March 05 2019, @02:40PM (2 children)

            by DannyB (5839) Subscriber Badge on Tuesday March 05 2019, @02:40PM (#810254) Journal

            > teaching software developers IS the only solution

            It is not an either-or approach. It is important to do both to the maximum extent possible.
            1. Teach software developers
            2. Make systems and languages more safe

            > We can't all start to code in languages like Java and Ada because those introduce a loss in efficiency

            I think I've been clear in this thread that there is no perfect language. I also was clear that C / C++ should not be going away -- just that they are not appropriate for many things they are being used for, such as writing applications at or above the command line level.

            > because those introduce a loss in efficiency

            There are new modern languages (without GC) that compile direct to machine code but are safer at the source code level.

            I think I have also been clear in this entire S/N topic that there are places where C / C++ type languages are perfect. Just to re-re-repeat: bootloaders, firmware, microcontrolers, kernels, device drivers, etc.

            Although, IMO, new languages will take over this space eventually.

            > The only way for us to maintain code that is both efficient and secure is to step up our game on the individual front, not some magical tech solution.

            I don't think it is an either / or thing. Developers should step up their game. But we should make languages / systems more safe at the same time. There is a reason we have C. Because it is higher level than assembler. Easier to use. Easier to reason about. Portable. Etc. Yet you don't (seem to) complain about C and say we must all stick to assembler and up our game.

            Just to point out one of my past predictions about languages. In about 1990, when this idea seemed controversial, I was telling friends that in ten years new languages would all have GC. (This is not an argument about GC, just that I have been right before.) By 2000 most new languages had GC and all newer languages with any significant penetration had GC.

            In short:
            AGREE: developers should improve their game
            DISAGREE: while no magical tech solution, we CAN and MUST do better at making our systems safer without significant loss of efficiency

            --
            I get constant rejection even though the compiler is supposed to accept constants.
            • (Score: 2) by exaeta on Tuesday March 05 2019, @07:47PM (1 child)

              by exaeta (6957) on Tuesday March 05 2019, @07:47PM (#810392) Homepage Journal
              It would be nice to see languages that deliver the same performance as C++ without a loss in efficiency. Whilst I have seen a few that can come close to C, none have managed to dethrone C++.
              I have seen language after language make false promises about performing as well as C++, and none of them have come close in reality. Some languages come closer, but there are a lot of important aspects of C++ that another language would have to meet in order to replace it:
              • Equally powerful template system for compile time metaprogramming that generates equally efficient code
              • A mechanism to take advantage of architecture differences for optimizing code. In C++ this is UB, something else would need to replace that to eliminate UB.
              • A standardized language not controlled by corporations. The reason I avoid using C#, Rust, etc. is mainly because they are controlled by organizations without objective standards development.
              • A long term commitment to zero overhead. If greater safety can be had while maintaining zero overhead, great. If you cannot find a zero overhead method of maintaining safety, then your language is not as efficient as C++.

              Other languages have failed on these fronts. The most straightforward way to solve these problems is to improve C++, (e.g. C++11, C++14, C++17) and development of a new language is less efficient.
              Do you really think that a new language will be able to do better than improvements to C++ as new versions are standardized? I don't think so. No language can come close in efficiency, and as the RAII paradigm is refined, the number of bugs also decreases. RAII is the most straightforward method to a code base that is both efficient and secure.

              --
              The Government is a Bird
              • (Score: 2) by DannyB on Tuesday March 05 2019, @09:40PM

                by DannyB (5839) Subscriber Badge on Tuesday March 05 2019, @09:40PM (#810429) Journal

                I very much agree those are desirable goals for a language. And for any language that would replace C++.

                The last time I used C++ was in about 1998. So my knowledge is a bit dated. But, I was looking at it very hard as my next "go to" language for most things.

                I really cannot say whether another language would replace C++.

                I would hope so.

                I remember moving from Pascal to possibly C++, and thought that C++ brought a lot of very nice features along with efficiency. I would hope that some new language could truly live up to the same admiration by:
                * being even more safe
                * uncompromising efficiency of compiled code (don't pay at runtime for features you don't use, etc)
                * being even more high level and powerful

                If a language could do that, then it would deserve to replace C++, just as C++ has replaced languages before it.

                Now future ideal languages aside, we come back to the practical reality for today. There is no perfect programming language for all uses. There probably never will be, because uses of languages are for so many vastly different purposes.

                It would be amazing if a language could have efficiency and not pay for features not used. Yet still offer runtime type information and GC. Introspection at runtime. Unlimited precision integers. Yet not make you pay for any of those features if you don't use them. That is a big challenge. Maybe no single language will ever pull it off.

                In my replacement language search in the late 90's, I ended up on Java because it checked all the right boxes for me. I also strongly believed it would continue to improve and it has. Back then bytecode was interprted. Now JVM bytecode gets compiled twice. First by C1 that quickly generates obvious code. Then shortly later by C2 which spends a lot of time generating highly optimized code.

                Let me bring up a point about efficiency for a moment. This is a Java to Java comparison. There is now an ahead of time Java compiler. Still experimental. It generates relatively small binaries with very quick startup times. It performs well. But not as well as the full dynamic JVM once it has "warmed up". (That means once all classes have been compiled to native code by the C2 optimizing compiler mentioned earlier.) Why? Because C2 can see the entire global scope of the entire program. Something that no ahead of time compiler can ever do. The JVM C2 compiler can make optimizations at run time because it sees the entire scope of all code. It can aggressively inline. It can know exactly where all the callers are to a particular function and make calling optimizations accordingly. It doesn't have to be locked into one calling convention. It also can compile directly to the actual instruction set of the actual processor you are using -- including that processor's x86 extensions. Again something an ahead of time compiler cannot do and be able to run on all processors. But all this compilation of bytecode causes slow startup time. A program starts up with its bytecode being interpreted and the most cpu intensive hotspots getting compiled by C1 and then later by C2. When I learned just a day or so ago that the ahead of time compiler cannot match the performance of a fully "warmed up" JVM, I was initially surprised, but then, not surprised.

                --
                I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by Thexalon on Thursday February 28 2019, @06:59PM (2 children)

      by Thexalon (636) on Thursday February 28 2019, @06:59PM (#808268)

      I heard the same basic rant 20 years ago when the whole .com craze was causing wild speculation on Wall Street and any fool who could figure out HTML was suddenly a valuable "programmer".

      My dad, who started coding for a living around 1978, got the same reaction from the old hands who'd been working on OS/360 and such. Those darn kids with their MVS and smart terminals, makes the job too easy! I'm sure previous generations complained about it being too easy to use those fancy assemblers - why aren't people writing machine code like they used to?

      Some things never change: Young people getting excited about new technology that think means they don't have to think as carefully as their predecessors (they're partially right, partially wrong). Their predecessors complaining about how the younguns don't know what they're talking about (also partially right, and partially wrong).

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2) by DannyB on Friday March 01 2019, @03:42PM

        by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:42PM (#808727) Journal

        Funny thing. I'm not young. Yet I EMBRACE change and advancements rather than get stuck in a rut arguing that things should be as they always have been.

        --
        I get constant rejection even though the compiler is supposed to accept constants.
      • (Score: 2) by ilsa on Friday March 01 2019, @03:53PM

        by ilsa (6082) Subscriber Badge on Friday March 01 2019, @03:53PM (#808741)

        And to an extent, they weren't wrong. Look at the breathtaking level of crapware that came out when Visual Basic was released. I have yet to see a single instance of Visual Basic code that isn't absolute garbage.

        The lower you set the bar for new programmers, the worse the resulting code is going to be. It's as simple as that.

        Throw something like Javascript into the mix, or any other language that was never designed for complex programs, and you amplify the problem that much more.

        The same technologies that help good programmers work more efficiently, also tend to enable unskilled people to write code. There is a balance to be had, and what I see says that the pendulum has swung way too far to the 'easy' side.

    • (Score: 2) by DannyB on Thursday February 28 2019, @07:32PM

      by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:32PM (#808299) Journal

      over the last, oh, decade or so, software quality has utterly collapsed.

      Developer quality has utterly collapsed. That is why we see famous interview questions from Google and others. Most companies have made mistakes of hiring mediocre programmers or posers. HR people don't know the difference. Developers time is expensive, scarce and not well spent on interviews. Plus we now have a high tech "gold rush". Oh, I can get rich fast by becoming a developer! And books of the genre: Learn X in only 24 hours! (And yes, these books are deliberately intended to deceive people to think they can learn brain surgery or rocket science in less than ten years) How about: Learn C in only ten years! Become a SQL expert in only ten years!

      we have a huge glut of second-rate programmers who think they are god's gift to the universe

      Nailed it in one Mr. Garibaldi.

      Nevermind that Javascript is a steaming shitpile with new layers being added on an almost daily basis in an effort to overcome it's shortcomings.

      Absolutely agree. Yet JavaScript is a reality. And modern JS is far better than the state it was in ten or especially fifteen years ago. And standardization seems to finally, at long long last, taking hold. Sort of like teraforming.

      All the issues we're running into are perfectly well understood because they have been made repeatedly in the past and there are troves of documentation regarding avoidance and mitigation. The problem is that no one bothers to learn because they immediately discard anything old as inferior and irrelevant.

      Yes. But also another thing. Efforts to systematically eliminate these problems, such as I suggest elsewhere in this SN topic, are met with stiff resistance. It's not just troves of documentation to mitigate the problems. Things at fundamental levels like languages and compilers can make entire classes of bugs just vanish -- yet people will put up a fight over it.

      --
      I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by hendrikboom on Friday March 01 2019, @11:08AM

      by hendrikboom (1125) Subscriber Badge on Friday March 01 2019, @11:08AM (#808631) Homepage Journal

      Nevermind that Javascript is a steaming shitpile with new layers being added on an almost daily basis in an effort to overcome it's shortcomings.

      Last I heard is that the authors of Javascript thought to adapt Scheme (or maybe another Lisp dialect) to run in the browser. But their manager wanted something that looked like C. Javascript was the resulting miscegenation.

  • (Score: -1, Flamebait) by Anonymous Coward on Thursday February 28 2019, @05:19PM (6 children)

    by Anonymous Coward on Thursday February 28 2019, @05:19PM (#808199)

    If everyone coding in C/C++/D/what have you switched to Rust, software would be much safer and better.

    • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @05:31PM (2 children)

      by Anonymous Coward on Thursday February 28 2019, @05:31PM (#808207)

      Ada? Designed for military applications--life critical systems...

      • (Score: 1) by redneckmother on Thursday February 28 2019, @06:34PM (1 child)

        by redneckmother (3597) on Thursday February 28 2019, @06:34PM (#808248)

        I was surprised that Ada didn't take off, since DOD was pushing it hard. It is (was?) a decent language. Loved the strong typing, exception handling, and syntax.

        --
        Mas cerveza por favor.
        • (Score: 2) by DannyB on Thursday February 28 2019, @07:34PM

          by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:34PM (#808301) Journal

          I first ran into ADA in college. I was told by the grown ups that I would do good to learn ADA because it would soon be the ONLY language because of the DOD.

          I simply could not believe that. I saw a lot of usefulness in other languages.

          I also considered that an ADA compiler must be a beast of sophistication and complexity. And at this time microcomputers were taking off and had, maybe 48 K of RAM. Or 64 K RAM. Later 128 K. etc.

          --
          I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 2) by DannyB on Thursday February 28 2019, @06:08PM

      by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @06:08PM (#808230) Journal

      Rust is one way to solve memory management problems.

      GC is another way.

      Rust costs more developer time but is much more efficient at runtime. It may not be suitable to build a large complex application. It may be great for microcontrollers, bootloaders, kernels, etc.

      GC makes developers lives easy (which is a major cost center), but has some runtime costs. It is suitable to build large complex systems but wholly inappropriate for microcontrollers, bootloaders, kernels, etc.

      Both Rust and GC approaches solve a problem: leaving attention to memory management in the hands of developers who may not be paying attention.

      The vast, vast majority of historical bugs have been three simple memory management problems. Not freeing something (memory leak). Double free. Use after free. Use a language that makes these impossible and you've just removed the historical vast majority of problems. Strong typing with type problems detected at compile time also solves a huge category of problems and doesn't leave them undetected until runtime.

      --
      I get constant rejection even though the compiler is supposed to accept constants.
    • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @08:07PM (1 child)

      by Anonymous Coward on Thursday February 28 2019, @08:07PM (#808331)

      Rust is part of the problem. One - it is more googleware. Two - compare the compiled executable from C, C++, D and say.. Rust. Oh yes, the eternal "disk space is cheap and so is memory" argument. I disagree. You multiply the bloat and overhead times every application and process in sight, and soon you can bring down a new Ryzen 12 core to speeds we used to enjoy back when the 8088 was king.

      • (Score: 4, Informative) by Anonymous Coward on Thursday February 28 2019, @09:46PM

        by Anonymous Coward on Thursday February 28 2019, @09:46PM (#808388)

        Rust is Mozilla's baby. You are thinking of Go.

  • (Score: 2) by exaeta on Thursday February 28 2019, @06:51PM

    by exaeta (6957) on Thursday February 28 2019, @06:51PM (#808260) Homepage Journal

    Shameless plug for a new discord to discuss software security.
    https://discord.gg/USe7KnN [discord.gg]
    I'd love to do something about our software systems, frankly, sucking, when it comes to security. Join if you can code and are experienced.

    --
    The Government is a Bird
  • (Score: 0) by Anonymous Coward on Thursday February 28 2019, @08:40PM

    by Anonymous Coward on Thursday February 28 2019, @08:40PM (#808348)

    Yesterday pentium division bug, today speculative execution, tomorrow ?

    Considering the current spec exec bugs are already too big for the IT industry which has done nothing to fix them, what will the next level be like?

    Maybe we need to KISS.

  • (Score: 3, Interesting) by NotSanguine on Thursday February 28 2019, @09:25PM

    by NotSanguine (285) <reversethis-{grO ... a} {eniugnaStoN}> on Thursday February 28 2019, @09:25PM (#808375) Homepage Journal

    All the discussion about the quality of software, the appropriateness of one language/dev platform over another and the lack of quality developers are all excellent points, and thanks to all who have contributed to that discussion.

    But doesn't it strike anyone as pretty self-serving that TFS (No TFA for me, thanks! Semper Fidelis!) is mostly about Risk Based Security (RBS) badmouthing the processes of the NVD/CVE databases?

    Especially given that RBS is in "competition" with NVD/CVE. It seems to me that the goal for such databases should be to publicize and document vulnerabilities, patches, mitigations and exploits, rather than trashing folks who are providing a valuable public service.

    What value is added by this?

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 1, Insightful) by Anonymous Coward on Thursday February 28 2019, @09:59PM

    by Anonymous Coward on Thursday February 28 2019, @09:59PM (#808396)

    you naturally end up with a crazy construct acting up in every which non-understandable way.
    When an unexpected state produced by one bug then gets mishandled by a chain of other bugs and ad-hoc workarounds to yet other bugs, the set of possible misbehaviors grows exponentially.

    Fix your bugs, people. Nothing stable can be built upon a broken foundation.

(1)