Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday February 28 2019, @02:55PM   Printer-friendly
from the hello-entropy dept.

The National Vulnerability Database (NVD) is a US government-funded resource that does exactly what the name implies-acts as a database of vulnerabilities in software. It operates as a superset of the Common Vulnerabilities and Exposures (CVE) system, operated by the non-profit Mitre Corporation, with additional government funding. For years, it has been good enough—while any organization or process has room to be made more efficient, curating a database of software vulnerabilities reported through crowdsourcing is a challenging undertaking.

Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product.

In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."

https://www.techrepublic.com/article/software-vulnerabilities-are-becoming-more-numerous-less-understood/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by DannyB on Thursday February 28 2019, @07:23PM

    by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:23PM (#808290) Journal

    a proficient C programmer can produce safer code in C than a less proficient programmer using a "safer" language

    What you seem to be saying reinforces my original point. C is unsafe at any speed. If a programmer can easily screw up a pointer, or double free memory, or use after free, or fail to free, then the language is too low level.

    Quote:
    "A programming language is low level when its programs require attention to the irrelevant."
    - Alan J. Perlis.

    That's item 46 on Top 50 Programming Quotes of All Time [junauza.com].

    Managing memory and pointers, and buffer overflows, and string overflows is a descent into the irrelevant. It truly is irrelevant to almost any problem being solved by almost any programmer today. It should simply be impossible to mess up pointers, and get these types of overflows without going to a deliberate effort to do so. There are ways I can create these types of errors in Java, but not by accident. I can read and write files, network sockets and never worry about a buffer overflow or string overflow. It just can't happen.

    Now don't get this wrong. I'm NOT saying we should get rid of C. Just that we should confine its use to what it is appropriate for. And writing application software it not what it is appropriate for.

    If there were one single perfect programming language for everything, we would all be using it already.

    Complain about Java or any other GC language, but the economics of why they are useful and so widely used will prove you wrong. It's not the 1960's anymore. Computers don't cost many millions of dollars with programmer time being cheap. It's the other way around. A team of programmers costs millions of dollars a year, and one month of a developer salary (before benefits) can easily buy a very sweet computer.

    If you're optimizing for bytes and cpu cycles, then you're almost certainly doing it wrong. If I need another 64 GB of (expensive) memory on the production server but can beat my C competitor to market by a year, my boss won't blink an eye and we'll laugh all the way to the bank. I'm optimizing for dollars not bytes and cpu cycles. These things are just an economic reality.

    including that same C programmer being tossed into a new language.

    I would agree that being tossed into a new language can be a jarring experience.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2