Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday February 28 2019, @02:55PM   Printer-friendly
from the hello-entropy dept.

The National Vulnerability Database (NVD) is a US government-funded resource that does exactly what the name implies-acts as a database of vulnerabilities in software. It operates as a superset of the Common Vulnerabilities and Exposures (CVE) system, operated by the non-profit Mitre Corporation, with additional government funding. For years, it has been good enough—while any organization or process has room to be made more efficient, curating a database of software vulnerabilities reported through crowdsourcing is a challenging undertaking.

Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product.

In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."

https://www.techrepublic.com/article/software-vulnerabilities-are-becoming-more-numerous-less-understood/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by exaeta on Tuesday March 05 2019, @07:47PM (1 child)

    by exaeta (6957) on Tuesday March 05 2019, @07:47PM (#810392) Homepage Journal
    It would be nice to see languages that deliver the same performance as C++ without a loss in efficiency. Whilst I have seen a few that can come close to C, none have managed to dethrone C++.
    I have seen language after language make false promises about performing as well as C++, and none of them have come close in reality. Some languages come closer, but there are a lot of important aspects of C++ that another language would have to meet in order to replace it:
    • Equally powerful template system for compile time metaprogramming that generates equally efficient code
    • A mechanism to take advantage of architecture differences for optimizing code. In C++ this is UB, something else would need to replace that to eliminate UB.
    • A standardized language not controlled by corporations. The reason I avoid using C#, Rust, etc. is mainly because they are controlled by organizations without objective standards development.
    • A long term commitment to zero overhead. If greater safety can be had while maintaining zero overhead, great. If you cannot find a zero overhead method of maintaining safety, then your language is not as efficient as C++.

    Other languages have failed on these fronts. The most straightforward way to solve these problems is to improve C++, (e.g. C++11, C++14, C++17) and development of a new language is less efficient.
    Do you really think that a new language will be able to do better than improvements to C++ as new versions are standardized? I don't think so. No language can come close in efficiency, and as the RAII paradigm is refined, the number of bugs also decreases. RAII is the most straightforward method to a code base that is both efficient and secure.

    --
    The Government is a Bird
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by DannyB on Tuesday March 05 2019, @09:40PM

    by DannyB (5839) Subscriber Badge on Tuesday March 05 2019, @09:40PM (#810429) Journal

    I very much agree those are desirable goals for a language. And for any language that would replace C++.

    The last time I used C++ was in about 1998. So my knowledge is a bit dated. But, I was looking at it very hard as my next "go to" language for most things.

    I really cannot say whether another language would replace C++.

    I would hope so.

    I remember moving from Pascal to possibly C++, and thought that C++ brought a lot of very nice features along with efficiency. I would hope that some new language could truly live up to the same admiration by:
    * being even more safe
    * uncompromising efficiency of compiled code (don't pay at runtime for features you don't use, etc)
    * being even more high level and powerful

    If a language could do that, then it would deserve to replace C++, just as C++ has replaced languages before it.

    Now future ideal languages aside, we come back to the practical reality for today. There is no perfect programming language for all uses. There probably never will be, because uses of languages are for so many vastly different purposes.

    It would be amazing if a language could have efficiency and not pay for features not used. Yet still offer runtime type information and GC. Introspection at runtime. Unlimited precision integers. Yet not make you pay for any of those features if you don't use them. That is a big challenge. Maybe no single language will ever pull it off.

    In my replacement language search in the late 90's, I ended up on Java because it checked all the right boxes for me. I also strongly believed it would continue to improve and it has. Back then bytecode was interprted. Now JVM bytecode gets compiled twice. First by C1 that quickly generates obvious code. Then shortly later by C2 which spends a lot of time generating highly optimized code.

    Let me bring up a point about efficiency for a moment. This is a Java to Java comparison. There is now an ahead of time Java compiler. Still experimental. It generates relatively small binaries with very quick startup times. It performs well. But not as well as the full dynamic JVM once it has "warmed up". (That means once all classes have been compiled to native code by the C2 optimizing compiler mentioned earlier.) Why? Because C2 can see the entire global scope of the entire program. Something that no ahead of time compiler can ever do. The JVM C2 compiler can make optimizations at run time because it sees the entire scope of all code. It can aggressively inline. It can know exactly where all the callers are to a particular function and make calling optimizations accordingly. It doesn't have to be locked into one calling convention. It also can compile directly to the actual instruction set of the actual processor you are using -- including that processor's x86 extensions. Again something an ahead of time compiler cannot do and be able to run on all processors. But all this compilation of bytecode causes slow startup time. A program starts up with its bytecode being interpreted and the most cpu intensive hotspots getting compiled by C1 and then later by C2. When I learned just a day or so ago that the ahead of time compiler cannot match the performance of a fully "warmed up" JVM, I was initially surprised, but then, not surprised.

    --
    The lower I set my standards the more accomplishments I have.