Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday February 28 2019, @02:55PM   Printer-friendly
from the hello-entropy dept.

The National Vulnerability Database (NVD) is a US government-funded resource that does exactly what the name implies-acts as a database of vulnerabilities in software. It operates as a superset of the Common Vulnerabilities and Exposures (CVE) system, operated by the non-profit Mitre Corporation, with additional government funding. For years, it has been good enough—while any organization or process has room to be made more efficient, curating a database of software vulnerabilities reported through crowdsourcing is a challenging undertaking.

Risk Based Security, the private operator of competing database VulnDB, aired their grievances with the public CVE/NVD system in their 2018 Vulnerability Trends report, released Wednesday, with charged conclusions including "there is fertile grounds for attorneys and regulators to argue negligence if CVE/NVD is the only source of vulnerability intelligence being used by your organization," and "organizations are getting late and at times unreliable vulnerability information from these two sources, along with significant gaps in coverage." This criticism is neither imaginative, nor unexpected from a privately-owned competitor attempting to justify their product.

In fairness to Risk Based Security, there is a known time delay in CVSS scoring, though they overstate the severity of the problem, as an (empirical) research report finds that "there is no reason to suspect that information for severe vulnerabilities would tend to arrive later (or earlier) than information for mundane vulnerabilities."

https://www.techrepublic.com/article/software-vulnerabilities-are-becoming-more-numerous-less-understood/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by DannyB on Thursday February 28 2019, @05:49PM (7 children)

    by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:49PM (#808218) Journal

    I'm not sure why you bring up GC. I don't know of vulnerabilities caused by GC alone. GC systems are not exactly immature. Research on Java GC goes back 20+ years. Research on GC in general goes back to the 80s and probably earlier.

    Now as for problems versus vulnerabilities, yes. Higher level languages bring new problems. And taking a similar approach to what I suggested for C, those should be avoided.

    Example: Java and other languages get this temptation to introduce what I'll call 'format strings'. What I mean by that is like the formatting of printf() using %s and similar. But not just for printing, but even for processing strings. Now there's nothing wrong with that. But then the formatting languages go too far and it starts to become possible to execute code in some sense and affect the operation of the program. Sort of like SQL injection, if an attacker knows to introduce the right attack into the right field that goes through some format processor, you've got a problem. Now I think the REAL problem here is that programmers should treat raw user input as untrusted and not process it through functions that might have undesirable results. But maybe there are ways to help enforce that.

    Suppose a language had separate types such as String, SqlString, HtmlString, etc. These would NOT be assignment compatible. The only way to assign a String to an HtmlString would be through a function that expands any html entities.

    HtmlString hs = strToHtml( s );

    An HtmlString is presumably sent to a browser to be rendered inline as text.

    Now if the string 's' contained <script>insert evil js code here</script>, it would expand and appear on the html page a visible script tag but could not be executed by the browser.

    Similarly you could not assign a plain string to an SqlString without going through a function that scrubs it for certain things.

    This is but an example. The general idea is to try to remove opportunities to accidentally create vulnerabilities. The fact that they are easy to unintentionally create is the very reason we have so many.

    I am not saying we should get rid of C entirely. Or immediately. And I understand people are attached to it. And sometimes people, including myself, don't like change. But progress brings change. If things about our systems could be made safer this seems like progress.

    We could all go back to writing in assembler. Or even hex code. Or toggle switches on the front panel to hand enter instructions and watch the blinking lights.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by DannyB on Thursday February 28 2019, @05:53PM

    by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @05:53PM (#808219) Journal

    As an ACTUAL EXAMPLE of what I meant by the <script> tag, see my post above. Notice that your browser included the script with the evil javascript code here, but didn't execute it. That's because I typed it in, pre-expanding any html entities.

    &lt;script&gt;

    which appears on the browser as:

    <script>

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 1) by redneckmother on Thursday February 28 2019, @06:26PM (1 child)

    by redneckmother (3597) on Thursday February 28 2019, @06:26PM (#808241)

    We could all go back to writing in assembler. Or even hex code. Or toggle switches on the front panel to hand enter instructions and watch the blinking lights.

    Let's not, and say we did. Been there, done that, got the t-shirt, got it autographed, and gave it away.

    I DO miss the blinkenlights (S/360 and others), though.

    --
    Mas cerveza por favor.
    • (Score: 2) by DannyB on Thursday February 28 2019, @07:09PM

      by DannyB (5839) Subscriber Badge on Thursday February 28 2019, @07:09PM (#808277) Journal

      I never used an S/360 or 370. But I did get to briefly use an IBM 1130 with punch cards and an 029, and sometimes 026 keypunch. Just for one semester. Then interactive CRTs on a new (but obscure) minicomputer. I first learned to write significant assembler code on that beast, as well as learn several high level languages, including at the end, Pascal.

      I also wrote 8086 code in the early 1980s to do high speed scrolling, writing, filling, etc of character into rectangular "windows" of the IBM PC character display. Going through the BIOS int 13 let alone DOS was too slow.

      I say that because people think that my affinity for garbage collection and high level languages means I don't know how to do anything low level or manage memory.

      I also (ahem) did binary machine code patches to some software to bypass a nag screen that would come up. Not anything to do with licensing. But this one program (*cough* Microsoft Works *cough*) on Macintosh would start up going to an open / new dialog, when it could go to a different mode at startup that was accessed by just clicking Cancel. So I traced the execution, and came up with a tiny patch. Everyone in the office loved it. It had nothing to do with piracy.

      Let's not, and say we did.

      So really, moving to an OS, and a language like C is a true advancement, at that time.

      Yet people then resist further advancements. But back in the 1970's I remember people arguing that we should never use high level languages like FORTRAN or heaven forbid Pascal.

      --
      People today are educated enough to repeat what they are taught but not to question what they are taught.
  • (Score: 2) by hendrikboom on Friday March 01 2019, @10:34AM (3 children)

    by hendrikboom (1125) Subscriber Badge on Friday March 01 2019, @10:34AM (#808627) Homepage Journal

    Research on GC in general goes back to the 80s and probably earlier.

    It goes back to at least the 60's.

    • (Score: 2) by DannyB on Friday March 01 2019, @03:34PM (2 children)

      by DannyB (5839) Subscriber Badge on Friday March 01 2019, @03:34PM (#808716) Journal

      The earliest GC research I personally read about was in the 80s. I'm sure it goes further back. But once you move back to the 70's, memory is so precious, and so limited, that it is difficult to imagine what good you can do to make things better.

      I have a decent sized hardcover textbook on GC that I bought in the 90's. It covered amazing developments in GC that I found quite informative. From what I see in the various Java GCs, that textbook is clearly obsolete now.

      --
      People today are educated enough to repeat what they are taught but not to question what they are taught.
      • (Score: 2) by hendrikboom on Saturday March 02 2019, @12:00AM (1 child)

        by hendrikboom (1125) Subscriber Badge on Saturday March 02 2019, @12:00AM (#809005) Homepage Journal

        The original garbage collector was part of the original Lisp implementation.

        The advance I remembered from the 60's was a technique for garbage-collecting Lisp memory without needing extra storage for a stack It used two marking bits in every cell. It reversed pointers so as to have a trail back and then reversed them again while backing out, using the cells themselves to contain a stack. It used two marking bits in every cell so as to mark what had been visited, and also to remember which pointers had been reversed.

        This was invented precisely because memory was so precious and limited!

        • (Score: 2) by DannyB on Monday March 04 2019, @02:56PM

          by DannyB (5839) Subscriber Badge on Monday March 04 2019, @02:56PM (#809779) Journal

          I realize GC goes back to original Lisp. I'm sure early implementations were simple mark/sweep.

          In an excellent GC hardcover textbook I purchased in the early 90s, it did cover the technique you describe of reversing the pointers and then back again. Another technique it introduced me to was for when a system had virtual memory. Simply have two separate address spaces within virtual memory. Old and New. To GC you copy objects from Old space to New space. As you visit objects that are live, they are copied to New space, and the object in old space is replaced with a tagged forwarding address to new space. As you see pointers to objects in old space, if the object has a forwarding address, simply update the pointer as the object has already been moved. Once you've visited all of the live objects, everything live is now in New space, and Old space can be discarded or paged out. eg, none of its pages need to be swapped into real memory pages in RAM. Plus, New space is compacted, all blocks consecutively allocated.

          Then there were treadmill approaches and coloring approaches and discussion of how to make a GC run concurrently with the program (eg 'mutator'). As I have read about all of the various GCs available in Java, I am impressed by the amount of research that has been done in the last couple decades. The cost of GC is now astonishingly low. Stop-The-World pauses are very short -- typically just to mark the root set and then let the program continue while GC proceeds to collect. GC can now run on multiple cpu threads. Things are tuned so well with workloads that Full-GC operations never happen -- because they would cause long pauses. The design of modern GC seems to be to perpetually avoid ever doing a Full GC. It just keeps up with the mutator.

          --
          People today are educated enough to repeat what they are taught but not to question what they are taught.