O'Reilly and Software Improvement Group conducted a survey about secure coding: https://www.oreilly.com/ideas/the-alarming-state-of-secure-coding-neglect
Much of it is as expected but I stumbled upon this tidbit:
"[Static analysis] was reported as being used by 25% of respondents. One-third of those who didn't use it said it was too expensive. The rest of the non-users were fairly evenly divided among other explanations: tools were not available for their technology, were too hard to use, had too many false positives, or were not usable in Agile development."
When developing I have almost always used compiler warnings (gcc/acc/icc/cxx/clang) and dedicated tools cppcheck/flexelint/coverity-scan/pvs-studio/clang-analyze so the above snippet depressed me because catching errors sooner rather than later makes them much cheaper to fix. Static analysis tools can require much configuration, can be expensive, and be time-consuming, and I guess that for some languages such tools don't even exist. The part about static analysis tools not fitting a development process struck me as downright odd.
What is your take on this? Why aren't you using static analysis (and if you do: which one and for what?)
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @04:38AM (11 children)
I compile my code with -Wall. Does that count?
by the way, I went to the famous article, and "static analysis" is not defined.
(Score: 2) by The Mighty Buzzard on Wednesday May 10 2017, @05:09AM
Yup, rehash (the fork formerly known as slashcode) here runs with use strict and use warnings except in a very, very few places where we do something funky but legitimate. That's about all you can ask for this codebase though. I think a proper static analysis tool would straight up shit out its nostrils when presented with some of the stuff we inherited.
As for Rust, I haven't really found a static analysis tool yet but I leave all the warnings turned on except for the ones about snake case because my Rust skills aren't shiny enough to make me think I know what I'm doing better than the compiler yet.
My rights don't end where your fear begins.
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @05:21AM (8 children)
It's a good start. Look into -Wextra - it may be too much for you, and it gives me a lot of false positives. You can always pick and choose individual types of warnings from the -Wextra category.
(Score: 2) by coolgopher on Wednesday May 10 2017, @06:02AM (7 children)
Whenever I set up a new C/C++ project, I tend to go through the latest info page for gcc/clang (or other compiler I'm unfortunate enough to have to use) and enable everyone unless I explicitly know I don't want it. And in most cases I still want that warning enabled globally, and only disable it temporarily within a compilation unit on an as-needed basis. The compiler is your best friend, listen to it! :)
(Score: 2) by TheRaven on Wednesday May 10 2017, @09:18AM (4 children)
sudo mod me up
(Score: 3, Informative) by mth on Wednesday May 10 2017, @10:22AM (2 children)
-Wextra does not enable every warning. The GCC man page reads:
Note that some warning flags are not implied by -Wall. Some of them warn about constructions that users generally do not consider questionable, but which occasionally you might wish to check for; others warn about constructions that are necessary or hard to avoid in some cases, and there is no simple way to modify the code to suppress the warning. Some of them are enabled by -Wextra but many of them must be enabled individually.
It does enable a useful set of warnings, in my opinion, so "-Wall -Wextra" is a good starting point for most projects.
(Score: 2) by TheRaven on Wednesday May 10 2017, @11:28AM (1 child)
sudo mod me up
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @05:05PM
That is clang only and enables all warnings and other things. In fact, it enables so much, that the original idea was that it was only really useful in the development of clang itself; that is, when they run clang against their test suite.
(Score: 2) by coolgopher on Wednesday May 10 2017, @12:23PM
Nope, it does not. It enables *some* extra warnings, but not all. See https://gcc.gnu.org/onlinedocs/gcc/Warning-Options.html [gnu.org] for details. Things like -Wundef, -Wshadow, -Wfloat-equal and -Wpointer-arith need to be explicitly enabled.
(Score: 2) by Wootery on Wednesday May 10 2017, @12:17PM (1 child)
only disable it temporarily within a compilation unit on an as-needed basis
Or, if you don't mind using a more intrusive approach, stop the false-positives by putting #pragmas around the code where you really do want to do something funky. [stackoverflow.com]
(Score: 2) by coolgopher on Thursday May 11 2017, @04:02AM
That is precisely what I mean when I say "within a compilation unit" :)
(Otherwise I would have said "for a particular compilation unit")
Gcc sure took its sweet time to get push/pop support for warnings, but these days it's fully functional, thankfully.
(Score: 2) by mth on Wednesday May 10 2017, @10:36AM
I don't know if this is the "official" definition, but static analysis is anything that examines code and reports about it without running it. It could be a tool that flags coding style violations, copy-pasted code, use of error-prone API calls, known dangerous language constructs etc. And indeed compiler warnings are also a form of static analysis.
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @05:03AM (1 child)
Math Professor: No, I don't grade on a curve... It's too complicated. [sciencenet.cn]
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @12:21PM
Well, I have to agree with the professor. In a curve, the street needs your full attention; doing grading there is a bad idea. Restrict grading to the straight parts of your driving. ;-)
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @05:25AM (5 children)
Does anyone have any good free open source static analysis tools for C? I've used static analysis at work for embedded software, but the packages we used were thousands of dollars, and very proprietary.
(Score: 2) by TheRaven on Wednesday May 10 2017, @09:23AM
The clang static analyser is pretty good and is actively developed by a number of companies. The one big problem with it is the lack of a way of consistently tracking reports as the code evolves (Coverity is much better for that, and is free to use for open source projects).
The real problem with any static analysis tool is that it's very hard to adopt later on in the development process, because you'll likely have a hundreds or thousands of warnings and have to go and classify them all. This even applies to new analyses from existing analysers.
I am not surprised by these results at all. I was talking to someone from Google's Android security team a year or so ago and they were very proud of adding support for _FORTIFY_SOURCE to Android. I was surprised, because when I'd looked at adding that to FreeBSD, it had seemed not worth the effort because I hadn't been able to come up with a bug that _FORTIFY_SOURCE would catch at run time and the clang static analyser wouldn't catch at compile time, so I asked if they had some examples. His reply was that they didn't use static analysis at all in their development process. For something security critical, that's completely unacceptable (in contrast, Apple runs it on everything and writes new clang analysers whenever they find a new category of bug in their code).
sudo mod me up
(Score: 1) by isj on Wednesday May 10 2017, @05:07PM
I know of:
Open source: clang-analyzer, CppCheck
Free: the 2 above, and Coverity-scan (if your source code is opensource hosted on github etc)
Good: YMMV. It depends on how much time you are willing to invest, what shape the code is currently in, and which types of errors you are looking for.
(Score: 2) by bzipitidoo on Wednesday May 10 2017, @05:30PM
I made a simple C/C++ parser that checks for things such as balanced parentheses, and gives some statistics about the code. It doesn't handle #ifdef and relatives, so it is easily fooled by abuses such as 'for (;;i++) { #ifdef DEBUG printf("%d ",i) } #else } #endif' which has 1 opening brace and 2 closing braces. It found an error in the Firefox source code that the compiler missed thanks to not even looking at the code those preprocessing directives told it to skip over.
If anyone is interested, I plan to release it in a month. Yes, it'll be free.
(Score: 1) by isj on Wednesday May 10 2017, @10:19PM (1 child)
If the cost is the issue then consider pc-lint from Gimpel. It is relatively cheap and can be configured in many ways. But it does take time to configure to your environment, and the latest release doesn't support C++11 but its upcoming version (of which I'm a beta tester) does.
(Score: 2) by hendrikboom on Wednesday May 10 2017, @11:34PM
I used that on the Amiga long ago. It was useful. It has been around for a while.
(Score: 2) by bziman on Wednesday May 10 2017, @05:39AM (4 children)
I've been a professional software engineer for twenty years, all of it with Java, and for the past few years, I've had access to static analysis tools, and it has definitely made my code better, and it has probably made me a better engineer, too. Why not use the available tools? Even my colleagues who prefer vi to a modern IDE still benefit from the static analysis tools. And our code is better for it.
(Score: 4, Insightful) by Anonymous Coward on Wednesday May 10 2017, @05:45AM
> I've been a professional software engineer for twenty years, all of it with Java,
Crap. You just made me feel really old.
(Score: 0) by Anonymous Coward on Wednesday May 10 2017, @07:00AM
Professional means you get paid to do it, right?
(Score: 2) by Nerdfest on Wednesday May 10 2017, @10:04AM
Likewise. There are some good tools for JavaScript as well. Things Like Sonar that run a set of static analysis tools as part of the build process and track the results work very well too. You can see the improvement (or degradation) of your code base. Sonar works with a series of plugins that let you analyze potential bugs, style, test coverage, complexity, etc. I saw one clever one that shows this as technical debt in monetary term. Not likely very accurate, but a good tool to give "management" an idea of the costs of poor code.
(Score: 1) by isj on Wednesday May 10 2017, @01:09PM
That has also been my experience. My code became better/clearer after I started using a dedicated tool.
I think it is a combination of understanding what is dubious code constructs, and also knowing that the tool will point them out to me (so might as well not make them)
(Score: 2) by mth on Wednesday May 10 2017, @11:06AM
Back when I did a lot of Java work, PMD [github.io] was a tool I used a lot. It's open source, found about as many issues as commercial tools did, didn't flag too many false positives, and was easy to include in automated builds. And you can even specify your own patterns, if you have project-specific issues you want to scan for.
Bundled with PMD but a separate tool is CPD, the copy-paste-detector. It doesn't only find literal copy-pasted code, but also code that's is mostly the same with small changes; you can configure how fuzzy the matching should be. This tool is particularly useful if you've inherited a code base and are trying to improve it. It supports many other languages besides Java.
For Python, I use Pylint [pylint.org]. Since Python is a very dynamic language (no static typing or static anything, really), static analysis is difficult, but Pylint does a reasonable job at finding issues beyond style violations. It does require configuration and annotations (special formatted comments) to make it useful. Besides a full project check, you can also run it on a single module at a time, which is useful if you want to do a quick sanity check of new code before you start a test run.
For C/C++, enabling more than the default compiler warnings and compiling with multiple compilers will catch a lot of issues. I've also used clang-analyze and cppcheck, and while they are useful additions, they didn't catch as many issues as I hoped they would.
(Score: 2, Informative) by Andrey_Karpov on Wednesday May 10 2017, @02:45PM (5 children)
I am one of the developers of PVS-Studio analyzer. I am not going to dispute here, but would like to note one moment.
Now we have 11000 errors in our "bug-base", which we found in open source projects. I mean really errors, not just warnings issued by the analyzer. You may have a look at all these errors: https://www.viva64.com/en/examples/ [viva64.com]
These errors were found as a byproduct of our articles: https://www.viva64.com/en/examples/ [viva64.com] . We have never had a goal to find as many errors as possible. Still, we have found these 11000 errors. This is quite a result.
Here is what I am leaning to. If the compilers were so great, and the analyzers so awful, we wouldn’t be able to fill the base with such a number of errors. So, there is definitely some use in the static analysis. :)
(Score: 1, Interesting) by Anonymous Coward on Wednesday May 10 2017, @05:21PM (3 children)
I think most people don't realize they are doing different jobs. Static analyzers are specifically designed to find errors or maybe-errors. Compilers are designed to turn one language into another. Why should my compiler complain about the construct (if A || B || A) or when the then and else clause do the same thing? Those are both examples of perfectly valid code. While it is true that the examples both stink to high heaven, they aren't necessarily bad. However, errors like "comparison between signed and unsigned integer expressions" or "'InsertNameHere' undeclared (first use this function)" can, and should, be caught by the compiler. In fact, I firmly believe that some errors shouldn't be detected by static analysis at all and only be spit out by the compiler.
(Score: 2) by hendrikboom on Wednesday May 10 2017, @11:39PM (1 child)
Said code that stinks to high heaven could very well be generated by any kind of automated code synthesis,
(Score: 2) by Pino P on Thursday May 11 2017, @11:58AM
Then the source code is the input to the code synthesis, not its output.
(Score: 0) by Anonymous Coward on Thursday May 11 2017, @06:32AM
I firmly believe that some errors shouldn't be detected by static analysis at all and only be spit out by the compiler.
I firmly believe there are also interpreted languages!
/snipe
(Score: 0) by Anonymous Coward on Thursday May 11 2017, @02:38AM
I like the combination of static and dynamic analyzers. Dynamic ones are pretty good at finding memory/thread/functional errors. Static is pretty good at finding poor patterns. Both of which have their place and BOTH should be used.
I can count on one hand the number of false positives I actually have found over the years with these sorts of tools. Many times they are right. You just have to dig into it. They usually only go sideways when you obscure the creation/destruction of memory in someway. Which in of itself is an anti pattern which many of those errors on your site are.
The tool is telling you something. You just have to listen and willing to trust that you are not the hotshot you think you are.