Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Saturday November 07 2015, @11:02AM   Printer-friendly
from the but-it-makes-me-look-cool dept.

The Atlantic is running an article on the friction between the computing world and Professional Engineer societies. This discussion has been going on for a long time, and is meaningful to me personally - I quit a 10-year career as server administrator with 'engineer' in my job title when I graduated with a Mechanical Engineering degree, and have since earned my Professional Engineer license. In a world where most software comes with a disclaimer of liability due to defects, where would an ethical, civic-minded programmer even practice Professional Engineering? Angry Birds probably doesn't have any responsibility to the public safety, so there's little need there; on the other hand, Google's self-driving car program is a good candidate.

I'd love to welcome the programming profession into the circle of licensed Engineers, provided that the industry manages to agree on standards of quality and accountability. I don't see the methods (such as Agile) used by programmers as a significant obstacle, either; the programming motto of "move fast and break things" (which the article wrongly decries) is echoed in the motto "fail early, fail often" that is held by many Mech Eng R&D shops. I just fear that the halting problem will be solved before any such standards become widely accepted and implemented in the industry.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by TheLink on Saturday November 07 2015, @06:00PM

    by TheLink (332) on Saturday November 07 2015, @06:00PM (#260038) Journal

    It would be like building a bridge, having someone drive over, and seeing if it crashes. No person who did that should call himself an engineer. The "hope and pray" method isn't good enough.

    It probably would be done that way if building and testing a bridge is as cheap as building and testing software. In fact nowadays many things are actually done that way using simulation and models... Some even use "evolutionary design": https://en.wikipedia.org/wiki/Evolved_antenna [wikipedia.org]
    Which is like crashing stuff a LOT and picking what worked best and doing it over and over again. So how different is that really?

    Which comes to this: lots of people don't understand the real differences between Civil Engineering and Software "engineering" and thus why things are the way they are and why stuff fails. Many try to manage software projects the way Civil Engineering projects are managed using the assumptions that are only valid for Civil Engineering, or completely misunderstand what is what.

    Civil Engineering:
    Design Phase costs about 10% of Build Phase.
    Build Phase involves tons of construction workers and heavy machinery and a fair bit of time. Increasing the speed of the Build Phase involves adding more workers and machinery, which can sometimes cut the build time by weeks or months.
    The blueprints and plastic models are way cheaper to make than the Real Thing.
    It's easier to convince Management to spend a bit extra to get the design better (not saying they won't be unhappy or that nobody will be sacked), because the budget only allows for one big Build.

    Making Software:
    Design Phase costs more than 1000 times the Build Phase.
    Build Phase involves the programmer typing "make all" (or selecting "rebuild solution") and going to read Soylent News or fetch a coffee or do some sword fighting ( https://xkcd.com/303/ [xkcd.com] ). Increasing the speed of the Build Phase involves adding more GHz, cores, SSDs or computers, which can sometimes cut the build time by seconds, minutes or hours.
    The blueprints and "plastic models" cost as much to make as the Real Thing.
    Management often sells the blueprints/plastic models as v1.0 because they compile and "kinda run" and the budget only allows for one big Design... and the customers often buy it :).
    There's no Gravity- you can do lots of weird stuff and it won't obviously fall over. And just because it looks complex doesn't always mean it is the wrong way of doing things. Sometimes a large multinational company's business processes are really that weird and complex- salary, leave, approvals, taxes and laws (of different states and countries), unicode, RTL+East Asian language handling, time zones (including DST), etc. it's just they and others don't realize it's that weird and complex till experienced people start giving them edge scenarios and asking them questions on what should happen in those scenarios. And even if you're experienced you may not capture all the edge cases or you might wrongly assume the customer would "obviously" prefer an edge scenario to be handled some way and thus not ask them (you have to assume otherwise there would be too many questions, many of which they would answer wrongly anyway ;) )...

    So it should be no surprise that those plastic models regularly fail. Complaining that the plastic models fail and saying that's because "software engineering" is not engineering isn't going to improve things. Too many people stupidly/ignorantly think the Software Design Phase is like the Civil Engineering Build Phase and assuming that it is just as easy to speed things up a lot by simply adding more people and resources. Yes you might be able to speed things up by adding people BUT it's not as easy and you need to add the right people (and perhaps remove some people...).

    As for "proof of correctness" and formal verification. In theory these are great. In practice in most real-world cases they wouldn't help that much because there's still the huge problem of capturing and describing the requirements accurately. You could have something that works 100% as per the specifications and design but the specifications were wrong in the first place and there's no formal verification for that.

    And from what I see in many cases that's where the big problems are - the customer doesn't actually know what they want till you build something the way they insisted, despite your suggestions/recommendations and then they go, "Uh, the end users don't like it. How about we try 'changing things a bit'" and you go "but that means you want something that's more like a bridge instead of a tower". Or they go "That's exactly what we asked for, but after some tests we're going to stick to using Excel instead". Whereas the code deviating from the spec is often not too difficult to fix if your coders aren't complete idiots (e.g. they did understand the spec and wrote to the spec, they just made the equivalent of "typos").

    Secondly say you really used "formal verification" stuff. Your formal verification "coder" could still make the equivalent of typos or bigger bugs while converting the human specification into the "formal verification" specification. So you still will have bugs despite it saying "100% correct". After all in most cases the "compiled" code works 100% exactly as what the coder wrote in the source code. Except it's wrong :). So how much would you really gain from formal verification in practice?

    Can you give me "formal proof" or statistics showing that the average real world coder will make fewer mistakes when converting a complex real world spec into the formal verification specification? Or is the improvement in quality actually due to "selection bias" where only the higher quality talent does that formal verification sort of thing, and those higher end bunch would make about as few mistakes anyway even if they wrote it in Python instead of using formal verification methods.

    If you really want to prove that formal verification really works, try outsourcing the conversion of a few complex human specs to formal verification specifications and resulting programs to the cheapest bidder and the average cost bidder. Then compare the results with outsourcing of the conversion of those same human specs to programs written in Java or C# or some higher level programming language. Compare the time and $$$ taken to produce 95% and 100% conformance to the human specs as per your interpretation. If you get significantly better results with formal verification methods then I will believe formal verification really works. Otherwise it could be as I said - it just appears to work because you are using higher skilled people :).

    But be aware I have no related certifications in such stuff, nor much real world experience in such stuff either. I think I made some good and valid points though ;).

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by TheLink on Saturday November 07 2015, @06:25PM

    by TheLink (332) on Saturday November 07 2015, @06:25PM (#260041) Journal

    On a less serious note- I bet in too many cases the "requirements gathering" ends up a bit like this: https://www.youtube.com/watch?v=UeMhjhGe4VA [youtube.com]

    :p