Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Friday May 08 2015, @11:26PM   Printer-friendly
from the coding-for-dollars dept.

Andy Hunt - one of the originators of the Agile Manifesto, and NOT your humble submitter - has concluded that Agile has lost its way:

in the 14 years since then, we‘ve lost our way. The word “agile” has become sloganized; meaningless at best, jingoist at worst. We have large swaths of people doing “flacid agile,” a half-hearted attempt at following a few select software development practices, poorly. We have scads of vocal agile zealots—as per the definition that a zealot is one who redoubles their effort after they've forgotten their aim.

And worst of all, agile methods themselves have not been agile. Now there‘s an irony for you.

How did we get into this mess?

The basis of an agile approach is to embrace change; to be aware of changes to the product under development, the needs and wishes of the users, the environment, the competition, the market, the technology; all of these can be volatile fountains of change. To embrace the flood of changes, agile methods advise us to “inspect and adapt.” That is, to figure out what‘s changed and adapt to it by changing our methods, refactoring our code, collaborating with our customers, and so on. But most agile adopters simply can‘t do that, for a very good reason. When you are first learning a new skill—a new programming language, or a new technique, or a new development method—you do not yet have the experience, mental models, or ability to handle an abstract concept such as “inspect and adapt.” Those abilities are only present in practitioners with much more experience, at higher skill levels

Andy also has some thoughts on how to correct this - starting with the idea that Agile methodologies must be applied to Agile methodologies, to allow them to adapt to changing needs.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Touché) by anubi on Saturday May 09 2015, @03:01AM

    by anubi (2828) on Saturday May 09 2015, @03:01AM (#180610) Journal

    Back in the DOS days, if one wanted to program and just have it work, he used the standard BIOS calls.

    Everything had standards. Keyboard, mouse, display, hard disk drive, floppy, serial, and parallel ports. Even the modems had a special code that everyone used.

    Then the idea of "drivers" and "proprietary technologies" hit the scene, and a lot of people started writing code that required a driver to interface to the hardware.

    This started at the end of DOS, beginning of Windows.

    All this complexity to try to keep the customer from simply copying code from one machine to another.

    Robust/resilience was sacrificed to the altar of "It won't run until I give you permission."

    So, we have a lot of stuff these days that people no longer know how to get permission for it to run.

    The permission-giver is apt to change his format which will now mandate you doing what he tells you do do if you ever want to see your data again.

    Its not like the old days which were more like programming an Arduino.

    Its not too late to ditch all this stuff, but it means we have to go open source where no-one can claim to be a permission-giver. Also means we have to re-institute a set of standards for hardware so no special driver is needed for it. This will have to be embraced at the manufacturing level, and they will encounter substantial resistance from those who now "have it made" with a business model of granting permission.

    Of course, for some business applications where a system does not need to last very long, proprietary techniques could still be in place, but only the businesses who want to do this would have their system regularly yanked out from under them, and be absolutely dependent on the permission-giver.

    ----------------

    I will share my latest runabout on my machine which had a software failure due to lack of robustness. I was trying to load some Gabotronics oscilloscope software. It would not run. Error 0xc000007b. Had no idea what that was. YouTube seemed to offer a variety of cures from running chkdsk to sprinkling holy water on it. ( Ok, I jest, but you get the idea.) I pulled up a dependency walker and had it step through the program. It found I had two files: mscvp100.dll and mscvr100.dll which were 64 bit ( my machine is a 64 bit ) instead of the 32 bit versions of the same name that the Gabotronics software was looking for.

    They are Microsoft VisualC++ runtime files.

    I know about "function overloading".. all one has to do if releasing a more powerful version of a library function is define new overloaded functions with arguments with the bigger number - now why in the hell did I have to do all this research? The 64 bit library should have honored the 32 bit call and handle it just like the 32 bit library handled it. Now, I can see where running a more advanced library in an antique machine will cause problems... there is no hardware to support the software, but the other way around should have been a piece of cake.

    In order to fix this problem, I searched through my own machine for instances of those two programs, looking for them in applications I knew were 32 bit. I found a couple. I copied them into the Gabotronics directory - and now everything runs. The point is I should not have had to do that.

    Why in all blue blazes would software care if it hit a 64 bit library, if the library was still accepting and returning data in the deprecated format?

    This is the kind of programming monkey business I expect from beginning level coders. Not an experienced corporation. ( Gabotronics, you are off the hook with this one - I am glaring straight at the Big M for making confusing files. )

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Touché=1, Total=2
    Extra 'Touché' Modifier   0  

    Total Score:   3  
  • (Score: 2) by Geotti on Saturday May 09 2015, @03:10AM

    by Geotti (1146) on Saturday May 09 2015, @03:10AM (#180612) Journal

    Gabotronics, you are off the hook with this one

    Shouldn't they be distributing those dlls alongside the executables these days, though?

    • (Score: 1) by anubi on Saturday May 09 2015, @03:43AM

      by anubi (2828) on Saturday May 09 2015, @03:43AM (#180623) Journal

      I would sure think so. I have heard of all these pairings of a specific library to a specific program called "DLL HELL".

      In all fairness, they did require me to install Microsoft .NET redistributables, and I probably chose the wrong one during installation. I have a 64 bit machine and it wanted me to check which one I wanted loaded. My guess is all my problems arose from choosing what looked like a 64 bit program to run on a 64 bit machine.

      Ten years from now it may be damned near impossible to get all the pairings correct for a program to run. I am pretty sure the old DOS machine will run until its power supply capacitors give out.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 0) by Anonymous Coward on Saturday May 09 2015, @04:14AM

    by Anonymous Coward on Saturday May 09 2015, @04:14AM (#180632)

    Back in the DOS days, if one wanted to program and just have it work, he used the standard BIOS calls.

    What a load of rubbish. Sorry to be mean but I was there. Each company had its own set of bios calls and special interrupt lines (so a Compaq may not act exactly like an ACER). Even version to version of DOS those calls could change. DRDOS!=MSDOS3.3!=MSDOS6.1 then you could have TSRs load up and change the way something would work. Some OEMs went as far as to actually change the way DOS worked. Then you would have different flavors of ovl files, or at least 10 different ways to control memory above 1meg. 6 different types of sound cards each with its own special command set. Video cards that worked in VGA mode up to 640x480 and everything else above it was 'special'.

    This was the same thing on pretty much all the platforms. Amiga, Atari, etc.. It was 'mostly standard' but would change in subtle ways from one version of the OS to the next or one board to the next.

    Then windows31/OS2/NT35 came along. They changed everything like you said. But it was merely an extension of what went before. Proprietary goop. Some of the pain went away but instead of at least pretending to be somewhat compatible (they usually were almost compatible) they just programmed to the API. In someways better in many ways like you noticed worse.

    You are seeing the many of the same things in the ARM area where everyone is trying to be 'the standard'. Then thing is if you actually pull off being the standard you may no longer be relevant. No one buys a Hayes modem anymore but everyone still uses their command set. You probably use it every day and dont even know it if you use a cell phone.

    • (Score: 1) by anubi on Sunday May 10 2015, @03:51AM

      by anubi (2828) on Sunday May 10 2015, @03:51AM (#180966) Journal

      I agree wholeheartedly with the example you posted. Things were just starting up, and several things hit the market. A lot of that needed to be pruned in the worst way. We should have learned back then that having variants is not a good idea. Can you imagine the confusion in the plumbing industry if everyone went around designing pipe and appliances with no standardized size and threading standards?

      I was in that day too, and had considered the non-standard computers "rubbish" and refused to design them into anything. You even mentioned my nemesis of the day... Compaq. I considered Compaq more a toy and fashion statement than an industrial machine. Some executives liked them. I did not like them because nothing inside was standard. Compaq was a big enough company they would have stuff made special for them. The product marketers would say "product differentiation", but as far as I was concerned, a Compaq was a "special needs" machine.

      As far as I was concerned, IBM/Microsoft had set the standard with the IO.SYS, MSDOS.SYS, COMMAND.COM protocol, which I was fully expecting them to combine into BIOS ( I believe there were two main BIOS of the day, Award and Phoenix ). Just as we have one common language spoken in this country, and one currency in use, I fully expected things to settle down to where there was one common baseline architecture all machines would use, and machines speaking nonstandard protocols would be just about as acceptable as someone trying to tender Jamaican currency in an American store.

      That is why I would much more likely design in a "no-name" machine than a name-brand machine. The no-name were usually built of standards-compliant components, whereas the name-brand machine often had special non-compliant hardware ( especially BIOS, floppy drives, and power supplies ) in it... parts which will become damn near impossible to find in, say, ten years. I remember seeing all these new machines come out with all these "arty" cases ( which looks like someone left them out in the sun too long ) and wondering just how I would replace the CDROM in the thing. There wasn't much even in the line of reusable parts in the thing even if the store gave them to me. Possibly I could re-use the fan in the power supply.... As far as I was concerned, they were junk before even being unboxed.

      What I do remember is IBM setting the AT standards with their "AT Technical Reference" binder, where they detailed every aspect of electronic design and software. Yes- they even printed the SOURCE CODE!!! Things were off to a really good start. Things went downhill from there, accelerated by not only greed of special interests of ones trying to corner the market, but also by a Congress that would back them up.

      Here's hoping that the Open Source community pulls together and builds a good foundation. But I also fear the laws already passed by Congress will just be used as legal bludgeons to keep any meaningful free enterprise competition at bay. And good luck trying to pry people away from eye candy.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]