Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Friday May 08 2015, @11:26PM   Printer-friendly
from the coding-for-dollars dept.

Andy Hunt - one of the originators of the Agile Manifesto, and NOT your humble submitter - has concluded that Agile has lost its way:

in the 14 years since then, we‘ve lost our way. The word “agile” has become sloganized; meaningless at best, jingoist at worst. We have large swaths of people doing “flacid agile,” a half-hearted attempt at following a few select software development practices, poorly. We have scads of vocal agile zealots—as per the definition that a zealot is one who redoubles their effort after they've forgotten their aim.

And worst of all, agile methods themselves have not been agile. Now there‘s an irony for you.

How did we get into this mess?

The basis of an agile approach is to embrace change; to be aware of changes to the product under development, the needs and wishes of the users, the environment, the competition, the market, the technology; all of these can be volatile fountains of change. To embrace the flood of changes, agile methods advise us to “inspect and adapt.” That is, to figure out what‘s changed and adapt to it by changing our methods, refactoring our code, collaborating with our customers, and so on. But most agile adopters simply can‘t do that, for a very good reason. When you are first learning a new skill—a new programming language, or a new technique, or a new development method—you do not yet have the experience, mental models, or ability to handle an abstract concept such as “inspect and adapt.” Those abilities are only present in practitioners with much more experience, at higher skill levels

Andy also has some thoughts on how to correct this - starting with the idea that Agile methodologies must be applied to Agile methodologies, to allow them to adapt to changing needs.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by anubi on Sunday May 10 2015, @03:51AM

    by anubi (2828) on Sunday May 10 2015, @03:51AM (#180966) Journal

    I agree wholeheartedly with the example you posted. Things were just starting up, and several things hit the market. A lot of that needed to be pruned in the worst way. We should have learned back then that having variants is not a good idea. Can you imagine the confusion in the plumbing industry if everyone went around designing pipe and appliances with no standardized size and threading standards?

    I was in that day too, and had considered the non-standard computers "rubbish" and refused to design them into anything. You even mentioned my nemesis of the day... Compaq. I considered Compaq more a toy and fashion statement than an industrial machine. Some executives liked them. I did not like them because nothing inside was standard. Compaq was a big enough company they would have stuff made special for them. The product marketers would say "product differentiation", but as far as I was concerned, a Compaq was a "special needs" machine.

    As far as I was concerned, IBM/Microsoft had set the standard with the IO.SYS, MSDOS.SYS, COMMAND.COM protocol, which I was fully expecting them to combine into BIOS ( I believe there were two main BIOS of the day, Award and Phoenix ). Just as we have one common language spoken in this country, and one currency in use, I fully expected things to settle down to where there was one common baseline architecture all machines would use, and machines speaking nonstandard protocols would be just about as acceptable as someone trying to tender Jamaican currency in an American store.

    That is why I would much more likely design in a "no-name" machine than a name-brand machine. The no-name were usually built of standards-compliant components, whereas the name-brand machine often had special non-compliant hardware ( especially BIOS, floppy drives, and power supplies ) in it... parts which will become damn near impossible to find in, say, ten years. I remember seeing all these new machines come out with all these "arty" cases ( which looks like someone left them out in the sun too long ) and wondering just how I would replace the CDROM in the thing. There wasn't much even in the line of reusable parts in the thing even if the store gave them to me. Possibly I could re-use the fan in the power supply.... As far as I was concerned, they were junk before even being unboxed.

    What I do remember is IBM setting the AT standards with their "AT Technical Reference" binder, where they detailed every aspect of electronic design and software. Yes- they even printed the SOURCE CODE!!! Things were off to a really good start. Things went downhill from there, accelerated by not only greed of special interests of ones trying to corner the market, but also by a Congress that would back them up.

    Here's hoping that the Open Source community pulls together and builds a good foundation. But I also fear the laws already passed by Congress will just be used as legal bludgeons to keep any meaningful free enterprise competition at bay. And good luck trying to pry people away from eye candy.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]