Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday July 01 2019, @07:32PM   Printer-friendly
from the You-Thought-Your-Builders-Were-Bad dept.

Stories about seriously mangled public infrastructure projects keep coming up and even the alleged super-efficient Germans are not exempt. But what would you expect when you change and expand the project while it is being implemented and when you use smaller contractors with no track record for projects like this? the BBC has the story:

As a structure, it looks impressive enough.

Until you pause, look around you, and absorb the silence. This is Berlin Brandenburg or BER, the new, state-of-the-art international airport built to mark reunified Germany's re-emergence as a global destination.

It is a bold new structure, costing billions, and was supposed to be completed in 2012.

But it has never opened.

BER has become for Germany not a new source of pride but a symbol of engineering catastrophe. It's what top global infrastructure expert Bent Flyvbjerg calls a "national trauma" and an ideal way "to learn how not to do things".

[...]Martin Delius, a former Berlin city politician who later headed an extensive inquiry into what went wrong, says those in charge decided "to give 30 to 40 contracts to smaller companies which they thought they could pressurise into giving them lower prices".

"They built a very complex controlling system which didn't work," he says.

Most disruptive of all were decisions to change the size and content of the new airport - while it was being built.

[...]New construction boss Hartmut Mehdorn made a list of all the faults and failures, Mr Delius tells me.

"Small ones like the wrong light bulbs to big ones like all the cables are wrong," he says.

The final total was 550,000 - more than a half a million problems to fix.

Maybe that builder who left a big hole in your dining room wall for a couple of weeks wasn't so bad after all? It wasn't like seven years later, was it?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by goodie on Tuesday July 02 2019, @06:47PM

    by goodie (1877) on Tuesday July 02 2019, @06:47PM (#862484) Journal

    That's the thing that is so fascinating to me. Software is really now the product of series of uncoordinated moves by heterogeneous groups of actors working on their own thing and using one another's artifacts to build something of their own. And the process repeats itself over and over...

    Where it gets ugly is when you see that we have little to no visibility on what we depend on. When the heartbleed vulnerability was discovered, everybody freaked out because they suddenly realized that one of the most critical pieces of software they depend on was maintained by two (I think?) overworked people. Yet so many products depend on it.

    Same with refactoring. If we use stuff like structural complexity measures, we can reduce it through refactoring. But if the refactoring involves using other frameworks, dependencies, etc. our own code might have low structural complexity but the code that we depend on might be crap. But we won't see that because we will depend on the packaged version 99% of the time. Same principle with microservices: low complexity, limited scope. But pieced together they can create a huge mess of interdependence. Why do you have a dependency on an XML parser? Because the devs from a dependency 4 levels down did not remove it when they moved to JSON. Or something like that...

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2