BBC reports the co-pilot of the Germanwings flight that crashed in the Alps intentionally locked the pilot out of the cabin and initiated the flight's descent into the ground:
The co-pilot of the Germanwings flight that crashed in the French Alps, named as Andreas Lubitz, appeared to want to "destroy the plane", officials said.
Marseille prosecutor Brice Robin, citing information from the "black box" voice recorder, said the co-pilot was alone in the cockpit.
He intentionally started a descent while the pilot was locked out.
Mr Robin said there was "absolute silence in the cockpit" as the pilot fought to re-enter it.
Air traffic controllers made repeated attempts to contact the aircraft, but to no avail, he said.
The story seems SN-worthy because it is an object lesson in the consequences for our lives when we put complex machines and systems into the hands of others. In this case it was a trained pilot who killed a plane full of people who were powerless to stop him. Another example could be engineers who sabotage a dam and wipe out entire communities downstream. We mostly don't think about stuff like this because there is an invisible web of trust, sometimes called a "social contract," that leads people to get on a plane, or go to work, or take their kids to school without giving it a second thought. But when that social contract unravels, all bets are off...
(Score: 1) by tizan on Thursday March 26 2015, @05:39PM
Humans are too fallible.
We do have the technology to fly planes from airport to airport automatically....we need human only in cases of major issues, mechanical failure and messy situation at landing site etc...
(Score: 0) by Anonymous Coward on Thursday March 26 2015, @05:47PM
But without the practice from piloting, how will the humans stay in practice, or "current", so that they are capable of dealing with emergencies that the autopilot can't handle? Simulators are OK, but everyone getting into one knows it is not real. Further, the emergency scenarios that are rehearsed in flight simulators are ones that other humans invented and can't possibly cover all the possible failures. It is a hard problem.
(Score: 0) by Anonymous Coward on Thursday March 26 2015, @05:47PM
Because the automated autopilot couldn't possibly be sabotaged by the human emergency copilots. Right?
(Score: 2) by hemocyanin on Thursday March 26 2015, @06:02PM
Or by some random person interested in the ultimate swatting, so to speak.
(Score: 1) by tizan on Thursday March 26 2015, @09:30PM
Yes ..but an automatic system won't suffer from sudden depression and try to crash the plane he is piloting because say his girlfriend dumped him or he owed the mafia money or something like that.
(Score: 1) by MostCynical on Thursday March 26 2015, @09:56PM
Human - hits mountain deliberately or by accident.
"Automated pilot" - hits mountain because of poor programming/ hardware fault/error..
While removing one point of "failure", automation adds one per line of code.. we need both (especially in cars: http://www.huffingtonpost.com/2015/03/26/teen-distracted-driving-study-aaa_n_6944804.html [huffingtonpost.com] )
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 1, Funny) by Anonymous Coward on Friday March 27 2015, @12:18AM
Like systemd, eh?
(grin)
(Score: 2) by GungnirSniper on Friday March 27 2015, @01:28AM
That's what Scotty used in Star Trek 3, and look how that turned out!
Tips for better submissions to help our site grow. [soylentnews.org]