Secret code is everywhere—in elevators, airplanes, medical devices. By refusing to publish the source code for software, companies make it impossible for third parties to inspect, even when that code has enormous effects on society and policy. Secret code risks security flaws that leave us vulnerable to hacks and data leaks. It can threaten privacy by gathering information about us without our knowledge. It may interfere with equal treatment under law if the government relies on it to determine our eligibility for benefits or whether to put us on a no-fly list. And secret code enables cheaters and hides mistakes, as with Volkswagen: The company admitted recently that it used covert software to cheat emissions tests for 11 million diesel cars spewing smog at 40 times the legal limit.
But as shocking as Volkswagen's fraud may be, it only heralds more of its kind. It's time to address one of the most urgent if overlooked tech transparency issues—secret code in the criminal justice system. Today, closed, proprietary software can put you in prison or even on death row. And in most U.S. jurisdictions you still wouldn't have the right to inspect it. In short, prosecutors have a Volkswagen problem.
Interesting article with implications for Open Source.
(Score: 3, Insightful) by anubi on Sunday October 18 2015, @12:56AM
Ever since the dawn of computerization, "the computer did it" has been a socially acceptable and plausible deniability mechanism for avoiding responsibility.
In the same vein that a child can get away with things an adult would be nailed for.
I feel its time the parents start taking responsibility for the acts of their children.
In the case of proprietary software, all liability of the acts of said software should be the responsibility of the rightsholder. If he is going to profit from others deliberately planned and enforced ignorance of what his thing really does, then he should also bear the burden of what it did.
In the case of public open source software, the user is responsible. It was open. He should have known what it is or have other trusted people vet the software for him. This is akin to trying to find someone accountable for drowning from jumping in a lake. He knew the risks.
Now if he jumped into a big black hole and was assured by "the rightsholder" that someone was there to catch him on the bottom, and there wasn't, then I feel the "rightsholder" is also solely responsible for all the misfortune the paying customer has for trusting the "rightsholder" to deliver.
While we are going so gung-ho over "rights", I feel its past high time we also consider assigning responsibilities along with those rights.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 2) by Whoever on Sunday October 18 2015, @01:04AM
Indeed. [youtube.com]
(Score: 2) by frojack on Sunday October 18 2015, @04:04AM
You have to ask yourself just how often this actually occurs, where there is actual stonewalling on allowing code inspection?
I submit, that it probably doesn't happen all that often with application specific software, such as court records systems, or such. Medical industry? Probably a bigger likely hood, but probably not for medical devices.
Personally, having developed software under contract for a state government, I've been asked for source code for audit. Delivered it without a single quibble. It was work for hire, after all. I even helped the plaintiff's expert find the exact portion of of the code that handled the feature they were contesting. We worked through the code together, proved it was correct, ran dozens of actual cases through the code and manual calculations.
I've also sent proprietary source code to customer for audit with a simple Non Disclosure Agreement. They had a court case they were involved in, and when they needed to introduce it in evidence, I was asked to release them from the NDA. I did. No questions asked. I even wrote a description of that program for them. (That software was part of a sales system, it computed bulk discounts.)
Most software companies don't want to get dragged into a court case ie they can possibly help it. Most are not going to refuse a Judge.
I suspect the case is overblown here.
There are probably some companies that have something to hide, a bad bug that killed someone, and they dob't want to disclose it. But I've never heard of such. The amount of code in the world that is actually in a position to hurt someone is pretty small, perhaps mostly located in vehicles.
No, you are mistaken. I've always had this sig.
(Score: 2, Insightful) by anubi on Sunday October 18 2015, @04:51AM
Malfunction of some code may hurt someone. That's when business-grade software prone to botnet-launched viral scripts does not make the cut. One would use something like Micrium's U/COS or the like for life critical applications where someone is apt to actually get sued for a malfunction.
Volkswagen's fraud hardly even shows on my personal radar. It was mickeymouse coding to cater to mickeymouse laws. No one's safety was in jeopardy. From all I could tel, they just did what they had do do to pass spec. No, its not "right", but how about regulations that may not be practically attainable? They did what they were forced to do. I cannot hold much of a grudge against them for that. The most they did was deception... something Americans also know as "salesmanship". Used-car dealers are generally far better at pulling off fast ones on their customer.
The biggest beef I have is software that won't keep a secret. ( The second beef I have is annoy-ware where authors use the computer and DRM "rights" to enforce annoyances and unwanted stuff on me ).
Its like having someone over for lunch, he goes through my house, and lists the contents to several unsavory characters who paid him on the side to do this. Then I get targeted break-ins. And its all because I let the wrong guy take a peek inside my house.
In the digital world, lack of privacy is only fodder for spearphishing fraud. You get an email from someone you are doing business with, or at least that is what you are led to believe by the carefully crafted headers on the thing. Maybe its your retirement account. Maybe its your healthcare provider. Or maybe your bank. And they attach a document in the typical way businesses do these days.
Now, a lot of us that have had experiences clearing our machines of nasty malware, view an attached document a lot like picking up a soiled condom. These things have only "business-grade" trustworthiness and should be opened in a virtual sandboxed environment. By having detailed information on your personal contacts, the phisher is far more likely to offer you one of these documents you take seriously enough to open in the raw, using the trust you used to have in the business who you think sent it to you.
By destroying trust in your business relationships, the "bad guys" have succeeded.
What you did by going over the code with your customer is proper. Now, instead of just you being responsible, its others who have seen and understood the code as well. If it were certified by an agency, they too are in on the clusterfuck should the software malfunction. If you had ducked behind "mine! mine! mine! I will not reveal! I claim my rights!, then I also feel along with that you also get full unmitigated responsibility for all malfunctions as well. Especially DRM locks. As far as I am concerned, the very same party that has the power to sue if that lock is broken is the same party that should be held accountable for the software's behaviour. Not necessarily the coder - we all know how that game is played. The coder has to do what he is told to do.
If we allow terrorists to destroy trust between ourselves, the terrorists have won.
And its all because our own machines can't be trusted to keep a secret.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]