Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday October 18 2015, @12:09AM   Printer-friendly
from the and-then-we'll-welcome-you-to-obfuscated-code-contests dept.

Secret code is everywhere—in elevators, airplanes, medical devices. By refusing to publish the source code for software, companies make it impossible for third parties to inspect, even when that code has enormous effects on society and policy. Secret code risks security flaws that leave us vulnerable to hacks and data leaks. It can threaten privacy by gathering information about us without our knowledge. It may interfere with equal treatment under law if the government relies on it to determine our eligibility for benefits or whether to put us on a no-fly list. And secret code enables cheaters and hides mistakes, as with Volkswagen: The company admitted recently that it used covert software to cheat emissions tests for 11 million diesel cars spewing smog at 40 times the legal limit.

But as shocking as Volkswagen's fraud may be, it only heralds more of its kind. It's time to address one of the most urgent if overlooked tech transparency issues—secret code in the criminal justice system. Today, closed, proprietary software can put you in prison or even on death row. And in most U.S. jurisdictions you still wouldn't have the right to inspect it. In short, prosecutors have a Volkswagen problem.

Interesting article with implications for Open Source.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by anubi on Sunday October 18 2015, @04:51AM

    by anubi (2828) on Sunday October 18 2015, @04:51AM (#251358) Journal

    Malfunction of some code may hurt someone. That's when business-grade software prone to botnet-launched viral scripts does not make the cut. One would use something like Micrium's U/COS or the like for life critical applications where someone is apt to actually get sued for a malfunction.

    Volkswagen's fraud hardly even shows on my personal radar. It was mickeymouse coding to cater to mickeymouse laws. No one's safety was in jeopardy. From all I could tel, they just did what they had do do to pass spec. No, its not "right", but how about regulations that may not be practically attainable? They did what they were forced to do. I cannot hold much of a grudge against them for that. The most they did was deception... something Americans also know as "salesmanship". Used-car dealers are generally far better at pulling off fast ones on their customer.

    The biggest beef I have is software that won't keep a secret. ( The second beef I have is annoy-ware where authors use the computer and DRM "rights" to enforce annoyances and unwanted stuff on me ).

    Its like having someone over for lunch, he goes through my house, and lists the contents to several unsavory characters who paid him on the side to do this. Then I get targeted break-ins. And its all because I let the wrong guy take a peek inside my house.

    In the digital world, lack of privacy is only fodder for spearphishing fraud. You get an email from someone you are doing business with, or at least that is what you are led to believe by the carefully crafted headers on the thing. Maybe its your retirement account. Maybe its your healthcare provider. Or maybe your bank. And they attach a document in the typical way businesses do these days.

    Now, a lot of us that have had experiences clearing our machines of nasty malware, view an attached document a lot like picking up a soiled condom. These things have only "business-grade" trustworthiness and should be opened in a virtual sandboxed environment. By having detailed information on your personal contacts, the phisher is far more likely to offer you one of these documents you take seriously enough to open in the raw, using the trust you used to have in the business who you think sent it to you.

    By destroying trust in your business relationships, the "bad guys" have succeeded.

    What you did by going over the code with your customer is proper. Now, instead of just you being responsible, its others who have seen and understood the code as well. If it were certified by an agency, they too are in on the clusterfuck should the software malfunction. If you had ducked behind "mine! mine! mine! I will not reveal! I claim my rights!, then I also feel along with that you also get full unmitigated responsibility for all malfunctions as well. Especially DRM locks. As far as I am concerned, the very same party that has the power to sue if that lock is broken is the same party that should be held accountable for the software's behaviour. Not necessarily the coder - we all know how that game is played. The coder has to do what he is told to do.

    If we allow terrorists to destroy trust between ourselves, the terrorists have won.

    And its all because our own machines can't be trusted to keep a secret.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   2