Submitted via IRC for SoyCow3941
We think of our job as controlling the user's experience. But the reality is, we control far less than we imagine.
Last week, two events reminded us, yet again, of how right Douglas Crockford was when he declared the web "the most hostile software engineering environment imaginable." Both were serious enough to take down an entire site—actually hundreds of entire sites, as it turned out. And both were avoidable.
[...] The first of these incidents involved the launch of Chrome 66. With that release, Google implemented a security patch with serious implications for folks who weren't paying attention. You might recall that quite a few questionable SSL certificates issued by Symantec Corporation's PKI began to surface early last year. Apparently, Symantec had subcontracted the creation of certificates without providing a whole lot of oversight. Long story short, the Chrome team decided the best course of action with respect to these potentially bogus (and security-threatening) SSL certificates was to set an "end of life" for accepting them as secure. They set Chrome 66 as the cutoff.
So, when Chrome 66 rolled out (an automatic, transparent update for pretty much everyone), suddenly any site running HTTPS on one of these certificates would no longer be considered secure. That's a major problem if the certificate in question is for our primary domain, but it's also a problem it's for a CDN we're using. You see, my server may be running on a valid SSL certificate, but if I have my assets—images, CSS, JavaScript—hosted on a CDN that is not secure, browsers will block those resources. It's like CSS Naked Day all over again.
To be completely honest, I wasn't really paying attention to this until Michael Spellacy looped me in on Twitter. Two hundred of his employer's sites were instantly reduced to plain old semantic HTML. No CSS. No images. No JavaScript.
The second incident was actually quite similar in that it also involved SSL, and specifically the expiration of an SSL certificate being used by jQuery's CDN. If a site relied on that CDN to serve an HTTPS-hosted version of jQuery, their users wouldn't have received it. And if that site was dependent on jQuery to be usable ... well, ouch!
It can be easy to shrug off news like this. Surely we'd make smarter implementation decisions if we were in charge. We'd certainly have included a local copy of jQuery like the good Boilerplate tells us to. The thing is, even with that extra bit of protection in place, we're falling for one of the most attractive fallacies when it comes to building for the web: that we have control.
Source: http://alistapart.com/article/the-illusion-of-control-in-web-design
(Score: 4, Interesting) by c0lo on Wednesday May 09 2018, @12:41AM
Huh! If that would be that simple.
You reckon the control over assets is enough? To paraphrase:
Yes, I know you never said that, but the context is 'illusion of control' - your answer suggest a better way to deal with, but if one imagins that's foolproof, the illusion of control will still persists.
And I tell you a more reliable way to make sure the access to your assets doesn't depend as much on others. It involves relinquishing most of the control on your assets - replicate them in a P2P network and let many independent third parties certify their authenticity (use of a DHT).
Of course this is barely possible in the today's mainstream internet, having control' drives profits and power for those able to exercise it.
https://www.youtube.com/watch?v=aoFiw2jMy-0