Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Monday June 19 2017, @12:44AM   Printer-friendly
from the you-never-go-full-rapid-release dept.

In news seemingly designed to give me an ulcer, Microsoft is moving Windows Server to Rapid Release.

From the Windows Server blog: "Starting this fall, we plan to deliver two feature updates per year, each spring and fall, aligning to the Windows and Office Semi-annual Channel release cycle."

From this systems administrator's perspective, I do not believe Microsoft has shown that they can deliver the QA on updates necessary for rapid release. Personally, I have been testing Windows 10 since 1507 and I have not seen a trend-line of stable consistency within their updates.

I thought Microsoft would be able to get it together but, anecdotally, Windows 10 updates have been consistently problematic since release. From Office 2016 blocking the installation of cumulative updates on 1607 LTSB, broken and inconsistent removal of AppX packages when following Microsoft's own recommendations, and installation behaviors being documented after the fact like the wholesale reinstallation of AppX packages following build updates, every single month's updates brings me trepidation.

The removal of their QA department 'programmatic testers' seems to be the culprit. From infamous August 2015 update debacle to problems like W10 1703 erroring out on every MDT deployment, this wild inconsistency in monthly update quality has leaked into other Windows branches as well. I have been doing monthly security-only updates with plans to do an annual cumulative update to my templates.

What are other SA's plans about rapid release?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by jb on Tuesday June 20 2017, @05:23AM

    by jb (338) on Tuesday June 20 2017, @05:23AM (#528343)

    That depends on the OS and how painful/disruptive or otherwise its upgrade process is.

    For example, OpenBSD also has a 6 month release cycle (and provides errata patches to release n only until release n+2 comes out), but a very well thought out, predictable & well documented upgrade process for each release. As a result, upgrading our production systems that run OpenBSD twice a year isn't particularly painful or disruptive.

    On the other hand, upgrading our production systems that run GNU (regardless of which distro) every few years tends to be a major exercise, with huge amounts of planning required to get anywhere near the same level of uptime during the transition...

    As for the various closed source Unix flavours, upgrade pain / disruption / uncertainty is usually such that it's easier just to deploy a new system in parallel, then migrate the data. I imagine the same thing is probably true of Windows (although fortunately we don't run it here). The difference is that most of the closed source Unix vendors still offer a 10+ year upgrade cycle -- and after 10 years you're ready to buy new hardware anyway (so their upgrade processes don't *need* to be as robust) -- not so much after 6 months or less!