Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday August 06 2022, @02:51PM   Printer-friendly
from the I'd-like-an-ice-cream-machine-please dept.

An Anonymous Coward writes the following story:

I’ve long believed companies should offer workers a choice in the technology they use in the office and when working remote. Doing so lets employees use what they feel is the best choice of devices for their work, it can help attract and retain staff, it lessens the likelihood workers will go rogue and source their own technology (aka shadow IT), and it establishes a positive relationship between IT and the rest of an organization.

Companies like IBM and SAP have documented their experiences in moving to an employee-choice model and have declared it a success. But does that mean it would work for every company? And how do you decide which way to go?

The most important question in developing (or expanding) an employee-choice model is determining how much choice to allow. Offer too little and you risk undermining the effort's benefits. Offer too much and you risk a level of tech anarchy that can be as problematic as unfettered shadow IT. There isn’t a one-size-fits-all approach. Every organization has unique culture, requirements/expectations, and management capabilities. An approach that works in a marketing firm would differ from a healthcare provider, and a government agency would need a different approach than a startup.

Options also vary depending on the devices employees use — desktop computing and mobile often require differing approaches, particularly for companies that employ a BYOD program for smartphones.

Most employee-choice programs focus on desktops and laptops. The default choice is typically basic: do you want a Windows PC or a Mac? Most often, the choice only extends to the platform, not specific models (or in the case of PCs, a specific manufacturer). Keeping the focus on just two platforms eases administrative overhead and technical support requirements. It also allows companies to leverage volume purchases from one partner in order to receive bulk discounts.

Have you been allowed to choose your own technology and equipment at work? What were the choices offered to you and what restrictions were placed upon them?


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by Anonymous Coward on Saturday August 06 2022, @03:40PM (7 children)

    by Anonymous Coward on Saturday August 06 2022, @03:40PM (#1265270)

    I've never been given a choice in hardware/software from an employer. I have, often, done at least part of my work on my personal systems, just so that I have the tools that work best for me. This has sometimes been a violation of company rules that would have gotten me fired if the bosses knew about it.

    One interesting recent observation: I'm working on a project that includes developers from what is probably the largest commercial Linux company, and their developers all are running Windows. It does raise the question of "eating your own dog food".

    Starting Score:    0  points
    Moderation   +2  
       Interesting=1, Informative=1, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Saturday August 06 2022, @03:43PM

    by Anonymous Coward on Saturday August 06 2022, @03:43PM (#1265273)

    I'm working on a project that includes developers from what is probably the largest commercial Linux company, and their developers all are running Windows.

    RedHat? What is the largest commercial Linux company these days?

  • (Score: 2, Insightful) by RamiK on Saturday August 06 2022, @04:32PM (4 children)

    by RamiK (1813) on Saturday August 06 2022, @04:32PM (#1265290)

    I'm working on a project that includes developers from what is probably the largest commercial Linux company, and their developers all are running Windows. It does raise the question of "eating your own dog food".

    I don't see how not using desktop linux is a problem if your job is to write servers and embedded software targeting linux. And seeing how puny commercial desktop linux is compared to the overall linux market, I suspect the vast majority of linux devs are running Windows and MacOS.

    --
    compiling...
    • (Score: 1, Informative) by Anonymous Coward on Saturday August 06 2022, @05:40PM (3 children)

      by Anonymous Coward on Saturday August 06 2022, @05:40PM (#1265312)

      It depends on what tools you need and what tools are available.

      I’ve had to work in too many places where the Unix/Linux server has a bare minimum install (no GUI, no debugger), and everyone was expected to do their development work on a Windows box (with no C/C++ development tools), using MS’s old version control system, and then manually copy changes to the server.

      All that with no ability to install any “unapproved” software on either end (“unapproved” meaning the bosses don’t use it).

      • (Score: 3, Interesting) by Ox0000 on Saturday August 06 2022, @05:57PM (1 child)

        by Ox0000 (5111) on Saturday August 06 2022, @05:57PM (#1265317)

        no GUI, no debugger

        Installing a GUI or a debugger on production system probably would/should have incurred a similar derision ("they made us install UIs and debuggers on production systems, the MADNESS!"). Those things typically have no place on production systems.

        Being able to do _some_ level of development and debugging locally (with all caveats in place w.r.t. "works on my machine" vs "production box is not the same as your box") rather than having to jump onto another environment has value: it increases velocity and makes it so that you understand your own and your target system(s).
        Ask Android devs how much they love remote debugging...

        • (Score: 0) by Anonymous Coward on Saturday August 06 2022, @06:17PM

          by Anonymous Coward on Saturday August 06 2022, @06:17PM (#1265320)

          Installing a GUI or a debugger on production system probably would/should have incurred a similar derision

          In my experiences, they either didn't have a development machine, or they refused to install "extra" software on it.

      • (Score: 2) by RamiK on Saturday August 06 2022, @07:18PM

        by RamiK (1813) on Saturday August 06 2022, @07:18PM (#1265334)

        I’ve had to work in too many places where the Unix/Linux server has a bare minimum install (no GUI, no debugger), and everyone was expected to do their development work on a Windows box (with no C/C++ development tools), using MS’s old version control system, and then manually copy changes to the server.

        Would having a linux desktop without gcc, valgrind, gdb etc... available make a difference? Cause otherwise, you're not missing desktop linux. You're missing development tools. So I fail to see your point.

        --
        compiling...
  • (Score: 1, Informative) by Anonymous Coward on Saturday August 06 2022, @06:05PM

    by Anonymous Coward on Saturday August 06 2022, @06:05PM (#1265319)

    I was blessed. I always had company hardware, but software was up to us. As long as you didn't do something stupid on the network, you were golden. Towards the end of my career, "IT image" was a threat but it never got imposed on me. I think it would have if I'd stayed though. It think it would have been at least a six month productivity killer for me on the ramp-up, but maybe that's just a hate for change.

    At one point, I was the only one without a Mac as my primary machine. I used to joke with them, "Why can't I think different like everybody else?".