Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday May 26 2015, @12:30PM   Printer-friendly
from the life-is-easier-with-FOSS dept.

The European Union's interoperability page reports:

Using open source in school greatly reduces the time needed to troubleshoot PCs, [as indicated by] the case of the Colegio Agustinos de León (Augustinian College of León, Spain). In 2013, the school switched to using Ubuntu Linux for its desktop PCs in [classrooms] and offices. For teachers and staff, the amount of technical issues decreased by 63 per cent and in the school's computer labs by 90 per cent, says Fernando Lanero, computer science teacher and head of the school's IT department.

[...] "One year after we changed PC operating system, I have objective data on Ubuntu Linux", Lanero tells Muy Linux [English Translation], a Spanish Linux news site. By switching to Linux, incidents such as computer viruses, system degradation, and many diverse technical issues disappeared instantly.

The change also helps the school save money, he adds. Not having to purchase [licenses] for proprietary operating systems, office suites, and anti-virus tools has already saved about €35,000 in the 2014-2015 school year, Lanero says. "Obviously it is much more interesting to invest that money in education."

[...] The biggest hurdle for the IT department was the use of electronic whiteboards. The school uses 30 of such whiteboards, and their manufacturer [Hitachi] does not support the use of Linux. Lanero got the Spanish Linux community involved, and "after their hard work, Ubuntu Linux now includes support for the whiteboards, so now everything is working as it should".

[...] Issues [with proprietary document formats] were temporarily resolved by using a cloud-based proprietary office solution, says Lanero, giving the IT department time to complete the switch to open standards-based office solutions. The school now mostly uses the LibreOffice suite of office tools.

[...] "Across the country, schools have contacted me to hear about the performance and learn how to undertake similar migrations."

As I always say, simply avoid manufacturers with lousy support and FOSS is just the ticket.


[Editor's Comment: Original Submission]

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @12:49PM

    by Anonymous Coward on Tuesday May 26 2015, @12:49PM (#187986)

    Oh wait

  • (Score: 5, Informative) by Anonymous Coward on Tuesday May 26 2015, @12:56PM

    by Anonymous Coward on Tuesday May 26 2015, @12:56PM (#187990)

    Just a little correction, that is not an university but a school (basic and seconday education) until 12th grade, called "2º de Bachillerato" in Spain.

    • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @01:37PM

      by Anonymous Coward on Tuesday May 26 2015, @01:37PM (#188009)

      > 2º de Bachillerato

      That sounds so dirty.

  • (Score: 5, Insightful) by bradley13 on Tuesday May 26 2015, @12:57PM

    by bradley13 (3053) on Tuesday May 26 2015, @12:57PM (#187991) Homepage Journal

    I'm a huge FOSS fan. I manage to stick 99% to Linux/LibreOffice, even though my employer runs a Windows network. However, honesty forces me to admit that FOSS software has three problems:

    - FOSS is generally less capable that proprietary software. Photoshop vs. Gimp. LibreOffice-Impress vs. PowerPoint.

    - FOSS documentation sucks.

    - Interoperability. While you can put this on whichever side you care to (proprietary lock-in is the big lever for established players), it is a problem. I can read/write .doc and .xls with no problem, but as soon as someone comes with OOXML formats, interoperability goes down the toilet.

    The problems are inevitable: people get paid for writing and documenting proprietary software. There's a lot less money for FOSS, and people gotta eat, it's that simple.

    So, while it's great that this works for these Spanish schools, it only works because a few core people have worked their buns off to make it so. Like driving the Linux support for the whiteboards, and probably a hundred other, smaller problems. All of us who install and support FOSS systems know the kinds of weird problems that crop up, and usually there is no support organization you can turn to for help.

    I'm rambling here, but that latter ought to be different. People think FOSS means "free as in beer". If more people were willing to pay for support, we could have a much healthier infrastructure. However, that's just not the mentality, so FOSS muddles along on the fringe of relevance, just as is has done for decades now.

    --
    Everyone is somebody else's weirdo.
    • (Score: 1, Insightful) by Anonymous Coward on Tuesday May 26 2015, @01:40PM

      by Anonymous Coward on Tuesday May 26 2015, @01:40PM (#188011)

      FOSS has better documentation than any MS software. If you can't find what you need using "man" just do a google search.

      • (Score: 2) by LoRdTAW on Tuesday May 26 2015, @02:04PM

        by LoRdTAW (3755) on Tuesday May 26 2015, @02:04PM (#188024) Journal

        "man" ... Bwahahahahahahah. Oh, boy thanks for that. I need a good laugh.

        • (Score: 1, Informative) by Anonymous Coward on Tuesday May 26 2015, @02:19PM

          by Anonymous Coward on Tuesday May 26 2015, @02:19PM (#188029)

          Well, it does require an IQ above 10 to understand it.

          • (Score: 2) by LoRdTAW on Tuesday May 26 2015, @03:03PM

            by LoRdTAW (3755) on Tuesday May 26 2015, @03:03PM (#188046) Journal

            *Fail Horn*
            Nice try. Next time a newbie tries to learn how to use FOSS and rage quits because some jackass tells them to use man, I'll remember your little jab and have a snicker. A newbie will not be able to properly glean anything from man unless that man page is very verbose which they often arent (ntfsclone is one of the more straightforward man pages). This is why a search engine is the newbies friend because there is a wealth of verbose information with the steps and commands laid out in plain english (or whatever your language of choice is). man is akin to a service manual for the experienced technician, not a getting started guide for the customer.

            • (Score: 2) by sjames on Tuesday May 26 2015, @04:01PM

              by sjames (2882) on Tuesday May 26 2015, @04:01PM (#188088) Journal

              Actually, way back in the before time, I did learn to use Unix largely from reading man pages.

              • (Score: 2) by LoRdTAW on Tuesday May 26 2015, @04:12PM

                by LoRdTAW (3755) on Tuesday May 26 2015, @04:12PM (#188099) Journal

                Remember, you did. Everyone learns differently.

                • (Score: 2) by sjames on Tuesday May 26 2015, @04:29PM

                  by sjames (2882) on Tuesday May 26 2015, @04:29PM (#188115) Journal

                  Yes, I did, therefor it can be done. I'm not claiming it's the best way for everyone, just refuting your claim that it can't happen./

              • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @07:36PM

                by Anonymous Coward on Tuesday May 26 2015, @07:36PM (#188218)

                Secretaries used to use vi-like editors and were comfortable with LaTeX. Now, somehow we pander to the lowest common denominator. People like him will tell you that it's too hard for people like secretaries to run. I don't think people have gotten dumber. I think Windows treats you like you are a dumbass and people like him accept it as fact.

              • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @10:25PM

                by Anonymous Coward on Tuesday May 26 2015, @10:25PM (#188326)

                I have a severe non-verbal learning disability. To me, man pages are the worst shit I've ever come across - little more than wall of text notes to people who already know how something works, as a reminder of how it works.

            • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @08:35PM

              by Anonymous Coward on Tuesday May 26 2015, @08:35PM (#188259)

              Wow, what an idiot.

        • (Score: 2, Interesting) by SDRefugee on Tuesday May 26 2015, @03:46PM

          by SDRefugee (4477) on Tuesday May 26 2015, @03:46PM (#188081)

          Agree... I'm a retired tech, supported windows since WFWG, been a big fan of Linux since 1994 or so, and now that I'm retired, I've pretty much given up on Windows, and run Linux on all my home machines. (Disclaimer: Am trying out Windows 10 preview as I'm *sure* I'm gonna be asked about it once it goes live...) My BIG gripe about Linux man pages are the fact that the vast majority of them are simply a list of the command line switches the program responds to... Which is just peachy if you use the program all the time and you've just forgotten one or more obscure, not-frequently used switches.. But.. say you've NEVER used the program before, you'd REALLY like a few choice *examples* of the invocation/use of the app... Not very many man pages have these, so your only course of action is to google for a tutorial on said app..... That being said, as far as I'm concerned, Linux trumps Windows hands-down....

          --
          America should be proud of Edward Snowden, the hero, whether they know it or not..
          • (Score: 2) by Grishnakh on Tuesday May 26 2015, @05:55PM

            by Grishnakh (2831) on Tuesday May 26 2015, @05:55PM (#188167)

            Not very many man pages have these, so your only course of action is to google for a tutorial on said app.

            Not all man pages are wonderful, and I'm sure most would welcome any contributions in editing. However, man pages really are there mostly to be reference tomes, to exhaustively list every command-line option available. For command-line programs, they're extremely useful that way. However, now that we have the internet, there's no shortage of tutorials for various things online, and in fact for any GUI program you certainly would not want a man-page tutorial, you'd want a webpage tutorial so you can see screenshots.

          • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @07:39PM

            by Anonymous Coward on Tuesday May 26 2015, @07:39PM (#188221)

            In the same comment, the insightful AC mentioned going online to get help.

            I visit the forums of several distros fairly regularly.
            It's rare that I see a problem that the assembled masses can't solve|hasn't solved.
            Indeed, I an often irritated that the poster of the question didn't follow directions and search to find 1 of the dozens of times his question has already been asked and answered.

            In addition, a bootable ISO makes it possible to get online--even if you have completely torched your Linux install.

            Contrary to any claims that answers to Linux problems are difficult to find, I say that they are no more problematic and are often easier to solve than the payware OS that is usually held up as "the standard".

            -- gewg_

            • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @08:15PM

              by Anonymous Coward on Tuesday May 26 2015, @08:15PM (#188243)

              I missed where you had re-inserted the brand up in the text.

              -- gewg_

        • (Score: 2) by turgid on Tuesday May 26 2015, @08:01PM

          by turgid (4318) Subscriber Badge on Tuesday May 26 2015, @08:01PM (#188233) Journal
          • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @10:30PM

            by Anonymous Coward on Tuesday May 26 2015, @10:30PM (#188329)

            The GNU info pages I've come across remind me of two things. The first is the man page, because all the ones I've come across are a cut-and-paste of the man pages, and the second is this XKCD comic [xkcd.com].

            • (Score: 0) by Anonymous Coward on Wednesday May 27 2015, @03:56PM

              by Anonymous Coward on Wednesday May 27 2015, @03:56PM (#188664)

              I've not seen a single GNU info documentation that was just a cut-and-paste of the man page (actually I've seen it the other way round, the man page being generated from a section(!) of the info file).

          • (Score: 2) by hendrikboom on Wednesday May 27 2015, @03:50PM

            by hendrikboom (1125) Subscriber Badge on Wednesday May 27 2015, @03:50PM (#188662) Homepage Journal

            I find GNU info pages completely useless. The navigation mechanisms are too unintuitive, and too different from everything I use regularly, even though I'm an emacs user.

            Perhaps automatic mechanisms to view GNU info pages in a mouse-based point-and-click browser could work. Anyone know of such a thing? Maybe a firefos or chrome extension?

    • (Score: 2) by bart9h on Tuesday May 26 2015, @01:41PM

      by bart9h (767) on Tuesday May 26 2015, @01:41PM (#188012)

      it only works because a few core people have worked their buns off to make it so. Like driving the Linux support for the whiteboards, and probably a hundred other, smaller problems.

      And how is that a problem?

      If more people were willing to pay for support, we could have a much healthier infrastructure.

      Yep, just imagine people donated 10% of what they would pay for proprietary software to the free software projects.

    • (Score: 5, Informative) by Marand on Tuesday May 26 2015, @01:42PM

      by Marand (1081) on Tuesday May 26 2015, @01:42PM (#188013) Journal

      - FOSS is generally less capable that proprietary software. Photoshop vs. Gimp. LibreOffice-Impress vs. PowerPoint.

      I can't speak for Impress vs. Powerpoint, and I'm not interested in discussing the rest of your comment, but I'm going to rant a bit about the Photoshop vs. Gimp comment, because it's so outdated it's not even funny. Everyone still acts like Gimp is the only open source graphics program, and that hasn't been true for a long time. In fact, it hasn't been the best FOSS graphics tool for years, because it's long since been surpassed for nearly every task by other FOSS options.

      The open-source graphics powerhouse has been Krita [krita.org] for a few years now, with no sign of it changing any time soon. Functionality-wise, it's somewhere between Photoshop and a more painting-oriented program like Corel Painter. Depending on what you use Photoshop for, it's your best option for an FOSS-replacement, and for some workflows it even surpasses it.

      It handles high bit depth images, non-RGB colour spaces, has non-destructive layer filters and transforms, GPU-accelerated canvas, has a bunch of different brush engines that are insanely configurable, and so much more that it would take far too long to write it all out, with more being added constantly. Just about any "Gimp doesn't have this, it sucks" you can think of will be in Krita. Also, it's Qt, rather than being stuck on gtk2 like gimp, so the UI is less flaky, more configurable, and also works better in Windows.

      The OS X support is lacklustre, but that's about the only failing I can think of vs gimp. Krita's easily on par with paid graphics software. In fact, they put a special variant -- which has a switchable tablet+desktop UI -- on Steam with a price tag, and people actually do pay for it. Not only that, but they did a Kickstarter last year that got enough support to fund a developer for a year, and they're doing another one [kickstarter.com] that met its goal before the first week was over. (Which still has about a week left to make stretch goals, if anyone's interested in supporting)

      Of course, there are plenty more tools beyond those, depending on need. Blender is insanely polished and powerful; Inkscape is a great tool for vector work; Synfig is available for animation; and MyPaint is a nice painting/sketching oriented app that has an infinite canvas and a decent brush engine.

      Seriously, try something other than Gimp, it hasn't been worthy of being considered the FOSS flagship graphics tool in a very long time. It hasn't had enough developer interest in a while, and it's been crippled by its reliance on gtk2 for almost as long. I hope that if I and others rant about this enough, people will finally start to notice and try something good instead of acting like gimp is the entirety of FOSS graphics software.

      • (Score: 2) by mtrycz on Tuesday May 26 2015, @02:55PM

        by mtrycz (60) on Tuesday May 26 2015, @02:55PM (#188042)

        People that do painting, probably already know about the FOSS options (and mostly don't care, as it's as free-as-in-beer as a pirated photoshop).

        Poeple that do photography, where gimp is the major competitor with photoshop, are actually stuck with photoshop. Source: i do some photography.

        --
        In capitalist America, ads view YOU!
        • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @10:33PM

          by Anonymous Coward on Tuesday May 26 2015, @10:33PM (#188330)

          Source: i do some photography.

          You miss-spelled "anecdotal" there.

          • (Score: 2) by mtrycz on Wednesday May 27 2015, @08:17AM

            by mtrycz (60) on Wednesday May 27 2015, @08:17AM (#188518)

            Yup. As in: I know from experience, but that's just my experience. Isn't that a sufficient disclaimer?

            --
            In capitalist America, ads view YOU!
        • (Score: 2) by Marand on Wednesday May 27 2015, @04:00AM

          by Marand (1081) on Wednesday May 27 2015, @04:00AM (#188444) Journal

          People that do painting, probably already know about the FOSS options (and mostly don't care, as it's as free-as-in-beer as a pirated photoshop).

          That's bullshit right there. That's sort of like arguing that jobs are a waste of time because "people that need money probably know about employment options, and mostly don't care because robbing a bank is more useful for them."

          Plus there are better proprietary painting apps than Photoshop, so if you're going to pirate, at least pirate the right tool for the right job.

          Poeple that do photography, where gimp is the major competitor with photoshop, are actually stuck with photoshop. Source: i do some photography.

          For most photography uses I've found digikam* to be more than good enough, with the added bonus of having tagging and other organisational features on top. Between that and opening the file in krita I haven't needed gimp for photos in a while. Digikam for sorting, tagging, and basic edits that don't need the weight of Krita (basically anything but touch-up stuff), and then Krita (which has RAW support) if it needs something more involved.

          I'm not saying that will work for everybody, of course, but that wasn't my original point. My original point is that nobody even tries, they just use "gimp vs photoshop" as a quick way to dismiss FOSS graphics apps as a whole. Sometimes out of ignorance, sometimes trolling, but it needs to be fought either way.

          * Actually, I should have mentioned digikam in my original post, because it's another really nice program that deserves more attention. Good FOSS photo management are really hard to find and it's probably the best one I've found.

          • (Score: 2) by mtrycz on Wednesday May 27 2015, @08:23AM

            by mtrycz (60) on Wednesday May 27 2015, @08:23AM (#188520)

            DIgicam is for organizing, much like Lightroom from Adobe, not for editing. Last time I tried Krita, it was no good for photoediting, I might as well give it a try sometime.

            The only problem I have with Gimp for photoediting is non destructive editing / adjustment layers. There's just no game. Anyway, now that I finished studies, might as well contribute to some of the softwares I use, instead of complaining, eh?

            --
            In capitalist America, ads view YOU!
            • (Score: 3, Insightful) by Marand on Wednesday May 27 2015, @09:34AM

              by Marand (1081) on Wednesday May 27 2015, @09:34AM (#188537) Journal

              DIgicam is for organizing, much like Lightroom from Adobe, not for editing

              I'm aware, but it also has a built-in editor, called ShowFoto, that covers a lot of the basics (like levels, curves, some various transformations) without needing to pull out something heavier. A lot of the time, that really is all one needs, and it acts as a good starting point for firing up something else when you need more.

              Unrelated, but I wouldn't be surprised if a really casual user could manage without ever needing a proper editor at all. KDE's viewer, gwenview, has the most common stuff like cropping, resizing, and rotation. Some people might never need more than that. Kind of interesting to consider.

              Last time I tried Krita, it was no good for photoediting, I might as well give it a try sometime.
              The only problem I have with Gimp for photoediting is non destructive editing / adjustment layers. There's just no game.

              Krita's actually been doing a lot with non destructive editing the past few versions. It's recently added transform masks (non-destructive transforms) and has had filter masks (same for filters) for a while. Transforms are pretty obvious, and some of the filters are obvious because they're standard things like blur and sharpen, but it also has "filters" that let you non-destructively change colour, saturation, contrast, brightness, and one (that I love) will convert a colour to alpha.

              It's still painting first and foremost, as a design decision, but it has enough feature overlap with Gimp (plus extra things that Gimp can't do) that I can usually use Krita instead, even for general image manipulation. Which is good, because it gets really annoying using it without a working mousewheel (bug I mentioned in another comment).

              I still like gimp well enough, and in fact I prefer its multi-window mode to the more common MDI or tabbed designs. It's just been stagnant while others have continued to improve, so it's not quite the shining star it used to be. At this rate, the painting program (Krita) will gain photo editing feature parity with Gimp before Gimp manages to get proper support for something other than 8bit RGB.

              Anyway, now that I finished studies, might as well contribute to some of the softwares I use, instead of complaining, eh?

              Hey, don't knock it. Sometimes complaining to the right people is contributing. At least, that's what I tell myself when I put in feature requests. :)

      • (Score: 3, Informative) by danomac on Tuesday May 26 2015, @04:00PM

        by danomac (979) on Tuesday May 26 2015, @04:00PM (#188086)

        It hasn't had enough developer interest in a while, and it's been crippled by its reliance on gtk2 for almost as long.

        I just thought I'd point out they aren't stuck on gtk2... gtk is an acronym for the GIMP Toolkit, it was written specifically for GIMP.

        • (Score: 3, Informative) by Marand on Wednesday May 27 2015, @03:36AM

          by Marand (1081) on Wednesday May 27 2015, @03:36AM (#188436) Journal

          I just thought I'd point out they aren't stuck on gtk2... gtk is an acronym for the GIMP Toolkit, it was written specifically for GIMP.

          Yes, I'm aware of gtk's history, possibly better than you are -- gtk+ got the "KFC" treatment and hasn't been "the gimp toolkit" in many, many years[1] -- but you apparently didn't notice that I said gtk2 specifically. I also didn't say they were stuck on gtk, I said that the reliance on gtk2 is crippling it. First release of gtk3 was early 2011, four years ago, and there's still no support, much like the long-promised full GEGL support. At this rate, by the time gtk3 support comes everyone else will be using gtk4.

          While in theory there's nothing wrong with using a project that's in maintenance mode like that, it's a problem with gtk2, because it has a lot of odd problems that will likely never be fixed because testing and development has long since moved on to gtk3. It has problems with non-wacom input devices, for example, because it's a market that exploded around gtk3's release.

          Another example: the odd way gtk2 handles input devices also causes its share of problems that don't occur in gtk3 because of design improvements in 3 vs 2. With the last two mouses I've owned, I haven't been able to use the mousewheel at all in gimp as a side effect of this. It may be related to having a non-wacom pen display, but it's not a problem with gtk3. It's annoying enough that I've completely stopped using gimp if there's any way to do something in one of the other programs available, at least until the gtk3 port finally surfaces.

          Due to the heavy investment in gtk, it makes sense to stick with it, but sticking with gtk2 realy hurts it right now. Not just because long-standing design problems will never improve, but also because the vast majority of gtk devs use and know gtk3 now, so the developer pool is constantly dwindling.

          Meanwhile, Krita's managing to pump out new features while still working on the Qt5 port on the side. If Gimp had enough developer interest to do the same, most of my comment probably wouldn't even have been necessary. It's unfortunate, but projects can stagnate, and Gimp has long been showing signs of it.

          ---

          [1] Brief history lesson: the last gtk to be the "gimp toolkit" was the gtk1 series. By gtk2 (in 2002) it had already been rewritten and renamed to "gtk+", with development focused on more than just the needs of gimp. In fact, I looked it up and the rewrite and name-change in 1999 when gtk+ 1.2 was released. So, it hasn't been the "gimp toolkit" for 16 years; I think it's time to get over it and just go with "gtk+" like the rest of the world.

          Too bad we don't have some kind of 0-score "Outdated" mod (like Disagree is) for when people post ancient information. You don't deserve a downmod but there should be some way to flag old info.

      • (Score: 1) by stormreaver on Tuesday May 26 2015, @09:29PM

        by stormreaver (5101) on Tuesday May 26 2015, @09:29PM (#188282)

        The open-source graphics powerhouse has been Krita [krita.org] for a few years now, with no sign of it changing any time soon.

        The last time I tried Krita, a few years ago, it had one major show-stopping problem: it was unbearably slow. I had several large images I wanted to touch-up. I started Krita, selected the first picture to load, and waited. And waited. And waited.

        After getting tired of waiting on that first image, I loaded the first picture in the GIMP, did the touch-ups, and saved it. I checked on Krita, and it was still loading. I loaded the second picture into the GIMP, did the touch-ups, then saved. Krita was still loading. I repeated that for several more pictures in the GIMP, then Krita finally finished loading the first picture.

        Krita has great potential, but was so slow that it was unusable. It's been a couple years, though, and I had all but forgotten it still exists.

        • (Score: 2) by Marand on Wednesday May 27 2015, @04:39AM

          by Marand (1081) on Wednesday May 27 2015, @04:39AM (#188463) Journal

          The last time I tried Krita, a few years ago, it had one major show-stopping problem: it was unbearably slow. I had several large images I wanted to touch-up. I started Krita, selected the first picture to load, and waited. And waited. And waited.

          Not surprising. I've been following it for a long time, and the Qt4 port started out really rocky. The 2.4 release (in early 2012) was when it really started to shine, and anything before that was basically unusable. They've been doing somewhat regular updates since, and speed-ups have been a major focus.

          I don't know how large you're talking, but currently even 600dpi images with dozens of layers (using 80+mb of disk space) open pretty well -- well for not being stored on SSD, at least. Canvas interaction used to be slow at large sizes too, but it's had OpenGL acceleration for the canvas for a long time, which helped tremendously. That's around when I really started using it instead of just checking it out occasionally to see progress.

          On my (crappy dual core) system it can still be rough using certain brushes or layer operations on 600dpi images -- though it's better than it was at that, too -- but everything else is pretty fast. Performance boosts for large brushes on large canvases seems to be a major theme with the current Kickstarter, so I'm looking forward to seeing that improve more.

          That's what I meant before: Gimp's moving at a snail's pace while Krita's been evolving rapidly since 2.4 and other projects have likewise grown quickly. The landscape's completely different but nobody seems to have noticed yet. It's a good time to grab some of the alternatives and poke at them a bit to see what works for you.

    • (Score: 3, Funny) by Anonymous Coward on Tuesday May 26 2015, @02:48PM

      by Anonymous Coward on Tuesday May 26 2015, @02:48PM (#188037)

      I'm sorry, but that's cherry picking. There are plenty of FLOSS software that is either on equal footing, or better than the proprietary alternatives. Examples of that are Linux, nginx, PostgreSQL, Blender, Firefox, VLC, Eclipse, Octave, Deluge, etc, etc...

      Oh and special mention goes to Emacs which completely crushes Windows, OSX, GNU/*, and *BSD in all regards except the text editing business.

      • (Score: 1) by archshade on Tuesday May 26 2015, @03:31PM

        by archshade (3664) on Tuesday May 26 2015, @03:31PM (#188064)

        Oh and special mention goes to Emacs which completely crushes Windows, OSX, GNU/*, and *BSD in all regards except the text editing business.

        -- Emphasis mine

        Err just to nitpick but Emacs was originally written by Richard Stallman. The Same Richard Stall man who started GNU, the FSF, and wrote the GPL. RMS even changed the name of his EMACS to GNU/EMACS to show it was part of the GNU project. Clearly Emacs is an alternative GNU userland. Now GNU just need an editor and it will be sorted (and maybe a finnished kernel).

        • (Score: 2, Funny) by archshade on Tuesday May 26 2015, @03:36PM

          by archshade (3664) on Tuesday May 26 2015, @03:36PM (#188072)
          Replying to my self is bad form but I noticed a typo (guess I should have previewed).

          GNU needs a finished kernel (looks at GNU Hurd). There already exists a usable "Finnished" kernel and I here it is quite popular.

      • (Score: 2) by hendrikboom on Wednesday May 27 2015, @04:06PM

        by hendrikboom (1125) Subscriber Badge on Wednesday May 27 2015, @04:06PM (#188670) Homepage Journal

        Emacs is a perfect tool for the environment it originally flourished on -- a text-only user-interface for those old text-only full-duplex terminals. That's why it has so very many tools -- it's not a text editor, it's a user-friendly operating system for limited-functionality terminal equipment. Of course, it can edit text, that's one of the things computer users need to do now and then.

    • (Score: 4, Informative) by Thexalon on Tuesday May 26 2015, @02:54PM

      by Thexalon (636) on Tuesday May 26 2015, @02:54PM (#188041)

      Interoperability. While you can put this on whichever side you care to (proprietary lock-in is the big lever for established players), it is a problem. I can read/write .doc and .xls with no problem, but as soon as someone comes with OOXML formats, interoperability goes down the toilet.

      LibreOffice has been able to handle reading OOXML formats fairly well for quite some time, and the only problems I've had writing to those formats had to do with using weird stuff that isn't actually part of any documented standard. The main reason LibreOffice has a hard time with them is precisely because Microsoft does everything it can to try to prevent competitors from remaining interoperable.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 2) by FatPhil on Tuesday May 26 2015, @03:30PM

      > I can read/write .doc and .xls with no problem, but as soon as someone comes with OOXML formats, interoperability goes down the toilet.

      Which is why everything must be done to dissuade people from using them. (And anyone involved in ratifying the ill-defined (crap like "this block is a binary stream of data which should be interpreted the same way as it would by one of our prorietory ActiveX controls in Word 2000") "standard" at ISO should have their heads on a chopping block.)

      Oh, people are willing to pay for support. My wage for the last decade has been because of companies willing to pay for open source software.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by sjames on Tuesday May 26 2015, @04:27PM

      by sjames (2882) on Tuesday May 26 2015, @04:27PM (#188111) Journal

      Really, FOSS tends to have different capabilities rather than less.

      Funny thing about proprietary formats. I have on many occasions rescued some corrupted proprietary file on Windows by loading it into Free software and then writing it back out again. In other words, the FOSS was more compatible with the proprietary format than the MS program,, that created it.

    • (Score: 3, Informative) by q.kontinuum on Tuesday May 26 2015, @04:36PM

      by q.kontinuum (532) on Tuesday May 26 2015, @04:36PM (#188121) Journal

      This is wrong on so many levels,...

      FOSS is generally less capable that proprietary software. Photoshop vs. Gimp. LibreOffice-Impress vs. PowerPoint.

      I work in the area of test-automation. If you ever tried to get that started with Windows, and then were allowed to switch to Linux (or e.g. BSD), you wouldn't talk about proprietary software being more capable. Windows is *such* a pain in the ass in that area. It starts with the path-length limitations. If you work with Jenkins, especially with Matrix-jobs and descriptive labels/job names you will easily exceed the maximum allowed path lengths. Can you show me a decent proprietary source-control-system? Feature- and performance-wise comparable to git, especially in combination with gerrit? Or a CI-System feature-wise comparable to Jenkins? Don't tell me Bamboo or Go. They don't come close.

      - FOSS documentation sucks.

      Compared to what? Windows error-messages? "ERROR 9fewui99wq908eqww occured"? Just sticking with Jenkins as an example, I get full documentation on the implementation, API/extension points, I can use the API via Groovy-scripts, etc. Good luck finding something similarly documented in the proprietary section. Ever tried to write a parser for Excel- or Word-Documents? Without using proprietary plugins? Not even MS themselves managed to keep their Office-versions downwards-compatible (I didn't try the latest versions. A couple of years ago, colleagues came to me because they knew I had OpenOffice and could load new documents, store them in old format, and vice-versa. Storing in old format from newer MS-Office version frequently failed.

      The problems are inevitable: people get paid for writing and documenting proprietary software. There's a lot less money for FOSS, and people gotta eat, it's that simple.

      Oh. In that case I should really talk to Cloudbees and ask them why we pay so much for our support-contract for Jenkins. And while I'm at it, I might ask jfrog, why they take money for their Artifactory-support. I probably missed something there...

      So, while it's great that this works for these Spanish schools, it only works because a few core people have worked their buns off to make it so.

      Yes, true in this case. E.g. for Android it's different, because it already has some market share. In case of Jenkins we also implement plugins or submit fixes to available plugins. We are part of those working their buns off, and still we get more benefit than costs. Because copying software simply is free. It does not cost more to provide the same software to 10^8 people instead of only 10^3. But the bigger the user-commnity is, the higher the number of contributors, even if they remain a tiny fraction overall.

      However, that's just not the mentality, so FOSS muddles along on the fringe of relevance, just as is has done for decades now.

      A lot of critical infrastructure nowadays is open-source driven. Starting with billions of mobile phones, wireless routers, databases, going to the very top100 of supercomputing, and down again to wrist-watches.

      Seriously, your reasons sound surprisingly alike the typical Microsoft-BS.

      --
      Registered IRC nick on chat.soylentnews.org: qkontinuum
      • (Score: 4, Funny) by Grishnakh on Tuesday May 26 2015, @05:59PM

        by Grishnakh (2831) on Tuesday May 26 2015, @05:59PM (#188168)

        Can you show me a decent proprietary source-control-system?

        That's easy: IBM Rational ClearCase.

        Just kidding; that thing is an abomination, and it's mind-boggling that so many big companies pay huge amounts of money for licensing and support costs for it. In fact, ClearCase is basically the poster child for how awful proprietary software can be.

        • (Score: 2) by q.kontinuum on Tuesday May 26 2015, @07:01PM

          by q.kontinuum (532) on Tuesday May 26 2015, @07:01PM (#188194) Journal

          For a moment you had me there. I had to deal with ClearCase once, some time ago...

          --
          Registered IRC nick on chat.soylentnews.org: qkontinuum
          • (Score: 2) by Grishnakh on Tuesday May 26 2015, @08:16PM

            by Grishnakh (2831) on Tuesday May 26 2015, @08:16PM (#188245)

            I had one job where I had to use it daily. It really wasted a lot of my time. AFAICT, it might have made a little sense back in the 80s when it was first developed, but it stopped making any sense after about 2000, when everything else passed it up.

            What ClearCase really is is a poster child for inertia. It made sense for big companies to use it for big projects in a time when there weren't any really good alternatives (they had CVS back then, but for really big projects, especially multi-site projects, that had a lot of issues). So these big companies bought into it, made it their "corporate standard", and now decades later they're stuck with it and they refuse to change because everything is locked up in it. I believe there's actually some programs to migrate CC data to Subversion or Git, but again inertia rears its ugly head: no one wants to be the one to risk their management reputation on doing this, so it never gets done.

        • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @10:44PM

          by Anonymous Coward on Tuesday May 26 2015, @10:44PM (#188336)

          ClearCase has been killed by CVS and Subversion with progress on Git killing off SVN. Thank fucking god. At least it isn't CA Harvest *shudder*

      • (Score: 1) by Dr Spin on Tuesday May 26 2015, @07:00PM

        by Dr Spin (5239) on Tuesday May 26 2015, @07:00PM (#188192)

        Use proprietary software for infrastructure?

        Would you stake your entire business on a horse when its legs are hidden under a blanket?

        Are you nuts?

        There may be justification for closed source apps, but the only justification for closed source
        infrastructure is that there is no open source alternative.

        Specify Oracle once, regret forever!

        --
        Warning: Opening your mouth may invalidate your brain!
      • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @07:45PM

        by Anonymous Coward on Tuesday May 26 2015, @07:45PM (#188224)

        it only works because a few core people have worked their buns off

        I was afraid that the editor would put my Google translation up beside the original Spanish-language page link (and he did).
        In the process, my note at the end of the summary that mentioned that the brand of the whiteboard with the LOUSY MANUFACTURER SUPPORT got truncated.
        That manufacturer is Hitachi.

        copying software simply is free.

        I would have used the word "redistributing"--but, yeah.

        It does not cost more to provide the same software to 10^8 people instead of only 10^3.

        When talking about fixing a shortcoming, my way of say that is "You only have to solve the problem ONCE".
        Send your solution upstream to the kernel|distro|app guys; park it on a server and let everyone know they can make copies for themselves and/or they can send copies to anyone and/or link to your page.

        Folks trying to make FOSS sound difficult are being disingenuous.

        This stuff goes directly to M$'s recent attempts to claim that they have released stuff as "open".
        Heh. Just try to -alter- that stuff and -redistribute- it.
        You'll find out in a big hurry how "open" it is when an M$ lawyer sends you a nastygram after you have tried to -share- your improvements to M$'s code.

        "We will allow you to read this" is NOT "open".

        -- gewg_

        • (Score: 2) by q.kontinuum on Tuesday May 26 2015, @08:04PM

          by q.kontinuum (532) on Tuesday May 26 2015, @08:04PM (#188235) Journal

          This stuff goes directly to M$'s recent attempts to claim that they have released stuff as "open".

          To honour the facts, afaik MS is nowadays one of the bigger contributors to the Linux kernel. This [theinquirer.net] link is already three years old, but from what I read, they are still contributing (maybe a little less than 2012).

          I could well imagine them funding Poettering and systemd ;-)

          --
          Registered IRC nick on chat.soylentnews.org: qkontinuum
          • (Score: 2, Informative) by Anonymous Coward on Tuesday May 26 2015, @08:42PM

            by Anonymous Coward on Tuesday May 26 2015, @08:42PM (#188265)

            As I have mentioned previously, [soylentnews.org] most of the kLOCs that MICROS~1 has "contributed" are crap and have to be removed.
            As also mentioned there, the only reason M$ "contributed" was they got caught violating GPL--otherwise their code would have remained closed and proprietary.

            ...and it's interesting that you mention 2012.
            That was the year that OpenStack dropped Hyper-V because M$ wouldn't continue to support it,
            (Also mentioned in my previous post.)
            I have also recently seen an item that says M$ is trying to get back in through a side door via an OpenStack vendor with low morals concerning with whom they will associate.

            Note the Mod'ing of my comment there as well.
            M$ fanboys have serious problems with the truth about Redmond.

            -- gewg_

        • (Score: 2) by Grishnakh on Tuesday May 26 2015, @08:21PM

          by Grishnakh (2831) on Tuesday May 26 2015, @08:21PM (#188247)

          Send your solution upstream to the kernel|distro|app guys; park it on a server and let everyone know they can make copies for themselves and/or they can send copies to anyone and/or link to your page.

          Folks trying to make FOSS sound difficult are being disingenuous.

          It's not completely disingenuous. The problem is, if a FOSS solution simply doesn't exist, creating one takes time and energy and expertise. How much is a function of the difficulty of the project. So if a proprietary solution already exists, it's easier to just use that. However, you're right, if enough people (or the right people) get annoyed and someone makes a FOSS solution, suddenly this problem is no longer a problem for anyone; anyone can download it for free and use it. But someone's gotta do the work first. For some projects, this isn't as hard, because interested people exist who want to tackle the project (look at, say, Inkscape). For other projects, no one wants to bother (look at, say, tax preparation software).

          Someone else made a good comment: imagine how things would be if everyone, instead of spending $$$ on proprietary solutions, donated 10% of that to FOSS development.

          • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @09:05PM

            by Anonymous Coward on Tuesday May 26 2015, @09:05PM (#188277)

            Another thing that I haven't seen mentioned yet is bounties on specific bugs|desired features.

            The advent of online crowdsourcing makes collective action by folks with the same gripe easier than ever.

            Anyone ever see anything like that happen with proprietary software?

            -- gewg_

    • (Score: 3, Interesting) by Nerdfest on Tuesday May 26 2015, @04:38PM

      by Nerdfest (80) on Tuesday May 26 2015, @04:38PM (#188123)

      I did a rebuild about a year ago on a laptop for a friend who was taking a nursing program. After several crashes, malware infections, and the loss of some data, I rebuilt the machine for her with a new install of Windows 7 and a partition for Ubuntu Linus. I showed her how to select the OS at boot time.

      She ended up finishing the report she'd lost data on in Linux, and was pretty pleased. This weekend she came by to see if I could fix her machine. Windows would no longer boot. I assumed it was a malware problem or something again, but apparently all she'd only ever booted it accidentally and installed updates. She needed to get in this time as there was a piece of work software that needed Windows.

      Turns out she actually preferred running Linux ... and this was the Unity desktop, which I'm not even that thrilled with myself. I set Windows up in a VM this time (Snapshots are so handy when running windows). She'd set up wallpaper, installed software like DarkRoom (great photo editing suite), and never had any problems or had to ask for help (except for how to get music on her Samsung phone ... they do something weird and don't support MTP).

      I think people are a little out of date on just how usable FOSS software is.

      In a related topic, it took six frikkin' reboots before all updates were installed and many hours. I sure don't miss that.

      • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @10:56PM

        by Anonymous Coward on Tuesday May 26 2015, @10:56PM (#188341)

        Years ago, I worked in a Windows beige box shop. The owner promised someone a new computer by a particular time of day, without running it by me first, and didn't bother telling me until 2 hours before it was due to arrive. No real problem, hardware takes about 30-40 minutes to throw into a case, Windows took about the same to install. Then came the patches.

        Now, keeping in mind that this was a top-of-the-line machine for the day, it took two hours just to load the service pack. That rebooted, and then I had to install the next batch of patches (farmers who didn't have broadband), which took another hour or so. They were really angry when they left - I was angry, too, as the boss had blamed me for it after actively keeping me from starting on the machine until late in the day.

        Aside: That actually wasn't the first time he'd done that. The first time he did that, a customer brought her computer in, and every time I put it on the bench to work on it in the knowledge that he'd promised a one day turn-around, he took it off and shouted at me. Eventually, the woman arrived for her computer and I hadn't started on it, so he blamed me for that. She left, angry, I put it on the bench, and got shouted at for doing so.

        She came back an hour later, and I'd just put it on the bench again and this time she was shouting at him because it wasn't ready. Eventually, when I was part way through the job, she just started shouting to give him the bloody thing, which she took away to the competition a few doors down the road, and they had it up and running in about 90 minutes. She even bagged him on a few review sites for it, along with a few dozen other customers where he'd sent me out without telling me what was going on, and they were disgusted by it.

        There was one particular church I went to, where I had problems with one network install because of a package he knew and I didn't. He went up to work on it, fucked the whole lot up, put two of the five computers and the modem on 192.168.1.*, one on 10.1.1.*, another on 10.0.0.*, and another on 192.168.0.*, came back, told me I was going to have to go back and just work it out, oh, and sell them a new laptop because the network access problems were clearly caused by the one guy who had a MacBook. I went back, rebuilt the network again, fixed the package (took about a minute when I realised what had happened), and was done.

        I was shouted at for not selling them a new laptop because the MacBook was clearly the problem and when they rang up in a few days with the same problem, I was going back up there in my own spare time to fix it. We didn't hear from them again.

        Unbelievably, it's 7 years later and the guy is still in business.

    • (Score: 2) by meisterister on Tuesday May 26 2015, @07:13PM

      by meisterister (949) on Tuesday May 26 2015, @07:13PM (#188198) Journal

      - Interoperability. While you can put this on whichever side you care to (proprietary lock-in is the big lever for established players), it is a problem. I can read/write .doc and .xls with no problem, but as soon as someone comes with OOXML formats, interoperability goes down the toilet.

      I would like to point out that this is pretty much false for Windows versions of Office. Libre/Open Office has been able to read Microsoft's file formats for quite a while now, but the main problem has arisen when MS products need to read the free documents. In Windows, Office seems to be able to read OpenDocument Text files and such fairly well, but on the Mac all hell breaks loose as soon as you try.

      So basically the problem is that Microsoft is incompatible with Free Software more than Free Software is incompatible with Microsoft.

      --
      (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
    • (Score: 2) by darkfeline on Wednesday May 27 2015, @10:47PM

      by darkfeline (1030) on Wednesday May 27 2015, @10:47PM (#188821) Homepage

      >FOSS is generally less capable that proprietary software. Photoshop vs. Gimp. LibreOffice-Impress vs. PowerPoint.

      This is true. LibreOffice Writer still doesn't support outlining, for god's sake.

      >FOSS documentation sucks.

      All documentation sucks, as a rule, but I much prefer FOSS documentation to proprietary documentation, which as a rule doesn't exist. With FOSS, I at least have a guaranteed hope of figuring out how something works: read official documentation -> use Google, public discussion forums, and the mailing list -> read the source. With proprietary software, read official documentation (utterly useless or non-existent) -> use Google -> give up.

      Here's one example: I was looking for a MS Word feature recently. I couldn't find it in the built-in help. I found it on MS's website using Google, except it applied to a different version of Word and the feature has been moved somewhere else. More Googling, no luck. Playing around with Word I managed to find the feature and enable it somewhere in the GUI maze. Now I had a greyed-out button on my GUI toolbar. Great.

      After more experimentation, I did manage to get it to work, but the documentation was not very helpful.

      >Interoperability. While you can put this on whichever side you care to (proprietary lock-in is the big lever for established players), it is a problem. I can read/write .doc and .xls with no problem, but as soon as someone comes with OOXML formats, interoperability goes down the toilet.

      The last time I had an ODF document corrupted was never. The last time I had a Word document corrupted was a few years ago. Your results may vary. In particular, interoperability between different distributions of fools varies.

      Note that the universal open source format is plain text, which works very well everywhere.

      --
      Join the SDF Public Access UNIX System today!
  • (Score: 4, Interesting) by VLM on Tuesday May 26 2015, @12:57PM

    by VLM (445) on Tuesday May 26 2015, @12:57PM (#187992)

    The real story is switching to the web

    cloud-based proprietary office solution

    Locally, ten years ago the elementary schools were almost purely native app based. expensive proprietary educational software installed on windows desktops.

    Currently, with the ipad rollout etc, I don't think the elementary schools are using a single native desktop binary application... all web based, so they can use a legacy desktop or their ipads.

    Once you do that, switching desktops is pretty easy, the only software compatibility you need is "does it run firefox/chrome? OK we're done."

    It applies to linux/bsd/foss also. I moved a bunch of machines from Debian to Freebsd awhile ago. Same old xmonad, same old emacs, same old urxvt, can't tell I'm on a freebsd desktop vs a linux one. Once you dig beneath the surface zfs is cool, no systemd is cool, etc.

    • (Score: 4, Informative) by q.kontinuum on Tuesday May 26 2015, @05:38PM

      by q.kontinuum (532) on Tuesday May 26 2015, @05:38PM (#188158) Journal

      The real story is switching to the web

      cloud-based proprietary office solution

      Not really...

      [...] Issues [with proprietary document formats] were temporarily resolved by using a cloud-based proprietary office solution, says Lanero, giving the IT department time to complete the switch to open standards-based office solutions. The school now mostly uses the LibreOffice suite of office tools.

      Personally, I use my laptop a lot on train and wouldn't dream of getting my work interrupted whenever my mobile connection weakens. And I'd rather go naked on the streets than typing my personal mails or manage my personal photos on a cloud-service. (Note I didn't write "e-mail"; unencrypted email have privacy-wise the same status as post-cards. Paper-mails to my tax-counsellor are an entirely different matter.)

      --
      Registered IRC nick on chat.soylentnews.org: qkontinuum
    • (Score: 2) by Anal Pumpernickel on Wednesday May 27 2015, @03:46AM

      by Anal Pumpernickel (776) on Wednesday May 27 2015, @03:46AM (#188439)

      That sounds great. Store all your data on servers controlled by someone else. Privacy? Who needs that?

      And to make it worse, students are forced to go there. This "cloud" nonsense is unacceptable.

  • (Score: 3, Interesting) by Marand on Tuesday May 26 2015, @01:09PM

    by Marand (1081) on Tuesday May 26 2015, @01:09PM (#187994) Journal

    Nice to see more adoption of open solutions instead of funneling education funds into licensing fees. Probably helps that the proprietary OS vendors are US companies, thanks to a combination of growing mistrust (Thanks, NSA) and knowing that any money that goes to Microsoft or Apple is basically just leaving the country to never be seen again. Even if (hypothetically) there's no cost savings going from licensing Windows to paying for knowledgeable Linux admins, spending more on locals that can handle administration helps their economy more than handing all that cash to a US company where it'll disappear into pockets and tax loopholes.

    I think that's likely a major factor in limiting US adoption, actually. If Windows and OS X were, say, Russian products, you can practically guarantee there would be a much stronger push in the US to adopt something else. As it is, they're "home turf" products, so it's easier to rationalise it away as supporting US businesses.

    A shame, really, because I think schools, especially, should be using open source, regardless of what's used elsewhere. An open system is better for a learning environment than a black box you aren't allowed to even look at too closely. Using a open source OS means you not only have an open system in the schools, but they can also provide liveCDs to the students and say, "Here, you can use it at home and learn how it works! Tear it apart, do whatever you like!" You know, the sort of thing you can't do with proprietary software unless you want the BSA at your door demanding money.

    Most students won't care, but even if only a handful benefit from it, it's a win. For the rest, at worst it won't hurt, because they'll still be able to pick up Windows or OS X easily enough later when needed, assuming they aren't already familiar.

    • (Score: 4, Insightful) by VLM on Tuesday May 26 2015, @01:27PM

      by VLM (445) on Tuesday May 26 2015, @01:27PM (#188003)

      they'll still be able to pick up Windows ... easily enough later when needed

      Something to think about is the death of the operating system and desktop environment paradigm.

      My workplace used to use native apps, but they got rid of the last one for general use a couple months ago. Everything happens in a web browser now except for legacy MS Office and they're trying to run away from that too. One special purpose app I still use if vmware vsphere client because theres a web interface but the admins can't set it up correctly or whatever so I'm stuck on the slow PITA client when I need to mess with cloud images.

      Figuring out how to use Windows as a chrome bootloader isn't much harder than figuring out how to use Linux as a chrome bootloader...

      Non-casual games are still native, legacy ms office is still native, weird hardware is still native (think FPGA development environment). Thats about it for native apps.

      One workplace advantage is once everything moves to a web browser, the execs with their apple laptops and ipads don't need any special handling. They get their desire of being expensive special people without increasing anyone elses workload, which is cool.

      Even just ten years ago, native windows apps used to be business critical. They're just gone now, replaced by web interfaces.

      In the tech field there's a weird "moth to flame" attraction of some subcultures for ever more elaborate desktop environments, meanwhile the actual users are becoming less interested in native apps and environments every day. Eventually this divergence is going to be even more comical than it already is.

      • (Score: 2) by opinionated_science on Tuesday May 26 2015, @01:32PM

        by opinionated_science (4031) on Tuesday May 26 2015, @01:32PM (#188006)

        I modded you funny for "their desire of being expensive special people".

        Priceless!!!!

      • (Score: 2) by Marand on Tuesday May 26 2015, @01:53PM

        by Marand (1081) on Tuesday May 26 2015, @01:53PM (#188017) Journal

        Something to think about is the death of the operating system and desktop environment paradigm. [...] Figuring out how to use Windows as a chrome bootloader isn't much harder than figuring out how to use Linux as a chrome bootloader...

        That's an interesting point, though I'd say it's pretty hard to beat Linux as a Chrome bootloader [wikipedia.org], or even a Firefox bootloader [mozilla.org] if that's more your style.

        You still get a better gaming bootloader out of a Wintendo [catb.org], though Valve is trying [wikipedia.org].

        • (Score: 2) by wantkitteh on Tuesday May 26 2015, @06:42PM

          by wantkitteh (3362) on Tuesday May 26 2015, @06:42PM (#188183) Homepage Journal

          This. I have successfully expunged all Windows machines from my life except for a couple of highly proprietary use cases at work and video games. I'll likely be taking my best shot at shifting my gaming to Linux by the end of the week, depending on deliveries. SteamOS gets first shot, just because I'm fairly sure I'll end up on Mint Cinnamon afterwards when I find non-SteamOS things I want to do. [dolphin-emu.org] Twitch streaming will likely happen along the way too, although OBS seems happier on Windows right now with less glitches in the encoded video stream (YMMV) and better plugin support, but I expect that'll change in time.

      • (Score: 3, Informative) by LoRdTAW on Tuesday May 26 2015, @02:15PM

        by LoRdTAW (3755) on Tuesday May 26 2015, @02:15PM (#188028) Journal
        • (Score: 2) by VLM on Tuesday May 26 2015, @03:50PM

          by VLM (445) on Tuesday May 26 2015, @03:50PM (#188083)

          You have to install something like the xylinx software to a machine connected to the FPGA board.

          You could make a FPGA "appliance" that you'd connect to via a web interface or VNC/rdesktop using a rasp pi maybe somehow? It would be slow.

          A truly amazingly advanced dev board could host a webserver that allows you to upload to the FPGA from a web interface, I suppose.

          • (Score: 1) by tftp on Tuesday May 26 2015, @04:06PM

            by tftp (806) on Tuesday May 26 2015, @04:06PM (#188091) Homepage

            Huh? But in case I understood you correctly, Xilinx FPGAs can be programmed by any OS that can talk to a USB serial port (FT2232H.) Lattice CPLD is programmed over JTAG, I2C, SPI. None of that is an OS-specific process. Xilinx supports Linux for ages. Also, Xilinx has partial reconfiguration - which does allow you to run a web server on the device itself. But it's more practical to use a boot Flash with multiple images and trigger full reconfiguration when done.

            • (Score: 2) by VLM on Tuesday May 26 2015, @04:22PM

              by VLM (445) on Tuesday May 26 2015, @04:22PM (#188108)

              None of that is an OS-specific process.

              All of that is an OS-specific process and can only be done from a local native application of some sort, even if its a java app that runs on anything with a JVM, hardwired to the physical board via USB cable. (and don't get me started on companies that fill their USB ports with silicone to prevent IP theft)

              None of those can be done, say, from my phone, over its wifi connection using nothing more than the phone's web browser connecting to a site. Unless you stretch definition into a windows box with the application installed on it and a VNC server and then any VNC client on the LAN can connect to it and run the native app remotely.

              In corporate IT land or academia land I need signed forms and permissions and evaluations for every individual box or individual user account that has an app installed. With a web interface I need one guy to get permission to plug it in, assign an address and DNS name, and then anyone in the company can type a url into their browser without first filing an IT permission slip.

              I know Xylinx supports linux natively as a native app, thats the only way I've ever programmed a FPGA or CPLD. I guess it works on windows and osx too although I've never tried it. AFAIK there is no dev board out there that plugs in an ethernet cable, it DHCPs and Bonjours itself, then you connect any ole web browser to http://something:80 [something] and theres a complete development environment without installing anything on the machine that runs the web browser.

              Its like the difference between running Outlook Express native client which is installed on the box, and Outlook webmail.

              • (Score: 1) by tftp on Tuesday May 26 2015, @05:03PM

                by tftp (806) on Tuesday May 26 2015, @05:03PM (#188140) Homepage

                I know Xylinx supports linux natively as a native app, thats the only way I've ever programmed a FPGA or CPLD. I guess it works on windows and osx too although I've never tried it. AFAIK there is no dev board out there that plugs in an ethernet cable, it DHCPs and Bonjours itself, then you connect any ole web browser to http://something:80 [something] and theres a complete development environment without installing anything on the machine that runs the web browser.

                That is because that "board" would have to be a complete PC with a good amount of RAM and a fast CPU. This is needed to do the "complete development environment" per your request. Place and route is not exactly instantaneous, especially on large parts and complex designs.

                There is also that little issue with licensing. There are free versions of the Xilinx SDK, but there are also large editions that include professional features. Those are not free. I guess one could buy a PC that is loaded with all the software... but why bother if you can buy the software separately and run it on any PC of your choice? Or maybe not one PC but a cluster? (Xilinx supports that for many years now.)

                There is only a limited need for a "board" that allows you just to upload a .bit or a Flash image file. That role is currently filled with any old PC that runs a Chipscope Server. Then you can connect to that PC over TCP/IP and do your thing. I don't believe that anyone manufactures a standalone Ethernet-enabled JTAG pod. Is there a market for that? I don't know; but if there is, it's a very small, niche market. Perhaps it is large enough for a one-man company. But the price of the board has to be lower than the price of a Digilent JTAG module plus the R-Pi.

                • (Score: 2) by VLM on Tuesday May 26 2015, @05:33PM

                  by VLM (445) on Tuesday May 26 2015, @05:33PM (#188156)

                  Yes, although its important to remember that WRT screwing around, I'm not doing anything more technologically impressive than state of the art in 2005, so a single chip computer in 2015 with the specs of a decent 2005 desktop "pi-like" would be more than capable enough for advanced hobbyists and uni students. People were doing "real work" in '95 with '95 class hardware, so screwing around in '05 wasn't very challenging, and now that '05 class hardware fits on a single $5 chip in '15...

                  I mean, seriously, my first "hello world" project just to prove the toolchain is up and running correctly was a single bit full adder, three switch inputs and two LED outputs. I think that was on the first generation Digilent CPLD board. My first discovery was the LEDs were active low, LOL. I would imagine a stereotypical uni class is equally technologically unambitious. Its not like semesters are longer or kids are smarter than the very recent past. And the SoC of 2020 that has the specs of a 2010 desktop will be even more ridiculously overpowered when doing things that would have been challenging in '95. So in the long run it seems inevitable?

                  Another interesting sideline with students and hobbyists, is time is not money. So if it takes 5 minutes to compile a Z80 core like it used to in 2005 or whatever it took exactly, well, its not like uni students are highly paid...

          • (Score: 2) by LoRdTAW on Tuesday May 26 2015, @04:25PM

            by LoRdTAW (3755) on Tuesday May 26 2015, @04:25PM (#188109) Journal

            You can interface with an FPGA over SPI or even I2C to upload a bitfile. An arduino with the proper library could upload bitfiles from an SD card or even over BT or wifi. Uploading bitfiles on the fly from a Rpi is not big deal either. But, you can't compile the bitfiles on a Rpi simply because there is no ARM port of the dev tools, yet.

            The only instance when you might need Windows is if the dev board manufacturer did not release a Linux driver for their boards integrated serial/jtag adapter. That is just laziness and those boards should be avoided if you need Linux support. Sometimes, they also have Windows only tools for peripheral configuration like writing a binary image to onboard flash. Worst case, you can use a VM running Windows for those tools.

            I have developed for both the Digilent Nexys 2 and the Terasic DE0-nano on Linux.

      • (Score: 2) by kaszz on Tuesday May 26 2015, @02:19PM

        by kaszz (4211) on Tuesday May 26 2015, @02:19PM (#188030) Journal

        Even just ten years ago, native windows apps used to be business critical. They're just gone now, replaced by web interfaces.

        Too bad if anything were to happen with the internet connection or any of your documentation is secret.

        • (Score: 2) by VLM on Tuesday May 26 2015, @03:59PM

          by VLM (445) on Tuesday May 26 2015, @03:59PM (#188085)

          Web doesn't necessarily imply public access. It pretty much does for consumer stuff like "internet refrigerator" or "internet thermostat", admittedly.

          Plenty of engineering tools at work live on RFC1918 addresses with no external NAT access and provide a tasty easy to use web interface.

          In "the really old days" we used to have things like remote test gear connected by honest to god serial ports and modems for "remote" and in the 90s I got involved in projects to put them on terminal servers so we could just telnet from any desktop machine on our LAN. Since the turn of the century all that is gone and everyone ships web clients.

          Even our hourly employee timeclock is now a website. All our reporting systems. All our UPS'es. Transfer switches and their battery chargers. 20+ years ago logging was a bunch of RS232 alert lines feeding to a dot matrix line printer, thats all online via web for monitoring subsystems now. I don't have access to it but I'm told the HVAC "front panel" is entirely virtual online now.

          All that stuff, 10+ years ago, would have been a native desktop app. I remember having to support them and having to fight IT to install our engineering applications. Its a lot easier to get permission to connect to the engineering/production network and just access a certain URL in a web browser.

          • (Score: 2) by kaszz on Tuesday May 26 2015, @04:10PM

            by kaszz (4211) on Tuesday May 26 2015, @04:10PM (#188095) Journal

            The problem with website style access is that it's a poor design to interact with for other software. So the whole software-to-software communication becomes a huge and lengthy hurdle.

            • (Score: 2) by kaszz on Tuesday May 26 2015, @04:13PM

              by kaszz (4211) on Tuesday May 26 2015, @04:13PM (#188101) Journal

              s/lengthy/unreliable/

            • (Score: 2) by VLM on Tuesday May 26 2015, @04:28PM

              by VLM (445) on Tuesday May 26 2015, @04:28PM (#188113)

              The unix philosophy of small cooperative tools is actively opposed. Thus the giant monolith.

              In the old days of telnet servers and RS-232 connections I was the guy stuck writing "EXPECT" scripts to automate work.

              The modern solution is presenting some kind of REST ish standard ish API. Being a standard there are of course like 15 incompatible standards. But machine to machine automation is hardly impossible over the web. I've been stuck doing all sorts of SOAPy WSDLy foolishness over the years.

              • (Score: 2) by kaszz on Tuesday May 26 2015, @04:41PM

                by kaszz (4211) on Tuesday May 26 2015, @04:41PM (#188127) Journal

                Not impossible. Just very cumbersome.

                It's almost like.. oh this webinterface is messy. Fix it? nah. Slam OpenWRT etc.. onto and be done with it using some script on the device.

          • (Score: 2) by q.kontinuum on Wednesday May 27 2015, @08:29AM

            by q.kontinuum (532) on Wednesday May 27 2015, @08:29AM (#188523) Journal

            Web doesn't necessarily imply public access.

            It usually means giving access to the documents to some cloud-service-provider. Big companies can run their own servers etc., but smaller businesses won't. If I found out my tax-adviser, physician, priest, therapist or other business partner was managing my personal data at googledocs or another cloud service, I'd look for another one,

            It pretty much does for consumer stuff like "internet refrigerator" or "internet thermostat",

            For consumers it definitely means entrusting their data to some other companies, usually US-based. This is a no-go for me. The US made it pretty clear that, while they might consider to eventually obey privacy laws regarding their own citizens, foreigners don't have any such privileges.

            --
            Registered IRC nick on chat.soylentnews.org: qkontinuum
      • (Score: 3, Informative) by Phoenix666 on Tuesday May 26 2015, @02:25PM

        by Phoenix666 (552) on Tuesday May 26 2015, @02:25PM (#188033) Journal

        For me the problem with doing away with native apps in favor of cloud versions is that connectivity in the United States is crap. I have a business connection in NYC and it's still chancy. The incredible security and privacy nightmare of the NSA and its brethren criminal organizations aside, productivity would take a hit on a regular basis if I had to rely on the cloud for anything. Maybe it's a different story in parts of the world that have 1st World broadband, like South Korea, but the United States does not seem to be cloud-ready.

        --
        Washington DC delenda est.
        • (Score: 2) by kaszz on Tuesday May 26 2015, @04:46PM

          by kaszz (4211) on Tuesday May 26 2015, @04:46PM (#188130) Journal

          So USA is still a 1st world country? ;-)

        • (Score: 2) by VLM on Tuesday May 26 2015, @04:56PM

          by VLM (445) on Tuesday May 26 2015, @04:56PM (#188137)

          None of the browser driven apps at my employer are commercial public cloud, at least not that I can think of.

          A $250K machine with an embedded web interface (think of a stereotypical network printer, but running an engineering tool or production machine instead of a boring printer), OK.

          A box provided by the manufacturer that plugs into our network and supposedly should be treated as an appliance although its really just a windows or linux install with apache and some support code. An example of this architecture (that we actually don't use) would be github enterprise. OK.

          A virtual image on the private vmware cluster and the private NAS farm. OK. I have like 20 of them doing various things. None of them have any public access. I don't even know where they're located today although I think they're in the midwest somewhere. They could be at the coastal centers again for all I know. It really doesn't matter.

      • (Score: 2) by novak on Tuesday May 26 2015, @11:33PM

        by novak (4683) on Tuesday May 26 2015, @11:33PM (#188349) Homepage

        Not really disagreeing with you but adding to the list of areas where native applications are required: Engineering. I worked in a place that did a fair bit of CFD simulation, as well as some mechanical and thermal FEA. For data processing on that level you really require as much memory as you can get- that's actually what determined how big of a model you could load, everyone had 12GB RAM on their desktop but there were a few community boxes at 96GB to 128GB RAM for more serious processing. You would never ever want to add the overhead of a browser on top of that.

        The simulation clusters themselves also needed a lot of RAM, typically several GB per core, with more memory on the head node for partitioning. And the clusters, the clusters ran linux (redhat, unfortunately, but redhat 5 at least). Over the years I worked there we had only one windows cluster, and it was a total nightmare: if the machines were not patched exactly in lockstep the simulations would crash. Various windows updates affected the simulation performance significantly, often fatally, and the cluster generally failed to run above about half what our linux clusters did.

        So basically: yes, the desktop is dying, and where it lives on linux is much more fit to survive (except, of course, the legacy MS Office VBA plugins, may they perish in fire).

        --
        novak
  • (Score: 5, Interesting) by bzipitidoo on Tuesday May 26 2015, @01:14PM

    by bzipitidoo (4388) on Tuesday May 26 2015, @01:14PM (#187998) Journal

    It's not just purchases of licenses. Tracking a bunch of software licenses is a pain in the rear. Site licensing helps, but there are still many packages and versions all with lengthy EULAs and different terms most of which won't hold up in court, but aren't worth the trouble to fight out. It's not even close to the convenience of not having to track licenses at all, and not having to worry that the Business Software Alliance might come calling and demand access so they can audit every PC on site.

    • (Score: 5, Interesting) by Phoenix666 on Tuesday May 26 2015, @02:29PM

      by Phoenix666 (552) on Tuesday May 26 2015, @02:29PM (#188035) Journal

      My favorite moment at an old workplace was when they had to buy more proprietary software to track the software licenses on the other proprietary software. We called it a recursive clusterfuck.

      After that I never allowed proprietary software in my departments again--I used only FOSS and hired only people who knew how to work with it. Worked brilliantly.

      --
      Washington DC delenda est.
    • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @08:11PM

      by Anonymous Coward on Tuesday May 26 2015, @08:11PM (#188238)

      Linux advocate Robert Pogson always remembers to mention that your M$ EULA only allows a limited number of client boxes to be connected to your M$ server.
      After you reach that limit, you need -another- server.
      ...and each of those subordinate boxes requires a Client Access License (CAL).

      Proprietary software is only useful as an example of how NOT to do things (nickel-and-dime folks).

      As Pogson likes to remind us, FOSS is the right way to do IT.

      -- gewg_

  • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @02:27PM

    by Anonymous Coward on Tuesday May 26 2015, @02:27PM (#188034)

    Honestly if they have their stuff setup for the web, the OS is really unimportant.

    Actually, why isn't everything setup for the web?

    I think of one simple example that would be worth millions, if you put the math text onto a webserver, all schools in this country and globally have access to a single math textbook and there is now no need for everyone to print their own personal handed off copy.

    The savings only increase the further you carry the idea, but it's nothing to do with the operating system at all.

    • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @02:41PM

      by Anonymous Coward on Tuesday May 26 2015, @02:41PM (#188036)

      The MS Exchange browser plugin, having that work properly and with minimum support is a big deal for many institutions though.

      • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @04:02PM

        by Anonymous Coward on Tuesday May 26 2015, @04:02PM (#188089)

        I think that's because the online idea needs to be further extended. I use nodeJs and it has a module called mailin. I store emails as JSON objects in a database, JSON is the language used up at the front end, and with nodeJs it's the server language as well. Building an email interface took about a day(multi user would only take another day) and a chunk of that was just waiting for my DNS to update with the new MX entry. I think part of the problem is that with proprietary people have had a lot of necessary "how is the hotdog made" operation hidden in compiled code and open source(most especially script languages) are showing that not only is it not magic it's ridiculously simple, it's trying like a fiend to be as simple as possible. However trying to find advice like that about the webserver and site is difficult because people don't understand that universality and simplicity breed better operation because mutation is more possible since it's all understandable and standard.

        If they had their emails coming into the same webserver handling the texts (which like up above is not a difficult or impossible task) it would really show how irrelevant the OS is.

    • (Score: 2) by tibman on Tuesday May 26 2015, @04:09PM

      by tibman (134) Subscriber Badge on Tuesday May 26 2015, @04:09PM (#188092)

      The harder part is trying to get all schools to agree on the same book(s).

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @04:46PM

        by Anonymous Coward on Tuesday May 26 2015, @04:46PM (#188129)

        The schools are an extension of will, if any authority deemed "we're going to use this text as an online universal source to save millions" everyone would expect that to be the case and the pressure and ease of it would eventually trickle away even the hardened opponents over the span of 5-10 years.

        • (Score: 2) by tibman on Tuesday May 26 2015, @04:55PM

          by tibman (134) Subscriber Badge on Tuesday May 26 2015, @04:55PM (#188136)

          I was thinking more about the book content. Different regions have widely different views of history and how subjects should be taught.

          --
          SN won't survive on lurkers alone. Write comments.
          • (Score: 2) by hendrikboom on Wednesday May 27 2015, @04:37PM

            by hendrikboom (1125) Subscriber Badge on Wednesday May 27 2015, @04:37PM (#188680) Homepage Journal

            Oh! The Horror! Suppose schoolchildren had access to other countries' textbooks!

            • (Score: 2) by tibman on Wednesday May 27 2015, @04:51PM

              by tibman (134) Subscriber Badge on Wednesday May 27 2015, @04:51PM (#188687)

              Hah : ) That is still not what i was trying to say though. Those other countries/regions will submit corrections to all the "mistakes" in your books. Though i suppose there is nothing wrong with having 20 different geometry books.

              --
              SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by kaszz on Tuesday May 26 2015, @04:34PM

      by kaszz (4211) on Tuesday May 26 2015, @04:34PM (#188119) Journal

      Some cognitive researchers have found that using handwriting is needed to make the text to stay in memory. So while good in theory, it may not work out.

      • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @04:39PM

        by Anonymous Coward on Tuesday May 26 2015, @04:39PM (#188126)

        I'm not opposed to limits on it. Education and children are best dealt change slowly and in layers. However for the math text, and certain others, the work is not performed in the book it's just there as a reference and they use notebooks for the work. It makes sense to universalize references most especially math because they are meant to be universal to begin with. It would also eliminate much wasted duplicated effort. A universal text could be overhauled on a regular basis with a decent budget so that it's not just a text, it's a really fantastic and ever increasingly fantastic reference and tool. A book will eventually rot, cannot hold video/audio, and cannot be interacted with.

      • (Score: 0) by Anonymous Coward on Friday May 29 2015, @07:59PM

        by Anonymous Coward on Friday May 29 2015, @07:59PM (#189815)

        There's lots of bad science out there, so a citation would be nice so I could have a look and see if it is a study worth paying attention to. My gut feeling is that it isn't.

  • (Score: 0) by Anonymous Coward on Tuesday May 26 2015, @07:51PM

    by Anonymous Coward on Tuesday May 26 2015, @07:51PM (#188228)

    I was just trying kubuntu 14.04 LTS and the network config dialog is too big for the default res of 640x480 (when the graphics hardware isn't supported), and it doesn't really need to be that big. WTF! I can't remember whether this is a regression (wouldn't be surprised - given the other stupid changes in the UI) or kubuntu has always had this problem. I was using KDE for a few years and it had a fair bit of UI stupidity but that was many years ago - then I gave up on Desktop Linux because it seemed more like the developers were sabotaging Desktop Linux or doing stupid crap like wobbly windows or screwing up the sound system than actually making things better (if I'm going to put up with shit I might as well pick mainstream shit like Windows). Getting "WorksForMe" and other nonsense when I reported broken stuff didn't help.

    The other recent problem was I couldn't find an easy way to find the fastest mirror (no the defaults aren't the fastest - I had to look for a faster mirror - which turns out to be in a different country but 10x faster, and edit a text file). Yeah I know the lack of $$$$ = no CDN, but there used to be a utility to find fast mirrors and then select them.

    I do use Linux regularly for server stuff though. I may change the mirrors for the servers (I've been putting up with the slow defaults - it's for work - good excuse to read soylent etc ;)).

    To me it's no surprise Desktop Linux hasn't caught on that much - it was crap, it coincidentally became even crappier when Windows became crappier (it would have been a good opportunity to take marketshare but as I said it's practically like sabotage), and it stayed crap. OS X's share proves it's not mainly Microsoft's fault. And Android's share too.

    If you don't think Desktop Linux is crap that's fine with me. Maybe it works fine for your use case like for this Spanish school.

    But in the real world many people like an OS where some old driver still works many years later through many service packs and updates, even after the manufacturer in Taiwan or wherever is no longer in existence. The driver might be buggy, it might have security issues. But it might not, or nobody might ever exploit it anyway. Point is more often than not it works as long as the hardware does. No need to recompile from nonavailable source code just because of "kernel updates". No need for a defunct manufacturer to sponsor some OSS developer to keep maintaining it.

    • (Score: 0) by Anonymous Coward on Wednesday May 27 2015, @07:05AM

      by Anonymous Coward on Wednesday May 27 2015, @07:05AM (#188494)

      Hi, Hairyfeet! How's it going? Still haven't found those drivers? Did anyone ever tell you about ndiswrapper?