Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday December 17 2018, @11:17PM   Printer-friendly
from the I-don't-see-what-you-did-there dept.

PLEX, this last week pushed out changes to its ROKU users (I am one). That made using PLEX nearly impossible for some people. Light and Dark gray color palate. White text on light gray background, to the point of the PLEX 1/4 screen height logo and spinning-working throbber being lost on the background.

So war ensues... See Plex.tv support forums if you must.

My question to you all, "What is TECH's responsibility to the Handicapped?".

Should good TECH also have a backdoor method allowing those with usability issues to still use the product, when TECH changes directions? What about lifetime pre-paid services that are now unusable? Should there be immediate return of funds, so we can buy the second best solution (now the best choice for us)? Should any change be signed off by a third party auditor to insure continued usability?

So again, asked differently, what is TECH's moral responsibility?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by ilsa on Tuesday December 18 2018, @08:15PM (5 children)

    by ilsa (6082) Subscriber Badge on Tuesday December 18 2018, @08:15PM (#776019)

    It's not just handicapped people, although they are overwhelmingly affected.

    The problem is why has the tech industry at large decided that idiotic design patterns trumped good sense? We had this all figured out a couple decades ago. Since then, what do we have? Playstation button icons have replaced basic navigation symbols. Menus have been replaced by that idiotic 3 lined hamburger whatever. Low-contrast design.

    IMO this is just another symptom of "modern" development practices. Make things more convenient for the developer, not the user. Heck, the exploding Javascript "ecosystem" has proven that developers have effectively pointed their middle finger at quality, efficient code. Let the sucker using the product just throw more hardware at the problem cause the developer either has no concept, or no care, that their code run like a quadriplegic elephant in a tar pit.

    At least, that's the only explanation I can think of as to why product quality has collapsed across the board in recent years.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday December 19 2018, @06:24PM

    by Anonymous Coward on Wednesday December 19 2018, @06:24PM (#776413)

    Its the use of AB testing. This is just a form of NHST (testing a strawman null hypothesis that two different things affect the outcome in exactly the same way, then concluding whatever your favorite thing is) which destroys every field it touches.

  • (Score: 2) by urza9814 on Wednesday December 19 2018, @07:13PM (3 children)

    by urza9814 (3954) on Wednesday December 19 2018, @07:13PM (#776441) Journal

    It's...complicated, probably.

    As I posted above, I think a large part of it is the disconnect between developers and users. Devs don't know who is using the software or why; they just build something that makes sense to them and dump it on the world and see what happens. Hell, most of the time the devs aren't even using their own software. I don't REALLY have a clue what half of the code I work on is actually doing -- I know I need to move data that matches certain conditions from point A to point B. I don't know what that data is actually doing, I don't know how it is used by the business application, I just know they told me to find a way to get it somewhere. I write software for pharmacies and I haven't visited a pharmacy even once in the past five years, and I got ZERO training on what these people do all day, so I don't have a clue.

    At the same time, that disconnect goes both ways. Users don't want to learn how the software works. They just want magic. People complain about their "piece of shit calculator" because it gave the wrong answer -- after they typed in the wrong numbers. But hey, if Word can detect typos in text, why can't the calculator detect typos in numbers? They don't even understand why that very question is so absurd.

    And as a developer you can try to do right and still find that it doesn't matter at all, which is rather frustrating and may be another factor which leads to these ultra simplistic designs. You spend hours to produce pages and pages of documentation clearly showing every feature the user needs...and they stuff it to the bottom of a filing cabinet and call you every damn day asking questions that are already documented. So you convert it to an online copy and send them that...and they never even open it. So you integrate the documentation into the product itself, and they blindly click through the instructions without reading them then call you and ask what to do next. Every time. And that's probably exactly what we deserve for trying to put a computer in every single person's hands. Now half your users are essentially illiterate. Good fuckin' luck....

    • (Score: 2) by ilsa on Wednesday December 19 2018, @09:00PM (2 children)

      by ilsa (6082) Subscriber Badge on Wednesday December 19 2018, @09:00PM (#776527)

      What you are saying is true, however ultimately it is still the developer's responsibility (or the project manager, consulting company, whereever the buck stops...) to figure out what will most benefit the user. Per your pharmacy example, if nobody on your team has visited a pharmacy and watched what they do, that is a MASSIVE problem. Your product is almost guaranteed to be difficult to use. I was involved in a project once where software was designed by people who would never actually use it. The software was complete crap, cost a great deal of money, and was ultimately abandoned because the people using it were *this* close to open revolt over it.

      Eating your own dog food is critical to a successful product, regardless of what that product is.

      The thing is, software development is not a new field at this point. There is almost 50 years of experience now, and none of these problems are new. There is more than enough documentation around that describes good UI principals and good software design. There are plenty of real world examples, good and bad. But developers are just as bad about reading documentation as users are, so all this documentation may as well not have been written.

      We already know full well that users rarely read the documentation. So as you said, that needs to be accommodated, to as reasonable an extent as possible. Unless it's a very specialized piece of software, your software should be easily discoverable. The UI should be obvious to use and guide users to correct usage. But the current UI fad is to make interfaces look as hipster as possible. Standardized controls are for losers, apparently. I remember when I first tried the snapchap app on my phone. That was the single most stupidest UI I had ever seen. Absolutely nothing was obvious. It was totally undiscoverable, unless you counted randomly tapping on literally everything just to see what happened. While I did eventually figure out how to use it (I treated it as a curious puzzle rather than a chat app), I was so affronted by how shockingly bad the UI was that I never used it again

      And again, I bring up that idiotic hamburger menu button. I see it all over the place now, and I hate it every single time. If you need a menu interface, make a f__king menu interface. If having a menu is too complicated, then redesign your UI because you've clearly done it wrong.

      And as other people have mentioned, don't redo your UI every version just for the sake of "freshening it up". It has zero benefit, unless you count additional retraining costs for existing users as a benefit. Why is iPhone so popular? Because it (generally) works, and it stays working. You pick up an iPhone with iOS1 or iOS12, you will be able to use either one. Is the UI boring an uninspired? You betcha. And that's the point. If I wanted surprises, I'd buy a Kinder Egg.

      As an aside... One nice feature that a lot of applications used to have, is that you could click a ? button, then click the thing you wanted to know about, and help would pop up. Nobody does that anymore. Why? I dunno. Because it's "outdated" and "old"?

      I realize I'm going off on tangents but there's just so much area to cover. The thing is, it's not just UI. It's all the possible ways in which users interact with software. Not just day to day usage but installation and maintenance, etc. The thing that really pisses me off, is the sheer arrogance that I've seen developers exhibit. They literally expect the world to just revolve around them just because they wrote some code. Ansible is a great example of that. They will happily break fundamental grammars in a point release, and expect everyone to just drop what they're doing rewrite all their playbooks to accomodate it. This has resulted in people having to design entire continuous integration pipelines JUST for their playbooks. Ansible is supposed to *facilitate* things like infrastructure automation and continuous integration, but instead all they've done is add another layer of complexity to an already complex mix. Ansible could have been a fantastic product, but instead it's only barely passable because Ansible developers don't give a flying fig about how much of a burden they are placing on already overwhelmed sysadmins.

      I could go on and on and on about the stuff I've seen but I've already written a long enough essay already. But ultimately, while yes users can be raving asshats (it's unfortunately unavoidable and you're right, there's only so much you can do about them...), that doesn't obviate the developer's responsibility of making a product that is fit for purpose for at least the targeted user base. This means developers need to spend more time thinking about how their product is going to behave, well before they've so much as written their first line.

      • (Score: 2) by urza9814 on Thursday December 20 2018, @06:36PM (1 child)

        by urza9814 (3954) on Thursday December 20 2018, @06:36PM (#776897) Journal

        What you are saying is true, however ultimately it is still the developer's responsibility (or the project manager, consulting company, whereever the buck stops...) to figure out what will most benefit the user. Per your pharmacy example, if nobody on your team has visited a pharmacy and watched what they do, that is a MASSIVE problem. Your product is almost guaranteed to be difficult to use. I was involved in a project once where software was designed by people who would never actually use it. The software was complete crap, cost a great deal of money, and was ultimately abandoned because the people using it were *this* close to open revolt over it.

        Yup...lack of foresight, lack of investment, standard MBA crap. We've got so many people coming in on six month contracts that a proper, complete training process would eat up the majority of that time and nothing would get done. Nobody wants to invest in employees; they want "more bodies on this project" which they can just throw away when they're done. And then they brag about this lack of investment as "efficiency" and "innovation".

        The thing is, software development is not a new field at this point. There is almost 50 years of experience now, and none of these problems are new. There is more than enough documentation around that describes good UI principals and good software design. There are plenty of real world examples, good and bad. But developers are just as bad about reading documentation as users are, so all this documentation may as well not have been written.

        There's also fifty years of "Cheap, fast, good: Pick two" with "good" almost always being the one that gets thrown out. The devs suck because nobody ever wants to hire a good dev until there's a crisis that nobody else can figure out. Until then they just want cheap code monkeys. Management is more concerned with how many releases they can push out in a year than in how many defects are found in each one...defects are internal, while the number of releases gets bragged about in shareholder documents. Ignoring quality is the standard formula for short-term stock market success (and then cashing out before everything crashes down around you.)

        We already know full well that users rarely read the documentation. So as you said, that needs to be accommodated, to as reasonable an extent as possible. Unless it's a very specialized piece of software, your software should be easily discoverable. The UI should be obvious to use and guide users to correct usage. But the current UI fad is to make interfaces look as hipster as possible. Standardized controls are for losers, apparently. I remember when I first tried the snapchap app on my phone. That was the single most stupidest UI I had ever seen. Absolutely nothing was obvious. It was totally undiscoverable, unless you counted randomly tapping on literally everything just to see what happened. While I did eventually figure out how to use it (I treated it as a curious puzzle rather than a chat app), I was so affronted by how shockingly bad the UI was that I never used it again

        Well, I tend to avoid mobile apps and social media these days so I can't say too much about those...except that the UI changes are likely focused on emphasizing advertisements as others have mentioned elsewhere in these comments. If you're clicking around randomly on the screen, you're more likely to click an ad. I recall a particularly egregious case of this from that Angry Birds game a few years ago -- it was something along the lines of tap once to aim and tap again to fire...and as soon as you start to aim a full-screen ad would pop up. So if you went to click to fire too quickly you'd hit the ad instead. This isn't stupid developers, this isn't people failing to read and understand the science of UI design...it's a bad UI because management explicitly paid for a bad UI.

        But we also have the "cheap/fast/good" issue again -- nobody wants to pay for a custom designed UI, which is how we end up with a bunch of "universal" frameworks (ie, Drupal/Wordpress/Joomla CMSes and such replacing traditional web development) -- but you can't always keep it simple if it's designed to do everything for everyone. It's generally better to start from the ground up and building only what you need rather than starting with everything including the kitchen sink and trying to strip it all out again. But it costs more to do it that way, and nobody's going to pay for it.

        And again, I bring up that idiotic hamburger menu button. I see it all over the place now, and I hate it every single time. If you need a menu interface, make a f__king menu interface. If having a menu is too complicated, then redesign your UI because you've clearly done it wrong.

        Agreed. I think that might be an attempt to avoid localization costs? You don't have to translate if there's no text. And "English" can still be a localization issue for US-based corporations...I was just in a meeting where the head of the department had to explain to some project leads what the word "pending" meant. We've got code files where the file names have words like "Perscirption" -- the people writing pharmacy software can't spell the word "prescription" and can't be bothered to look it up and nobody in management seems to think that's a problem because *something* still manages to hit the production servers "on time", even if it's crap and missing half the requested features...

        And as other people have mentioned, don't redo your UI every version just for the sake of "freshening it up". It has zero benefit, unless you count additional retraining costs for existing users as a benefit. Why is iPhone so popular? Because it (generally) works, and it stays working. You pick up an iPhone with iOS1 or iOS12, you will be able to use either one. Is the UI boring an uninspired? You betcha. And that's the point. If I wanted surprises, I'd buy a Kinder Egg.

        Yup, said that myself in another comment. That stuff is infuriating. But you've gotta keep moving the ads around in order to get people to click them by mistake...if they don't click the ads, you don't get profit and you can't pay the devs because most users won't pay for the software just because it's marginally easier to learn.I stopped using Facebook about a year ago and I *still* hear the complaints every few months when they change their interface. But you suggest that maybe those people should stop using Facebook and they call you a crazy tinfoil hat conspiracy theorist. People whine about it, but as far as I can tell they seem to prefer this to the alternatives...

        Although your comments about iOS bring up a somewhat different point. I see no real difference between the iOS interface strategy and the Facebook interface strategy. Either way I have zero control over it. Either way they tell me I can take it or leave it. With Android, the interface is more or less what I want it to be. Win Linux on my laptop my interface is some twisted mess of windowing, tiling, and straight CLI...and that's exactly how I want it. Settings should not change automatically by themselves, but it still needs to be possible to change them if the user wants to. "One size fits all" doesn't work any better for computer UIs than it does for clothing. Only good for cheap crap and small accessories.

        I could go on and on and on about the stuff I've seen but I've already written a long enough essay already. But ultimately, while yes users can be raving asshats (it's unfortunately unavoidable and you're right, there's only so much you can do about them...), that doesn't obviate the developer's responsibility of making a product that is fit for purpose for at least the targeted user base. This means developers need to spend more time thinking about how their product is going to behave, well before they've so much as written their first line.

        Right...but keep in mind that most of these developers aren't doing this as a hobby (and when they do, the results are often much better)...generally they're writing what they're told to write. Developers will stop doing this crap when users decide that it's worth an extra five bucks to get a product that's not a complete steaming pile. You demand everything for free (or as close to it as possible) and you get what you pay for.

        • (Score: 2) by ilsa on Thursday December 20 2018, @08:50PM

          by ilsa (6082) Subscriber Badge on Thursday December 20 2018, @08:50PM (#776963)

          I can't disagree with anything you've written. I think the issue for me is that, because OSS developers are supposed to be free from the whole "stupid profit-motivated management decisions" problems, they have the freedom to do things "right". But when all is said and done it feels like zero progress is being made. Dealing with OSS software invariably is far more difficult than it needs to be, which results in negative experiences.

          A fantastic example is a jailbroken iPhone. In theory, it's great. Having low level control of your phone, etc etc. But I eventually gave up jail breaking my phone, not because Apple made it too difficult, but because there was so little payoff for doing so. The majority of software available on Cydia was problematic for one reason or another. The few widgets and doodads that were worth getting were paid ones, and even those became pointless the moment Apple added a feature that did 90% of what the widget did.

          I don't like my iPhone or iOS, and Apple's attitude pisses me off to high heaven, but when push comes to shove, it does what it says on the tin with minimal fuss. (More or less. Apple removed access to network sniffing so I can't use it to do wifi channel scanning anymore... grumble grumble...)

          Ilsa