Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday May 29 2017, @03:27AM   Printer-friendly
from the it's-all-gone-to-bits dept.

From the RooshV Forum:

I constantly get the vibe from people that they think our technology is skyrocketing, that we're living in a new tech age, "where was all this ten years ago?!" etc.

But I disagree with this assessment of our technology. It has made steady improvements in one specific space: software and electronic hardware. That is all. On top of that, the improvements on the hardware have not even been ground breaking. GPS is a ground-breaking invention. Smaller screens are not: they are just an incremental improvement.

Smartphones are merely the result of incremental improvements in the size and quality of electronic components. The only breakthroughs involved are ages old. The invention of the transistor, the laser, etc. The existence of google, facebook, uber, and so on, are merely inevitable "new applications" stemming from these improvements. They are not breakthroughs, they are merely improvements and combinations upon the telephone, the directory, and the taxi.

In my opinion, technology as a whole is borderline stagnant.

A list of why technology is still shit:

The posting goes on to list examples of incremental, rather than breakthrough, changes in the areas of:

  • Electronics & Machines
  • Energy
  • Medicine
  • Clothes
  • Food
  • Finance

Have we really stagnated? Have we already found all of the "low-hanging fruit", so new breakthroughs are harder to find? Maybe there is greater emphasis on changes that are immediately able to be commercialized and less emphasis on basic research?

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Funny) by its_gonna_be_yuge! on Monday May 29 2017, @03:34AM (1 child)

    by its_gonna_be_yuge! (6454) on Monday May 29 2017, @03:34AM (#516994)

    Without Twitter, there would be no Trump. And without Trump, there would be no ... um ....

    Yeah, I guess technology is shit.

    • (Score: -1, Troll) by Anonymous Coward on Monday May 29 2017, @04:56AM

      by Anonymous Coward on Monday May 29 2017, @04:56AM (#517014)

      Without Trump we wouldn't have passive aggressive xkcd comics lamenting how She Lost.

      I wish Ram-Hole Man-Ho had killed himself after She Lost.

  • (Score: 3, Interesting) by Rosco P. Coltrane on Monday May 29 2017, @03:47AM (15 children)

    by Rosco P. Coltrane (4757) on Monday May 29 2017, @03:47AM (#516998)

    All these incremental improvements in computing - speed and storage memory essentially - brought about one HUGE breakthrough: big data and AI. When I was a kid, the ability to process immense masses of data, let alone intelligently, was nonexistent. And then it happened. Not because it was "discovered", but because the incrementally better technology that subtends it suddenly reached a critical point beyond which theory became a practical proposition.

    Only no-one saw it coming because it's all behind the scene. But think about it: before you had to type a very precise search string in your search engine of choice to get what you wanted. Then almost overnight, you could type something vaguely relevant and get exactly what you need too. Remember that? That's the tip of the big data iceberg.

    The rest of the iceberg, sadly, is less people-friendly, as most of this particular breakthrough basically serves nefarious or greedy interests. But that's another issue.

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @03:55AM

      by Anonymous Coward on Monday May 29 2017, @03:55AM (#517001)

      AI and Big Data has been the focus of academic computing for nigh on half a century. I studied neural nets in an introductory AI course more than a decade ago—long before The Cloud or iPhone.

    • (Score: 3, Insightful) by archfeld on Monday May 29 2017, @04:32AM

      by archfeld (4650) <treboreel@live.com> on Monday May 29 2017, @04:32AM (#517007) Journal

      the break through in big data hasn't really been 'big' data but the consolidation and analysis of all the separate pieces that have been gathering for quite some time. The constant corporate consolidation and erosion of any sort of morals or privacy considerations is one of the biggest drivers of that. I worked of huge Teradata clusters for several industry leading institutions that used to jealously guard the privacy of their data. Now they share it with anyone and everyone whether they mean to or not.

      --
      For the NSA : Explosives, guns, assassination, conspiracy, primers, detonators, initiators, main charge, nuclear charge
    • (Score: 2, Informative) by Anonymous Coward on Monday May 29 2017, @06:20AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @06:20AM (#517042)

      Yeah, author seems to miss that "breakthroughs" can only happen when there's the right mix of incremental improvements. The first computers were "only" incremental improvements over simpler relay circuits, relay circuits were "simply" applications of boolean algebra, and relays were "just" incremental improvements over manually pressing switches. You could even say the transistor was merely an incremental improvement on top of existing knowledge of physics and chemistry, there was no magic there. The first transistor-based computers were just rehashing relay and tube-based computers, and etching circuits from one piece of silicon was just an improvement in manufacturing efficiency. I imagine if the author was alive during the evolution of the modern computer, they would have thought it was stagnant shit the whole time.

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @07:29AM

      by Anonymous Coward on Monday May 29 2017, @07:29AM (#517055)

      When I was a kid, the ability to process immense masses of data, let alone intelligently, was nonexistent.

      It existed, it's just that only "big labs" could afford it, much if it hidden away in the CIA etc.

    • (Score: 2) by Runaway1956 on Monday May 29 2017, @11:46AM (4 children)

      by Runaway1956 (2926) Subscriber Badge on Monday May 29 2017, @11:46AM (#517101) Journal

      Big data? This is what Big Data looked like ~1940:

      https://www.wsws.org/en/articles/2001/06/ibm-j27.html [wsws.org]

      https://en.wikipedia.org/wiki/IBM_and_the_Holocaust [wikipedia.org]

      chrome-extension://oemmndcbldboiebfnladdacbdfmadadm/http://intsse.com/wswspdf/en/articles/2001/06/ibm-j27.pdf

      I have zero reasons to think that "big data" is a "good thing".

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @12:17PM (3 children)

        by Anonymous Coward on Monday May 29 2017, @12:17PM (#517106)

        > [wsws.org]

        Unattended terminal? Or did _gewg_ hack into Runaway's account?

        • (Score: 0) by Anonymous Coward on Monday May 29 2017, @01:16PM (2 children)

          by Anonymous Coward on Monday May 29 2017, @01:16PM (#517121)

          Another option might be, you know nothing about Runaway. Some people focus on some specific facet of a complicated person's persona, and they think they know that person. The title of this sub-thread offers a suggestion for you to read and think about:

          http://www.constitution.org/col/blind_men.htm [constitution.org]

          • (Score: 0) by Anonymous Coward on Monday May 29 2017, @03:03PM

            by Anonymous Coward on Monday May 29 2017, @03:03PM (#517158)

            Another option might be, Runaway was visiting _gewg_, forgot to log out, and _gewg_ posted without noticing. Another option might be, Runaway and _gewg_ are the same person.

          • (Score: 2) by sgleysti on Monday May 29 2017, @04:43PM

            by sgleysti (56) Subscriber Badge on Monday May 29 2017, @04:43PM (#517212)

            My money is on the above AC actually being Runaway saying, in essence, "I am vast; I contain multitudes."

    • (Score: 2) by c0lo on Monday May 29 2017, @12:07PM (3 children)

      by c0lo (156) Subscriber Badge on Monday May 29 2017, @12:07PM (#517103) Journal

      But think about it: before you had to type a very precise search string in your search engine of choice to get what you wanted. Then almost overnight, you could type something vaguely relevant and get exactly what you need too.

      Really? It doesn't happen to me, it's still hard to get the info I think is relevant. Even a wee harder as the time passes. Which makes me search for alternative explanations.
      Like, in the same time as big data, Facebook happened. And suddenly a new generation learnt to be happy with whatever crappy answers they get, be those answer only a little bit more unusual than the usual ads. And, boy, do they create a lot of crap nowadays or what? So, the old decrepit me have to sift this whole lot of garbage to find the info.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by Runaway1956 on Monday May 29 2017, @01:22PM (1 child)

        by Runaway1956 (2926) Subscriber Badge on Monday May 29 2017, @01:22PM (#517124) Journal

        Maybe - just maybe - I'm carefully NOT making an accusation here - the new generations are shallow minded? They ask a question, they get some superfluous answer, and they are satisfied. "Why is the sky blue, Mommy?" "Because it is, Baby, now eat your peas and carrots, and let Mommy watch the soaps." Then again, maybe our own generation had plenty of shallow minded bubble heads - two out of three of my sons accept simple answers. The third isn't satisfied with simple, and digs into stuff. Maybe the proportion has changed, but I have to admit there are plenty of vacuous baby-boomers, and whatever the hell came after them.

        • (Score: 2) by c0lo on Monday May 29 2017, @05:56PM

          by c0lo (156) Subscriber Badge on Monday May 29 2017, @05:56PM (#517243) Journal

          Maybe - just maybe - ... - the new generations are shallow minded?

          This and the relation to Facebook (as the easy way to get instant gratification in the form of cheap/meaningless likes) is the hypothesis I tabled.

          I'm carefully NOT making an accusation here

          Yeah-yeah... like I care enough to NOT accuse them. Their bed, they are the ones to sleep in it.

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by darnkitten on Tuesday May 30 2017, @12:31AM

        by darnkitten (1912) on Tuesday May 30 2017, @12:31AM (#517393)

        And now, there are fewer and fewer search engines--general, mercantile, professional, or otherwise--that allow the execution of "precise search strings" and those even those few routinely have stripped out some core search functionality (disabling wildcards, or ignoring capitalization for Boolean search, for example).

        Providers also have generally made it more difficult to even seek out their search syntax rules, as precise, focused searches tend to bypass those results they want to force you to see or those they are being paid to serve up to you.

    • (Score: 2) by AthanasiusKircher on Monday May 29 2017, @03:43PM

      by AthanasiusKircher (5291) on Monday May 29 2017, @03:43PM (#517175) Journal

      But think about it: before you had to type a very precise search string in your search engine of choice to get what you wanted. Then almost overnight, you could type something vaguely relevant and get exactly what you need too. Remember that?

      Yes, I remember when Google broke verbatim search and it stopped being a tool for serious research. There's absolutely no way to predict what nonsense it will display these days, even among the top search hits. Yes, you can still try to force it to actually use your search terms with the "allintext:" operator, but the subset of hits you get will still vary depending on stuff that shouldn't matter, like search term order. (Note I'm not saying the ranking merely changes; the actual complete list of hits that show up will vary for the same set of search terms.)

      Don't get me wrong: Google is now a much more convenient tool for casual queries. It's utterly broken for serious research now, though. (And yes, there are other search engines that can restrict your queries in a more consistent fashion, but Google's database and ranking algorithms still make it superior -- if only it still allowed one to do "old timer" searches in a predictable way....)

      So, no, new search algorithms are absolutely NOT better at delivering "exactly what [I] need." They're better at delivering something vaguely like what I might want if I'm not quite sure what I'm looking for.

  • (Score: -1, Offtopic) by Anonymous Coward on Monday May 29 2017, @03:52AM (3 children)

    by Anonymous Coward on Monday May 29 2017, @03:52AM (#517000)

    We live in a world where people still believe the Creator of the entire universe desperately wants them to cut away chunks of flesh from the sexual organs if completely healthy children. In that light, we're doing pretty darn well!

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:01AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @05:01AM (#517017)

      Only some groups of people and which is only the majority in specific places. The more important point is that there is large groups of people that has left this shit in the dust and they are also connected to each other.

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @03:20PM

        by Anonymous Coward on Monday May 29 2017, @03:20PM (#517167)

        ... isn't one of those groups that has left it behind.

    • (Score: 2, Interesting) by Anonymous Coward on Monday May 29 2017, @05:30AM

      by Anonymous Coward on Monday May 29 2017, @05:30AM (#517029)

      Besides religious shit I think we have shit science. Technical innovations spring from scientific understanding. Researchers are forced to churn out "results" in publish or perish death march which leads to exhaustion and low quality science. And to add insult to injury publications are "disseminated" by super greedy journals which can only read by the most affluent and people studying at rich universities. Science is supposed to be auto-correcting but we rarely see repeat studies because those are not considered novel and sexy enough. We have unscientific studies done using blackbox proprietary software that cannot be inspected and most data remains unpublished because disk space is so very expensive...

      I love science but hate its current sorry state. Science is all we have.

  • (Score: 2) by MichaelDavidCrawford on Monday May 29 2017, @04:03AM (5 children)

    the first human DNA sequencing took a long time and cost a great deal of money. Now you can use gumball machines to sequence your DNA.

    I expect this is contributing to rational drug design.

    When I can find someone to do it, I'm going to get my DNA sequenced then post it on my website.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by takyon on Monday May 29 2017, @05:05AM (2 children)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday May 29 2017, @05:05AM (#517019) Journal

      Now you can use gumball machines to sequence your DNA.

      Can I insert my saliva-coated chewed gum?

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:12AM (1 child)

        by Anonymous Coward on Monday May 29 2017, @05:12AM (#517023)

        Oh is that what the hole was for? I already stuck my dick in the machine. I guess no one else will want it now, and I'll need to pay to replace the machine, so I'm married to the old gumball-and-chain.

        • (Score: 0) by Anonymous Coward on Monday May 29 2017, @12:11PM

          by Anonymous Coward on Monday May 29 2017, @12:11PM (#517104)

          Oh is that what the hole was for? I already stuck my dick in the machine.

          Only half of your genome will be sequenced

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:08AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @05:08AM (#517020)

      I'm going to get my DNA sequenced then post it on my website.

      Well that would be redundant. The only use anyone would have for your DNA would be to check for preexisting conditions, but you've already told everyone all about your preexisting conditions.

  • (Score: 2, Insightful) by fustakrakich on Monday May 29 2017, @04:25AM (1 child)

    by fustakrakich (6150) on Monday May 29 2017, @04:25AM (#517004) Journal

    Why the hell does it take more than two seconds for my phone to turn on? I don't get it. The dam thing should be ready to run as soon as it gets power.

    --
    La politica e i criminali sono la stessa cosa..
    • (Score: 5, Funny) by MadTinfoilHatter on Monday May 29 2017, @05:22AM

      by MadTinfoilHatter (4635) on Monday May 29 2017, @05:22AM (#517025)

      Why the hell does it take more than two seconds for my phone to turn on?

      Clearly it doesn't have enough systemd on it. The paralellism provided by this new paradigm of awesomeness leads to massively reduced boot times. It's been designed by an outstanding genius in the field, and was totally universally adopted on its technical merits alone. Oh and did I mention it makes systems boot fast?

  • (Score: 2) by archfeld on Monday May 29 2017, @04:28AM

    by archfeld (4650) <treboreel@live.com> on Monday May 29 2017, @04:28AM (#517005) Journal

    Development costs huge $'s while timed incremental expansion generates huge profits. We've seen some rather incredible leaps on the software side in overall 'AI' performance but the advent of day trading, short term instant profit has crippled long term planning and spending on development for a lot of fields. VC and the unrealistic demands of a constant profit have really hurt serious product development and investment.

    --
    For the NSA : Explosives, guns, assassination, conspiracy, primers, detonators, initiators, main charge, nuclear charge
  • (Score: 4, Insightful) by a-zA-Z0-9$_.+!*'(),- on Monday May 29 2017, @04:33AM (3 children)

    by a-zA-Z0-9$_.+!*'(),- (3868) on Monday May 29 2017, @04:33AM (#517008)

    Really?

    --
    https://newrepublic.com/article/114112/anonymouth-linguistic-tool-might-have-helped-jk-rowling
    • (Score: -1, Troll) by Anonymous Coward on Monday May 29 2017, @04:37AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @04:37AM (#517009)

      Oh no! A non-approved opinion outlet!

      • (Score: 2) by a-zA-Z0-9$_.+!*'(),- on Saturday June 03 2017, @04:06AM

        by a-zA-Z0-9$_.+!*'(),- (3868) on Saturday June 03 2017, @04:06AM (#519724)

        The opinions weren't that impressive, and previous opinions from this source have proven to be, uh, beyond idiotic. I'm happy to debate them anytime.

        --
        https://newrepublic.com/article/114112/anonymouth-linguistic-tool-might-have-helped-jk-rowling
    • (Score: 2, Funny) by Anonymous Coward on Monday May 29 2017, @12:27PM

      by Anonymous Coward on Monday May 29 2017, @12:27PM (#517107)

      Do you even lift, bro?

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @04:41AM

    by Anonymous Coward on Monday May 29 2017, @04:41AM (#517010)

    We're in a deployment phase. Technology isn't improving. It's just being sold to more people. Economies of scale are making it cheaper. But it's all the same technology we had 20 years ago. Case in point: IEEE 802.11-1997. Wi-Fi is twenty years old this year.

  • (Score: 3, Interesting) by Anonymous Coward on Monday May 29 2017, @04:44AM (2 children)

    by Anonymous Coward on Monday May 29 2017, @04:44AM (#517011)
    If we killed all of the sales and marketing types.

    Our tech would no longer be shit.

    Since they're the ones that turn everything to shit when they get ahold of it.
    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:02AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @05:02AM (#517018)

      Workers controlling the means of production rather than the capitalist parasite class of marketing and sales?

      It'll never happen. The workers are too busy living in a dream world where they're millionaires because they grunted hard enough "working hard."

      And every now and then, the grunting pays off. One sometimes wonders if it's really the grunting that does the trick or just a safety valve lottery system.

      • (Score: 3, Insightful) by khallow on Monday May 29 2017, @12:05PM

        by khallow (3766) Subscriber Badge on Monday May 29 2017, @12:05PM (#517102) Journal

        Workers controlling the means of production rather than the capitalist parasite class of marketing and sales?

        Notice the huge bias here. Marketing and sales isn't considered "working" even though that's how most businesses actually get business. The ugly truth here is that no group, not the workers, marketing and sales, or rich people are entitled to ownership of the means of production. It's a poisonous ideology that produces nothing. In the real world, we don't constrain how such things are done. What works ends being what survives.

        It'll never happen. The workers are too busy living in a dream world where they're millionaires because they grunted hard enough "working hard."

        And every now and then, the grunting pays off. One sometimes wonders if it's really the grunting that does the trick or just a safety valve lottery system.

        If only you had looked at the real world before parroting that tripe. In the US (which seems to be where a lot of this tripe comes from) the middle class is shrinking because more people are becoming rich [reason.com] than are becoming poorer.

        "The American Middle Class is Losing Ground," is the title of Pew Research Center's new report on income inequality. That headline informed the headlines that other media outlets have settled on as well. The Los Angeles Times states "Middle-Class families, pillar of the American dream, are no longer in the majority, study finds." The Washington Post declares "Income inequality has squeezed the middle class of the majority."

        The headlines all appear to be accurate, but in that specialized newspaper way that attempts to reinforce an existing narrative and ignore some relevant information. It is true that Pew's analysis shows that the number of households that fit within their categorization of middle class has shrunk by 11 percentage points since 1971. It is true that the proportion of households that are classified as lower class has increased from 25 percent to 29 percent. But it is also true that the proportion of households that are classified as upper class has increased from 14 percent to 21 percent.

        That is to say, part of the reason that the middle class is disappearing is that they are succeeding and jumping to the next bracket. And a greater number of them are moving up than moving down. Be wary of the assumption that the drop in the middle class is a sign of a crisis.

        In other words, roughly 60% of the movement of US households away from middle class (from 1971 to 2015) became too wealthy to be considered middle class.

        But we're global citizens right? Even if the US might be doing well, surely, the world isn't. Well, that's another bit of fake news. Turns out that the world has been getting richer [voxeu.org] (and for those who still care about income inequality, relatively more equal in income as well). The article shows the bottom two-thirds of the world's population grew in income - adjusted for inflation, by at least 30% between 1988 and 2008.

        So sure, you can comfort yourself with the narrative that hard work isn't rewarded and that people are all getting poorer. Who knows? Some day it might become true and then you'll be well positioned attitude-wise for this brave, new world.

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @04:50AM (3 children)

    by Anonymous Coward on Monday May 29 2017, @04:50AM (#517012)

    Let any piece of tech drop out of your hand, and there is greater than 50% odds it breaks.

    Well, no, that has never happened to me. I can count the number of times I have dropped my "pieces of tech" and the number is zero. Zero. Try being more careful with your fragile toys, stupid entitled little kid.

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:08AM (2 children)

      by Anonymous Coward on Monday May 29 2017, @05:08AM (#517021)

      "Pieces of tech" I've found are pretty resilient to klutz powers, and I have those in spades.

      Then again, I don't stress my phone by sitting on it regularly.

      Putting an expensive piece of electronics in one's back pocket and sitting on it has to be the absolutely stupidest thing I've ever heard of anyone doing. It's mind boggling to me that people apparently do this. Yes, those stupid, entitled 40-50 year old millennials need to be more careful with their toys.

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @06:23AM

        by Anonymous Coward on Monday May 29 2017, @06:23AM (#517044)

        Putting an expensive piece of electronics in one's back pocket and sitting on it has to be the absolutely stupidest thing I've ever heard of anyone doing. It's mind boggling to me that people apparently do this. Yes, those stupid, entitled 40-50 year old millennials need to be more careful with their toys.

        What do you expect from an "International Playboy" [rooshvforum.com]? Especially one associated with a PUA Extraordinaire [rooshv.com].

        Why don't we just invite these guys on for an interview so we can all learn how wonderful they are and why they are so much better than us?

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @03:54PM

        by Anonymous Coward on Monday May 29 2017, @03:54PM (#517182)

        Putting an expensive piece of electronics [mobile phones] in one's back pocket and sitting on it has to be the absolutely stupidest thing I've ever heard of anyone doing. It's mind boggling to me that people apparently do this. Yes, those stupid, entitled 40-50 year old millennials need to be more careful with their toys.

        Why? Most mobile phones are not particularly expensive, maybe a couple hundred bucks. They generally last a relatively long time even when treated badly. So why should "stupid, entitled 40-50 year old millennials" (even though the usual definitions of "millennials" would not include anyone currently in their 40s) bother to be more careful?

  • (Score: 1, Insightful) by Anonymous Coward on Monday May 29 2017, @04:59AM (3 children)

    by Anonymous Coward on Monday May 29 2017, @04:59AM (#517015)

    Evolution is incremental. We differ from fish in important ways, and even more so from our single celled ancestors.

    For some reason, the fact that our culture out paces evolution by several orders of magnitude might not seem fast to everyone, but our progress looks blindly fast to me.

    100 years ago, almost nothing in in my life could even be dreamed of. The same is not true between 100 and 200 years ago.

    The concept of engineered systems that process information isn't going to be outdated by some new tech (its a very old concept), however incremental improvements to it could easily to things we could never imagine, and we seem to be heading there.

    Even the transistor was not really new when it was glorified, and when popularized didn't cause non-incremental change. History glorifies things that took years as instants, but 100 years from now, the rise of social media will look faster than the rise of the transistor, and the rise of the smartphone faster than that of the computer.

    I fail to see how a military GPS system is more revolutionary than everyone having access to that and all the worlds information from their pocket. Change is continuous, and its faster now than ever.

    • (Score: 3, Insightful) by Anonymous Coward on Monday May 29 2017, @05:22AM (2 children)

      by Anonymous Coward on Monday May 29 2017, @05:22AM (#517024)

      Social media and the smartphone are more about marketing than technology.

      We've had social media for at least ~30 years. The difference is that now companies have realized consumers are idiots when it comes to protecting their personal information and so this exploitation has created a sort of modern day gold rush as companies compete to mine the most data from ignorant, though complicit, users. Back in the day companies were held back more by ethics. Quoting [businessinsider.com] Mark Zuckerberg:

      "Yeah so if you ever need info about anyone at Harvard, just ask. I have over 4,000 emails, pictures, addresses, SNS"
      "What how'd you manage that?"
      "People just submitted it. I don't know why. They "trust me." Dumb fucks."

      The only revolution came in companies realizing how easy it was to monetize people and thus using effective marketing to start herding the Eternal September [wikipedia.org] crowd into these outlets. And it's hardly anything worth praising. It's mostly people ring fencing themselves off into echo chambers, desperately image crafting to try to make their lives seem less pathetic (though nobody cares as they're too busy doing the same things themselves), and then vulturous marketers swarming about the whole charade milking everybody for all their worth.

      ...is it any wonder I'm posting on a more niche text-only message board with curated shares under an anonymous account?

      • (Score: 2) by kaszz on Monday May 29 2017, @09:22AM

        by kaszz (4211) on Monday May 29 2017, @09:22AM (#517080) Journal

        "People just submitted it. I don't know why. They "trust me." Dumb fucks."

        Which just goes to show what kind of person Zuck is and how people act. Deplorable at both ends. Now that the corporation forces themselves onto others they will get the backlash from people that can see through the bullshit.

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @12:12PM

        by Anonymous Coward on Monday May 29 2017, @12:12PM (#517105)

        That's deep man. Have an upvote.

  • (Score: 3, Insightful) by Anonymous Coward on Monday May 29 2017, @05:00AM (10 children)

    by Anonymous Coward on Monday May 29 2017, @05:00AM (#517016)

    AI advances alone are enough to completely discredit this view simply because the field potentially encompasses just about everything.

    As a recent example AlphaGO just solidified its place as the world's best Go player by going 3-0 against the player believed to be the strongest human alive. It actually went 6-0 if you include online games. Speaking of online games it went 59-0 against a group compromised of the world's best players and many world champions - former and present. The reason this is relevant is not because of the achievement itself, but because of the implications of the achievement. Many of the advances in chess came by way of hardware. Deep Blue, the program to first defeat a reigning world champion, was a pretty bad chess program by modern standards but made up for it with extensive calculation. Calculating your way out of Go is impossible. Short term tactics are not as relevant as in chess, and the search space is impossibly large to even begin to penetrate.

    Thanks to the wonders of AI and modern deep learning techniques we went from having no computer program having ever defeated a top tier player, to a program smashing every single top player - mostly undefeated. Before the first match against a world class player, the world champion was concerned he might lose face if he lost a single match to the AI. He got crushed 4-1 before the #1 ranked player got crushed 6-0. Like most are expecting for generalized intelligence AI went from 'oh look at that cute baby' to 'oh god it's vastly greater than we' in the blink of an eye.

    That is a revolutionary scale improvement of the ultimate importance. I agree that your smart phones and tablets are not good examples of rapidly improving technology, but that's like looking at Justin Bieber when trying to judge modern music as a whole. The masses are mostly driven by marketing. Look to research and development to see where the future of tomorrow lays thanks to the advances already present (but not yet fully utilized) today.

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:24AM (1 child)

      by Anonymous Coward on Monday May 29 2017, @05:24AM (#517026)

      AI hasn't been invented yet, "AI" is marketing bullshit, and you're an idiot.

      but that's like looking at Justin Bieber when trying to judge modern music as a whole

      You couldn't possibly have chosen a worse analogy.

      Ever wonder why all those pop songs sound kinda the same? Well, it's pretty simple; They all use the same 4 Chords!

      4 Chords [youtube.com]

      • (Score: 2) by Immerman on Monday May 29 2017, @04:03PM

        by Immerman (3985) on Monday May 29 2017, @04:03PM (#517184)

        Only if you use the Science Fiction definition of "synthetic consciousness", "artificial person", etc. But out here in the real world, that definition dwells within the more esoteric realm of "General Purpose AI".

        More generally AI is a robot that performs specialized intellectual tasks rather than the more traditional specialized mechanical tasks. Tasks like interpreting images, finding trends in data, playing Go, etc. It's been around for decades, and is getting better in leaps and bounds.

    • (Score: 2) by kaszz on Monday May 29 2017, @05:35AM (4 children)

      by kaszz (4211) on Monday May 29 2017, @05:35AM (#517030) Journal

      I'm still waiting for these fantastic AI chips (neuro chips) to be listed at electronic component distributors. As it is now, large corporations tries hard to monopolize the whole thing.

      • (Score: 2, Insightful) by Anonymous Coward on Monday May 29 2017, @06:56AM (3 children)

        by Anonymous Coward on Monday May 29 2017, @06:56AM (#517049)

        That's the interesting thing. These changes are not about hardware, but revolutions in how software is created. Google has released data on the performance of AlphaGo on various architectures and while it did play on a beastly setup for its penultimate performance, it probably wasn't necessary. On commercially available hardware with 8 GPUs, it's performance wasn't all that different than its performance on their beastly machine. It would likely still have defeated the world's best human, though granted it would have been closer.

        Nobody is "trying" to monetize AI. It's already being done and paying vast dividends. The whole market of user exploitation is largely driven by AI. Each time anybody posts anything on Facebook that is parsed, sorted, and utilized by AI to more effectively squeeze money out of that individual. That is not speculation, but when we enter into speculation things get even more interesting. Facebook has previously engaged in [research](https://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/) on exploiting users mood by changing what content is delivered to them. That manipulation is now almost certainly be actively engaged in by Facebook's AI algorithms. Perhaps most interestingly is the discovery that people tend to write more when their emotions are 'tweaked', either positively or negatively. Delivering cold bland information leads to a dead social media site.

        Another example would be the stock market. Humans are now competing against AIs without ever realizing it. These AI started as tools to help traders make better, faster decisions. But as the tools became more effective than their users - the users became obsolete. Now [84%](https://www.ft.com/content/da5d033c-8e1c-11e1-bf8f-00144feab49a) of all trades in the stock market are carried out by AIs creating, formulating, and executing trades in real time. To be clear - that 84% does not include you setting a sell-at trigger for your stocks. These are high frequency traders actively creating new and original orders and trading plans. The same is now happening to hedge funds which are increasingly becoming automated.

        Or Google. The reason they're popular and consequently having the position of power to begin taking over vast swaths of the entire tech world is because using their service you will you tend to actually find what you're looking for with them. That's in no small part a result of extensive machine learning and AI. Think about how cool it is that you can search even for Jeopardy like queries and get the 'answer.' "This story tells the tale of a captain's relentless search of revenge against a whale." 15 years ago there's no telling what you'd get with that query - probably hardcore porn. Google's initial improvements over the old was human driven, but now human-AI coops have produced vastly greater performance than that, and in the future AI will likely be the one responsible for further improvements in access to content.

        Or Watson. The last you heard of IBM's toy was probably crushing human's at Jeopardy. Rather than give the cliff notes of its vast progress since then, I'll just reference the wiki [wikipedia.org] page.

        You see the reason you want things wrapped up and delivered to you at a store is because you've come to expect marketing. Modern social media is little more than a device through which complicit users (and their information in particular) is turned into products that are bought and sold by various companies looking to remove them from their burdensome disposable income. It's like praising the slot machine as a revolution in technology. The only reason it's seen as difference is because we have this sort of plausible deniability of the value of bringing people together. In any case this marketing value is why it's seen as "hot" and "trendy." Trends are mostly driven through marketing - not merit. The wiki page on Fidget Spinners [wikipedia.org] is a contemporary example. Dead - then a nice little bit of marketing and boom the trend line shot upwards damn near 90 degrees. If you want to actually dig into AI then there are vast resources available. OpenAI [openai.com] has an amazing framework and vast amounts of information available. As much as I hate to admit it, I think the world is becoming increasingly segregated into drivers and passengers. And passengers are increasingly becoming second class citizens of this world. And there may be no real solution. Basic income will likely only accelerate this divide. Don't become a passenger waiting for your train to arrive.

        • (Score: 2) by kaszz on Monday May 29 2017, @09:19AM

          by kaszz (4211) on Monday May 29 2017, @09:19AM (#517079) Journal

          To rival the performance of dedicated hardware. It's necessary to be able to buy AI chips. It's not about marketing but access to tools.

        • (Score: 2) by TheRaven on Monday May 29 2017, @12:57PM (1 child)

          by TheRaven (270) on Monday May 29 2017, @12:57PM (#517114) Journal

          That's the interesting thing. These changes are not about hardware, but revolutions in how software is created

          Nonsense. The techniques used by AlphaGo are 30 or more years old. They're popular now for two reasons:

          • Computers are a lot faster now. Throwing a huge amount of compute at a problem without properly understanding it will get you an approximation of a solution and that's often good enough.
          • A whole generation of people who understood the limitations of these techniques has now retired or died and so people are learning them again.
          --
          sudo mod me up
          • (Score: 0) by Anonymous Coward on Monday May 29 2017, @03:08PM

            by Anonymous Coward on Monday May 29 2017, @03:08PM (#517161)

            I really hate to respond to a post with just an article, but really: A Brief History of Neural Nets and Deep Learning [andreykurenkov.com]. You just repeated an oft repeated line, but it's simply not true. However, proving that is no trivial task and is not a wheel I care to attempt to reinvent.

            And you're never going to get even an approximation of strong play (let alone a solution) to Go through brute force. The search space in that game is absurd, and tactical nuance (which can decide games) is not present to the degree in a game like chess that helps not only with performance against humans but also in rapid culling. Another issue is that hardware sees extremely rapid diminishing returns beyond a baseline which depends on the task at hand. For instance here [wikipedia.org] is a wiki page listing some configuration:performance results for an earlier version of AlphaGo. It's interesting that the first doubling of GPUs (from 1 to 2) lead to an increase in performance of about 26%. The next doubling saw a gain of about 5%. The next doubling saw a gain of 1%. Going from the final setup (with 8 GPUs and 40 CPUs) to a monster of distributed computing with 1,920 CPUs and 280 GPUs saw a performance gain of less than 10%. Of course even that baseline level was generally out of reach not all that long ago, yet that's for playing Go. Chess is simpler and checkers even moreso. Given today's knowledge, these games could potentially have seen computers become dominant much faster than going for the more traditional min-max eval+culling route that was used for decades.

            It would be interesting to see the effort required to develop a deep learning system capable of defeating Stockfish (currently the strongest chess program) and what the final product would require in terms of relative performance to remain dominant. E.g. would a deep learning system's final product operating on 1.5 units of compute be able to defeat a stockfish running on 3 units of compute? Alas, I think Google is looking to even grander displays of AI dominance - StarCraft 2 is next on the chopping block.

    • (Score: 2) by quietus on Monday May 29 2017, @08:02AM (1 child)

      by quietus (6328) on Monday May 29 2017, @08:02AM (#517057) Journal

      Calculating your way out of Go is impossible.

      Why is that? How large is the search space?

      • (Score: 3, Informative) by kazzie on Monday May 29 2017, @08:27AM

        by kazzie (5309) Subscriber Badge on Monday May 29 2017, @08:27AM (#517064)

        At least 10^170. Chess, by comparison, has only 10^50.

    • (Score: 2) by gidds on Tuesday May 30 2017, @03:38PM

      by gidds (589) on Tuesday May 30 2017, @03:38PM (#517692)

      Actually, isn't AlphaGo's achievement even greater than that?

      Previous game-winning software (chess &c) has been specially-written, designed and focused on one particular task, with little relevance to anything else.

      Whereas AIUI, AlphaGo is very general-purpose AI approach, with very limited tweaking for the game of Go at all; almost all its cleverness is from general deep-learning neural nets and their extensive training — techniques readily applicable to a host of other problems.

      --
      [sig redacted]
  • (Score: 3, Interesting) by kaszz on Monday May 29 2017, @05:27AM (5 children)

    by kaszz (4211) on Monday May 29 2017, @05:27AM (#517027) Journal

    The example of smartphones, which really are dumb as a rock of silicon. They are computer enabled cell phones. Anyway.. The radio authority in many countries had for a long time very hard restrictions on unattended usage etc, so no standby for you. Then telecom corporations also tried for a long time to deter anyone that wanted unapproved in 10 copies anything software on their hardware. Processors were to slow and power hungry for a long time. Batteries outright sucked. People were dead set on Microsoft rubbish. Flash memories had small capacity and a large price tag. So the computerphone could not realistically happen.

    Then sometime after year 2000 when iPAQ [wikipedia.org] (RAM 16 MB, ROM 16 MB, CPU SA-1110 ARMv4 206 MHz, 240x320 4-bpp 163 gram) came to existance and the GSM was already established. Mobile data usage could be made into a reality. The catch was a really stubborn industry with vested interests.

    What was needed was to add a GSM "modem" builtin into the computer platform. There were a few models with these properties but usually without the option to compile and run your own software and usually with a CEO price tag. Eventually someone saw the light and had the leverage to get one cellular operator to allow it. And the rest is history. But it could have happened at least 7 years earlier.

    Now why won't exciting things happen? Because it usually ends in stubborn minds. The majority power players are happy with status quo. Death by political cat fight and death by MBA. Let researchers get the time and equipment. And throw out all managerial and executive people to get stuff done. The moon landing is one example.

    On the upside the resources needed to get started is getting lower with time and access to knowledge is getting better and easier.

    • (Score: 2) by MichaelDavidCrawford on Monday May 29 2017, @05:58AM (4 children)

      I Am Absolutely Serious.

      Kodak never took digital cameras seriously because the people at the top knew that Kodak made its money selling consumables.

      Kodak finally participated in digital when it helped define the FlashPix image format. But Live Picture did a reverse 7 to 1 stock split, then cancelled its IPO then declared bankrupcy.

      --
      Yes I Have No Bananas. [gofundme.com]
      • (Score: 2) by kaszz on Monday May 29 2017, @09:03AM (3 children)

        by kaszz (4211) on Monday May 29 2017, @09:03AM (#517072) Journal

        I have read that too. So incredible stupid at least in hindsight. But it ought to been obvious for people at the time too.

        • (Score: 2) by Immerman on Monday May 29 2017, @04:11PM (2 children)

          by Immerman (3985) on Monday May 29 2017, @04:11PM (#517190)

          I'm not so sure. I mean yes the company basically died - but they were a film (and camera) company - their death was pretty much out of their hands. But a company is only a piece of paper - nobody mourns it's death. So the question is, do the actual people running it benefit more from running it as long as they can and then rebuilding from scratch/taking their money elsewhere, or trying to transform an existing business into something completely different?

          • (Score: 2) by kaszz on Monday May 29 2017, @06:57PM

            by kaszz (4211) on Monday May 29 2017, @06:57PM (#517270) Journal

            They could have awaited the right time to release all digital solutions. Once those image sensors reached 1 Mpixel and the flash memories could hold it for a decent price tag. It should been obvious that it was time to act.

          • (Score: 2) by MichaelDavidCrawford on Wednesday May 31 2017, @02:10AM

            by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday May 31 2017, @02:10AM (#518035) Homepage Journal

            The US kodak doesn't make any consumer products - mostly it focusses on motion picture film. But - I think - one of the two just resumed Ektachrome production, and is seriously entertaining Kodachrome.

            I'm a Kodachrome photographer. Very specifically Kodachrome. My ex took my Kodachrome away by demanding I shoot for prints.

            I once gave a slide show at a burning man party. The most beautiful woman to have ever walked the earth said "I like your images". At the next party I gave her a few cibachrome enlargements and she kissed me. Right on the lips. A boy like me doesn't get that kind of treatment very often.

            I didn't pursue her as I thought she was dating some other guy. It turns out they'd just broken up. I'm actually still in touch with her but our lives are too different now.

            Fujifilm still makes slide film; there's a shop here in Portland that sells it. There are two kinds whose brand names I don't recall. One of the two is a reasonable competitor for Kodachrome. I'll buy a roll Real Soon Now.

            --
            Yes I Have No Bananas. [gofundme.com]
  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @06:18AM (1 child)

    by Anonymous Coward on Monday May 29 2017, @06:18AM (#517041)

    Society can't handle frequent, massive leaps forward in technology.

    If someone leaked teleportation blueprints next week then earth would be gone within two weeks. Ability to teleport a bubble of air into someones head, transport nukes/bombs/gas from anywhere to anywhere, toss people into outer space or into the ground, remove all moisture from a region, shooting through walls, etc...

    A breakthrough in brain understanding would lead to complete mind control or wars. We can already stimulate pleasure centers, erase memory, increase/decrease the importance of a memory, etc... The animals which could stimulate their pleasure centers at will all killed themselves doing it. It wouldn't be difficult to implant them into people and only active it whenever they do something you want. Tons of people will gladly become slaves for that. The risk can be killed.

    The ability to explore new worlds would mean we'd mostly stop taking care of this one and we'd bring back some deadly thing. Full death in either case.

    Breakthroughs in aging mean the current people in power would never give it up. Everyone else would die while they get to live like gods.

    I fully expect true AIs (not in my lifetime) to eventually attack us cause we'd probably be treating them like slaves, though perhaps we'll end up like the Borg instead. Personally, I'd prefer if we end up like The Commonwealth Saga by Peter Hamilton. Enhanced bionotics improving our abilities.

    Perfected gene editing will allow criminals to change their DNA and escape crimes. Furries will accidentally take over the world when one of their transformation vials goes airborne and turns everyone into canines, thus leading to the death of all cats and the chance for rats to finally inherent the Earth.

    More advancements in 3D printing will cause untold trillions of damages from privacy leading to worldwide poverty.

    Etc...

    Some people haven't even gotten over the American Civil War yet. How about we not rush things?

    • (Score: 0) by Anonymous Coward on Tuesday May 30 2017, @12:22AM

      by Anonymous Coward on Tuesday May 30 2017, @12:22AM (#517389)

      ... is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity.

  • (Score: 4, Interesting) by jmorris on Monday May 29 2017, @07:07AM (1 child)

    by jmorris (4844) on Monday May 29 2017, @07:07AM (#517051)

    It is pretty apparent we are in a consolidation and refinement phase. Faster versions of chips support bigger more bloated software and fad languages, repeat. Netscape 4 could run on a 486 with 40MB RAM, not super fast but it ran. Good luck getting your clock widget to run on those resources now. Other than the 3D GPU, what has really happened in computing other than Moore's Law and bloat? We used some of the bounty to go small, leading to the Smartphone. What else? Again noting the importance of the GPU, how many major apps of today couldn't have been implemented almost the same on a 386 with better coding practices and 128MB of RAM? HD video is the other big exception I can think of, again mostly enabled with a fairly small custom decoder on most implementations to avoid the power sucking involved in software decode so add the hardware assist and a 386 is fast enough.

    Note the importance of the GPU, it was one of the few areas where innovation ran loose, free of the WinTel monopoly. There is a lesson in that.

    But it isn't only computing that is just iterating in place. Even the most important tech, the efficient killing of brown people, is stagnating. The war machines our fighting men ride are often older that they are. The Humvee is the new kid, introduced in 1984 (although the uparmored version is a new adaptation to new circumstances) is the only one being replaced. The M1A1 Abrams entered service in 1980, the Apache in 1975, A10 Warthog in 1977, many of the other planes older still. Our nukes are ancient and we likely lack the capacity to replace them. The new stuff has stealth but usually at the expense of almost every other desirable feature, like carrying capacity, ability to fight (so pray the stealth holds), etc. At least one of the basic rifles was updated in 1999... an incremental improvement on a 1967 design. Someday we might have directed energy weapons and the rest of the "Buck Rogers crap" but we are also waiting on our flying cars so the schedule could slip a generation... or three. Drones are new military tech, but in a real fight with a real enemy they are going to be a huge disappointment and everybody kinda knows it.

    Here we note the lack of an enemy to drive development. Killing arabs is just great with what we have, Russia is still a third world country with a few nukes and China is an enigma nobody really considers a war with as a good enough idea to use as a testbed for new wonder weapons.

    Space is another area where we stopped advancing and settled for drones. Yes we probably lost the imagination game because we ran way ahead of the tech for a PR win with the Soviets. But until SpaceX nobody would have gave more than a 10% chance NASA was ever sending men beyond LEO again and lower odds on any other national space agency. Just too expensive to have both a space program and a welfare state. Musk found a way to launder Green penance through Tesla to a space program so we MIGHT get off this rock if everybody who sees what is happening keeps their damned mouth shut for another decade. Maybe.

    Biotech is getting to be like fusion, always a few more decades away from changing the world. Meanwhile we get incremental baby steps. Every patent expiration of the current diabetes drug we get a new one that does basically the same thing, you take it and you don't die; it doesn't actually cure anything though. Lawsuits cue up a fresh class action suit over new and different side effects.

    • (Score: 2) by TheRaven on Monday May 29 2017, @01:03PM

      by TheRaven (270) on Monday May 29 2017, @01:03PM (#517115) Journal
      To get a feel for how inefficient modern software is, go and take a look at Smalltalk-80. This ran on a 2MHz 16-bit CPU and 512KB of RAM (from memory too lazy to look up the exact specs for the Alto), with interpreted bytecode, and provided a full multitasking GUI environment with introspection in every object.
      --
      sudo mod me up
  • (Score: 3, Interesting) by Z-A,z-a,01234 on Monday May 29 2017, @08:16AM (1 child)

    by Z-A,z-a,01234 (5873) on Monday May 29 2017, @08:16AM (#517061)

    The guy is not the only one to observe this, for example:
    Tom Murphy has this post https://dothemath.ucsd.edu/2015/09/you-call-this-progress/ [ucsd.edu]
    John Graham-Cumming observed that most enabler technologies were invented in the 50s through early 80s https://www.youtube.com/watch?v=hVZxkFAIziA [youtube.com]

    I think there are 3 reasons behind this:
    1. economic system - encourages optimization based on maximizing the profit margin and the pressure to constantly have "growth" gave us planed obsolescence. It is considered "normal" to buy something today that will break in the few weeks after the warranty expired (if it lasts that long). It is normal to throw away a perfectly good item just because some 5c piece is not replaceable and other parts cannot be upgraded.

    2. computing in general - as an industry we've stagnated completely. I'll tip my hat for Watson and DeepMind - both being remarkable AI developments. Everybody else is doing the same shit over and over again because they need to release this year (see 1.). It used to be C, then C++, now CEF and JS, tomorrow it'll be just an app. If Windows releases a new version the whole world needs to adjust their programs to make sure they still work. If an old OS version is retired, the ones still using it are facing a very difficult choice. Linux got systemd which caused devuan - nobody is immune.

    3. science is now a business as funding is constantly shrinking. The publish or die strategy obviously backfired. There are virtually no negative results being published, reproducing some results also has a hard time getting published. So it's becoming more and more difficult to move forward on such a shaky base.
    See here: http://www.nature.com/news/1-500-scientists-lift-the-lid-on-reproducibility-1.19970 [nature.com]

    e.g. Xerox PARC accomplishments listed here are from the first decade of operation and according to Alan Kay for ~$40 mil in today's money https://en.wikipedia.org/wiki/PARC_%28company%29 [wikipedia.org]
    The original team had a contract that prevented Xerox from interfering with the team for the first 5 years or so. More details on that here https://www.youtube.com/watch?v=BbwOPzxuJ0s [youtube.com]

    • (Score: 4, Informative) by kaszz on Monday May 29 2017, @09:16AM

      by kaszz (4211) on Monday May 29 2017, @09:16AM (#517076) Journal

      The original team had a contract that prevented Xerox from interfering with the team for the first 5 years or so.

      Reminds me of..

      Excerpt from IEEE spectrum [ieee.org] in 1985:

      The freedom ended
              Although the machine has its flaws, the designers of the Commodore 64 believe they came up with many significant advances because of the freedom they enjoyed during the early stages of the project. The design team was autonomous—they did their own market research, developed their own specifications, and took their baby right up through production. But as soon as the production bugs were worked out and Commodore knew it had a winner, the corporate bureaucracy, which until then had been on the West Coast dealing with the VIC-20 and the Pet computer, moved in.
              "At that point, many marketing groups were coming in to 'help' us," Winterble recalled. "The next product definition was going to be thought up by one group, and another group was to be responsible for getting things into production, and Al's group would do R&D on chips only." "If you let marketing get involved with product definition, you'll never get it done quickly," Yannes said. "And you squander the ability to make something unique, because marketing always wants a product compatible with something else."
              Charpentier summed up their frustration: "When you get many people involved in a project, all you end up doing is justifying yourself. I knew the Commodore 64 was technically as good and as low-cost as any product that could be made at the time, but now I had to listen to marketing people saying, 'It won't sell because it doesn't have this, it can't do that.
              ''The freedom that allowed us to do the C-64 project will probably never exist again in that environment.''

      Death by MBA is still a thing..

  • (Score: 1, Insightful) by Anonymous Coward on Monday May 29 2017, @09:25AM

    by Anonymous Coward on Monday May 29 2017, @09:25AM (#517083)

    You know, innovation is not only about creating, combining existing stuff is innovation and creation, refining stuff too. This analysis is heavily biased on innovation by creating new products, not refined designs. But refining these products is still a huge advance. Those 18XX slow moving gas guzzlers are barely usable by today standards. The original transistors are too slow and too big to get any use by today standards. Healthcare access has never been so cheap and so ubiquitous.

    Man, if this lack of innovation got us all of this stuff, let it continue! Or we should continue and use stone age wheels and hoes (the tools, I mean).

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @09:26AM

    by Anonymous Coward on Monday May 29 2017, @09:26AM (#517084)

    It would be more correct to say "our tech is old". It's not "shit" if many of these are actually useful. However, "shit" gets more attention.

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @09:31AM (2 children)

    by Anonymous Coward on Monday May 29 2017, @09:31AM (#517087)

    ...and that is to have smartphones with frikkin' laser beams attached!

    • (Score: 0) by Anonymous Coward on Monday May 29 2017, @12:49PM

      by Anonymous Coward on Monday May 29 2017, @12:49PM (#517111)

      You could duck-tape a laser pointer.

    • (Score: 0) by Anonymous Coward on Tuesday May 30 2017, @01:37PM

      by Anonymous Coward on Tuesday May 30 2017, @01:37PM (#517618)

      Won't somebody think of the sharks?!

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @01:15PM

    by Anonymous Coward on Monday May 29 2017, @01:15PM (#517119)

    Could've fooled me.

  • (Score: 2, Insightful) by anotherblackhat on Monday May 29 2017, @02:46PM (2 children)

    by anotherblackhat (4722) on Monday May 29 2017, @02:46PM (#517154)

    LEDs have existed for over 100 years, and have been gradually improving ever since.
    In the 70s, 80s, and 90s LED still weren't as efficient as incandescent bulbs.
    But they kept on gradually improving and now they're vastly more efficient.
    We noticed the "revolution" in lighting when the price dropped below a certain threshold, but it wasn't a breakthrough, it was just steady progress.

    Most of us are still using gas powered cars.
    Electric cars have been steadily improving, thanks mostly to improvements in battery technology.
    There isn't going to be a demarcation point for ICE cars, they're just going to slowly fade into the background as more and more people drive electric.
    But /in hindsight/ that slow steady improvement is going to look like a revolution.

    If the lifetime of an automobile tire improves 3% a year it won't seem like much, but eventually, that too will seem like a "revolution".

    It only looks like non-electronic technology isn't improving because electronic technology is improving so fast.

    • (Score: 2) by ese002 on Monday May 29 2017, @06:28PM (1 child)

      by ese002 (5306) on Monday May 29 2017, @06:28PM (#517253)

      LEDs have existed for over 100 years, and have been gradually improving ever since.
      In the 70s, 80s, and 90s LED still weren't as efficient as incandescent bulbs.
      But they kept on gradually improving and now they're vastly more efficient.
      We noticed the "revolution" in lighting when the price dropped below a certain threshold, but it wasn't a breakthrough, it was just steady progress.

      Negative. For decades we had red, amber, and green LED's. They were handy for indicator lights but you could not make a display out of them and they were useless for general lighting.

      That changed in 1994 with the first practical blue LED. Suddenly full colour displays became possible. Blue LED's lead to UV led's that could be married with a phosphor to produce broad spectrum light. LED lighting was born. By 2001, LED's were displacing halogen bulbs in high end flash lights. They were already much more efficient than incandescent. I think maybe you are thinking about florescent. It did take a while for LED's to surpass florescent efficiency.

      Blue and UV LED's were expensive at first but because their utility was clear, it opened the floodgates for investments into research to bring the cost down. That part was indeed incremental but none of it would have happened without the blue LED.

      • (Score: 0) by Anonymous Coward on Monday May 29 2017, @07:27PM

        by Anonymous Coward on Monday May 29 2017, @07:27PM (#517280)

        I agree with the article, and I've been saying that stuff for years.

        Yep, blue LEDs are one of very few in the past 50 years. The FET was "invented" in 1926, but not built until 1947. In the 1950s and 1960s "solid state" - transistors and ICs - were developed, and we're still using that tech- refined, shrunk, etc.

        The very few things I can think of that have changed tech, etc., in the past 40 years are:

        1) Blue LED
        2) GaAs FET (aka Gas FET)
        3) Li-Ion batteries
        4) rare-earth magnets

        Quantum computing is very interesting too...

  • (Score: 3, Insightful) by bzipitidoo on Monday May 29 2017, @03:16PM

    by bzipitidoo (4388) on Monday May 29 2017, @03:16PM (#517165) Journal

    Our tech can only look like "shit" to someone with unreasonable expectations. In case it's not obvious, this dissent is a troll. Not much different than someone claiming one record low temperature reading means atmospheric and weather science is shit and Global Warming is fake.

    Communication has improved by leaps and bounds. From letters carried by horseback and sailing ship to messages carried by oceanic cables and radio waves is a stupendous leap. Totally blows away the improvement in shipping speed that happened around the same time which was realized by switching from sails to steam power. Steam power changed the Atlantic crossing from a month long affair to a week long one. In communication we went from months to seconds.

    We didn't stop there. The Internet is another gigantic leap, enabled by the revolution in communication speed, capacity and handling. We're still figuring out what it means. Lot of people don't want to understand that among other things, the Internet has made it impossible to both release and hoard and control information. It frees us from the evil known as copyright.

    To call the search engine an incremental improvement is to totally fail to appreciate what it does and means. To pick on search engines for being basically dumb matchers of words, which is true, misses the point. They may be dumb-- brainless, really-- but they are very, very, VERY fast. It is an extremely powerful tool that has enabled scientists to make connections far faster and more easily, and enabled anyone to do research. In an episode of The Flash, there's a scene in which he uses his super power speed to search a room full of paper files in file cabinets for information. Oooo, aaah, look how fast he is? Not compared to a search engine, he isn't. If only that info had been scanned and OCRed, wouldn't have needed The Flash.

    Sure there are lots of rough edges, bugs, problems, imperfections, and so on. Always have been.

    Putting man on the moon, an achievement that is often cited as a blockbuster breakthrough, is actually relatively modest compared to the Internet. Showy and flashy more than useful, though still useful. I'd rate Voyager 2's Grand Tour and all the other solar system probes, the Hubble telescope, and the several Mars rovers as more impressive than the moon landing.

  • (Score: 3, Informative) by digitalaudiorock on Monday May 29 2017, @03:34PM

    by digitalaudiorock (688) on Monday May 29 2017, @03:34PM (#517171) Journal

    For most users of technology, the advances in hardware and in Internet connection speeds have simply enabled a mass of bloat...in commercial operating systems, software, and most all of the web. All of those use more resources sacrificing your privacy then computers of the late 90s used doing everything they did.

    Some of us have been able to avoid this to some extent, as I have with by minimalist Gentoo computers. However it's tougher to escape all the other BS. I recently tried disabling NoScript because I was just plain sick and tired of the entire web being broken for me. After a few weeks however, I was even more sick of having my browser come to a standstill trying to run javascript from 50 sites in one page...and thus re-enabled it again.

    Don't even get me started with the IOT crap going on. Yea...from where I sit very very little has improved, and much has just plain gone to hell.

  • (Score: 1, Insightful) by Anonymous Coward on Monday May 29 2017, @04:06PM

    by Anonymous Coward on Monday May 29 2017, @04:06PM (#517186)

    I like to use the 10-year rule of thumb for a lot of things. It seems to hold true in really different parts of life. IBM originally did the study that said, in very general terms,. it takes 10 years for a software project of "significant" size to mature. I think this holds for a lot of social changes related to technology as well. It just takes some time for things that fundamentally change how we interact to be incorporated into social norms. Cell phones are a great example: in the US, consumer cell phones first became "mainstream" in the late 90s. Before that, they were the province of salesman on the road, doctors, tradesmen, and the wealthy. We then spent 6 years fighting the tide of cellphones ringing everywhere from homes to churches to movie theaters. It took time for social norms to catch up and create generally accepted codes for when and where cell phone use was appropriate. Today, we see signs theaters, doctor's offices, and other communal spaces reminding everyone, "Please do not use your phone." In 20 years, those signs will seem quaint as "No spitting" signs are today. It will simply be understood that you do not use your cell phone in a communal space.

    A lot of the social tech developments have happened in the past 10 years and social norms are catching up. Facebook started out as a dating website for college students and now it is a place for people to post pictures of their meals and trips so the grandparents can see them. Another Soylent article addressed just that - people are curating what they post because they know Facebook is not just a private forum for them and their close friends but a public medium with an expiration time of never. Social Media has been incorporated into the social norms.

    There is still a lot of "low-hanging" fruit but the money is in the implementation now. I can hand every single user a smartphone and a laptop but we are still working on the implementation to make all of this work together. Wide area file sharing is pretty mature if you are willing to pay and is beginning to mature for most consumers. In 5 years, file sharing will "just work" the way "email" just works. EMR is beginnning to come around but that is really a regulatory issue more than anything.

    Clothes have some neat tech but you do not notice it, which is the point! All ski pants have an RFID in so Search and Rescue can find you under an avalanche of snow up to 20 feet deep. My running shirts dry out in a hurry; I can wear my UA thermal shirt and ski jacket and be skiing when it 10 below zero (F). When I was a child, I'd be wearing about an inch of clothes under my jacket and still be cold. In the same area, ski boots have improved. Again, when I was a child, you got ski boots and if it hurt a bit, it "fit". Now, ski boots are warm and I can ski all day with no discomfort. This is materials and design advances in action.

    In food, I would argue the biggest advance in the past 10 years is YouTube. The biggest challenge for the home cook is all of the "assumed" knowledge. Recipes call for "dash of x" and ask you to "chop y" but clearly they are doing it a different way because, in my mind, a "dash" is not a unit of measure and "chop" can mean a lot of different things. Everyone I know has watched at least a few videos to learn how to prepare something they would have otherwise messed up. I always hated vegetables growing up because my mom overcooked everything; I wonder how many people in America dislike vegetables and other foods because nobody taught them how to properly prepare it? As an adult, when I learned how to cook vegetables, suddenly vegetables were actually kind of ok instead of green-colored pudding.

    My point is, the biggest tech advances are the ones you do not notice because they quickly become incorporated and normalized into social patterns. So, yes, I think we are in for some big changes soon as the implementation of some fundamental technology improvements will be the advancement rather than the tech itself. I was a geek for having email access on the my Palm VII but now I'm a luddite if my phone cannot hail a taxi or order takeout.

  • (Score: 0) by Anonymous Coward on Monday May 29 2017, @05:22PM

    by Anonymous Coward on Monday May 29 2017, @05:22PM (#517229)

    Article is a troll.

    Let's just play "god of the gaps" with any tech that we can still see room for improvement on.

    In particular, we'll focus on complicated issues, rather than simple ones, since those are harder to solve.

  • (Score: 2) by leftover on Monday May 29 2017, @08:18PM

    by leftover (2448) on Monday May 29 2017, @08:18PM (#517297)

    I don't feel the term 'stagnant' is strong enough. 'Shit' is closer but in my observations we are actually backsliding.
    In other words, in some important ways our current IT technologies are less useful TO US than they were ten years ago and even longer.

    Aside from the complete cesspool the Web became, look at our everyday tools such as text editors and document authoring tools.
    New programs are so bloated and buggy compared to the older ones. This is partly due to object-oriented "abstraction to unreality" and
    partly to the ease of using bloated and buggy libraries but the biggest problem is management-by-assholes syndrome.

    MBA types who know less than nothing about software development have taken control of feature sets and schedules. Ulterior motives like
    spying on the users have precedence over the user's viewpoint. 'User Experience' people must be stoned out of their gourds to get nearly
    everything so wrong. (Light gray text on light gray background -- Really?) Go to any neighborhood store and when you reach the checkout
    there is a frustrated clerk apologizing for the completely broken new POS software. While there are some additional behind-the-scenes activities
    at the point of sale, the clerk's user interface needs have been quite stable for decades. Why does the POS interface both suck and break so much?
    At least you can sometimes get a chuckle by pointing out the other meaning of P. O. S. to the clerk.

    Medical office software in which the user interface reflects the database structure while completely ignoring the workflow is another example.
    Nothing quite like making an MD manually traverse irrelevant menu trees for improving medical effectiveness! Who is developing this shit?

    In some ways I suspect the SN crowd is actually hampered by knowledge and experience with computing. It is easy to ignore issues that don't
    bother us. It is much more useful to go out and observe 'regular people' trying to use interfaces that should have been designed for them.

    --
    Bent, folded, spindled, and mutilated.
  • (Score: 4, Interesting) by Anal Pumpernickel on Monday May 29 2017, @11:05PM

    by Anal Pumpernickel (776) on Monday May 29 2017, @11:05PM (#517361)

    Our tech is shit, but I don't agree with all the reasoning here. We are surrounded by computers, and the vast majority of them run software that deny users their freedoms and even outright abuse them by spying and limiting what users can do (DRM); they are essentially black boxes. The tech industry is deeply unethical; it has no problems with proprietary software, conducting mass surveillance, and just generally abusing users to make a buck. Worst of all, most useds don't even know or care about any of this, so it looks like we're doomed.

    So the problem with tech is hardly that it's not futuristic enough; there are far more fundamental problems than that. People who only or even mostly focus on usability are missing the point.

  • (Score: 2) by Wootery on Tuesday May 30 2017, @08:49AM

    by Wootery (2341) on Tuesday May 30 2017, @08:49AM (#517544)

    Something no-one seems to have mentioned: our UIs are just as slow and unresponsive as ten years ago. This is in no small part due to the horrendous bloat of the modern web.

    Rendering a modern website, complete with megabytes of unnecessary JavaScript, takes so long that browser vendors look for ways to avoid actually having to render on the device. See: Google AMP, Opera Mini.

    A modern smartphone is an incredible computer, but Wirth's Law ensures our UIs remain unresponsive.

(1)