Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday November 30 2017, @03:59PM   Printer-friendly
from the become-a-plumber dept.

Automation could wipe out 375-800 million jobs globally in the next 13 years, including 16-54 million in the U.S. But don't worry, there's a new job waiting for you:

The McKinsey Global Institute cautions that as many as 375 million workers will need to switch occupational categories by 2030 due to automation.

[...] "The model where people go to school for the first 20 years of life and work for the next 40 or 50 years is broken," Susan Lund, a partner for the McKinsey Global Institute and co-author of the report, told CNN Tech. "We're going to have to think about learning and training throughout the course of your career."

[...] "The dire predictions that robots are taking our jobs are overblown," Lund said. "Yes, work will be automated, [but] there will be enough jobs for everyone in most areas." The authors don't expect automation will displace jobs involving managing people, social interactions or applying expertise. Gardeners, plumbers, child and elder-care workers are among those facing less risk from automation.

Also at Bloomberg.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by khallow on Friday December 01 2017, @02:22AM (2 children)

    by khallow (3766) Subscriber Badge on Friday December 01 2017, @02:22AM (#603768) Journal

    and they're relying on past results and how automation has worked before to describe how the future will be.

    Past results which have worked over centuries in the face of a huge amount of technology improvement and expansion of global trade, and are still working as of the present day. In other words, if they're wrong, shouldn't we start to see that at some point? Another thing that hasn't changed are all the Chicken Littles forecasting doom and gloom. Maybe some century down the road they'll be right instead of glaringly wrong?

    My view is that economics still works even in the situations that you are worried about. And we still have three economics effects ignored here. First, comparative advantage. Even in the situation where automation can do everything better than you, it doesn't mean that it is more economical to do so. The opportunity cost of diverting such resources can outweigh the cost of using existing human labor for the task.

    Second is Jevons paradox. When one increases the productivity of human labor via automation and robotics, one will make it more valuable and hence, more in demand. Third, it's not that hard to create new markets and economies when the existing ones don't serve a group very well. Black markets are a classic example of this in action.

    The thing is we're better off [soylentnews.org] than we've ever been before. Technology advances, including automation, are a large part of why that happened.

  • (Score: 2) by Phoenix666 on Friday December 01 2017, @03:04PM (1 child)

    by Phoenix666 (552) on Friday December 01 2017, @03:04PM (#603914) Journal

    Second is Jevons paradox. When one increases the productivity of human labor via automation and robotics, one will make it more valuable and hence, more in demand. Third, it's not that hard to create new markets and economies when the existing ones don't serve a group very well. Black markets are a classic example of this in action.

    That only holds as long as human agency is involved. Once AI comes into the picture, humanity falls off the economic cliff. That's if the AI decides it wants to continue to do things that serve humans. If it decides humans are competition for limited resources, then we're really in the soup.

    So, yes, up until the last human is removed from the chain of production, the handful of guys with 7 PhD's and neural implants and VIs to expand their productivity will have limitless wealth and power, but how long do you really think that would be tolerated by the other 7 billion humans on the planet who are starving and desperate? Bullets are not infinite and gun barrels overheat and cartridges jam and guidance systems fail. The ultra-geniuses might easily command their kill-bots to obliterate the masses, but after they break down after killing the 1 billionth person there are a further 6 billion to tear them limb-from-limb with their bare hands, if need be.

    Maybe the best path for humanity is not that one. Maybe de-humanizing for the purposes of control is not a sustainable progression. Maybe it would be better for everybody to comport ourselves in a way as a global society that brings out the best in everyone. That might be a lofty goal and in all likelihood impossible to fully realize, but isn't it a better star to navigate by than the one we have been?

    --
    Washington DC delenda est.
    • (Score: 1) by khallow on Friday December 01 2017, @04:02PM

      by khallow (3766) Subscriber Badge on Friday December 01 2017, @04:02PM (#603942) Journal

      That only holds as long as human agency is involved. Once AI comes into the picture, humanity falls off the economic cliff. That's if the AI decides it wants to continue to do things that serve humans. If it decides humans are competition for limited resources, then we're really in the soup.

      Human agency never went away.

      So, yes, up until the last human is removed from the chain of production

      I guess you missed my bit about humans setting up their alternate economies when that happens.

      Bullets are not infinite

      But there are already vastly more bullets in the world than would be needed to kill all seven billion people. Such a conflict will depend on a lot of things, but in theory, killbots would lose in the short term and win in the long term due to greater human numbers at first, and a faster production and training cycle for killbots later.

      Maybe the best path for humanity is not that one. Maybe de-humanizing for the purposes of control is not a sustainable progression. Maybe it would be better for everybody to comport ourselves in a way as a global society that brings out the best in everyone. That might be a lofty goal and in all likelihood impossible to fully realize, but isn't it a better star to navigate by than the one we have been?

      No, because people who don't play by those rules can win big. It's a lot easy to discourage killbot army creation, if there are huge negative consequences to doing so, like your factory becomes a smoking hole in the ground. But that means not being nice on occasion.