Repetitive skills like pattern recognition, information retrieval, optimization and planning are most vulnerable to automation. On the other hand, social and cognitive skills such as creativity, problem-solving, drawing conclusions about emotional states and social interactions are least vulnerable.
The most resilient competencies (those least likely to be displaced by AI) included critical thinking, teamwork, interpersonal skills, leadership and entrepreneurship.
Yuval Harari, a historian at the Hebrew University of Jerusalem, described the rise of AI as a "cascade of ever-bigger disruptions" in higher education rather than a single event that settles into a new equilibrium. The unknown paths taken by AI will make it increasingly difficult to know what to teach students.
Perhaps we can all be employed as therapists, counseling each other about our feelings of irrelevance?
(Score: 1, Insightful) by Anonymous Coward on Friday May 10 2019, @02:18PM
Well, let's start with the last one: I'm pretty sure some machines will be purposely taught garbage. If only to sabotage the competitor's AI.
And when taught garbage, I'm sure some machine will start to believe in fairytales, because doing so is not the result of a malfunctioning brain, it is the result of a misprogrammed brain. And I don't see why machines should notz be vulnerable to that, too.
And machines certainly will miss clues, as even for a machine, it is not economical to consider all available data, but it will have to prioritise the evaluation for the probability of it being relevant. And that's exactly the mechanism that leads humans to immunize ideas, and it is not to expect that machines are immune to this.
After all, there's no evolutionary advantage in being vulnerable to deception, therefore the rational assumption, based on the limited data we have on that question, is that it is a likely, if not necessary, side effect of having the ability to reason.