Submitted via IRC for TheMightyBuzzard
Microsoft's man in charge of predicting the future has forecast the slow death of the Qwerty keyboard — with facial tracking, voice and gesture recognition taking over. Dave Coplin, the technology giant's chief envisioning officer, said it was bizarre that 21st-century workers still relied on typing technology invented in the 19th century. He added that while there have been huge leaps in technology, often the workplace had not caught up.
"We have these amazing computers that we essentially use like we're still Victorians. The Qwerty keyboard is a great example of an old design being brought forward to modern day. We've not really evolved. We still use this sub-optimal design.
"We're looking at technologies now like voice and gesture recognition, and facial tracking that may make the keyboard redundant," he added.
"We think that computers in the not-too-distant future will be able to understand all of those things and infer on my behalf my intent, meaning and objective that I'm trying to do."
(Score: 0) by Anonymous Coward on Saturday October 08 2016, @03:34PM
They often pushed more buttons, and the panels always had a lot of information.
(Score: 0) by Anonymous Coward on Saturday October 08 2016, @03:46PM
Yes they did, but if you think about the complexity of the tasks they were often doing (for example, improvising some new technology on the spot by pushing only a handful of buttons), seems to me there aren't enough buttons pressed to explain all the necessary data points, and that some connection with the mind would have to be involved.
(Score: 2) by mhajicek on Saturday October 08 2016, @04:24PM
Wouldn't work for Data.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 1, Touché) by Anonymous Coward on Saturday October 08 2016, @04:27PM
A good point, but really at some point you have to take a step back and realize it isn't a documentary...
(Score: 4, Funny) by mhajicek on Saturday October 08 2016, @04:37PM
Galaxy Quest.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 2) by driverless on Sunday October 09 2016, @02:58AM
Traumschiff Surprise.
(Score: 0) by Anonymous Coward on Sunday October 09 2016, @07:13PM
Yes, the point you should do that is *before* you analyse the number of button presses against the complexity of the task they were supposed to be doing.
(Score: 2) by edIII on Saturday October 08 2016, @09:11PM
Oh yes it would, and it would work much, much, much, faster. Data can directly connect to LCARS, which is akin to us directly interfacing our brains to the ship's computer core. As Data can actually understand the programming at a lower level, he can "think" with LCARS in a more sophisticated fashion, and faster than, any humanoid (except a few).
Barring that, Data can provide physical inputs to LCARS fast enough that you can't see his hands moving. Your environment is customizable and Data's user profile probably includes buttons that no other human can decipher allowing him to work more effectively with less inputs.
I would not bet against that android.
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 0) by Anonymous Coward on Saturday October 08 2016, @04:35PM
Then how did they work with Data? But in all seriousness, what they did isn't too hard if it is context-based. For example, the one to adjust the shields would be the engineer or tactical station, which probably already have buttons for the shield. It is like placing a call to someone. How can you do that without hitting half a million buttons? click one button to go home, one for phone, one for contacts and one to select the person. Yes, its true that context-based GUIs mean it takes longer for some tasks but other are just a few or even one interaction.