Arthur T Knackerbracket has processed the following story:
In 1968's Star Trek episode, "The Ultimate Computer," Captain Kirk had his ship used to test M5, a new computer. A copilot, if you will, for the Starship Enterprise.
Designed to more efficiently perform the jobs of the human crew, the M5 indeed did those jobs very well yet with such a terrifying lack of understanding it had to be disabled. But not before exacting a terrible price.
Last week, Microsoft 365 Copilot, a copilot, if you will, for the technology enterprise sold as performing human tasks with more efficiency, increased its prices by 5 percent, the first of many finely judged increments in the old style. Unlike the M5, it isn't in the business of physical destruction of the enemy, instead producing commercial victory with the photon torpedo of productivity and the phaser bolts of revitalized workflow.
[...] Some time back, this columnist noted the stark disparity between the hype of the metaverse in business and the stark, soulless hyper-corporate experience. Line-of-business virtual reality has two saving graces over corporate AI. It can't just appear on the desktop overnight and poke its fingers into everything involved in the daily IT experience. Thus it can't generate millions in licensing at the tick of a box. VR is losing its backers huge amounts of money that can't be disguised or avoided, but corporate AI is far more insidious.
As is the dystopia it is creating. Look at the key features by which Microsoft 365 Copilot is being sold.
Pop up its sidebar in Loop or Teams, and it can auto-summarize what has been said. It can suggest questions, auto-populate meeting agendas. Or you can give it key points in a prompt and it will auto-generate documents, presentations, and other content. It can create clip art to spruce up those documents, PowerPoints, and content.
How is this sold? That it will make you look more intelligent by asking Copilot to suggest a really good question while doing an online presentation or a Teams meeting. What's also implied but unsaid: If you're the human at the end of this AI-smart question and want to look smart enough to answer it, who are you gonna call? Copilot.
The drive is always to abdicate the dull business of gathering data and thinking about it, and communicating the results. All can be fed as prompts to the machine, and the results presented as your own.
And so begins a science-fiction horror show of a feedback loop. Recipients of AI-generated key points will ask the AI to expand them into a document, which will itself be AI key-pointed and fed back into the human-cyborg machine a team has become. Auto-scheduled meetings will be auto-assigned, and will multiply like brainworms in the cerebellum. The number of reports, meetings, presentations, and emails will grow inexorably as they become less and less human. Is the machine working for us, or we for the machine?
Generative AI output feeding back into itself can go only one way, but Copilot in the enterprise is seemingly designed to amplify that very process. And you have to use it if you want to keep up with the perceived smartness and improved productivity of your fellow workers, and the AI-educated expectations of the corporate structure.
[...] It is taboo to say how far your heart sinks when you have to create or consume the daily diet offered up in company email, Teams, meeting agendas, and regular reports. You won't be able to say how much further it will sink when all the noise is amplified and the signal suppressed by corporate AI. Fair warning: Buy the bathysphere now.
There is an escape hatch. Refuse. Encourage refusal. When you see it going wrong, say so. A sunken heart is no platform for anything good personally, as a team, or as an organization. Listen to your humanity and use it. Oh, and seek out "The Ultimate Computer" – it's clichéd, kitsch, and cuts to the bone. The perfect antidote for vendor AI hype.
(Score: 5, Informative) by Anonymous Coward on Sunday November 24, @01:42PM (7 children)
Microsoft has given you the honor to offer up your documents and other things you type by default for their AI training. Should you somehow NOT wish to have such honor bestowed upon you, you must easily opt out of this service through a simple eight-click process:
File -> Options -> Trust Center -> Trust Center Settings -> Privacy Options -> Privacy Settings -> Optional Connected Experiences -> Uncheck box: "Turn on optional connected experiences"
For Teams you need to go to Privacy -> Privacy in Teams to find it.
I would hope this is not the default setting for enterprise installations as well in places governed by the GDPR.
(Score: 4, Insightful) by acid andy on Sunday November 24, @01:53PM (5 children)
Madness! What's the difference between an Option and a Setting? It looks like many Settings make one Option. But why under the File menu? Is it because kids these days don't even know what a file is? And how Orwellian does the Trust Center sound? *Shudders*
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—
(Score: 4, Insightful) by Gaaark on Sunday November 24, @03:16PM (4 children)
I can just see Microsoft giving 'Bosses' the ability to see their underlings use of Co-pilot: "Ah, George(tte) used Co-pilot to come up with that inciteful question/what have you. Gotta keep an eye on them: they don't really know what they're doing, they're just using AI to appear to know.", etc.
I'd be afraid of using it on a Corporate given computer/lappy.
Yeah: Trust Center. And who do you trust?
--- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
(Score: 3, Insightful) by acid andy on Sunday November 24, @07:03PM
Good point. Simple surveillance of employees like key loggers and recording screen time are already very popular, so managers will love an AI that constantly watches and (pretends to) evaulate employee productivity. I predict the worst mental health crisis in the history of humanity, coming soon!
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—
(Score: 1, Interesting) by Anonymous Coward on Monday November 25, @01:38AM (2 children)
Incite (v. int.) [merriam-webster.com]:
Insight (noun) [merriam-webster.com]:
(Score: 2) by Gaaark on Monday November 25, @02:23AM
Ah yes... caught by Professor Pedant again! And i would have gotten away with too, if it hadn't been for those buoys and grrrls and their Shaggy dog. :)
--- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
(Score: 2) by acid andy on Monday November 25, @02:15PM
I thought he meant the employee's question was inciting discontent or something.
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—
(Score: 3, Interesting) by Anonymous Coward on Sunday November 24, @03:01PM
Hmmm
Microsoft AI: User XXYYZZ has toggled the "optional" connected experiences off, better tell the NSA AI...
NSA AI: (connects via backdoor, has a look...)Boring...maybe the CIA AI might find some of it of interest...
CIA AI: (reads the NSA AI exfiltrated material) Nothing here of interest to me, but I'll bet the GCHQ AI will find some of this stuff amusing, and the IRS AI might want to have a good look at a couple of the files..
You're not so much 'opting out' as indicating that you've something to hide.
Welcome to the ePanopticon, where we'll all be under the basilisk gaze of (in the words of Mr Donald Fagen)
'A just machine to make big decisions,
Programmed by fellows with compassion and vision'
Aye, right...