Microsoft is having another whack at its controversial Recall feature for Copilot+ Windows PCs, after the original version crashed and burned amid scrutiny from security researchers and testers over the summer. The former version of Recall recorded screenshots and OCR text of all user activity, and stored it unencrypted on disk where it could easily be accessed by another user on the PC or an attacker with remote access.
The feature was announced in late May, without having gone through any of the public Windows Insider testing that most new Windows features get, and was scheduled to ship on new PCs by June 18; by June 13, the company had delayed it indefinitely to rearchitect it and said that it would be tested through the normal channels before it was rolled out to the public.
Today, Microsoft shared more extensive details on exactly how the security of Recall has been re-architected in a post by Microsoft VP of Enterprise and OS Security David Weston.
Previously on SoylentNews:
Microsoft Will Try the Data-Scraping Windows Recall Feature Again in October - 20240822
"Recall" Will Now Be Opt-In: Microsoft Changes New Windows AI Feature After Backlash - 20240610
Total Recall: Microsoft Dealing With Trust Issues - 20240609
Windows Co-Pilot "Recall" Feature Privacy Nightmare - 20240524
(Score: 5, Informative) by stormwyrm on Thursday October 03, @03:13AM (5 children)
Though I don't think any end user really wants this. While I have occasionally wished that I could remember something ephemeral that I'd accessed recently, the cost in privacy risk is extremely high for the benefit, no matter what the security assurances that Microsoft makes that it's totally locked down. In any case there can be no way to keep this kind of data safe from Microsoft itself, and they are going to be tempted to access it sooner or later. The modern-day successors of Cardinal Richelieu in the FBI and the world's other police forces will not pass up the chance to obtain six billion lines written by the hands of even the most innocent by which they can hang anyone. They will certainly apply pressure to Microsoft even if they somehow grow principles and actually take serious measures to keep the data of Recall as private as they can make it, using the old, tired pretext of CSAM. Microsoft's track record for security has always been absolutely dismal, so any assurances they make about how secure it will be ought to be taken with the greatest scepticism. The only hope is if there is a way to provably and permanently disable this misguided anti-feature in such a way that it can never, ever be turned on. I'm not holding my breath.
Numquam ponenda est pluralitas sine necessitate.
(Score: 3, Interesting) by aafcac on Thursday October 03, @05:15AM
I don't think they'd have this much trouble it were the equivalent of the script command designed to just record a few inputs like you can in for power automate Desktop to use later. Or a manually recorded clip subjected to OCR for later use.
(Score: 4, Funny) by PinkyGigglebrain on Thursday October 03, @07:19AM (3 children)
Makes me wonder WHY Microsoft is pushing Recall so hard.
How do they profit/gain from this?
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 5, Touché) by mhajicek on Thursday October 03, @07:22AM (1 child)
By collecting and selling data. And you'll "agree" to it in the EULA you never read.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 3, Touché) by PiMuNu on Thursday October 03, @07:39AM
Note that folks pay $100 for a license for this malware (+ office + etc).
(Score: 5, Insightful) by stormwyrm on Thursday October 03, @07:44AM
Numquam ponenda est pluralitas sine necessitate.