Microsoft is having another whack at its controversial Recall feature for Copilot+ Windows PCs, after the original version crashed and burned amid scrutiny from security researchers and testers over the summer. The former version of Recall recorded screenshots and OCR text of all user activity, and stored it unencrypted on disk where it could easily be accessed by another user on the PC or an attacker with remote access.
The feature was announced in late May, without having gone through any of the public Windows Insider testing that most new Windows features get, and was scheduled to ship on new PCs by June 18; by June 13, the company had delayed it indefinitely to rearchitect it and said that it would be tested through the normal channels before it was rolled out to the public.
Today, Microsoft shared more extensive details on exactly how the security of Recall has been re-architected in a post by Microsoft VP of Enterprise and OS Security David Weston.
Previously on SoylentNews:
Microsoft Will Try the Data-Scraping Windows Recall Feature Again in October - 20240822
"Recall" Will Now Be Opt-In: Microsoft Changes New Windows AI Feature After Backlash - 20240610
Total Recall: Microsoft Dealing With Trust Issues - 20240609
Windows Co-Pilot "Recall" Feature Privacy Nightmare - 20240524
Related Stories
As reported by https://www.msn.com/en-us/news/technology/windows-recall-sounds-like-a-privacy-nightmare-heres-why-im-worried/ar-BB1mNGFI , Microsoft is introducing a new "feature" in Windows 11:
If you haven't read about it yet, Recall is an AI feature coming to Windows 11 Copilot+ PCs. It's designed to let you go back in time on your computer by "taking images of your active screen every few seconds" and analyzing them with AI, according to Microsoft's Recall FAQs. If anyone other than you gets access to that Recall data, it could be disastrous.
...
On the surface, this sounds like a cool feature, but that paranoid privacy purist in the back of my mind is burying his face in a pillow and screaming. Imagine if almost everything you had done for the past three months was recorded for anyone with access to your computer to see. Well, if you use Recall, you won't have to imagine.
That might seem like an overreaction, but let me explain: Recall is taking screenshots every few seconds and storing them on your device. Adding encryption into the mix, that's an enormous amount of bloaty visual data that will show almost everything you've been doing on your computer during that period.
...
But that's just the tip of the iceberg. Microsoft openly admits that Recall will be taking screenshots of your passwords and private data:
"Note that Recall does not perform content moderation. It will not hide information such as passwords or financial account numbers. That data may be in snapshots that are stored on your device, especially when sites do not follow standard internet protocols like cloaking password entry."
...
Arguably, the worst part about this is that it will be on by default once you activate your device. Microsoft states:
On by default
A user going by the name of "Alex von Kitchen" summarised the issues quite well: https://aus.social/@Dangerous_beans/112477798730314983
Windows Recall Demands an Extraordinary Level of Trust That Microsoft Hasn't Earned
Op-ed: The risks to Recall are way too high for security to be secondary:
Microsoft's Windows 11 Copilot+ PCs come with quite a few new AI and machine learning-driven features, but the tentpole is Recall. Described by Microsoft as a comprehensive record of everything you do on your PC, the feature is pitched as a way to help users remember where they've been and to provide Windows extra contextual information that can help it better understand requests from and meet the needs of individual users.
This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare. That's doubly true because Microsoft says that by default, Recall's screenshots take no pains to redact sensitive information, from usernames and passwords to health care information to NSFW site visits. By default, on a PC with 256GB of storage, Recall can store a couple dozen gigabytes of data across three months of PC usage, a huge amount of personal data.
The line between "potential security nightmare" and "actual security nightmare" is at least partly about the implementation, and Microsoft has been saying things that are at least superficially reassuring. Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows' disk encryption technologies, which are generally on by default if you've signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user's Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall's snapshots.
This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.
[...] The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity. Data is stored on a per-app basis, presumably to make it easier for Microsoft's app-exclusion feature to work. Beaumont says "several days" of data amounted to a database around 90KB in size. In our usage, screenshots taken by Recall on a PC with a 2560×1440 screen come in at 500KB or 600KB apiece (Recall saves screenshots at your PC's native resolution, minus the taskbar area).
See also:
Arthur T Knackerbracket has processed the following story:
After weeks of being excoriated by cybersecurity experts, Microsoft is making moves to address concerns over its new AI-powered computer history-saving feature: Copilot+ Recall.
Most notably, Microsoft is switching Recall from a default feature to one that requires a user to opt-in first. The company is making the change before Recall officially rolls out on June 18.
"We are updating the set-up experience of Copilot+ PCs to give people a clearer choice to opt-in to saving snapshots using Recall," wrote Microsoft Windows VP Pavan Davuluri in an official company update on the feature. "If you don’t proactively choose to turn it on, it will be off by default."
Last month, Microsoft announced a series of new AI-powered features coming to Windows. One central feature that the company announced was Recall.
Recall takes constant screenshots in the background while a user uses a device. Microsoft's AI then scans the screenshots and makes a searchable archive of all the activity history that a user performed. Which websites were visited, what a user typed into forms – nearly everything is saved.
Cybersecurity experts were immediately concerned. A prominent former Microsoft threat analyst who had hands-on experience using Recall called the feature a "disaster."
It turns out, Recall really does save pretty much everything including text passwords, sensitive financial information, private Google Chrome browser history, and more. And Recall saves it inside a database that can be easily accessed by a bad actor who gains remote control of a user's device.
Making things even worse, Recall was going to be a feature turned on by default, meaning users might not have even been aware of what was going on in the background of their device.
Thankfully, users will now have to opt-in to the feature, fully aware of what they are turning on and what Recall does.
Microsoft will begin sending a revised version of its controversial Recall feature to Windows Insider PCs beginning in October, according to an update published today to the company's original blog post about the Recall controversy. The company didn't elaborate further on specific changes it's making to Recall beyond what it already announced in June.
For those unfamiliar, Recall is a Windows service that runs in the background on compatible PCs, continuously taking screenshots of user activity, scanning those screenshots with optical character recognition (OCR), and saving the OCR text and the screenshots to a giant searchable database on your PC. The goal, according to Microsoft, is to help users retrace their steps and dig up information about things they had used their PCs to find or do in the past.
The problem was that other users on the same PC, or attackers with physical or remote access to your PC, could easily access, view, and export those screenshots and the OCR database since none of the information was encrypted at rest or protected in any substantive way.
Microsoft had planned to launch Recall as one of the flagship features of its Copilot+ PC launch in July, along with the new Qualcomm Snapdragon-powered Surface devices, but its rollout was bumped back and then paused entirely so that Recall could be reworked and then sent out to Windows Insiders for testing like most other Windows features are.
Among the changes Microsoft has said it will make: The database will be encrypted at rest and will require authentication (and periodic reauthentication) with Windows Hello before users will be allowed to access it. The feature will also be off by default, whereas the original plan was to turn it on by default and make users go into Settings to turn it off.
"Security continues to be our top priority and when Recall is available for Windows Insiders in October we will publish a blog with more details," reads today's update to Microsoft Windows and Devices Corporate Vice President Pavan Davuluri's blog post.
When the preview is released, Windows Insiders who want to test the Recall preview will need to do it on a PC that meets Microsoft's Copilot+ system requirements. Those include a processor with a neural processing unit (NPU) capable of at least 40 trillion operations per second (TOPS), 16GB of RAM, and 256GB of storage. The x86 builds of Windows for Intel and AMD processors don't currently support any Copilot+ features regardless of whether the PC meets those requirements, but that should change later this year.
(Score: 5, Informative) by Runaway1956 on Thursday October 03, @02:55AM
The concept is flawed to start with. Historically, software that records your activity and/or takes screenshots was called "malware". The fact that Windows will just automatically record your activities, and cache all that data, only presents a huge vulnerability for hackers/crackers/law enforcement to exploit. Recall may be justified on Enterprise machines, since the corporation owns the machines. It most certainly is not justified on home or private machines. The individual who consents to this spyware is a fool who doesn't understand the issues. And, yes, that applies to parents consenting to Microsoft spying on their children.
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 5, Informative) by stormwyrm on Thursday October 03, @03:13AM (5 children)
Though I don't think any end user really wants this. While I have occasionally wished that I could remember something ephemeral that I'd accessed recently, the cost in privacy risk is extremely high for the benefit, no matter what the security assurances that Microsoft makes that it's totally locked down. In any case there can be no way to keep this kind of data safe from Microsoft itself, and they are going to be tempted to access it sooner or later. The modern-day successors of Cardinal Richelieu in the FBI and the world's other police forces will not pass up the chance to obtain six billion lines written by the hands of even the most innocent by which they can hang anyone. They will certainly apply pressure to Microsoft even if they somehow grow principles and actually take serious measures to keep the data of Recall as private as they can make it, using the old, tired pretext of CSAM. Microsoft's track record for security has always been absolutely dismal, so any assurances they make about how secure it will be ought to be taken with the greatest scepticism. The only hope is if there is a way to provably and permanently disable this misguided anti-feature in such a way that it can never, ever be turned on. I'm not holding my breath.
Numquam ponenda est pluralitas sine necessitate.
(Score: 3, Interesting) by aafcac on Thursday October 03, @05:15AM
I don't think they'd have this much trouble it were the equivalent of the script command designed to just record a few inputs like you can in for power automate Desktop to use later. Or a manually recorded clip subjected to OCR for later use.
(Score: 4, Funny) by PinkyGigglebrain on Thursday October 03, @07:19AM (3 children)
Makes me wonder WHY Microsoft is pushing Recall so hard.
How do they profit/gain from this?
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 5, Touché) by mhajicek on Thursday October 03, @07:22AM (1 child)
By collecting and selling data. And you'll "agree" to it in the EULA you never read.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 3, Touché) by PiMuNu on Thursday October 03, @07:39AM
Note that folks pay $100 for a license for this malware (+ office + etc).
(Score: 5, Insightful) by stormwyrm on Thursday October 03, @07:44AM
Numquam ponenda est pluralitas sine necessitate.
(Score: 4, Insightful) by Tokolosh on Thursday October 03, @02:06PM (1 child)
It used to be that Windows was easy to install and set up. Linux desktop was difficult and complicated.
No more.
(Score: 2, Insightful) by Anonymous Coward on Friday October 04, @12:10AM