Title | Total Recall: Microsoft Dealing With Trust Issues | |
Date | Tuesday June 11 2024, @06:45AM | |
Author | hubie | |
Topic | ||
from the get-your-ass-to-Redmond dept. |
Op-ed: The risks to Recall are way too high for security to be secondary:
Microsoft's Windows 11 Copilot+ PCs come with quite a few new AI and machine learning-driven features, but the tentpole is Recall. Described by Microsoft as a comprehensive record of everything you do on your PC, the feature is pitched as a way to help users remember where they've been and to provide Windows extra contextual information that can help it better understand requests from and meet the needs of individual users.
This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare. That's doubly true because Microsoft says that by default, Recall's screenshots take no pains to redact sensitive information, from usernames and passwords to health care information to NSFW site visits. By default, on a PC with 256GB of storage, Recall can store a couple dozen gigabytes of data across three months of PC usage, a huge amount of personal data.
The line between "potential security nightmare" and "actual security nightmare" is at least partly about the implementation, and Microsoft has been saying things that are at least superficially reassuring. Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows' disk encryption technologies, which are generally on by default if you've signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user's Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall's snapshots.
This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.
[...] The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity. Data is stored on a per-app basis, presumably to make it easier for Microsoft's app-exclusion feature to work. Beaumont says "several days" of data amounted to a database around 90KB in size. In our usage, screenshots taken by Recall on a PC with a 2560×1440 screen come in at 500KB or 600KB apiece (Recall saves screenshots at your PC's native resolution, minus the taskbar area).
See also:
Microsoft is revamping how Recall works amid its PR nightmare
By
Rich WoodsKey Takeaways
-Microsoft promised groundbreaking features with Copilot+, including Cocreator and Live Captions, but Recall has become a PR nightmare.
-Concerns about Recall being a security risk have led to backlash and panic among users due to data access vulnerabilities.
-Microsoft has been silent on Recall issues but is finally taking action to address the security concerns and ensure user control.
One of the key complaints about Recall is that it was opt-out. In the setup experience, Windows just tells you that it's on, and lets you check a box to open settings after setup is complete. Now, you'll have to choose to turn it on during the out-of-box experience, so it's totally opt-in.
Secondly, you'll have to use Windows Hello in order to turn on Recall. The idea is that in order to access it, Windows will have to know it's you.
Finally, Windows is going to use just-in-time decryption, meaning everything will be encrypted until you've been authenticated. Microsoft also confirmed that it's encrypted the search index database, which was one of the key call-outs in the report from earlier this week.Microsoft also noted that all Copilot+ PCs are Secured-core, so they're designed to be secure. They have Microsoft Pluton security chips, so there's hardware-level protection going on there.
https://www.xda-developers.com/microsoft-recall-pr-nightmare/
And, all of this makes sense because we know that security chips can't be hacked, because they are secure chips, right? /sarcasm https://www.tomsguide.com/news/billions-of-pcs-and-other-devices-vulnerable-to-newly-discovered-tpm-20-flaws
Oh yeah, the data never leaves your PC. Unless, of course, you do a backup to the cloud, right? In which case your data may be in Sri Lanka, Timbuktu, Israel, or, maybe even Ireland. And, police forces in third world banana republics never get warrants for whatever might be on the server.
The best thing Microsoft can do with CoPilot, is to deep six it. Better yet, deep six all of their "telemetry" along with CoPilot.
Original Submission #1 Original Submission #2
Links |
printed from SoylentNews, Total Recall: Microsoft Dealing With Trust Issues on 2025-05-21 17:52:35