Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Tuesday June 11, @06:45AM   Printer-friendly
from the get-your-ass-to-Redmond dept.

Windows Recall Demands an Extraordinary Level of Trust That Microsoft Hasn't Earned

Op-ed: The risks to Recall are way too high for security to be secondary:

Microsoft's Windows 11 Copilot+ PCs come with quite a few new AI and machine learning-driven features, but the tentpole is Recall. Described by Microsoft as a comprehensive record of everything you do on your PC, the feature is pitched as a way to help users remember where they've been and to provide Windows extra contextual information that can help it better understand requests from and meet the needs of individual users.

This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare. That's doubly true because Microsoft says that by default, Recall's screenshots take no pains to redact sensitive information, from usernames and passwords to health care information to NSFW site visits. By default, on a PC with 256GB of storage, Recall can store a couple dozen gigabytes of data across three months of PC usage, a huge amount of personal data.

The line between "potential security nightmare" and "actual security nightmare" is at least partly about the implementation, and Microsoft has been saying things that are at least superficially reassuring. Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows' disk encryption technologies, which are generally on by default if you've signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user's Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall's snapshots.

This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.

[...] The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity. Data is stored on a per-app basis, presumably to make it easier for Microsoft's app-exclusion feature to work. Beaumont says "several days" of data amounted to a database around 90KB in size. In our usage, screenshots taken by Recall on a PC with a 2560×1440 screen come in at 500KB or 600KB apiece (Recall saves screenshots at your PC's native resolution, minus the taskbar area).

See also:

MS revamping how Recall works amid its PR nightmare

Microsoft is revamping how Recall works amid its PR nightmare
By
Rich Woods

Key Takeaways

        -Microsoft promised groundbreaking features with Copilot+, including Cocreator and Live Captions, but Recall has become a PR nightmare.
        -Concerns about Recall being a security risk have led to backlash and panic among users due to data access vulnerabilities.
        -Microsoft has been silent on Recall issues but is finally taking action to address the security concerns and ensure user control.

One of the key complaints about Recall is that it was opt-out. In the setup experience, Windows just tells you that it's on, and lets you check a box to open settings after setup is complete. Now, you'll have to choose to turn it on during the out-of-box experience, so it's totally opt-in.

Secondly, you'll have to use Windows Hello in order to turn on Recall. The idea is that in order to access it, Windows will have to know it's you.
  Finally, Windows is going to use just-in-time decryption, meaning everything will be encrypted until you've been authenticated. Microsoft also confirmed that it's encrypted the search index database, which was one of the key call-outs in the report from earlier this week.

Microsoft also noted that all Copilot+ PCs are Secured-core, so they're designed to be secure. They have Microsoft Pluton security chips, so there's hardware-level protection going on there.
 

https://www.xda-developers.com/microsoft-recall-pr-nightmare/

And, all of this makes sense because we know that security chips can't be hacked, because they are secure chips, right? /sarcasm https://www.tomsguide.com/news/billions-of-pcs-and-other-devices-vulnerable-to-newly-discovered-tpm-20-flaws

Oh yeah, the data never leaves your PC. Unless, of course, you do a backup to the cloud, right? In which case your data may be in Sri Lanka, Timbuktu, Israel, or, maybe even Ireland. And, police forces in third world banana republics never get warrants for whatever might be on the server.

The best thing Microsoft can do with CoPilot, is to deep six it. Better yet, deep six all of their "telemetry" along with CoPilot.


Original Submission #1Original Submission #2

Related Stories

Microsoft Details Security/Privacy Overhaul for Windows Recall Ahead of Relaunch 9 comments

https://arstechnica.com/gadgets/2024/09/microsoft-details-security-privacy-overhaul-for-windows-recall-ahead-of-relaunch/

Microsoft is having another whack at its controversial Recall feature for Copilot+ Windows PCs, after the original version crashed and burned amid scrutiny from security researchers and testers over the summer. The former version of Recall recorded screenshots and OCR text of all user activity, and stored it unencrypted on disk where it could easily be accessed by another user on the PC or an attacker with remote access.

The feature was announced in late May, without having gone through any of the public Windows Insider testing that most new Windows features get, and was scheduled to ship on new PCs by June 18; by June 13, the company had delayed it indefinitely to rearchitect it and said that it would be tested through the normal channels before it was rolled out to the public.

Today, Microsoft shared more extensive details on exactly how the security of Recall has been re-architected in a post by Microsoft VP of Enterprise and OS Security David Weston.

Previously on SoylentNews:
Microsoft Will Try the Data-Scraping Windows Recall Feature Again in October - 20240822
"Recall" Will Now Be Opt-In: Microsoft Changes New Windows AI Feature After Backlash - 20240610
Total Recall: Microsoft Dealing With Trust Issues - 20240609
Windows Co-Pilot "Recall" Feature Privacy Nightmare - 20240524


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Interesting) by Anonymous Coward on Tuesday June 11, @07:15AM (1 child)

    by Anonymous Coward on Tuesday June 11, @07:15AM (#1360123)
    What's the increase in power consumption for a typical PC?

    If say an extra 5W is nothing to some people, then maybe they won't mind running 5W worth of cryptomining. I'm sure cryptomining stuff can be throttled.
    • (Score: 4, Insightful) by Rosco P. Coltrane on Tuesday June 11, @03:54PM

      by Rosco P. Coltrane (4757) on Tuesday June 11, @03:54PM (#1360186)

      Really my friend, power consumption is the least of this thing's concern.

      I don't want my PC to take screenshots of what I do continuously, and I MOST CERTAINLY don't want any Microsoft AI looking at them, even if you paid me thousands of dollars per screenshot.

  • (Score: 5, Insightful) by sigterm on Tuesday June 11, @09:03AM (10 children)

    by sigterm (849) on Tuesday June 11, @09:03AM (#1360127)

    Windows Recall Demands an Extraordinary Level of Trust That Microsoft Hasn't Earned

    Because indeed trust has to be earned, and not only have Microsoft failed to earn our trust, they have repeatedly demonstrated that they should not be trusted with any kind of sensitive information, or even basic Personally Identifiable Information (PII), as they will either:

    - bury some nefarious clause in their ToS allowing them to sell your data to any third party, or
    - just sell it anyway in blatant violation of their own ToS and applicable laws, and hope you don't notice.

    That is, if their consistently appalling security record doesn't cause the data to be leaked wholesale first.

    The best thing Microsoft can do with CoPilot, is to deep six it. Better yet, deep six all of their "telemetry" along with CoPilot.

    But they will do neither, but instead gamble on the short memories of the public, and also on the fact that the young generation doesn't know much about their practices in years past.

    And governments and regulators will consistently let them get away with it, for reasons that are a complete mystery and are in no way related to money changing hands, or decision-makers and industry leaders all being part of the same big boys club.

    • (Score: 4, Interesting) by PiMuNu on Tuesday June 11, @10:40AM (9 children)

      by PiMuNu (3823) on Tuesday June 11, @10:40AM (#1360136)

      Nb: I don't understand the legal situation of M$ selling a windows license to my organisation for whom I work. If (when) M$ scrapes my data, when I haven't signed any license agreement, does that make them vulnerable to e.g. GDPR sanctions? In many organisations use of M$ products is essentially compulsory. What is the legal situation here?

      Anyone know?

      • (Score: 1) by khallow on Tuesday June 11, @11:26AM (1 child)

        by khallow (3766) Subscriber Badge on Tuesday June 11, @11:26AM (#1360141) Journal
        They have this covered by clickwrap. You probably had a window popup signifying your agreement with some Microsoft license as part of getting your MS software to work. Or maybe they're just don't care.
        • (Score: 3, Informative) by mhajicek on Tuesday June 11, @05:48PM

          by mhajicek (51) on Tuesday June 11, @05:48PM (#1360195)

          No, in most companies, the IT department clicked through those. The employee gets the system in a functioning state

          --
          The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
      • (Score: 3, Insightful) by c0lo on Tuesday June 11, @11:26AM (6 children)

        by c0lo (156) Subscriber Badge on Tuesday June 11, @11:26AM (#1360142) Journal

        I don't understand the legal situation of M$ selling a windows license to my organisation for whom I work. If (when) M$ scrapes my data, when I haven't signed any license agreement, does that make them vulnerable to e.g. GDPR sanctions? In many organisations use of M$ products is essentially compulsory. What is the legal situation here?

        IANAL, but the following sounds obvious (common-sensical?) to me.

        Since it's your employer's computer and license, it is expected that everything you do on that computer is in the name and interests of your employer.
        This is one of the legit cases of "if you don't want anything about you personally to be tracked when using the computer, don't do anything personal on your employer's computer".

        Otherwise, whatever Microsoft scraps or not from the employer's computer is between them, you have no standing. If your employer doesn't like something, it's up to the IT/legal depts to take care about it.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 5, Insightful) by PiMuNu on Tuesday June 11, @12:32PM (4 children)

          by PiMuNu (3823) on Tuesday June 11, @12:32PM (#1360153)

          > This is one of the legit cases of...

          Except that as a requirement for my job, I have to deal with tax details and salary status (to interface with payroll, for example). I have to manage details of my education status, marital status and background (to interface with certain HR functions). I have to manage attendance including sickness and carer duties. This is all required to go through my PC. I can't imagine anything more personal and private than this information, which must go through my work PC as a requirement of my job - and I would guess the same is true of most organisations that are large enough to have a specific HR/finance department.

          • (Score: 5, Touché) by Gaaark on Tuesday June 11, @01:01PM (1 child)

            by Gaaark (41) on Tuesday June 11, @01:01PM (#1360160) Journal

            Don't worry: once your workplace MS servers and computers get hacked (if they haven't already), it will all be public anyways.

            When the MS EULA comes up, just click 'Agree', put your head down and cry.

            --
            --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
            • (Score: 2) by PiMuNu on Tuesday June 11, @01:06PM

              by PiMuNu (3823) on Tuesday June 11, @01:06PM (#1360162)

              Don't worry, most of it is on Oracle DB...

          • (Score: 2) by c0lo on Tuesday June 11, @02:04PM (1 child)

            by c0lo (156) Subscriber Badge on Tuesday June 11, @02:04PM (#1360173) Journal

            I have to manage details of my education status, marital status and background (to interface with certain HR functions). I have to manage attendance including sickness and carer duties. This is all required to go through my PC. I can't imagine anything more personal and private than this information, which must go through my work PC as a requirement of my job

            I'll take a wild guess in here (I already IANAL-ed): you signed a contract with your employer. If your employer is responsible "to e.g. GDPR sanctions" and it fails to take the necessary precautions, then your beef is w/ your employer, not w/ Microsoft.

            and I would guess the same is true of most organisations that are large enough to have a specific HR/finance department.

            Well, your/my MMV.
            I'm working for a large US corp, with subsidiaries in US/Canada/most of EU/Australia/NZ/Japan. I'm only dealing with the Australian HR and, after zillions of interview rounds and reference checks, I was NOT asked for anything but my DoB, Tax File Number, the bank account where to receive the salary and the superannuation fund acc# - all on deadtree support, as it was the employment contact.

            Yes, there are a good number of optional fields on the HR site, from "emergency contact number" to something that resembles a "skills and prev experience" of a recruitment site. I never bothered and they didn't insist.

            have to manage attendance including sickness and carer duties.

            Me too, in the sense in which I need to fill in the "leave request" form, in which there's no info leaking sensitive details - even the medical certificate (when it was required and I need to attach the scanned image of it) shows nothing but the standard "certifies that c0lo was unable to work between ... and... due to a medical condition" - I was surprised that it's exactly the same for a bout of flu or for a heart attack (having had both while working for the same employer).

            --
            https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
            • (Score: 2) by PiMuNu on Tuesday June 11, @02:59PM

              by PiMuNu (3823) on Tuesday June 11, @02:59PM (#1360179)

              Fair point, maybe I over-emphasise.

              OTOH if, for example, your favourite medical insurance/mortgage/finance firm finds out that you have been taking anything much more than a regular amount of sick leave, I can imagine that insurance premiums/etc get much more expensive. This sort of data is probably worth a few pennies extra per Microsoft license == Gazillions of $ for Microsoft.

        • (Score: 2) by corey on Thursday June 20, @12:03AM

          by corey (2202) on Thursday June 20, @12:03AM (#1361105)

          In addition to what Pinumu said, the HR people will also be running Windows with Copilot so it’ll scrape their screens while they access your personal information (and everyone else’s). And when you talk to someone from your medical insurance company, or govt health cover scheme, or bank, or your lawyer in a sensitive case (hopefully not ever) as examples, they will have their screen scraped as well since we’re all running Win 11 with Copilot. We’re buggered.

  • (Score: 4, Touché) by Rosco P. Coltrane on Tuesday June 11, @03:51PM (3 children)

    by Rosco P. Coltrane (4757) on Tuesday June 11, @03:51PM (#1360185)

    Since when did Microsoft have any trusting userbase?

    Microsoft has never once been a trustworthy company at any point in its history. At best, charitable users would see them as significantly incompetent. But most people know Microsoft as "that aggressive company with those psychopathic CEOs".

    • (Score: 4, Insightful) by owl on Tuesday June 11, @04:43PM (2 children)

      by owl (15206) on Tuesday June 11, @04:43PM (#1360193)

      Since when did Microsoft have any trusting userbase?

      Those of about 25yro and below have a different view of MS than those of us who watched their iron-handedness happen throughout the 90's.

      The under 25yro's don't see MS as an evil corp the way we do, because they've never seen their true colors in full display. They've only seen the carefully crafted fiction that MS has fed them, and so to them MS is not one of the bad-guys (even though reality is they are just as bad now as then, they just do a better job covering it up now).

      • (Score: 3, Insightful) by r_a_trip on Wednesday June 12, @09:35AM (1 child)

        by r_a_trip (5276) on Wednesday June 12, @09:35AM (#1360248)

        That is only until they get older and wise up. Sooner than later they will feel Microsoft's unrelenting knife in their own backs. MS can't help themselves. It is woven in the very fabric of that company.

        File format shenanigans, driver model changes and obsoleting perfectly functioning hardware (even fairly current equipment). Ever changing interface shuffles. Telemetry, advertising, bundling, subscriptions. Introducing hardware and/or services and abandoning it within 2 years. And that is current stuff.

        The MS of old was far worse. Not that the current MS is anywhere near being acceptable as a software vendor.

        • (Score: 2) by owl on Tuesday June 18, @03:37PM

          by owl (15206) on Tuesday June 18, @03:37PM (#1360920)

          The MS of old was far worse.

          Indeed.

          Not that the current MS is anywhere near being acceptable as a software vendor.

          Agreed. But the new MS seems to be attempting the boil the frog ploy. Trying to find just how much the new crew will tolerate before they jump ship instead of their prior extinguish everything methodology.

          Whether the new ploy succeeds long run is yet to be seen, because yes, eventually the MS knife will make it's 1,000th cut and a given user will jump ship.

  • (Score: 3, Insightful) by SomeGuy on Tuesday June 11, @05:59PM (2 children)

    by SomeGuy (5632) on Tuesday June 11, @05:59PM (#1360196)

    If Microsoft were to write an application called "Trust", you could be certain it is somehow the absolute worst thing for you.

    Where to even start, DRM/TPM, secureboot locking everything down, advertisements slipping in everywhere, Microsoft Tttttttteeeeeaaaa[waitng... waiting... waiting]mmmmmmssssssss, updates that break things, subscriptions just to write a damn letter, make your computer more "secure" by uploading all your private stuff to OneDrive, updates that magically make your computer obsolete without notice, changing things around so you have to re-learn everything.... yea, what is there to trust anyway? Trust that they will be the most evil, demonic, corrupt corporation possible?

    • (Score: 2) by tangomargarine on Tuesday June 11, @07:04PM (1 child)

      by tangomargarine (667) on Tuesday June 11, @07:04PM (#1360203)

      You forgot to mention that time they made that wonderful "do you want to upgrade to Windows 10" popup where your options were "yes, now" and "yes but do it later when I'm not looking".

      We can argue until the cows come home about ethics in programming, but clearly it's unethical to lie to your users about what the software is doing.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 3, Funny) by SomeGuy on Thursday June 13, @12:59AM

        by SomeGuy (5632) on Thursday June 13, @12:59AM (#1360323)

        I also forgot to mention that ball-and-chain Microsoft account crap.

        I forgot because I was too busy opening Microsoft Teams.

  • (Score: 1, Touché) by Anonymous Coward on Tuesday June 11, @07:37PM

    by Anonymous Coward on Tuesday June 11, @07:37PM (#1360205)

    Act like a slave; get treated like a slave. All whining does is make your master despise you more. Have some self respect and use a grown ups' OS.

  • (Score: 2) by cmdrklarg on Tuesday June 11, @09:50PM

    by cmdrklarg (5048) Subscriber Badge on Tuesday June 11, @09:50PM (#1360215)

    I'm forced to use Windows 11 at work, but the fact of the matter is that if I can't keep this Copilot/Recall/Bullshit off my Win 10 PC then it may just be the excuse to change my home PC to Linux. I've already dipped my toes in and found the water to be OK; I'm still on Win10 mainly due to laziness/inertia.

    Or access the Recall DB and replace all the screenshots with the old goatse pic; that would be entertaining.

    --
    The world is full of kings and queens who blind your eyes and steal your dreams.
(1)