Is Windows Recall bad for businesses?

Microsoft Corporate Vice President Pavan Davuluri speaks about Recall
(Image credit: Getty Images)

AI tools thrive on data – the more data you feed into an AI model, the more accurately it can operate based on its input. Until now, AI models like ChatGPT have mainly relied on publicly accessible information scraped from the internet.

This isn't ideal from a data rights perspective, of course, but it's a foundational method for training models to understand and respond to human speech. These models ingest vast amounts of text, images, and other data types to learn patterns, enabling them to respond contextually and generate coherent outputs.

However, when it comes to building highly personalized AI models, you are the best source of data – and that's the idea behind Microsoft's new Recall feature.

Recall is an AI-powered search function that allows you to look back through all of the tasks you've carried out on your computer. There's one problem, however – in order for Recall to work, Windows screenshots your desktop every few seconds. This has prompted pretty much every privacy advocate to begin pointing out the obvious problems. Microsoft has since made a series of changes to address the issues, but I'm going to point out why, in spite of these updates, Recall is still a fundamental privacy issue for businesses and what you can do to mitigate it.

Windows Recall's new setup page

(Image credit: Microsoft)

Why is Recall bad for businesses? 

Simply put, Recall is a privacy nightmare. It's one of a handful of new features Microsoft is shipping out with its CoPilot+ PC series, which runs on ARM architecture instead of the traditional x86 architecture. Or, was, at least.

The outrage over Recall's initial implementation led Microsoft to narrow the scope of Recall’s initial release from all CoPIlot+ devices to the select few who opt into the Windows Insider program, with a general rollout being delayed until some point in the future.

To make sense of Recall, I need to dig into what changes with CoPilot+. The swap to an ARM processor offers potentially increased processing capabilities with drastically reduced power usage. That's nice, but there are two interesting issues here.

The first is that all CoPilot+ PCs will ship with a dedicated neural processing unit designed specifically to run large AI models, indicating that Microsoft is staking the future of CoPilot+ on programs like Recall being an attractive feature for customers.

The second is the inclusion of Pluton, a Trusted Platform Module that forms the basis of Microsoft’s chip-to-cloud security strategy. Think of the combination of security and DRM you'd see on the Xbox: a cryptographically guaranteed tamper-resistant secure enclave with management keys and firmware inaccessible from user space, that Microsoft can update at will.

If you're into free and open source software then I'm sure alarm bells are already ringing. For now, though, keep in mind that Microsoft seems to be chasing a scheme where updates can be pushed at will with no way to downgrade.

Okay, moving on to Recall itself. It's a specific feature in the CoPilot+ ecosystem that allows an AI assistant to carry out complex search tasks on behalf of the user by capturing screenshots of the user's device. If more alarm bells are ringing – good, they should be. Recall meticulously combs through every second of computer usage, classifying patterns in your behavior, identifying which programs you've been using, and creating a search index for you to pull data from. Hence, "Recall".

It provides a natural language search function, upgrading Microsoft's traditional file-based search to include semantic search. So, you can ask Recall questions like "What program was I using 10 minutes ago?" or "Bring up the Word documents I was drafting for my lawyer." In principle, this is actually quite cool. There's a certain futuristic appeal to directly querying an AI model for information about how you're using your computer.

However, it comes at a cost. As I mentioned earlier, Recall tracks everything you're doing – it's how the AI model is built. It Recall was a completely open-source program made by a company that was entirely transparent about privacy and user control, I might be tempted to give it a pass. However, it's Microsoft.

Plenty of people have moved away from Microsoft in the past due to the invasive telemetry in Windows 10 and 11. It can't be turned off on Home distributions – only reduced. Your telemetry settings are "Full" by default on Windows 10, which means that full memory crashdumps are sent to Microsoft in a diagnostic report when a program you use crashes. This also gives Microsoft permission to grab files from your computer if they can’t replicate the crash.

Then, there's the disastrous first look at Recall where it quickly became apparent that there were no real safety or privacy options in place. Hackers managed to get their hands on preview versions of Recall and dump the entire database of images, plus the search index, as long as they were logged into the same user session as the Recall user – with no encryption to stop them. This means that, so long as a hacker can gain basic access to a user session, the spyware is already waiting for them. No need to build a complicated chain of exploits to install monitoring software as Admin.

In response to the furor, Microsoft dialed back the scope of the rollout almost immediately and new privacy tools were unveiled that allow you to dictate which apps Recall can capture. A new biometrics-based login scheme called "Windows Hello" was shown off, too, which Recall would require to search your Recall timeline or unencrypt the screenshot database. It was also stated that Recall will be off by default, that it can be disabled by group policy, and that admins can't view your Recall database. It won't capture private windows and won't look at your Recall database either. Promise.

Recall in action

Let's imagine a situation where you're working for a healthcare provider in America. You've been given a CoPilot+ device by your organization that's enrolled on Active Directory, so you can work from home. You log into your work VPN, giving you access to your intranet with patient data. Needless to say, handling this data comes with hardcore security requirements.

Now, you've been trained to handle this data properly, to comply with HIPAA. You don't store anything on your device, you simply log in, comply with all the endpoint security requirements, and manage data as necessary. Once you're done, you wipe all the credentials you've used to access your internal network. 

However, before you started work, you used Recall to find a picture you saved earlier, and you've since been running Recall the entire time you were browsing confidential patient data.

With Recall, you could feasibly, and unknowingly, share protected information

You might rely on Microsoft's argument that the data is encrypted – but it doesn't matter. For a start, you can access that data when you're not supposed to. More importantly, you've just shared protected health information with a business associate that you don't have a business associate agreement with. So, you preemptively sign a BAA with Microsoft just to avoid violating HIPAA, even if you don't use their cloud services. That's a timeline of several months, plus whatever it costs to sign the contract.

Pivoting to another scenario – a bad actor at Microsoft. Imagine you’re the target of an unwarranted law enforcement campaign. Before you say this is outlandish, remember that Microsoft has outright stated that it informs the NSA of zero-days before it patches them. Perhaps your company is even the target of industrial espionage. Microsoft could, in theory, update your device settings remotely using the Pluton chip and then begin exfiltrating screenshot data from that device without you ever being aware it was occurring.

I'm not claiming that Microsoft will suddenly begin spying on everyone on the planet, but the capability should give you pause. At the end of the day, all you have is the company's word and a guarantee that they now own your machine from the silicon upwards. 

What can you do about Recall?

So, what can be done? Thanks to internet-wide outcry, Microsoft has provided the ability to temporarily or permanently stop Recall from operating on your computer. You can also edit the registry or deploy a group policy update to disable the Recall feature.

However, it's unclear how tightly integrated this feature will be in Windows 12 and future updates. Recall requires a significant level of trust from Microsoft users – that the feature won’t become more invasive in the future – a trust that Microsoft has not necessarily earned.

What are the alternatives? You can opt for a distribution that Microsoft does not ship Recall with. However, support for these distributions may not last forever, and eventually, you may have to choose a distribution that includes Recall by default. Taking back control of your privacy, whether you're an individual or a large company, involves moving towards free and open-source software and adopting a privacy-centric approach.

Linux gives you complete control over your system

One immediate (but drastic) solution is to just ditch Windows. Linux gives you complete control over your system – including what it does and doesn't do. Of course, this only applies to the machines you own. When you start considering how much trust you place in cloud-based solutions such as Gmail, you'll quickly realize how much information you're giving away to advertising companies and data marketers. If you absolutely have to use a cloud-based service to host vital parts of your business such as email and file-sharing, I'd suggest switching to a privacy-conscious service provider, such as Proton VPN.

The Proton Unlimited package includes several privacy-focused tools to ensure you're not being spied on. For instance, its email solution is end-to-end encrypted, meaning only you can access your emails. Proton takes this policy seriously, even forgoing a password reset function to prevent potential abuse by employees or law enforcement. If you forget your password, it's game over – similar to being locked out of an encrypted hard drive without the password.

Calling all bargain hunters

Looking to bag some serious security savings? Be sure to head on over to our roundup of the best VPN deals.

Proton's cloud storage solution enforces the same strict security requirements, too, allowing you to host 500 GB of data for just $12.99 a month. Moving away from apps that harvest your metadata, Proton Calendar offers an all-in-one solution for arranging and sharing meetings, events, and dates securely.

And of course, Proton VPN prevents spying by ISPs, governments, hackers, and marketing agencies. Encrypting all of your internet traffic should be the first step you take towards a privacy-conscious approach. Essentially, ProtonVPN is the lynchpin that ensures all the other tools in Proton's ecosystem are truly private.

The bottom line

It's easy to feel disheartened about the future of privacy, but the point I’m trying to make here is that, for the moment, your privacy remains in your hands.

With the advent of Microsoft's Pluton chip, it's clear that Microsoft feels emboldened enough by the "my way or the highway" approach to push features that are deeply invasive and wouldn't have been palatable even a few years ago. There's a growing concern we may be sleepwalking into a future where computing devices are owned entirely by the corporations that design them, and we're expected to be content leasing them. 

Well, the only way to take a stand against that future is to consistently choose solutions that respect user privacy (and business privacy, too). If we end up in a future where Recall is always on by default and Microsoft (or whoever else) can dip into our cloud-based services at will for a sneak peak at what we're up to, it won't be because there was never any other choice out there. It'll be because we never took the other choice.

Sam Dawson
VPN and cybersecurity expert

Sam Dawson is a cybersecurity expert who has over four years of experience reviewing security-related software products. He focuses his writing on VPNs and security, previously writing for ProPrivacy before freelancing for Future PLC's brands, including TechRadar. Between running a penetration testing company and finishing a PhD focusing on speculative execution attacks at the University of Kent, he still somehow finds the time to keep an eye on how technology is impacting current affairs.