When Microsoft named its new Windows feature Recall, the company intended the word to refer to a kind of perfect, AI-enabled memory for your device. Today, the other, unintended definition of “recall”—a company's admission that a product is too dangerous or defective to be left on the market in its current form—seems more appropriate.
On Friday, Microsoft announced that it would be making multiple dramatic changes to its rollout of its Recall feature, making it an opt-in feature in the Copilot+ compatible versions of Windows where it had previously been turned on by default, and introducing new security measures designed to better keep data encrypted and require authentication to access Recall's stored data.
“We are updating the set-up experience of Copilot+ PCs to give people a clearer choice to opt-in to saving snapshots using Recall,” reads a blog post from Pavan Davuluri, Microsoft's corporate vice president for Windows and devices. “If you don’t proactively choose to turn it on, it will be off by default.”
The changes come amid a mounting barrage of criticism from the security and privacy community, which has described Recall—which silently stores a screenshot of the user's activity every five seconds as fodder for AI analysis—as a gift to hackers: essentially unrequested, preinstalled spyware built into new Windows computers.
In the preview versions of Recall, that screenshot data, complete with the user's every bank login, password, and porn site visit would have been indefinitely collected on the user's machine by default. And though that highly sensitive data is stored locally on the user's machine and not uploaded to the cloud, cybersecurity experts have warned that it all remains accessible to any hacker who so much as gains a temporary foothold on a user's Recall-enabled device, giving them a long-term panopticon view of the victim's digital life.
"It makes your security very fragile,” as Dave Aitel, a former NSA hacker and founder of security firm Immunity, described it—more charitably than some others—to WIRED earlier this week. “Anyone who penetrates your computer for even a second can get your whole history. Which is not something people want.”
In addition to making Recall an opt-in feature, Microsoft’s Davuluri also writes that the company will make changes to better safeguard the data Recall collects and more closely police who can turn it on, requiring that users prove their identity via its Microsoft Hello authentication function any time they either enable Recall or access its data, which can require a PIN or biometric check of the user’s face or thumbprint. Davuluri says Recall’s data will remain encrypted in storage until the user authenticates.
All of that is a “great improvement,” says Jake Williams, another former NSA hacker who now serves as VP of R&D at the cybersecurity consultancy Hunter Strategy, where he says he's been asked by some of the firm's clients to test Recall's security before they add Microsoft devices that use it to their networks. But Williams still sees serious risks in Recall, even in its latest form.
Many users will turn on Recall, he points out, partly due to Microsoft’s high-profile marketing of the feature. And when they do, they’ll still face plenty of unresolved privacy problems, from domestic abusers that often demand partners give up their PINs to subpoenas or lawsuits that compel them to turn over their historical data. “Satya Nadella has been out there talking about how this is a game changer and the solution to all problems,” Williams says, referring to Microsoft's CEO. “If customers turn it on, there’s still a huge threat of legal discovery. I can’t imagine a corporate legal team that’s ready to accept the risk of all of a user’s actions being turned over in discovery.”
For Microsoft, the Recall rollback comes in the midst of an embarrassing string of cybersecurity incidents and breaches—including a leak of terabytes of its customers' data and a shocking penetration of government email accounts enabled by a cascading series of Microsoft security slipups—that have grown so problematic as to become a sticking point given its uniquely close relationship with the US government.
Those scandals have escalated to the degree that Microsoft's Nadella issued a memo just last month declaring that Microsoft would make security its first priority in any business decision. “If you’re faced with the trade-off between security and another priority, your answer is clear: Do security,” Nadella's memo read (emphasis his). “In some cases, this will mean prioritizing security above other things we do, such as releasing new features or providing ongoing support for legacy systems.”
By all appearances, Microsoft's rollout of Recall—even after today's announcement—displays the opposite approach, and one that seems more in line with business as usual in Redmond: Announce a feature, get pummeled for its glaring security failures, then belatedly scramble to control the damage.