'

Privacy, Security Concerns Mount Over Microsoft Recall Feature

Privacy, Security Concerns Mount Over Microsoft Recall Feature

Privacy experts are criticizing a new Microsoft feature on its recently announced Copilot Plus PCs that takes continuous screenshots of users’ activity, which could include passwords or financial account numbers, and stores those screenshots locally on their devices.

The purpose of the feature, called Recall, is to serve as a search tool to help consumers better locate content they have previously viewed on their device. But the scope of the data potentially collected through this feature is vast and could include sensitive information, from credentials on websites that don’t hide password entry to confidential workplace emails.

That level of streamlined data collection is stirring up privacy and security concerns. The Information Commissioner’s Office (ICO), an executive non-departmental public body in the UK, announced on Tuesday it was “speaking with Microsoft to understand the safeguards in place to protect user privacy.”

For Eva Galperin, director of cybersecurity with the Electronic Frontier Foundation, one concern is that the Copilot Plus PCs’ constant screenshot feature will “be a gift for domestic abusers.”

“People in abusive relationships frequently live with their abuser and have their devices in the same home as the abuser, and frequently the abuser has physical access and can log into those devices,” said Galperin. “Making it easier for the abuser to track every aspect of what someone is doing online is just not a good idea… There’s a big difference between creating a situation where someone has to go find a remote access tool and install it, or buy a keylogger and install it, and simply logging into Windows and turning on an option.”

“If you start logging very detailed and revealing data about what a user is doing at all times, bad actors will attempt to get their hands on it.”

Recall was announced at Microsoft’s Build conference as part of its new line of Windows PCs with support for AI features, which will be rolled out in June. In a description of Recall on its website, Microsoft said that users have control over what type of screenshots the feature collects and stores on their devices. It also said that Recall does not take screenshots of “certain kinds of content” like Microsoft Edge InPrivate browsing sessions.

“Recall snapshots are kept on Copilot+ PCs themselves, on the local hard disk, and are protected using data encryption on your device and (if you have Windows 11 Pro or an enterprise Windows 11 SKU) BitLocker,” according to Microsoft on its website. “Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available for Microsoft to view, or use them for targeting advertisements.”

Still, Microsoft acknowledged that Recall “does not perform content moderation,” and won’t hide data like passwords that could then be captured in screenshots and stored on the device. Security professionals worry that the Recall feature could potentially provide another vector for threat actors to steal sensitive data. For instance, threat actors that have already infected users with infostealer malware potentially have more access to a greater treasure trove of data by targeting Recall’s recorded screenshots.

“This is also a big concern for information security people, because even though the screenshots don’t get sent anywhere by design right now, there is the possibility that this can change,” said Galperin. “This [also] creates a tempting target for hackers and for people who make malware. If you start logging very detailed and revealing data about what a user is doing at all times, bad actors will attempt to get their hands on it. Having this available in the first place, and having it very easy to get to, is not a good idea.”

The feature was announced as Microsoft faces pressure from both government entities and customers to better build security into its products, particularly after a number of intrusions last year led to Microsoft customer data being compromised. A Cyber Safety Review Board (CSRB) report last month cited numerous failures in Microsoft’s internal security practices and cloud platform controls that allowed attackers to access a cryptographic signing key that eventually led to the compromise of more than 20 Microsoft customers.

Go to Source
Author: