Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is correct - it was all on-device, with security guarantees that were instantly proven incorrect. Microsoft withdrew Recall, then brought it back with a newer, more secure implementation that was also proven insecure.

It also claimed that it wasn't going to record sensitive information but it did, to the point where some apps, like Signal, used available Windows APIs to set DRM flags on their windows so that Windows wouldn't capture those regions at all.

What Microsoft could have offered is an easy-to-implement API for application developers to opt into (but users can opt out of), and a blanket recall-esque toggle that users can apply to applications without explicit support. Applications like Firefox or Chrome could hook into the API to provide page content to the API along with more metadata than a simple screenshot could provide, while at the same time not providing that data when sensitive fields/data is on the page (and possibly providing ways for the HTML to define a 'secure' area that shouldn't be indexed or captured, useful in lots of other circumstances).

But, as with everything AI, they don't want users to want it; they want users to use it regardless of whether or not they want it. This is the same reason they forced Copilot into everyone's Office 365 plans and then upped the price unless you tried to cancel; they have to justify the billions they're spending and forcing the numbers to go up is the only way to do that.





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: