Google is expanding tools to remove personal data from search results
Recent upgrades target more forms of personal data and non-consensual AI-generated images
by Skye Jacobs · TechSpotServing tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.
What just happened? For years, Google's approach to personal privacy has relied largely on user requests and manual takedowns. The company is now expanding that process with new automation. An update to its privacy tools makes it easier for people to identify and remove their personal information – or synthetic images of themselves – from Google Search.
The tools use Google's large-scale indexing systems to spot sensitive data, but they involve a key trade-off: users must share partial personal information so Google can locate full matches online. Once that data is entered, Google's scanners run periodic searches and alert users when results containing their details appear.
The biggest technical advancement lies in the "Results About You" tool. Google's update now allows it to detect ID numbers – including driver's licenses, passports, and Social Security numbers – wherever they appear on indexed web pages.
Users can access the tool through their Google account settings and opt in by manually entering their ID details. Google requests a full driver's license number but only the last four digits of passport and Social Security numbers – enough for its detection algorithms to identify exposed data across the web.
While the service cannot remove content from the websites hosting it, it can delist those links from Google Search once a removal request is approved. Because search visibility often determines what gets discovered, delisting provides a practical safeguard for most users concerned about identity theft or doxxing.
Alongside that, Google has reworked its process for handling non-consensual explicit imagery. As AI-generated deepfakes continue to proliferate, this tool is designed to cover both real and synthetic sexual content. It can now process requests faster and supports submitting multiple images in a single batch.
Access has also become simpler. From the three-dot menu next to any image result, users can select "Remove result" and indicate that the image shows sexual content of themselves.
// Related Stories
- Discord's age-check crackdown is already pushing users elsewhere
- Dangerous new spyware can take full control of iPhone and Android devices
Google then asks whether the image is genuine or AI-generated. The company says the update should shorten reporting times and help users manage mass video or image harassment campaigns more efficiently.
Google's system now supports proactive monitoring as well. After someone initiates a removal request, they can choose to have Google automatically scan for similar content in future search crawls. This applies not only to explicit imagery but also to identifying data. The system can send automated alerts and block new results matching registered personal details before users even file another request.
Despite these improvements, both the "Results About You" and NCEI tools have inherent limits. They only affect Google Search visibility; the original material remains online unless the hosting site removes it. Still, for most users, disappearing from the world's most-used search engine is nearly equivalent to vanishing from the public web.
The ID-number protection update is already live. Faster image reporting and proactive scanning features will roll out in most countries over the coming days, with broader coverage planned later this year.