This scary AI breakthrough means you can run but not hide – how AI can guess your location from a single image

A computer screen showing two images of a road being analyzed by AI
(Image credit: Stanford)

There’s no question that artificial intelligence (AI) is in the process of upending society, with ChatGPT and its rivals already changing the way we live our lives. But a new AI project has just emerged that can pinpoint the location of where almost any photo was taken – and it has the potential to become a privacy nightmare.

The project, dubbed Predicting Image Geolocations (or PIGEON for short) was created by three students at Stanford University and was designed to help find where images from Google Street View were taken. But when fed personal photos it had never seen before, it was even able to accurately find their locations, usually with a high degree of accuracy.

Jay Stanley of the American Civil Liberties Union says that has serious privacy implications, including government surveillance, corporate tracking and stalking, according to NPR. For instance, a government could use PIGEON to find dissidents or see whether you have visited places it disapproves of. Or a stalker could employ it to work out where a potential victim lives. In the wrong hands, this kind of tech could wreak havoc.

Motivated by those concerns, the student creators have decided against releasing the tech to the wider world. But as Stanley points out, that might not be the end of the matter: “The fact that this was done as a student project makes you wonder what could be done by, for example, Google.”

A double-edged sword

Google Maps

(Image credit: Google)

Before we start getting the pitchforks ready, it’s worth remembering that this technology might also have a range of positive uses, if deployed responsibly. For instance, it could be used to identify places in need of roadworks or other maintenance. Or it could help you plan a holiday: where in the world could you go to see landscapes like those in your photos? There are other uses, too, from education to monitoring biodiversity.

Like many recent advances in AI, it’s a double-edged sword. Generative AI can be used to help a programmer debug code to great effect, but could also be used by a hacker to refine their malware. It could help you drum up ideas for a novel, but might assist someone who wants to cheat on their college coursework.

But anything that helps identify a person’s location in this way could be extremely problematic in terms of personal privacy – and have big ramifications for social media. As Stanley argued, it’s long been possible to remove geolocation data from photos before you upload them. Now, that might not matter anymore.

What’s clear is that some sort of regulation is desperately needed to prevent wider abuses, while the companies making AI tech must work to prevent damage caused by their products. Until that happens, it’s likely we’ll continue to see concerns raised over AI and its abilities.

You might also like

Alex Blake
Freelance Contributor

Alex Blake has been fooling around with computers since the early 1990s, and since that time he's learned a thing or two about tech. No more than two things, though. That's all his brain can hold. As well as TechRadar, Alex writes for iMore, Digital Trends and Creative Bloq, among others. He was previously commissioning editor at MacFormat magazine. That means he mostly covers the world of Apple and its latest products, but also Windows, computer peripherals, mobile apps, and much more beyond. When not writing, you can find him hiking the English countryside and gaming on his PC.

Read more
A phone showing the DeepSeek app in front of the Chinese flag
DeepSeek is under fire – is there anywhere left to hide for the Chinese chatbot?
DeepSeek
Experts warn DeepSeek is 11 times more dangerous than other AI chatbots
A person holding out their hand with a digital AI symbol.
DeepSeek kicks off the next wave of the AI rush
An AI face in profile against a digital background.
Worried about DeepSeek? Well, Google Gemini collects even more of your personal data
An AI-generated image of the colosseum with slides coming out of it.
AI slop is taking over the internet and I've had enough of it
Man with tin foil hat on.
The latest Apple Intelligence privacy scare is a lot of fuss about nothing, but here’s how to stop your phone using Enhanced Visual Search (if you really want to)
Latest in Artificial Intelligence
Deep Resarch
I test AI agents for a living and these are the 5 reasons you should let tools like ChatGPT Deep Research get things done for you
ChatGPT vs. Manus
I compared Manus AI to ChatGPT – now I understand why everyone is calling it the next DeepSeek
Two business men playing chess in the office.
It turns out ChatGPT o1 and DeepSeek-R1 cheat at chess if they’re losing, which makes me wonder if I should trust AI with anything
Google Gemini Calendar
Gemini is coming to Google Calendar, here’s how it will work and how to try it now
Netflix
Netflix tried to fix 80s sitcom A Different World with AI but it gave us a different nightmare
Pictory
What is Pictory: Everything we know about this business-focussed AI video generator
Latest in News
A stressed employee looking over some graphs
UK workers are spending more than one day per week tracking down information
Vision Pro Metallica
Apple Vision Pro goes off to never never land with Metallica concert footage
Mufasa is joined by another lion, a monkey and a bird in this promotional image
Mufasa: The Lion King prowls onto Disney+ as it finally gets a streaming release date
An American flag flying outside the US Capitol building against a blue sky
Sean Plankey selected as CISA director by President Trump
An Nvidia GeForce RTX 4060 on a table with its retail packaging
Nvidia RTX 5060 GPU spotted in Acer gaming PC, suggesting rumors of imminent launch are correct – and that it’ll run with only 8GB of video RAM
Indiana Jones talking to a friend in a university setting with a jaunty smile on his face
New leak claims Indiana Jones and the Great Circle PS5 release will come in April