I finally tried the Meta AI in my Ray-Ban smart glasses thanks to an accidental UK launch – and it's by far the best AI wearable

RayBan Meta Smart Glasses
(Image credit: Meta)

Officially, the Meta AI beta for the Ray-Ban Meta smart glasses is only available in the US and Canada, but today I spotted it had rolled out to my Meta View app here in the UK so I’ve been giving the feature a test run in and around London Paddington, near to where our office is based.

Using the glasses’ in-built camera, and an internet connection, I can ask the Meta AI questions like you would with any other generative AI – such as ChatGPT – with the added benefit of providing an image for context by beginning a prompt with “Hey Meta, look and …” 

Long story short, the AI is – when it works – fairly handy. It wasn’t 100% perfect, struggling at times due to its camera limitations and an overload of information, but I was pleasantly surprised by its capabilities. 

Here’s a play-by-play of how my experiment went.

Hamish wearing a black pair of Wayfarer smart glasses from Ray-Ban and Meta. He's also wearing a hat and a bag in a large modern living room.

These specs are stylish, and kinda useful too (Image credit: Future)

I'm on an AI adventure

Stepping outside our office, I started my Meta AI-powered stroll through the car park and immediately asked my smart specs to do two jobs: identify the trees lining the street and then summarize a long, info-packed sign discussing the area’s parking restrictions. 

On the trees task it straight up failed – playing a few searching beeps before returning to silence. Great start. But with the sign, the Meta AI was actually super helpful, succinctly (and accurately) explaining that to park here I needed a permit or I’d risk paying a very hefty fine, saving me a good five minutes I’d have spent deciphering it otherwise.

Following this mixed success, I continued walking towards Paddington Station. To pass the time, I asked the specs questions about London like I was a bonafide tourist. They provided some interesting facts about Big Ben – reminding me the name refers to the bell, not the iconic clock tower – but admitted they couldn’t tell me if King Charles III currently resides in Buckingham Palace or if I’d be able to meet him. 

Admittedly, this is a tough one to check even as a human. As far as I can tell he’s living in Clarence House, which is near Buckingham Palace, but I can’t find a definitive answer. So I’ll mark this test as void and appreciate that at least the AI told me it didn’t know instead of hallucinating (a technical term used when AI makes things up or lies).

I also tried my initial tree test again with a different plant. This time the glasses said they believed it was a deciduous tree, though couldn’t tell me precisely what species I was gawking at.

The Meta AI's respoonses to various questions about trees, Big Ben and trains

Some of the questions I asked, and the responses in the Meta View app (Image credit: Future)

When I arrived at the station I gave the specs a few more Look and Ask tests. They correctly identified the station as Paddington, and in two out of three tests the Meta AI correctly used the departure board to tell me what time the next train to various destinations was leaving. In the third test it was way off: it missed both of the trains going to Penzance and told me a later time for a completely different journey that was going to Bristol.

Before heading back to the office, I popped into the station’s shop to use the feature I’ve been most desperate to try – asking the Meta AI to recommend a dinner based on the ingredients before me. Unfortunately, it seems the abundance of groceries confused the AI and it wasn’t able to provide any suggestions. I’ll have to see if it fares better with my less busy fridge at home.

When it's right, it's scary good

On my return journey, I gave the smart glasses one final test. I asked the Meta AI to help me navigate the complex Tube map outside the entrance to the London Underground, and this time it gave me the most impressive answer of the bunch.

I fired off a few questions asking the glasses to help me locate various tube stations amongst the sprawling collection and the AI was able to point me to the correct general area every time. After a handful of requests I finished with “Hey Meta, look and tell me where Wimbledon is on this map.”

The glasses responded by saying it couldn’t see Wimbledon (perhaps because I was standing too close for it to view the whole map) but said it should be somewhere in the southwest area, which it was. It might not seem like a standout answer, but this level of comprehension – being able to accurately piece together an answer from incomplete data – was impressively human-like. It was as if I was talking to a local.

The complex London Underground map on a wall

The Tube is a maze of stations that's tough for tourists (Image credit: Future)

If you have a pair of Meta Ray-Ban smart glasses I’d recommend seeing if you can access the Meta AI. Those of you in the US and Canada can, for sure, but those of you who aren’t might be lucky like me to have the beta available to you. The best way to check is to simply say “Hey Meta, look and …” and see what response it gives. You can also check the Meta AI settings in the Meta View app.

A glimpse of the AI wearable future

There are a lot of terrifying realities to our impending AI future – just read this fantastic teardown of Nvidia’s latest press conference by our very own John Loeffler – but my test today highlighted the usefulness of AI wearables following the recent disasters for some other gadgets in the space.

For me, the biggest advantage of the Ray-Ban Meta smart glasses over something like the Humane AI Pin or Rabbit R1 – two wearable AI devices that received scathing reviews from every tech critic – is that they aren’t just AI companions. They’re open-ear speakers, a wearable camera, and, at the very least, a stylish pair of shades.

I’ll be the first to tell you, though, that in all but the design department the Ray-Ban Meta specs need work. 

The open-ear audio performance can’t compare to my JBL SoundGear Sense or the Shokz OpenFit Air headphones, and the camera isn’t as crisp or easy to use as my smartphone. But the combination of all of these details makes the Ray-Bans at least a little nifty.

At this early stage, I’m still unconvinced the Ray-Ban Meta smart glasses are something everyone should own. But if you’re desperate to get into AI wearables at this early adopter stage they’re far and away the best example I’ve seen.

You might also like

Hamish Hector
Senior Staff Writer, News

Hamish is a Senior Staff Writer for TechRadar and you’ll see his name appearing on articles across nearly every topic on the site from smart home deals to speaker reviews to graphics card news and everything in between. He uses his broad range of knowledge to help explain the latest gadgets and if they’re a must-buy or a fad fueled by hype. Though his specialty is writing about everything going on in the world of virtual reality and augmented reality.

Read more
RayBan Meta Smart Glasses
Best Ray-Ban Meta smart glasses update yet adds Live AI tools in early access
AI
Our predictions for AI in 2025 – what next for ChatGPT, Apple Intelligence and more
Visual Intelligence on an iPhone 16
Don't judge Apple Intelligence by today's summaries, smartphone AI is going to be amazing... eventually
Halliday Smart Glasses
I can't decide if I love or hate Halliday Smart Glasses with its ultra-tiny display and nosey AI
AI
The year in AI: how ChatGPT, Gemini, Apple Intelligence, and more changed everything in 2024
The Meta Quest 3 on a notebook surrounded by pens and school supplies on a desk
Leaked Meta memo teases 'half a dozen' new AI wearables, a mixed-reality push, and the return of the metaverse
Latest in Artificial Intelligence
A toy Amazon Echo next to the Alexa Plus logo and a range of Echo devices
What is Alexa+: Amazon’s next-generation assistant is powered by generative-AI
Gaming with AI
I asked Gemini to play a text-based adventure game with me and the AI whisked me away to a word-based fantasy
WhatsApp
WhatsApp just made its AI impossible to avoid – but at least you can turn it off
ChatGPT vs Gemini comparison
I compared GPT-4.5 to Gemini 2.0 Flash and the results surprised me
Apple iPhone 16 Plus
Apple officially delays the AI-infused Siri and admits, ‘It’s going to take us longer than we thought’
A laptop screen showing a ChatGPT coding panel
The ChatGPT Mac app just got a massive coding upgrade – and it’s coming to Windows soon
Latest in Features
Paul Rudd on the ground looking up at a unicorn, whose legs are visible
I've added 5 new movies and TV shows to my watchlist after they premiered at South by Southwest 2025
A toy Amazon Echo next to the Alexa Plus logo and a range of Echo devices
What is Alexa+: Amazon’s next-generation assistant is powered by generative-AI
Seth Rogen as Matt Remick looking worried in The Studio.
The Studio already has 100% on Rotten Tomatoes – here are 3 more highly-rated comedies to watch before it's released on Apple TV+
PrivadoVPN running on an iPhone during TechRadar's VPN tests
Why PrivadoVPN Free is still a stellar option for streaming
Padlock against circuit board/cybersecurity background
Kali laid bare: the most famous Linux hacking distro of all time
Lady Gaga sat at a press conference table for Spotify's fan event
Spotify’s press conference with Lady Gaga shows that music streaming services really do think about the fans after all