Your iPhone will soon get Apple's answer to Google Lens

Apple Live Text Google Lens
(Image credit: Apple)

Google Lens has slowly become one of the most useful augmented reality apps around, so Apple has decided build its own rival into iOS 15.

At WWDC 2021, Apple announced that 'Live Text' and 'Visual Look Up' will be coming to the iPhone's camera and Photos app as part of iOS 15. And both are direct rivals to Google Lens, which has become an increasingly powerful way to search the real world through your smartphone camera on both Android and iOS.

While we've already seen something similar on Android phones, 'Live Text' looks like it'll be a handy way for iPhone users turn copy handwritten or printed text from the real world into digital text. Apple says it's based on 'deep neural networks' that use on-device processing, rather than a cloud-based approach.   

The example Apple showed was notes on an office whiteboard – you'll be able to tap a new icon in the bottom-right corner of the Camera app's viewfinder, then just use Apple's usual text selection gestures (dragging your finger over the text) to copy the handwritten text into an email or the Notes app.

You'll also be to use 'Live Text' on existing photos in your library, although these use cases look slightly less useful. Apple's examples were copying the name and phone number of a restaurant in the background of an old photo, but perhaps some more interesting uses for the tech will materialize when it's out in the real world.

Apple's 'Live Text' is naturally a lot more limited than Google Lens, given the latter has been out since 2017. Right now, Apple says it only understands seven languages (English, Chinese, French, Italian, German, Spanish, Portuguese), which pails in comparison by Google Lens' ability to translate words into over 100 languages.

This means one of Google Lens' more useful tricks – live translating restaurant menus or signs when you're traveling – won't quite be matched by Apple's 'Live Text' right now. But it's a useful new trick for Apple's Camera app all the same and works across all types of photos, including screenshots and photos on the web.

Apple Visual Look Up Google Lens rival

(Image credit: Apple)

Searching the real world

In a similar vein, Apple's new 'Visual Look Up' is another direct challenge to some of the main features of Google Lens.

While it wasn't shown in great depth during WWDC 2021, the feature will apparently let you automatically look up information in your photos, like the breed of a dog or the type of flower you snapped.

According to Apple, this will work for books, art, nature, pets and landmarks, although exactly how exhaustive its knowledge is remains to be seen. It'll certainly be tricky for it to compete with Google on this front, given the mountains of data the search giant is able to glean from its other services.

But while this feature will probably be a little limited initially, it seems likely that this move is related to augmented reality and, perhaps ultimately, the rumored Apple Glasses. Automatically identifying visual information, like landmarks, is likely to be an important component of any smart glasses, so 'Visual Look Up' could be considered another important step towards some Apple face furniture.

'Visual Look Up' will apparently also work across iPhone, iPad and Mac, so it'll be baked into whatever Apple device you're using, as long as it's updated to the latest software. You can expect the full release of iOS 15 to arrive in around mid-September.

TOPICS
Mark Wilson
Senior news editor

Mark is TechRadar's Senior news editor. Having worked in tech journalism for a ludicrous 17 years, Mark is now attempting to break the world record for the number of camera bags hoarded by one person. He was previously Cameras Editor at both TechRadar and Trusted Reviews, Acting editor on Stuff.tv, as well as Features editor and Reviews editor on Stuff magazine. As a freelancer, he's contributed to titles including The Sunday Times, FourFourTwo and Arena. And in a former life, he also won The Daily Telegraph's Young Sportswriter of the Year. But that was before he discovered the strange joys of getting up at 4am for a photo shoot in London's Square Mile.