6 ways Android phones could use the iPhone 12 Pro’s LiDAR scanner tech
How could Android manufacturers use the tech?
The iPhone 12 Pro has lots of exciting features. We’ve all heard of 5G connectivity, even if it’s not available where we all live yet and Apple fans will be familiar with MagSafe too. But what is a LiDAR scanner?
Short for ‘Light Detection and Ranging’, this sensor bounces infrared lights off surfaces to create 3D maps. Like so much futuristic tech, it was developed by the US military in the 1960s. Now Apple is pitching it as the Next Big Thing for mobile by including it in the iPhone 12 Pro and iPhone 12 Pro Max as well as some iPad Pro models.
This ignores the fact Samsung, LG and Huawei already use it. Admittedly, time-of-flight (ToF) sensors only shine a single beam and Apple’s LiDAR Scanner uses multiple pulses to build up a more detailed picture. But Google has experimented with this exact tech for over a decade.
LiDAR has helped the tech giant’s self-driving cars navigate since 2009. While its 2014 Project Tango was an early attempt to bring augmented reality (AR) to phones. It led to two Android phones launching with LiDAR systems: the Lenovo Phab 2 Pro and Asus ZenPhone AR.
Though Google ultimately abandoned Tango in favor of ARCore computer vision, as it could do the same job without specialist hardware.
But if Apple generates as much hype around LiDAR as it hopes to, Android manufacturers won’t want to be left behind. They may pressure Google to develop greater Android support for the sensor. And maybe they should. Google could even do more, so here's some ways Android manufacturers could use this technology.
1. Out of this world AR
Apple says the iPhone 12 Pro will support powerful new augmented reality experiences, thanks to the LiDAR scanner. It’ll mean apps can build a map of a room in more detail in less time. So they can better embed virtual objects, such as hiding them behind real ones.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Tellingly, Google added the same feature to its ARCore platform back in June with just a software update. It has years more experience in this area than Apple, which has just started investing heavily in AR.
So if Android was to embrace LiDAR, it would combine the best of both worlds: feeding more accurate data into already advanced algorithms. This could lead to more immersive mobile AR and VR gaming, plus all kinds of new apps.
2. Take snappier selfies
Apple’s other big selling point for the LiDAR scanner is that it helps the iPhone autofocus up to six times faster in low light. Depth-sensing also improves its Night Mode portraits for after dark selfies.
Perhaps it could do the same for Android. Just like with AR, Google advocates machine learning to improve its pictures. Night Site mode for Android 11 works incredibly well in total darkness. But even ambient light can add noise to these algorithmically-enhanced selfies.
While its bokeh simulator can do some strange things if there’s more than one face in a frame. So it’s no wonder many other Android manufacturers have added ToF sensors to improve shots. Full-on LiDAR could go even further.
3. Mapping the world inside and out
The original use of LiDAR was for mapping. Apollo 15 astronauts used it to survey the surface of the Moon in 1971. Just as Android users filled the gaps on Street View by uploading Photo Spheres, their depth scans could crowdsource improvements to Google Earth’s 3D terrain.
Better still, they could make Google Earth VR (yes, this really exists) more immersive. But Project Tango was focused on mapping indoor spaces. Opening up the possibility that Google Maps could let you view the majesty of the Sistine Chapel in 3D like you’re really there. Get turn-by-turn directions to find your airport gate faster. Or check if your stadium seats are behind a pillar before you buy tickets.
4. Turbo-charging touch-free controls
The Pixel 4’s Motion Sense was an innovative idea. With built-in radar, you could control the phone – from silencing calls to skipping songs – with just a wave of your hand. It could also sense your presence to unlock the device faster than Apple’s Face ID. But sadly the tech never caught on and was dropped for the Pixel 5.
But just as LiDAR helps self-driving cars sense obstacles with great accuracy, it could help phones detect hand gestures too. And because LiDAR sensors are cheaper than miniature radar, Motion Sense could be rolled out to more Android devices beyond Google’s flagship. With a larger user base, app developers might be more inclined to embrace it and invent new contactless commands.
5. Smarter home security
Amazon and Google have been locked in a war to dominate the smart home for years now. This goes beyond smart speakers like Echo and Home. Amazon also owns Ring and Google has Nest. Just as with photography, LiDAR could hugely improve the night vision of Nest's security cameras.
While Nest could also use the sensor tech to launch an autonomous alternative to Ring’s new home security drone. While Tango was helping quadcopters fly on their own back in 2014. This in turn could open the doors for a whole host of Android-powered consumer robots.
6. Bring back Google Glass
Google promised the world a head-mounted display back in 2012. But despite partnerships with brands like Ray-Bans and Oakley, technical hurdles and privacy concerns meant a consumer-friendly Glass model never materialized.
With fresh competition from Facebook’s Project Aria and Apple’s long-rumoured smart specs, Glass could be due a comeback though. After all, Google Lens’ visual search and Assistant’s voice controls have essentially perfected all the features Glass was supposed to have. LiDAR could give it the competitive edge by improving the performance of its AR display.