Hands-on with Google's Project Astra, the AI that knows where you left your keys
Astra wanted to tell me a long story
Before I first tried Google's Project Astra – three times at Google I/O 2024 – a Google rep asked me not to be adversarial. I’d been asking questions about Astra’s last training date and how Google might react to a search warrant for Astra’s location data (more later, promise). But you can’t tell me, “don’t be adversarial” with an AI. As a writer and editor — a creator — AI IS the adversary … maybe. We don’t know yet, because this is all so very new. In the end, Project Astra wasn’t scary; it isn’t hitting the market any time soon, and I just wanted to play with it and have more fun.
Project Astra is a multimodal AI project from Google. That means it connects different types of input to create a response that seems more contextual than an AI that uses just one input method at a time. It uses a camera for vision and listens to your voice. You can draw a picture, and it will try to interpret what you draw. What it gives you in return is speech. Simply show and tell Project Astra whatever you like, and it talks back to you.
In the demo at Google I/O, Project Astra consisted of a large camera pointing down at a desk. Google offered a number of toys to use with our prompts. The demo was ‘limited’ to four options, but the fourth was really a free-for-all, so there was no limit.
Project Astra played Pictionary. I drew, and it guessed what I was drawing. It even talked me through its reasoning, and I offered hints. It guessed I was drawing a sun, but when I told Astra the center was supposed to be black, it correctly guessed a solar eclipse.
Astra told me a story, using the toys and my input as a guide. I showed Astra a crab and asked for a story in the style of Hemingway. Then I introduced a second crab toy, followed by a Kaiju lizard, which I said was the villain. Project Astra adapted to each new twist with no trouble and was clearly trying to tell a long, complicated tale to the chagrin of my Google timekeepers.
Project Astra created an alliterative sentence based on what I presented. A beautifully baked and browned baguette, for instance. It didn’t always start with the same letter as the object, but its responses were good alliteration.
I showed Project Astra a donut and asked for an alliterative sentence. Then I asked for sci-fi-themed alliteration, and it complied. I asked for silly words, and it understood what I wanted. Even its post-response follow-ups were alliteration. It was quite clever, seemingly.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Where Project Astra is (and isn't) going
There were limits, but I felt like we were seeing the tip of the iceberg. Google only gave us four minutes with Project Astra, which is why I kept going back. There were limited options for what Astra could do. The environment was very noisy, so we had to wear a microphone to be sure Astra heard us more clearly than the background noise.
Google insisted we only use the props provided for input, and these included plastic crabs, a hunk of an amethyst geode, and some fake food items, among other choices. But in the end, it wasn’t the objects that held back Project Astra – It was my imagination about what to ask.
This is why I returned to see Project Astra three times. It got more fun, the more open and expressive I could be with the software. By my third pass, I wasn’t waiting for the introduction; I just started talking to Astra immediately. There was no time to lose, and Project Astra has a lot to say. I wish I had time to really hear it tell a whole story because I kept interrupting for the sake of expediency.
Project Astra isn’t coming to smartphones any time soon; it’s just a research project, and the team seems small. Google has no plans to put Project Astra on the next Google Glass (if such a thing exists), at least not in this form. Google reps were clear that Project Astra is a prototype, and it doesn’t look portable in its current form.
Still, Project Astra’s concept would be perfect on smart glasses. When (if?) Google finally launches AR glasses, I’m sure Project Astra’s fingerprints will be on them.
Is Project Astra's 'memory' going to be a problem?
With that in mind, Project Astra has some questionable talents. In the Google I/O keynote, Google reps took Project Astra on a walk around the office. Then, a Googler asked Astra where he left his glasses.
Astra said it saw his glasses next to a red apple. It remembered. Project Astra has memory. The AI got that right.
This immediately raised privacy concerns. What happens when the FBI comes around? Oh, your shady friend was here? We have a warrant to see everything he touched and moved when he was in your house. It should all be available on camera, thanks to Project Astra.
Except that’s not how Project Astra works. It can remember, but only things it has seen during that single session. Google reps weren’t clear about what a session is, but it seems to be limited to a one-minute to four-minute span of time. After that, Project Astra forgets everything and moves to the next subject.
The problem is what happens during those minutes. Project Astra can’t compute its information locally. Astra’s ‘memory’ is uploaded to Google. In the prototype stage, that doesn’t mean much. If this becomes a commercial product, we’ll need to know where our data is going and who has access to it.
To fit on smart glasses, Project Astra will need to change
That said, Project Astra shows a lot of promise and I’m excited to see it evolve. Unlike the current visual AI recognition from Meta, now available on Meta Ray-Ban smart glasses, Google’s version considers motion and action. It looks at context, and its results seem much more advanced, even at this early stage.
Of course, Project Astra is a research demonstration that takes up a whole room, while Meta is shipping its AI on a device powered by a 1-Watt processor. There’s a long road from prototype to production.
We’ll be keeping a close eye on Project Astra and all of Google’s AI projects. I strongly believe the next evolution for wearables and mobile technology will converge at smart glasses, so the more we learn about what’s to come, the more we can prepare and influence what we get.
You might also like...
Phil Berne is a preeminent voice in consumer electronics reviews, starting more than 20 years ago at eTown.com. Phil has written for Engadget, The Verge, PC Mag, Digital Trends, Slashgear, TechRadar, AndroidCentral, and was Editor-in-Chief of the sadly-defunct infoSync. Phil holds an entirely useful M.A. in Cultural Theory from Carnegie Mellon University. He sang in numerous college a cappella groups.
Phil did a stint at Samsung Mobile, leading reviews for the PR team and writing crisis communications until he left in 2017. He worked at an Apple Store near Boston, MA, at the height of iPod popularity. Phil is certified in Google AI Essentials. He has a High School English teaching license (and years of teaching experience) and is a Red Cross certified Lifeguard. His passion is the democratizing power of mobile technology. Before AI came along he was totally sure the next big thing would be something we wear on our faces.