Project Astra, Google's vision for a universal AI assistant, is pulling into focus

Last year at Google I/O, one of the most interesting demos was Project Astra, an early version of a multimodal AI that could recognize your surroundings in real-time and answer questions about them conversationally. While the demo offered a glimpse into Google's plans for more powerful AI assistants, the company was careful to note that what we saw was a "research preview."

One year later though, Google is laying out its vision for Project Astra to one day power a version of Gemini that can act as a "universal AI assistant." And Project Astra has gotten some important upgrades to help the company accomplish this. Google has been working on upgrading Astra's memory — the version we saw last year could only "remember" for 30 seconds at a time — and added computer control so Astra can now take on more complex tasks.

In its latest video showcasing Astra, Google shows the assistant browsing the web and pulling out specific pieces of information necessary to complete a task (in this example, fixing a mountain bike). Astra is also able to look through past emails to find specific specs of the bike in question and call a local bike shop to inquire about a replacement part.

Eventually, according to DeepMind's Demis Hassabis, Astra's advancements will show up in Gemini. "Our ultimate vision is to transform the Gemini app into a universal AI assistant that will perform everyday tasks for us, take care of our mundane admin, surface delightful new recommendations, making us more productive and enriching our lives," Hassabis writes in a blog post. "This starts with the capabilities we first explored last year in our research prototype Project Astra, such as video understanding, screen sharing and memory."

Some of that work is already evident in Gemini Live, which recently got some multimodal capabilities of its own. But, as I noted last year, Project Astra gets even more interesting in the context of smart glasses — an idea Google briefly teased in its I/O video last year. That vision appears to be inching closer to reality, with Hassabis noting that Google is working on bringing Project Astra abilities to "new form factors, like glasses." There's no clear timeframe on when any of this will be available, but given Google's updates on Android XR elsewhere at I/O, we know the company has big plans for AI-powered smart glasses later this year.

This article originally appeared on Engadget at https://www.engadget.com/ai/project-astra-googles-vision-for-a-universal-ai-assistant-is-pulling-into-focus-174539875.html?src=rss