AI is, slowly but surely, getting some useful upgrades. At the Google I/O 2024, the company announced a new initiative called Project Astra.
The new project can continuously scan camera feeds to provide a contextual understanding of the world you are in. Google shared a demo video of the project which uses faster Gemini models to provide context to whatever you are doing or asking.
Here is the demo.
According to Google, project Astra is the future of AI Assistants, a general intelligence agent that can help with everything in your daily life. The main goal of this project is to make the assistant faster and more conversational with little lag.
There are several similar projects in development from different companies. For example, GPT-4o from OpenAI and the recent AI gadgets Rabbit R1, Humane AI Pin, and Meta’s Ray-Bans can do something similar.
However, the response time in the Google Astra demo is pretty great. In the video, it looks like the assistant is scanning the room or environment and giving you the answer instantly. That is not the case with products from other companies.
In the demo video, Google even showed the Astra AI running on a pair of glasses, which might suggest that more wearable AI tech is coming our way.
Astra is multimodal by design, you can type, talk, draw, photograph, and use video to interact with it.
Demis Hassabis, the head of Google DeepMind says, going forward, the story of AI will be less about the models themselves and all about what they can do for you.
Check out the demo from Google Project Astra on YouTube and see how it interacts with humans.
Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our comment Policy.