For smart glasses, Google introduced intermediate results on the Android XR platform and a headset with the integration of Gemini AI assistant, and also showed the work of the prototype. This reported by portal VC.
During the presentation of Neta Bhatia from Google, Gemini showed the ability to memorize visual information in glasses. After returning from the book shelf, AI called the book lying there correctly. The twins then helped find the key card from the hotel.
The functions of the Linse screen were also shown. Gemini interpreted the program in the book, recognized and opened the name of the song in the record, and at the same time exhibiting a three -dimensional card showing the instructions, formed a walking route.
Particular attention was paid by the real -time translation function that influenced the journalist Verge. In glasses lenses, subtitles are displayed in the selected language that provides the convenience of communication. Google Vice President Shahram Izadi, respectively, proposed to use text tips on the screen for public performances.
After the cancellation of the Iris Smart Glasses project project in 2023, Google focused on improving the Android XR platform. The main objective is the integration of the Gemini into the heads of other producers. Despite the successful demonstration, Verge says the opportunities offered may not fully reflect the real business of the developed system.
Previously reportedThis apple will publish a cheap and light R-Shop vision mood.
What are you thinking?
Source: Gazeta

Jackson Ruhl is a tech and sci-fi expert, who writes for “Social Bites”. He brings his readers the latest news and developments from the world of technology and science fiction.