Meta’s $ 799 Ray-Ban-Ban is the first big step of the company from VR to Ar

Zuckerberg has also shown how the neural interface can be used to compose messages (on WhatsApp, Messenger, Instagram or via email applications of a connected phone) by following your Mimed “writing” on a flat surface. Although this feature is not available during the launch, Zuckerberg said that he had reached “around 30 words per minute” in this silent mode of entry.
The most impressive part of the Demo on stage of Zuckerberg which will be available at launch was probably a “Live Legend” feature that automatically hits the words your partner says in real time. The functionality would have filmed background noise to focus on the subtitling only of the person you look at.
A Meta-Video Demos How live subtitling works on the Ray-Ban screen (although the field of vision of real glasses is probably much more limited).
Credit: Meta
A Meta-Video Demos How live subtitling works on the Ray-Ban screen (although the field of vision of real glasses is probably much more limited).
Credit: Meta
Beyond these types of “Gee Whiz” features, the Meta Ray-Ban screen can mainly reflect a small subset of your smartphone applications on its floating screen. Being able to obtain rotating instructions or see recipe steps on glasses without having to take a look at a phone looks like new really useful interaction modes. The use of glasses display as a viewfinder to align a photo or video (using the integrated 12 -megapixel zoom camera 3X) also seems to be an improvement compared to intelligent glasses without previous display.
But accessing basic applications such as weather, reminders, calendar and emails on your small glasses display strikes us as probably less practical than just watching your phone. And host video calls via the glasses by necessity forces your partner to see what you see via the outside camera, rather than seeing your real face.
Meta has also shown a pie video in the sky on how future “AI AI” integration would be able to automatically make suggestions and note the tracking tasks according to what you see and hear while wearing the glasses. For the moment, however, the apparatus represents what Zuckerberg called “the next chapter in the fascinating history of the future of computer science”, which should be used to focus on the metavese based on VR which was the last “future of IT” of the company.




