Meta Rolls Out ‘Look and Ask With Meta AI’ Feature on Ray-Ban Smart Glasses, Announces Early Access Programme

Meta is now allowing select customers to put on its pair of Ray-Ban smart glasses and try out new bespectacled AI-powered experiences as part of an early access programme. The Facebook parent has announced initial user tests for the smart glasses and intends to gather feedback on new features ahead of wider release. Meta is also introducing updates to improve the Ray-Ban smart glasses experience, which is powered by the Meta AI assistant, bringing smarter and more helpful responses.

Earlier this month, Meta announced a host of new features to its AI services across platforms. In an update to the same blog post on Tuesday, the company introduced a few new Meta Ray-Ban smart glass features. Those who sign up for early access can try out multimodal AI-powered capabilities, which allow the smart glasses to perceive visual information by looking and answering related queries.

According to Meta, its AI assistant on the glasses can take a picture of what you’re seeing, either via voice command or the dedicated capture button. It can also come up with a witty caption for the photo. Users could pick up an object while wearing the Meta Ray-Ban smart glasses and ask for information on the same, or look at a sign in a different language and ask the AI-powered glasses to translate it to English. The company, however, has warned users that its multimodal AI might make mistakes and will be improved over time with the help of feedback.

Leave a Comment

Your email address will not be published. Required fields are marked *

Translate »