Meta is adding AI to its Ray-Ban smart glasses next month

Users can activate the glasses’ smart assistant by saying “Hey Meta,” and then saying a prompt or asking a question. It will then respond through the speakers built into the frames. The NYT offers a glimpse at how well Meta’s AI works when taking the glasses for a spin in a grocery store, while driving, at museums, and even at the zoo.

Although Meta’s AI was able to correctly identify pets and artwork, it didn’t get things right 100 percent of the time. The NYT found that the glasses struggled to identify zoo animals that were far away and behind cages. It also didn’t properly identify an exotic fruit, called a cherimoya, after multiple tries. As for AI translations, the NYT found that the glasses support English, Spanish, Italian, French, and German.

Meta will likely continue refining these features as time goes on. Right now, the AI features in the Ray-Ban Meta Smart Glasses are only available through an early access waitlist for users in the US.

Source link

Denial of responsibility! NewsConcerns is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment