The Ray-Ban Meta Smart Glasses have multimodal AI now

5 months ago
The Ray-Ban Meta Smart Glasses seen at a front angle Overall, pulling out your phone is still faster, but it is handy for identifying things when you’re out and about. | Photo by Amelia Holowaty Krales / The Verge

When the Ray-Ban Meta Smart Glasses launched last fall, they were a pretty neat content capture tool and a surprisingly solid pair of headphones. But they were missing a key feature: multimodal AI. Basically, the ability for an AI assistant to process multiple types of information like photos, audio, and text. A few weeks after launch, Meta rolled out an early access program, but for everyone else, the wait is over. Multimodal AI is coming to everyone.

The timing is uncanny. The Humane AI Pin just launched and bellyflopped with reviewers after a universally poor user experience. It’s been somewhat of a bad omen hanging over AI gadgets. But having futzed around a bit with the early access AI beta on the Ray-Ban Meta Smart Glasses for the...

Continue reading…

Read Entire Article