. They make videos, take photos, livestream and act as an adequate replacement for headphones, all while looking like a normal pair of sunglasses. However, everyone’s been waiting for the
What is multimodal AI? Simply put, it’s a toolset that allows an AI assistant to process multiple types of information, including photos, videos, text and audio. It’s an AI that can view and understand the world around you in real time. This is the underlying concept behindMultimodal Meta AI is rolling out widely on Ray-Ban Meta starting today! It's a huge advancement for wearables & makes using AI more interactive & intuitive.
Excited to share more on our multimodal work w/ Meta AI , stay tuned for more updates coming soon.Here’s how it works. The glasses have a camera and five microphones, acting as the AI’s eyes and ears. With this in mind, you can ask the glasses to describe anything you are looking at. Do you want to know a dog’s breed before you go up and give it a good pet? Just ask the glasses. Meta says it can also read signs in different languages, which is great for traveling.
There are some other potential use case scenarios, like staring at loose ingredients on a kitchen counter and asking the AI to whip up a relevant recipe. However, we need a few weeks of real people running the tech through its paces to gauge what it's actually good at. Real-time translation is going to be something of a killer app, particularly for tourists, but here's hoping it keeps the hallucinations to a minimum.
Source: Gaming Daily Report (gamingdailyreport.net)
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: verge - 🏆 94. / 67 Read more »
Source: verge - 🏆 94. / 67 Read more »
Source: ForbesTech - 🏆 318. / 59 Read more »
Source: ALNewsNetwork - 🏆 583. / 51 Read more »
Source: PhoneArena - 🏆 322. / 59 Read more »
Source: verge - 🏆 94. / 67 Read more »