BUSINESS

In April, Meta will integrate AI into its smart glasses

Facebook’s parent company, Meta, plans to improve its smart glasses in partnership with Ray-Ban with a major upgrade that will launch in April. Meta is launching the Meta AI function, which allows users to communicate with their glasses using voice commands, in addition to the wide range of capabilities previously available in the Ray-Ban Meta glasses, such as picture and video capture, live streaming, and music playing beginning at $300.

Users may now look for suggestions and real-time responses based on their environment thanks to the implementation of Meta AI. For example, users may ask the smart glasses to translate between languages, recognize things, or make recommendations based on images that they have taken with them. Users may talk to the glasses by saying “Hey, Meta,” and they will respond with computer-generated speech that plays over the speakers.

According to Meta, speech functions will only be available in English, Italian, and French during the first rollout of the AI features in the United States. While acknowledging the novelty of the AI elements, Meta pledges to continuously improve the capabilities based on user input and expects the odd inaccuracy.

Meta is making an effort to stand out in the crowded AI market by incorporating AI capabilities into its smart eyewear. Mark Zuckerberg, the CEO of Meta, allegedly looks to hire people from Google’s DeepMind division to support the company’s AI projects, having invested heavily in Nvidia’s AI processors. In order to retain top personnel, Meta has also modified its recruiting procedures, providing jobs without the need for conventional interviews and rewriting its wage guidelines.

Interestingly, the creation of an AI model targeted at improving user Feeds and video recommendations is part of Meta’s technological strategy for the next years, demonstrating the company’s dedication to using AI across its platforms.

Related Articles

Back to top button