Sunday, October 6, 2024
26 C
Surat
26 C
Surat
Sunday, October 6, 2024

Meta can use your visual data to train Rayban Meta AI, the only way to opt out is to stop using its AI features

Must read

Meta can use your visual data to train Rayban Meta AI, the only way to opt out is to stop using its AI features

The Meta Ray-Ban smart glasses can use all the images shared with it to train its Meta AI. This means that whatever you capture with it or ask it to analyze any image, it will take it to teach its AI tools.

listen to the story

Advertisement
Meta can use your visual data to train Rayban Meta AI, the only way to opt out is to stop using its AI features
meta rayban smart glasses
Advertisement

Last week during Meta Connect 2024, the company unveiled its latest innovation of AR glasses, Orion. These glasses are equipped with advanced AI innovations, including hand tracking, AI voice assistant, eye tracking, and more. But this is not the case. During the event, Meta also revealed that its previous Ray-Ban smart glasses have also received several AI updates. Ray-Ban Meta Smart Glasses are now deeply integrated with Meta AI. These glasses feature an HD camera, capture button, open-ear speaker, and touchpad. Interesting, isn’t it? But media reports claim that Meta is possibly using visual data to train its AI systems.

Advertisement

According to TechCrunch, Meta has confirmed that Ray-Ban smart glasses can use all the images to train its AI model. The report says that earlier the company remained silent on this question. But after a few days, it was reported that this could happen depending on privacy policies. The report further explains that Ray-Ban Meta photos and videos are kept private until submitted to Meta AI for analysis. Once analyzed, they are subject to different policies. Meta does not use captured content for training without user-initiated AI analysis. This distinction is important to understand the data usage and privacy implications of using Ray-Ban Meta with Meta AI features.

This means that unknowingly, users may be giving their personal information through images to the Meta AI model. In short, the company is leveraging its first consumer AI device to collect large amounts of data, which can be used to develop increasingly advanced AI models. The only way out is to not use Meta’s multimodal AI features at all.

This is even more worrying now, as Meta has started releasing new updates with advanced AI features. New AI features allow Ray-Ban Meta users to interact more naturally with Meta AI, making it easier to send new data that can be used for training. Additionally, at last week’s 2024 Connect conference, the company introduced a live video analysis feature for the Ray-Ban Meta, which continuously streams images into the Meta’s multimodal AI model. In a promotional video, Meta showed how the feature could help users scan their wardrobe and have the AI ​​suggest an outfit.

In fact, Meta’s smart glasses privacy policies clearly state, “Text transcripts and related data are stored by default to help improve Meta’s products.” Not just text, but it seems that both voice transcripts and images are being used to train the Meta AI. But this is not surprising.

Since its inception, the AI ​​world has evolved by learning from the data we provide. Be it OpenAI or Microsoft or Meta, its AI models live on user-provided data. But the scariest thing here is that Meta introduced smart glasses with the idea of ​​bringing them for everyday use. Using it for every part of your day makes it awkward because it will record everything.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article