News

Kenya opens Ray-Ban Meta privacy probe after AI data concerns

Kenya’s data regulator has launched a suo moto probe into Ray-Ban Meta smart glasses over privacy, surveillance and AI training concerns, after complaints about covert recordings and sensitive footage processed in Nairobi.

The Office of the Data Protection Commissioner on Tuesday launched an investigation into Ray-Ban Meta smart glasses, citing concerns over privacy, surveillance and AI data use.

The probe follows reports of sensitive footage processed in Nairobi and allegations of covert recordings involving Kenyan women, placing the country at the centre of a growing global debate over wearable technology and regulation.

In a formal response on March 31, 2026, the regulator said, “The Office of the Data Protection Commissioner confirms that it has already commenced suo moto investigations into the privacy concerns raised in relation to the Ray-Ban Meta glasses and the processing of personally identifiable information for the training of Meta AI. The outcome and further developments will be communicated once the investigations are concluded.”


The inquiry follows a complaint by The Oversight Lab, which on March 6, called for urgent scrutiny of the glasses’ “mass surveillance capabilities” and alleged use in “non-consensual recording of intimate images and videos”.


The glasses are developed by Meta Platforms in partnership with EssilorLuxottica, the owner of the Ray-Ban brand.


First released in 2023, the devices resemble ordinary sunglasses but are fitted with dual cameras, microphones and speakers.


Users can capture photos and videos, livestream content and interact with Meta’s artificial intelligence through voice commands.


Recordings may be uploaded to cloud systems, where they can be processed and, in some cases, reviewed by human moderators to train AI models.


Industry estimates suggest millions of units have already been sold globally, as Meta positions the glasses as a key entry point into everyday AI use.


The Oversight Lab’s complaint was reinforced by investigative reporting from Scandinavian media, which found that footage collected worldwide was sent to Samasource Kenya EPZ Limited for review.


According to the report, Kenyan workers were exposed to highly sensitive material, including “bathroom visits, intimate moments, bank card details, crime, violence, pornography and many more”.


These revelations have raised questions about cross-border data transfers and whether individuals recorded by the devices were aware their images could be used in AI training processes.


Concerns have also been heightened by a case cited by The Oversight Lab involving a foreign national described as a pick-up artist, widely reported to be of Russian origin, who allegedly used the glasses to secretly record Kenyan women in private settings.


This stems from the Russian national who came in Kenya recently and recorded intimate videos using the glasses.


The organisation warned that such incidents highlight the risks of discreet wearable cameras being used to capture intimate content without consent, potentially enabling harassment and exploitation.


Mercy Mutemi, Executive Director of The Oversight Lab, welcomed the investigation but urged openness.


“It is notable that the ODPC is taking this issue seriously and has decided to investigate it. We ask that the investigation be done openly, consultatively and in full transparency,” she said.


More than 150 organisations and individuals have signed a letter supporting the probe, calling for a transparent process and stronger protections for digital rights.


Kenya’s move mirrors actions in other jurisdictions. In the United Kingdom, the Information Commissioner's Office has begun examining similar concerns, while Meta faces legal challenges in the United States over alleged privacy violations linked to the device.


Regulators are increasingly focused on whether existing laws are sufficient to address emerging technologies that blur the line between public and private spaces.


Although the glasses include a small LED indicator to signal recording, critics argue this may not be enough to ensure informed consent.


Questions also remain over how data is stored, who can access it and how long it is retained.


The investigation by Kenya’s data regulator is expected to examine whether the use of such devices complies with national data protection laws, particularly regarding consent, transparency and lawful processing.


As smart wearable technology becomes more widespread, the outcome could set an important precedent for how countries across Africa approach the regulation of AI-driven consumer devices.


For now, the case highlights a central tension of the digital age, balancing rapid technological innovation with the protection of individual privacy.

Related Topics

Related Stories

Latest Stories