Ring AI Facial Recognition: New Feature Raises Privacy Concerns

Ring Doorbell's New Facial Recognition Feature
Amazon’s Ring doorbells have been updated with a new, AI-driven facial recognition capability, as announced by the company on Tuesday. This feature, known as “Familiar Faces,” was initially revealed in September and is currently being deployed to Ring users throughout the United States.
How Familiar Faces Works
The system allows users to identify frequent visitors by building a database of up to 50 faces. This catalog can include relatives, friends, neighbors, delivery personnel, and other regular contacts. Upon identification within the Ring app, the device will automatically recognize individuals as they approach.
Instead of a generic “person detected” alert, users will receive specific notifications, such as “Mom at Front Door,” as detailed in the official announcement.
Concerns and Criticism
The introduction of this feature has already sparked opposition from privacy advocacy groups, including the EFF, and has drawn the attention of at least one U.S. Senator.
User Control and Privacy Features
Ring owners can utilize this feature to customize their alerts, suppressing notifications for individuals they don’t wish to be alerted about – for example, their own movements. Alert settings can be adjusted on an individual basis for each identified face.
The feature is not activated automatically; users must explicitly enable it within the app’s settings.
Managing Familiar Faces
Faces can be identified directly from the Event History section of the app or through the newly created Familiar Faces library. Once a face is labeled, that name will appear in all related notifications, the app timeline, and the Event History.
These labels are editable at any time, and tools are provided to merge duplicate entries or remove faces from the database.
Data Security and Retention
Amazon asserts that all facial data is encrypted and is not shared with third parties. Furthermore, unidentified faces are automatically deleted after a period of 30 days.
Concerns Arise Regarding AI Facial Recognition in Ring SystemsThe introduction of this functionality by Amazon, despite stated privacy safeguards, is prompting increased scrutiny.
Historically, the company has established collaborative relationships with law enforcement agencies. Previously, Amazon permitted police and fire departments to directly request doorbell footage from users of the Ring Neighbors app.
Recent partnerships include a collaboration with Flock, a company specializing in AI-driven surveillance cameras utilized by police forces, federal agencies, and an ICE division, as reported by 404Media. (Flock Safety subsequently clarified via email that ICE is not currently utilizing Flock systems.)
Past Security Lapses with Ring
Ring’s own security measures have previously demonstrated vulnerabilities.
In 2023, the company was penalized with a $5.8 million fine by the U.S. Federal Trade Commission. This was due to prolonged unrestricted access to customer video data by Ring employees and contractors.
Furthermore, the Neighbors app exposed sensitive user information, including home addresses and precise locations. User credentials for Ring accounts have also been identified circulating on the dark web for an extended period.
Considering Amazon’s history of cooperation with law enforcement and surveillance technology providers, alongside its documented security shortcomings, Ring users are advised to exercise caution when identifying individuals by name. Disabling the feature altogether and simply observing activity is a preferable alternative; not all functionalities require AI enhancement.
Regulatory and Organizational Pushback
The privacy implications have already triggered calls for Amazon’s Ring to discontinue the feature from U.S. Senator Ed Markey (D-Mass.). Consumer advocacy groups, such as the Electronic Frontier Foundation (EFF), are also voicing opposition.
Existing privacy regulations are currently preventing the rollout of this feature in Illinois, Texas, and Portland, Oregon, as highlighted by the EFF.
Amazon has stated that user biometric data will be processed in the cloud and will not be used for AI model training. The company also asserts that tracking an individual’s detected locations, even upon law enforcement request, is technically unfeasible.
However, the validity of this claim is questionable, given the functional similarities to the “Search Party” feature, which leverages a network of Ring cameras to locate lost pets.
EFF Response and Concerns
F. Mario Trujillo, Staff Attorney at the EFF, commented, “Simply approaching a door, or even passing by it, shouldn't necessitate a compromise of personal privacy.” He further emphasized the need for state privacy regulators to investigate, safeguard privacy rights, and rigorously evaluate the effectiveness of biometric privacy legislation.
This article was updated following receipt of comments from the EFF.
Related Posts

Waymo Baby Delivery: Birth in Self-Driving Car

Google AI Leadership: Promoting Data Center Tech Expert

AI Safety Concerns: Attorneys General Warn Tech Giants
Nvidia Reportedly Tests Tracking Software Amid Chip Smuggling Concerns

Spotify's AI Prompted Playlists: Personalized Music is Here
