iphones can now tell blind users where and how far away people are

Apple has introduced a noteworthy new accessibility capability within the newest iOS beta version: a system that identifies individuals within the iPhone's camera view and determines their distance, offering benefits such as effective social distancing for users with visual impairments, among other applications.
This functionality originated from Apple’s ARKit framework, where the company initially developed “people occlusion” technology. This technology recognizes the forms of people, enabling virtual objects to appear either in front of or behind them. The accessibility team then recognized that combining this with the precise distance measurements provided by the lidar sensors on the iPhone 12 Pro and Pro Max models could be a highly valuable aid for individuals experiencing visual impairments.
Naturally, the current pandemic brings to mind the importance of maintaining a safe distance of six feet from others. However, understanding the location and proximity of people is a fundamental visual skill we utilize constantly for everyday tasks, such as planning our movements, choosing which queue to join, and deciding when to cross a street.
The new feature, integrated into the Magnifier app, leverages the lidar and wide-angle camera found on the Pro and Pro Max iPhones, providing users with feedback through various methods.
This infrared video demonstrates the functionality of the lidar sensor in the iPhone 12 Pro, with each point representing the precise distance of the surface it reflects from.Initially, the system informs the user if any people are present within the camera’s field of view. If individuals are detected, it then communicates the distance to the nearest person, expressed in either feet or meters, and continuously updates this information as they move closer or farther away. An accompanying stereo sound indicates the direction of the person relative to the camera’s perspective.
Furthermore, the feature allows users to configure specific tones for designated distances. For instance, a user could set a distance of six feet, triggering one tone when someone is beyond that range and a different tone when they are within it. This caters to those who may not require a constant stream of precise distance readings, but simply want to maintain a certain level of separation.
A third element, potentially particularly helpful for individuals with both visual and auditory impairments, is a haptic pulse that increases in frequency as a person approaches.
Finally, a visual aid is included for those who benefit from assistance in interpreting their surroundings: an arrow on the screen that points towards the detected person. Recognizing that blindness exists on a spectrum, this feature provides support for a range of vision impairments.
The system requires adequate lighting conditions for the wide-angle camera to function effectively, and therefore will not operate in complete darkness. While limiting the feature to the higher-end iPhone models restricts its availability, the increasing usefulness of these devices as assistive tools for vision likely justifies the hardware investment for those who need it.
Here’s a demonstration of how the system currently operates:
https://twitter.com/panzer/status/1322362411949518848?s=20
Although similar tools exist on other phones and dedicated devices for identifying objects and people, it is uncommon for this functionality to be included as a standard, built-in feature.
People detection is available on iPhone 12 Pro and Pro Max devices running the recently released iOS 14.2 release candidate. Further details are expected to be published shortly on Apple’s official iPhone accessibility website.