LOGO

Snapchat First to Use iPhone 12 Pro LiDAR Scanner for AR

October 13, 2020
Snapchat First to Use iPhone 12 Pro LiDAR Scanner for AR

Apple unveiled its newest premium iPhone devices, the iPhone 12 Pro and 12 Pro Max, during its product presentation on Tuesday. A key feature of these phones is the inclusion of a new LiDAR Scanner, engineered to enhance augmented reality (AR) applications. Snapchat has now confirmed it will be one of the first platforms to integrate this advanced technology into its iOS application through a specialized Lens.

As detailed by Apple during the presentation, the LiDAR (Light Detection And Ranging) Scanner functions by calculating the time it takes for light to travel to an object and return.

When combined with the iPhone’s machine learning capabilities and development tools, lidar assists the iPhone in perceiving its surroundings.

Apple incorporated this technology into the iPhone 12 Pro models, where it contributes to improved photographic performance in low-light conditions, due to its capacity to function effectively even in darkness.

snapchat among first to leverage iphone 12 pro’s lidar scanner for arThis technology also empowers application developers to construct a detailed depth map of any given environment, accelerating AR processing for a more responsive experience and unlocking new possibilities for AR-based applications.

For developers, this translates to the ability to utilize lidar for functionalities such as object and room scanning – potentially leading to advancements in AR shopping applications, interior design tools, or augmented reality games, for instance.

It also facilitates the creation of sophisticated photo and video effects and enables more accurate positioning of AR elements, as the iPhone can actively perceive the depth of the surrounding space.

snapchat among first to leverage iphone 12 pro’s lidar scanner for arThis capability paves the way for innovative AR experiences, such as the one Snapchat is preparing to release. Already recognized for its high-quality AR filters, the company announced it will soon introduce a lidar-powered Lens designed specifically for the iPhone 12 Pro models.

Apple briefly showcased Snapchat’s lidar-powered feature during the segment of the iPhone event dedicated to the LiDAR Scanner.

The demonstration featured an AR Lens within the Snapchat app, displaying flowers and grasses covering surfaces and birds flying towards the user. The vegetation appeared to recede into the distance, and plants were even shown growing around kitchen cabinets, demonstrating the device’s ability to recognize the location of physical objects.

The birds within the Snapchat Lens disappeared as they moved behind the person, and accurately landed on the user’s hand.

We have confirmed that this is the precise Lens Snapchat is currently developing.

A video of the Snapchat filter in action can be viewed at 59:41 in the official Apple iPhone Event video.

Snap announced on Wednesday (following initial publication) that it will also release an update to Lens Studio, the company’s complimentary AR creation platform that allows creators and developers to design and publish their own Lenses directly within Snapchat. This new version, Lens Studio 3.2, will enable AR creators and developers to build LiDAR-powered Lenses for the new iPhone 12 Pro models, ensuring compatibility when the devices become available to customers. The update is accessible starting today. 

“The integration of the LiDAR Scanner into iPhone 12 Pro models unlocks a new dimension of creativity for augmented reality,” stated Eitan Pilipski, Snap’s SVP of Camera Platform, in an official statement. “We are pleased to collaborate with Apple to extend this advanced technology to our Lens Creator community.”

Updated, 10/13/20, 4:47 PM ET to confirm the Lens shown during the event will be launched; 10/14/20, 12 PM ET Updated with details regarding Lens Studio. 

#Snapchat#iPhone 12 Pro#LiDAR#AR#augmented reality#Apple