Skip to main content

iOS 16 integrates U1 chip with ARKit amid rumors of Apple’s mixed reality headset

iOS 16 introduces multiple new APIs, which allow developers to expand the capabilities of their apps. For instance, there are new APIs for lock screen widgets, walkie-talkie, interactive maps, weather data, and more. Interestingly, Apple has also updated the Nearby Interaction API to integrate the U1 chip into ARKit amid rumors of a new mixed reality headset.

Nearby Interaction API

The Nearby Interactions API was introduced with iOS 14, and it lets developers take advantage of the ultra-wideband U1 chip available in iPhone 11 and later. The U1 chip, which enables precise location and spatial awareness for the devices, could be used for things like detecting the distance between one iPhone and another.

With iOS 15 and watchOS 8, Apple has expanded these features to the Apple Watch, as Apple Watch Series 6 and later also feature the U1 chip. This year, iOS 16 brings an interesting new option for developers working with the Nearby Interaction API, which is the ability to integrate the U1 chip with augmented reality via ARKit.

As detailed by the company in a WWDC 2022 session, iOS already uses the U1 chip combined with ARKit to locate AirTags with the Precision Finding feature. With the help of the data provided by the U1 chip combined with the iPhone’s camera, the Find My app is able to precisely guide the user towards their AirTag.

Now developers will also be able to create similar experiences in their apps using the U1 chip and ARKit, which makes information about distance and directions even more consistent and accurate. Apple says that the best use cases for this API are experiences to guide users to a specific nearby object such as a misplaced item, object of interest, or object that the user wants to interact with.

An app can, for example, tell users whether the object they’re looking for is in front of them or behind them.

U1, ARKit, and Apple’s AR/VR headset

Apple's mixed reality headset to have U1 chip and ARKit-based technologies.

Multiple recent rumors have been pointing to the release of a new mixed reality headset from Apple sometime in late 2022 or early 2023. Although the product wasn’t announced at WWDC 2022 and the company didn’t say a word about augmented or mixed reality during the opening keynote, there’s a lot going on about AR and VR in the WWDC sessions.

For a device expected to have multiple cameras and advanced sensors, including an ultra-wideband chip, it seems clear that it will have precise spatial awareness. And even though there’s no SDK for the new headset since it hasn’t been officially announced, Apple seems to really want developers to prepare their apps for this kind of interaction ahead of the headset announcement.

When the company first announced the U1 chip with iPhone 11, it mentioned that experiences like faster AirDrop would be just the beginning. U1 is now used for things like car keys in the Wallet app and finding AirTag, but the chip will definitely play a major role in Apple’s mixed reality headset.

On a related note, ARKit has also been updated in iOS 16 to include support for 4K HDR video and advanced room interior scanning – another important step toward an AR/VR device.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Filipe Espósito Filipe Espósito

Filipe Espósito is a Brazilian tech Journalist who started covering Apple news on iHelp BR with some exclusive scoops — including the reveal of the new Apple Watch Series 5 models in titanium and ceramic. He joined 9to5Mac to share even more tech news around the world.