Apples AR headset will leave a lot of the hard work to the iPhone – Ars Technica

A tree-lined campus surrounds a multistory glass and steel building.
Enlarge / Apple offices in northern California.

Apple’s long-rumored mixed reality headset will require an iPhone within wireless range to function for at least some apps and experiences, according to a new report in The Information.

The Information’s sources say that Apple completed work on the system-on-a-chip (SoC) for the headset “last year” and that the physical designs for that and two other chips intended for the device have been completed. Apple has also finishing designing the device’s display driver and image sensor.

The SoC will be based on TSMC’s five-nanometer manufacturing process, which is current now but may not be when the headset releases in 2022 or later.

(Note that the headset we’re talking about is the pricy, high-resolution, likely developer-focused mixed reality headset Apple is expected to launch in the relatively near future—not the sleeker, mass-market consumer AR glasses that are planned to come later.)

Critically, the headset will not feature the Neural Engine, the machine-learning processor found in iPhones, iPads, and post-Intel Macs. The Neural Engine is already used to supplement Apple’s existing AR technologies, and it will be essential for future AR apps, too—the headset will just have to rely on a nearby device with that chip to handle those things since it won’t have a Neural Engine of its own.

That said, the headset’s SoC has both a CPU and GPU, suggesting that it will be able to do some things without having to communicate with the phone. The hardware in the headset, however, is said to be less powerful than that found in Apple’s phones or tablets.

But…

On the other hand (and this is just us speculating), it’s much more likely that this is so the SoC can do some tasks that would be inefficient over wireless rather than an effort to make the device nominally functional without the phone present at all.

The SoC for the headset was designed to excel at some things that other products don’t. Examples given by The Information’s source included power management to maximize battery life, “compressing and decompressing video,” and “transmitting wireless data between the headset and the host.”

These details give us many insights into exactly how Apple is approaching the underlying technologies for the headset. But the revelations here may not actually come as a surprise to many who’ve been following Apple’s work and work on AR headsets in general, lately.

Other AR devices like the Magic Leap rely on external processing units and heavy batteries that would be required to power headsets that do all their processing locally are a barrier to user comfort and adoption.

Apple has taken this approach with one key prior device: the Apple Watch. The first several iterations of the device required an iPhone nearby to function, but Apple ultimately switched to making a version of the wearable that could operate fully independently.

Apple has spent the past few years working with AR developers to create tools and APIs that facilitate the development of augmented reality apps, such as ARKit and RealityKit. These have been used to make AR apps that run on smartphone screens, not AR glasses, but much of this work would ultimately be applicable for mixed reality glasses.

Apple has done significantly less work publicly on VR, which is also said to be supported by the new headset.