It is not the first time that tech pundits have been anticipating Apple to debut the augmented reality segment with an AR headset. Now ahead of the annual Apple event, new information has been brought to light that the AR/VR headset will require an active iPhone connection, and may have to offload more processor-heavy tasks to a connected iPhone or Mac.
If the rumors are to be believed, Apple’s AR Glasses will work similarly to an Apple Watch, syncing with a user’s iPhone and displaying texts, emails, maps, and games across the user’s field of view.
According to the source, the Cupertino giant completed the 5 nm SoC (System-on-Chip) for the AR or VR headset last year and is awaiting its integration with the gadget. Furthermore, Apple finished two additional chips that would be incorporated into the device.
These chips lack a neural engine for AI and machine learning capabilities, making them less powerful than those found in Apple’s Macs and iOS devices. The chip is optimized for wireless data transfer, video compression, and decompression, and power economy to ensure long battery life. This also sounds sensible if the headset’s primary function is to stream data from another device rather than doing the heavy processing itself.
According to The Information, the device will also include an “unusually huge” image sensor which will be the size of one of the headset’s lenses. Though the image sensor has not been seen in prior leaks, the report mentions, it will collect high-resolution picture data from a user’s surroundings for augmented reality. Because it is impossible to perform VR without totally concealing the user’s vision, and it is difficult to do AR without the user being able to see the outside world, the image sensor may be used to offer a view of the user’s surroundings from within the headset.
The chips for the headset are being made by TSMC, and mass production is expected to take at least a year. The first AR/VR headset might be delivered as early as 2022, although it could be delayed if the device’s development is not completed on time. Meanwhile, TSMC has struggled to make the chip defect-free and has had low yields during pilot production. If the image sensor on the headset is already becoming a huge pain point, delegating the backend of the UX to an iPhone is definitely the best option.