Apple’s AR Glasses Are Hiding in Plain Sight
The company’s iOS 13.1, due out today, contains new glimpses of smart glasses currently in progress….
All of which matches up with the thought that Apple is planning a small, lightweight product—one that lives up to the term “wearable” by being more like smart glasses instead of an unwieldy Microsoft HoloLens. “Fifty-eight degrees doesn’t sound like much compared to an Oculus Rift, but compared to an nreal Light, which is 52 degrees, it’s already pretty competitive,” says JC Kuang, an analyst with AR/VR market intelligence firm VRS. “That’s the exact class of product we need to be looking at when we talk about what the architecture might look like.”
Mark Boland, chief analyst at ARtillery Intellgence, which tracks the augmented-reality mark, calls such a product a “notification layer,” and posits it as an introductory device of sorts—one that acts as a bridge between the mobile AR of today and a more powerful headset that could ultimately replace the smartphone. “I’ve always been skeptical of 2020,” he says. “If you look across the industry at the underlying tech, it’s just not ready to build something sleek and light.” However, an intermediary device like the one iOS 13 seems to point to could strike a balance, giving developers the chance to get used to building stereo experiences and develop best practices before needing to fully integrate with the “mirror world.”
A recent patent seems to support the idea as well. “Display System Having Sensors,” which Apple filed in March and was published in July, describes a companion system: a head-mounted device with inward- and outward-facing sensors feeds its inputs to a “controller,” which then “render[s] frames for display by the HMD.” A patent isn’t the same as a plan, obviously, but it’s a hell of a data point.
From Here to ARternity
How Apple gets from phone-tethered smart-glasses to something a fully realized spatial-computing platform—or how long it takes to do so—remains unclear, but elements of the road map are hidden in plain sight. “A lot of the tech they’ve already built and fully deployed is critical to their goal of building a discreet AR HMD platform,” Kuang says. As an example, he points to last week’s announcement that the iPhone 11 models could take photos of pets in Portrait Mode: “That’s a good example of them working in little tweaks that don’t appear to have relevance to AR, but are super-meaningful if you’re a developer. The ability to recognize nonhuman faces significantly expands your ability to build tools and experiences.”
Two acquisitions Apple has made in recent years also suggest how the company might get there. Kuang traces the current StarBoard testing mode to the 2017 acquisition of a company called Vrvana. At the time, Vrvana’s chief product was a mixed-reality headset—however, rather than rely on a transparent “waveguide” display like those in the HoloLens or Magic Leap One, it used front-facing cameras to deliver passthrough video to the user. (This is also how a company like Varjo delivers mixed reality using an VR headset.)
“It ruffled some feathers because nobody was really down with a discreet headset using pass-through,” Kuang adds of Vrvana. “But the StarBoard stuff presents exactly that: a Google Cardboard sort of functionality for iPhones. It’s obviously for testing purposes, but it maybe gives us a little more insight into how Apple has been testing AR without having to resort to building a couple of hundred waveguide-enabled devices for testing purposes.”
Apple’s other strategic move, buying Colorado company Akonia Holographics in 2018, looks to have two possible reasons: not just for the waveguide displays that Akonia was working on, but for the “holographic storage” that was the company’s original goal. The term, which refers to storing and accessing data in three dimensions rather than on the surface of a material (optical storage), has long eluded commercialization, but could prove pivotal to the long-term vision of AR. “The utopian vision of the end user device is super-lightweight and does functionally no computing compared to where we currently are,” Kuang says. “Everything happens on the cloud. The kind of speed and transfer that comes with holographic storage could be a key part of that.”