Apples secret weapon in AR is right in front of us – CNET

Sometime in the not-too-distant future, Apple will reportedly unveil an augmented- or mixed-reality headset. Apple hasn’t discussed any headgear yet. But augmented reality is alive and well on the iPhone — and it’s getting better fast. 

Apple began its AR journey in 2017, making a splash with virtual Ikea furniture and realistic-looking outdoor Pokemon Go battles. This year, I’ve been standing on street corners scanning fire hydrants with Apple’s new iPhone 12 Pro. I’ve mapped my house’s interior. I’ve navigated lava rivers on my floors.

In many ways, Apple’s depth-sensing lidar sensor on the latest iPhones and iPads, with its advanced 3D-scanning possibilities, feels like the backbone of the Apple headsets of the future.

Facebook, Microsoft and Magic Leap are already exploring goggles and glasses that aim to blend the virtual and real, with more headsets coming in the future using Qualcomm chips. But Apple’s AR mission right now, according to Mike Rockwell, Apple’s head of AR, and Allessandra McGinnis, its senior product manager for AR, is to make everything work better on the device you already have in your pocket. Layering AR with real-world locations and popping up experiences automatically, while making creative tools and developing assistive tech based on AR’s capabilities, could, in the long run, become the biggest killer apps.

“AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful,” Rockwell says. “For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.”

Rockwell and McGinnis also talked with me about what’s different now compared to three years ago, and why phones matter so much for what comes next.

p1002916-3

Patrick Holland/CNET

Apple’s killer AR app: The phone you already have

Virtual reality headsets like the Oculus Quest 2, while continually improving in quality, are still not used by many people compared to phones. “Nobody is really talking about numbers” of VR headsets sold except Sony, which has sold 5 million PlayStation VR headsets so far, says Senior Consumer Chip Analyst Anshel Sag, of Moor Insights, although “there’s a high probability that [the Quest 2] could hit 5-6 million headsets sold in the first year.” But even then, those VR headsets use apps that usually feel removed from the phones and computers we use everyday. AR headsets still don’t exist in any significant numbers yet, even years after Magic Leap and Microsoft’s HoloLens promised an imminent future of mixed reality.

“It’s been a pretty hard road for developers that are VR-only, or are trying to do AR-only experiences,” Rockwell notes. “There just aren’t that many [devices] out there.” Meanwhile, Apple’s sheer numbers of AR-enabled iPhones and iPads dating back to 2017 number in the hundreds of millions. “Even if you only appeal to a relatively small percentage, that’s still a gigantic number.”

Apple says there are already 10,000 AR-enabled iOS apps from 7,000 developers, with many focused on shopping or home improvement as a way to practically use AR at home. Practicality is exactly what Apple seems most intent on at the moment. “We wanted to provide a platform and ecosystem for developers where they could make a living,” Rockwell says.

While the COVID-19 pandemic has shut down physical businesses and slowed travel for most people, home shopping using AR tools is a major part of Apple’s focus right now. Much in the same way Google and Microsoft are pursuing ways to see things you might want to buy in 3D on your phone at home using phone-based AR tools, Apple’s hook-ins to its Safari browser enabling pop-up AR shopping look to be stand-ins for going to stores. 


Now playing:
Watch this:

Our in-depth review of the iPhone 12 and 12 Pro

13:48

“Home Depot’s found that people are two to three times more likely to convert when they view a product in AR than others that don’t,” McGinnis points out, citing numbers from Shopify and Build.com that show a greater likelihood to buy (94%) and a 22% lower rate of returns.

App developers including Adobe, which makes the AR creative app Aero for Apple’s iOS, seem to see phone-based AR the same way. “Headsets are on our roadmap, None of them has reached the critical mass that makes sense for us to deploy,” Adobe’s head of AR, Stefano Corrazza, says as to why the company hasn’t explored headset creative tools beyond acquiring Medium from Oculus: “Until we have an Apple or Google putting something out there in broad scale, it doesn’t make a lot of sense for us to push it out.” 

In the meantime, smartphones like the new $999 iPhone 12 Pro can be primary creative tools, building up to headsets down the road. “Even with a headset, the phone will be where all the computation happens,” Corrazza says. “And you can stream to the glasses potentially.”

That’s the same model Qualcomm is already building on for future AR/VR devices, too, but it could take years. In the meantime, there are phones. “It’s going to be the primary device for a while for consuming,” Corrazza says of the iPhone 12 Pro, “but also for scanning and 3D content, it’s a very powerful machine.” Adobe doesn’t use 3D scanning tools on Aero yet, but may be exploring ways to incorporate those features down the road.

Lidar as a step towards AR as a creative tool

Apple’s first steps in AR, alongside the iPhone 8, just recognized floors using the phone’s motion sensors, gyros and built-in camera. Then it recognized walls and people. Lidar-enabled iPhones and iPads, which invisibly spray an array of infrared lasers from a small black circle near the rear cameras, go a significant step further by quickly meshing (mapping in 3D) a room’s full dimensions. That also includes 3D objects and people in the space. It’s an evolution of the type of tech that Google explored years ago with a line of depth-sensing Tango phones, but on a more advanced and widespread scale. Many early lidar-enabled apps like Polycam, 3D Scanner and Record 3D are very creative and 3D capture-focused, a big shift from the dinosaur-conjuring, game-playing AR apps back in 2017.

“That’s part of the reason why we put this scanner on the device. We felt like it was a key technology that could open up an explosion of 3D assets that can be used for all kinds of things,” Rockwell says. “It also opens the possibility of being able to start to scan environments in a way, and be able to make it easier to create 3D objects.”

One of the largest repositories of 3D objects on the internet, Sketchfab, is already seeing an uptick despite years of previous explorations in 3D scanning before this. Sketchfab just hit 4 million subscribers and had its first profitable month since the service began in 2012.

But as Sketchfab’s CEO Alban Denoyel says, he’s been through previous times where he expected a boom in 3D objects. When VR headsets debuted in 2016 along with a couple of Google 3D-scanning Tango phones, there was a lot of hype. The market adoption didn’t happen, though, leading to what Denoyel calls a “VR winter.” It might finally be picking up now.

Snapchat is already exploring using lidar for AR effects that can put virtual things into the real world, and even has larger-scale experiments scanning whole city blocks. “We look at depth as very foundational,” Snapchat’s VP of the Camera Platform Eitan Pilipski says. 

Even with these possibilities, though, I find that learning to use these new tools can be daunting. Apple’s own AR-creation tool, Reality Composer, and Adobe’s 3D AR creative toolkit, Aero, are not necessarily apps you’ll instantly download right away, and I still find them to be apps I avoid. The 3D-scanning apps I’ve tried so far are fascinating, but also experimental, and not always intuitive. Apple’s largely put the world of 3D-scanning apps into developers’ hands, while Apple’s everyday core iOS tools don’t incorporate these features much at all.

Apple’s iOS-wide support for 3D objects does suggest a way that 3D things could eventually be shared like PDFs or photos. But in some ways, the creative tools for this future don’t fully exist yet.

The possibilities for photography could also be amazing, and Apple’s own Camera app uses the iPhone 12 Pro’s lidar to improve focus for night photos and portraits. But Apple doesn’t incorporate AR into its camera app or allow for any 3D scanning yet. Those ideas are left to developers to explore. Some apps, like DSLR Camera, already use the iPhone’s lidar to create custom layers of 3D information on top of photo data, layering text in 3D into photos. 

“The app is able to calculate the segmentation between the person and the background object,” says Fulvio Scichilone, the creator of DSLR Camera. “The future plan for the AR portrait … is to move, with the gyroscope or your finger, the frame of the picture.”

people-detection-ios-lidarpeople-detection-ios-lidar

People Detection recognizes people and measures distance, using AR tech. 


Scott Stein/CNET

Augmented reality as extended senses, and an accessibility tool

Apple sees the killer app of AR being discoverability, but there’s another huge opportunity arriving for accessibility. AR can literally extend one’s senses. In the audio realm, Apple already uses AirPods as hearing aids, and Facebook is exploring spatial audio for assistive hearing as well.

The same could come for assisting sight. Future vision-assistive products like Mojo Lens’ promised augmented contact lenses are aiming to be helpful tools for the vision-impaired. Apple could be taking a similar path with how AR on the iPhone, and future devices, work as assistive tools. Already, a new people-detection feature in iOS 14.2 uses Apple’s AR and lidar to recognize distances from people, and uses that for vision assistance on new iPhones. 

That could just be the beginning. “There’s a lot more we can do, especially related to our understanding of the environment that is around us,” Rockwell says. “We can recognize people, but if you think about what a human being can understand about an environment, there’s no reason that in the fullness of time a device can’t have that level of understanding, too, and provide that to developers.”

“We’ll be working together with the blind and partially sighted communities to improve specifically on the people-detection side,” adds McGinnis.

alicjakwade-allatanytime-02.pngalicjakwade-allatanytime-02.png

A location-based AR art exhibition by Alicja Kwade in Acute Art.


Alicja Kwade/Acute Art

AR’s future killer app: Being instant

Even though I cover AR all the time, I admit I forget to look for new AR apps when I use an iPhone or iPad in my daily life. Discovering what’s new in the virtual world while I’m busy in the real one isn’t a seamless process.

Rockwell sees the future of iPhone AR not as apps, but as quick-glance moments. “Something that you’re dipping in and out of three, four, five, six times a day to do various things, and they’re lightweight experiences,” he explains. “The killer app is really that it’s going to be used in a kind of regular basis all the time in these little ways that help you to do the things that you do today, that make them easier and faster.”

The road to that involves App Clips, Apple’s new way of having small micro-apps in iOS 14 that emerge on an iPhone without needing to download anything. App Clips can be triggered by NFC tags or scannable codes placed in the real world. I could scan or tap, and suddenly bring up AR related to the place I’m in, such as a virtual menu or a museum exhibit brought to life.

It also involves Apple’s mapping efforts. Apple’s new Location Anchors mean virtual AR objects can exist in real-life locations — imagine seeing a virtual piece of art in Times Square — shared by multiple people at the same time. 

“If it’s in one of the areas that we have high-res mapping, which is quite a lot in the US … if it’s within one meter, you can place an experience,” Rockwell says of Location Anchors, promising a better-than-GPS level of location-specific accuracy. Meanwhile, App Clips, which are triggered by particular QR codes or anchors in the real world, “can be down to centimeters of accuracy.”

Both of these are still a work in progress for Apple’s AR efforts: In a year of pandemic-induced isolation, it may be less likely that people have been in public places or in stores or museums where this type of location-based AR tech could emerge. But Apple sees them as crucial for people using AR on a daily basis. 

“We knew we had to solve those problems in order for AR to become a mainstream experience — I think we really are quite on the cusp of that for folks to have AR become something that is more a part of their everyday life,” Rockwell says.

“My perception is that App Clips and Anchors will make a massive difference,” Acute Art CEO Jacob De Geer says. Acute Art is an app that already hosts AR exhibits in real-world locations, but one of the current challenges to people finding this art is knowing it’s there. “The main issue, not just in AR but everything in tech now is, ‘Hey, how do you get people to download your app?'”

Another challenge to AR is — really, it’s not any one thing. Is it 3D art? Is it a series of tools to spatially scan the world and sense everything better? In that way, maybe AR is invisible. Maybe it’s a similar philosophy to how Google sees AR as a world-scanning tool.

“Often we hear people are using AR [apps] and don’t know what they are,” McGinnis says, referring to popular iPhone tools like Warby Parker’s instant AR-enabled glasses try-on. “As it becomes more and more mainstream, it doesn’t matter if you know it’s AR or not. It matters that you have an amazing experience in your device.”

The future groundwork is being laid now

Combine Apple’s lidar-based 3D scanning, Apple’s increasingly more capable AR tools for realistic visuals, plus the AirPod Pro’s introduction of spatial audio, which can make things you’re listening to sound like they’re moving in 3D space, and it isn’t hard to imagine a future Apple AR headset. 

Apple won’t comment on that. But in the meantime, the company is working on encouraging a groundswell of developers to make AR-ready apps. And whether or not a headset arrives anytime soon, more spatially aware iPhones and iPads will transform the phones into world-scanning devices with their own possibilities. Or maybe even for robotics, or as computer-vision-enabled cameras in unexpected places. 

“These things are, kind of in the beginning, a delicate thing, and you have to have all of the elements there, all these ingredients, for them to be successful,” Rockwell says. 

“A few years from now, it’ll be one of those things where you kind of can’t remember living without it, just like the internet,” he adds. “You’re going to feel like, wow, I’m using this on a regular basis … it will become just integrated into our lives.”