Apple’s software is generally very good. Even as the company has spread its focus across more platforms than ever — macOS and iOS and iPadOS and tvOS and watchOS and whatever software Apple is building for its perhaps one-day car and almost certainly its soon-to-be AR/VR headset — those platforms are yet to come. consistently excellent. It’s been a while since we had an Apple Maps-esque fiasco; the biggest mistakes Apple is making right now are much more at the level of putting the Safari URL bar on the wrong part of the screen.
What produces all that success and maturity, however, is the sense that Apple’s software is… finished — or at least very close. Over the past few years, the company’s software announcements at WWDC have been almost exclusively iterative and additive, with few major fluctuations. For example, last year’s big iOS announcements included some quality-of-life improvements to FaceTime and some new types of IDs that work in Apple Wallet. Otherwise, Apple has mostly just rolled out new settings menus: new notification controls, focus mode settings, privacy tools — that sort of thing.
This is not a bad thing! Nor is the fact that Apple is the best fast tracker in the software business is remarkably quick to adapt and polish up others’ new ideas about software. Apple’s devices are as feature-packed, long-lasting, stable and usable as anything else you can find. Too many companies try to reinvent everything for no reason and end up creating problems where they didn’t exist. Apple is nothing but a relentlessly efficient machine, and that machine is hard at work sharpening every pixel its devices make.
The best of iOS 15, in case you forgot.
But we are at a turning point in technology that will demand more from Apple. It’s pretty clear now that AR and VR are Apple’s next big thing, the next supposedly earth-shattering massive industry after the smartphone. Apple probably won’t show off a headset at WWDC, but as augmented and virtual reality become an increasingly part of our lives, everything about how we experience and interact with technology will have to change.
Apple, of course, has been flaunting AR for years. But only demos are shown, things you can see or do on the other side of the camera. We’ve seen very little from the company about how it thinks AR devices are going to work and how we’re going to use them. The company that loves to rave about its input devices needs a few new ones and a new software paradigm. That’s what we’re going to see at WWDC this year.
Remember last year when Apple showed you could take a photo of a piece of paper with your iPhone and it would automatically scan and recognize any text on the page? Live Text is an AR feature through and through: it’s a way to use your phone’s camera and AI to understand and catalog information in the real world. The whole tech industry thinks this is the future — that’s what Google is doing with Maps and Lens and what Snapchat is doing with its lenses and filters. Apple needs a lot more where Live Text came from.
From a simple UI perspective, one thing AR needs is a much more efficient system for getting information and performing tasks. No one is going to wear AR glasses that send them Apple Music ads and news alerts every six minutes, right? And full-screen apps that demand your special attention are becoming a thing of the past.
We might get some hints on what that will look like: It sounds like “use your phone without getting lost in your phone” will be a theme at this year’s WWDC. According to Bloomberg‘s Mark Gurman, we were able to see an iOS lock screen that shows useful information without having to unlock your phone. An iPhone that’s more visible seems like an excellent idea and a good way to prevent people from opening their phones to check the weather and finding themselves deep in a TikTok hole three and a half hours later. The same goes for the rumored “interactive widgets” that let you perform basic tasks without opening an app. And if Focus Mode gets some rumor — and especially if Apple can make Focus Mode easier to set up and use — it could be a really handy tool on your phone and an absolutely essential tool on your AR glasses.
I would also expect Apple to bring its devices much closer together, both in what they do and how they do it, in an effort to make the whole ecosystem more usable. With a nearly full lineup of Macs and iPads running on Apple’s M-chip—and perhaps a full lineup after WWDC when the highly anticipated Mac Pro finally arrives—there’s no reason for the devices not to share more DNA. Universal Control, which was probably the most exciting iOS 15 announcement even if it only came out in February, is a great example of what it looks like for Apple to treat its many displays as part of an ecosystem. If iOS 16 brings true freeform multitasking to the iPad (and boy, I hope it does), then an iPad in a keyboard dock is basically a Mac. Apple used to avoid that proximity; now it seems to embrace. And when it sees all these devices as ultimate companions and accessories for AR glasses, it needs them all to do the job well.
The last time Apple — hell, the last time everybody — had a really new idea about how we use gadgets in 2007, when the iPhone was launched. Since then, the industry has gone down a yes-and-path, improving and adapting without ever really breaking the foundations of multitouch. But AR is going to break all that. There is no other way. That’s why companies are working on neural interfaces, trying to perfect gesture control, and trying to figure out how to display everything from translated text to maps and games on a small screen in front of your face. Meta is already sending and selling its best ideas; Google’s come out in the form of Lens features and sizzle videos. Now Apple needs to show the world how it thinks an AR future works. Headset or no headset, that will be the story of WWDC 2022.