I can’t be the only person that’s noticed, right? For the last 4-6 Apple Events, AR has snuck around the virtual stages in complete view of us the viewer, disguised as CGI but it’s just augmented reality and long gone are the days of “AR Games” being a 10 minute demo we all sit through but instead AR is everywhere and this year, Apple stepped up what started last WWDC and baked AR into literally every product demonstration, not just the cut scenes.
You can include me in the group of people who were skeptical of augmented reality because my entire understanding of AR was based on Google Glass AND that AR was just Apple delivering on what’s possible today instead of spending the next years working on virtual reality. Tim Cook publicly has said AR is where it’s at, not VR but I figured that’s an easy statement when VR is still a ways away. Oculus is fine but it’s simple. We’re still closer to Second Life VR than Real Life VR and by real life I mean, looks like the world around us but AR..well that’s just overlaying context and adding spatial and contextual awareness to a real world event or object.
The recent Apple Events have sold me on AR and I’m really excited! Here are a few AR features coming that aren’t just “here’s a chain in a room” or “here’s a board game on a flat table that’s 20 feet wide”
- Look at someone and using the accelerometer / gyroscope in AirPods Pro, it beam forms the audio from what that person is saying so you can hear them better. It does the job of muting out surroundings to hear where you look…AMAZING!
- Apple Maps in iOS 15 knows where you are facing by looking at the objects in your iPhone camera, tracing them on screen, comparing that data to the 2D polygons from their Apple Maps Vans and overlying the street you should walk down on the iPhone’s Screen…HUGE for those who don’t have a natural compass of direction or are in a new city.
- Spatial audio everywhere. AirPods Pro / Max customers can wear their headphones full time, enjoy spatial and 3D audio from movies, TV shows, music, the people and things around them and have full control over this experience. There are already iOS settings that boost car horns or sirens for those hearing impaired so they are aware of an event happening around them even with noise cancelling turned on
- “What is this picture of” in iOS Camera App. Uses ML to tell you live when you’re taking a photo what’s in frame. WOW
- Additional ML processing of objects in the photos app
- AR Apple Events. I loved all of the virtual developers in the audience but it’s not crazy to think 2 years from now, we’ll be able to experience an AR apple event no matter where we are
- Live translation using photos. Imagine looking at a French menu in Paris next Fall with your Apple glass on and being able to read and understand the menu without asking for the English version
- Apple Maps in iOS 15 looks to be getting a lot of upgrades. The 2D view showing every path from Bicycle, Pedestrian, pop-up restaurant, lane guidance, merge and zipper lanes and other restrictions is not rocket science but baked into a car where you have a heads up display that is just early AR depending on the data source powering it is going to be game changing for distracted driving. We’re all still looking down at navigation devices and HUDs / Glasses on our face can display this simple information far better than an audible “keep left” but with an actual lane
There are a lot of small nuanced AR clues if you go through and watch the presentation again. Apple is sort of teleporting people to white rooms where the only thing filling the room is the users’s imagination. That sort of creative white space is made possible by our creative apps inside of our screens but soon, like the sci fi movies 20 years ago, we’ll have AR all around us and developers are going to do some amazing things. Apple’s Events are like mini AR exposes if you treat the CGI like AR and remember that in 2-5 years, these graphics will just be on our face and out in the world we’re viewing through our frames. Pretty remarkable. I’m stoked.