Spatial Handoff
I haven’t been able to try Apple Vision Pro yet, but the Mac screen-sharing feature reminded me of the “Design for Spatial Interactions” session from WWDC 2021. It’s fascinating that Apple has been using this “spatial” term for a few years now.
Like all Apple Design videos, it’s excellent. They talk about the precision locating UI for AirTag and the interactions behind handoff for HomePod mini. Putting your iPhone near your HomePod will move audio from iPhone to HomePod. It’s pretty magical. In both cases they’re talking about how physical — or spatial — actions can convey user intent, something traditionally conveyed with taps or buttons.
This is relevant for Vision Pro’s Mac screen-sharing feature. It’s triggered by looking at your Mac. There are some other clever features like the autocomplete bar that floats above your physical Bluetooth keyboard but this just look at it gesture seems under-utilized in visionOS 1.0 and could be extended much further.
For example: you’re watching a movie in Vision Pro and your spouse arrives home. What if you could look at your AirPlay-enabled TV and it could start playback on that screen, letting you share that experience in reality.
Or similarly, say you’re listening to music on Vision Pro. Look at a HomePod to handoff the audio. (There are dozens of HomePod owners who could benefit from something like this.) Similarly what if the audio seamlessly followed you as you moved from room to room — without blasting on every speaker in the house.
Maybe looking at a HomePod could cause a flurry of your favorite albums to flutter above. With a look you could start playing music. Eliminating the need to recall and verbalize a lengthy album title and artist, bringing that familiar feeling of the family record player.
Or would it be possible to use your iPhone’s keyboard for Vision Pro just by looking at your phone? (This could work similar to the Apple TV input handoff.) The typing experience on visionOS without a physical keyboard is severely lacking so this could be a game-changer.
(As an extension of this, what if you could AirPlay your iPhone into Vision Pro? This could be a sweet solution for getting offline videos for travel situations if native apps from big players like YouTube and Netflix never show up.)
Or what if someone wants to show you a photo on their phone while you’re wearing the headset like a sociopath? Maybe they could cast it into your headset to enjoy a beautiful panorama from a recent trip.
These are just a few examples that come to mind. I think there is a huge opportunity to rethink how we interact with digital things in a physical way. By all accounts Apple Vision Pro and visionOS are very 1.0 and I’m excited to see where this takes us over the next decade.
- Previous: Download on Apple TV
- Next: Magic Mouse Shoe