So we’re just going to gloss over the fact that Apple said that we can use the TrueDepth camera to scan our heads and ears to optimize our sound experiences with the AirPods Pro 2? Am the only one who thinks this perk, officially labeled “personalized spatial audio,” is the most interesting update announced during Wednesday’s ‘Far Out’ Apple event?
The social media sphere made a big to-do about the iPhone 14 Pro‘s Dynamic Island, the new 48MP wide-angle camera, the Apple Watch Series 8‘s crash detection, and satellite connectivity, but custom sound optimized for you and you alone? That blows all of the other features out of the water.
Personalized spatial audio
The AirPods Pro 2 got unveiled with a new H2 chip, improved active noise cancellation, enhanced transparency mode and improved acoustics. And to the delight of those who often misplace their possessions, Apple added a speaker to the AirPods Pro 2 case that plays a loud sound when FindMy is activated, making it easier for users to recover their lost earbuds.
While those updates are all great and dandy, none of them caught my eye like Apple’s new personalized spatial audio perk. Apple described spatial audio as sound that makes it seem like you’re on stage with the musician belting their heart out, but personalized spatial audio (PSA) takes that to a whole new level.
“With iOS 16, you can use the TrueDepth camera on iPhone to create a personal profile for spatial audio because the way we all perceive sound is unique based on the size and shape of our head and ears,” said Senior Engineer of AirPods Firmware Mary-Ann Rau. In other words, based on your ear shape, the AirPods Pro 2 will ensure that sound experiences are perfectly tuned for your unique physical features.
Users who have iOS 16 beta installed can experiment with the PSA UI. “Use iPhone to capture a high-resolution scan of your ear geometry for improved spatial audio on all your Apple devices,” the PSA page says. This tells us this isn’t an AirPods Pro 2 exclusive, but it’s certainly a highlight people will enjoy when iOS 16 officially rolls out in the coming days.
A Twitter user took screenshots of the PSA process, and here are the steps:
#iOS16 Personalized Spatial Audio 👀 pic.twitter.com/cilKIpbJKoJune 6, 2022
1. First, position your face in the camera frame.
2. Move your head in a circle to show all the angles of your face.
3. Bring the phone up to the right side of your head, about 1 to 2 feet away, to capture your right ear.
4. Turn your head slightly left and then right to capture front, side and back angles.
5. You will hear a confirmation sound when all angles are captured.
Once the camera finishes scanning your head and ear profiles, you’ll get a message that says, “You have completed your Personalized Spatial Audio setup.”
Although getting my ear scanned for top-notch audio quality is awesome, it’d blow my mind if, one day, Apple uses this tactic to send us customized ear tips for its popular earbuds line. I don’t know if it’s just me, but I’m tired of my AirPods Pro falling out of my ears, so beyond personalized spatial audio, I want to see Apple earbuds uniquely designed for my ear shape.
Stay tuned for our AirPods Pro 2 review! We’ll let you know whether the earbuds are, indeed, a significant step above its predecessors.