On supported devices, Android 13 may offer spatial audio with head tracking.
Spatial audio combined with head tracking completely reimagines how we listen to music and video content. For those unfamiliar with the technology, it creates a three-dimensional audio experience that responds to the listener’s movement. It requires a device that is compatible, a speaker, and an audio file. Once the conditions are met, the sound will adjust to your head movement.
This results in a highly lifelike output that easily outperforms conventional, analog audio. Certain iPhone and AirPods devices already offer the capability through approved applications. Google, on the other hand, has been working on it since Android 12L — including limited support for it. Android 13 may finally enable spatial audio with head tracking in its entirety, if you match the feature’s criteria.
Head tracking enhances the realism of spatial audio. The underlying technology makes use of the accelerometer and gyroscope included into a number of current headsets. That is, to track the movement of the head and adjust the audio output appropriately. According to Esper’s Mishaal Rahman, Android 13’s latest pre-release code fully supports spatial audio with head tracking on supported devices. He asserts:
Audio HAL v7.1 introduces APIs for regulating variable latency output stream mode. If the device intends to provide spatial audio with head tracking over a Bluetooth A2DP connection, latency mode control is necessary. There are two sorts of latency modes: FREE (i.e. no specific limitation on the latency) and LOW (i.e. no specific constraint on the latency) (a relatively low latency compatible with head tracking operations, typically less than 100ms).
Android 13 is currently under beta testing. As a result, features, APIs, and other modifications may change or vanish by the time the public, stable release becomes available. We can only hope that spatial audio with support for head tracking makes it into the final release and that app developers leverage it.