Bluetooth Low Energy Audio (LEA) ensures that users can receive high fidelity audio without sacrificing battery life, and lets them seamlessly switch between different use cases. Android 13 (API level 33) includes built-in support for LEA.
Most LEA headsets will be dual mode until the LEA source device market share grows. Users should be able to pair and set up both transports on their dual mode headsets.
You may want to integrate LEA for the following use cases:
Sharing audio: Users can simultaneously share multiple audio streams to one or more audio sink devices. Audio is synchronized between the source device and connected devices.
Broadcast Audio: Users can broadcast audio to friends and family, while also connecting to public broadcasts for information, entertainment, or accessibility.
LC3 audio codec support: This is the default audio codec and replaces the SBC codec used for A2DP (media) and mSBC in HFP (voice). LC3 is more efficient, reconfigurable, and higher quality.
Audio sampling improvements: Headsets can maintain high output audio quality when using microphones. Bluetooth classic lowers audio quality when using Bluetooth microphones. With BLE Audio, input and output sampling can reach 32 kHz.
Stereo microphone: Hearables can record audio with stereo microphones for spatial audio enhancements.
Hearing Aid Profile (HAP) support: HAP offers users greater accessibility and usage than previous ASHA protocols. Users can use their hearing aids for phone calls and VoIP applications.
Enhanced Attribute protocol (EATT) support: EATT allows developers to send multiple commands at once to paired hearables.
There are four main categories of use cases:
Conversational: Dialer and VoIP applications that require low-latency communication routing offer high quality audio and less battery usage.
Gaming: Concurrent microphone and high fidelity playback allows for games to stream high quality audio to hearables. A gaming app can access BLE audio input when a game arms the Bluetooth microphone as ready to use. Then, when a player starts a live conversation with a peer player, the game app can use the microphone data without delay.
Media: Media applications are allowed to set the audio manager's preferred device. The user can override this by changing their preferred device from within the system's settings.
Accessibility: Hearing aids that support BLE Audio can now use the microphone, allowing users to continually use their hearing aids for a call.
BLE Audio APIs and methods
The following APIs and methods are required to support BLE Audio hearables:
setCommunicationDevice()selects the audio device that should be used for communication use cases, for instance voice or video calls. This method can be used by voice or video chat applications to select a different audio device other than the one selected by default by the platform. This API replaces the following deprecated APIs:
clearCommunicationDeviceis called after your app finishes a call or session to help ensure the user has a great experience when moving between different applications.
BluetoothLeAudiocontrols the bluetooth service via proxy object.
Telecom InCall Service
setAudioRoute()sets the audio route to the current active device.
CallAudioState.ROUTE_BLUETOOTHdirects the audio stream through Bluetooth.
requestBluetoothAudio()requests audio routing to a specific bluetooth device.
Audio Device Info
AudioDeviceInfo.TYPE_BLE_HEADSETdescribes the audio device type as a LEA device. Used for identifying if the hearable device is a LEA device.
setPreferredDevice()sets the preferred device for audio routing to use. The user can override this in the system settings.
isLeAudioSupported()returns if the platform's hardware supports LEA.
isLeAudioBroadcastSourceSupported()returns if the platform's hardware supports LEA.
Guides based on use case
Below are guidelines for implementing LEA based on specific use cases.
Voice communication applications
Voice communication applications have the choice of managing audio routing and device state by self managing their state or by using the Telecom API which does the audio routing and state logic for you.
Self-Managed: For applications that are currently using
setSpeakerphoneOn()or want to self-manage the audio routing state, follow the Audio Manager self-managed call guide.
Managed: Use the Telecom API to create an audio or video calling application. This API lets you quickly and easily control audio routing and switch between Bluetooth devices. For more information, see the Telecom managed calls guide.
Audio recording applications
- Media Recorder: When recording audio using the Media Recorder, you can now record in stereo if the bluetooth hearable supports LEA. Check out the Audio recording guide.
LE Audio (LEA) headset recommendations
As more LEA headsets are released, we have discovered issues in real-world testing that degrade the user experience. The specification does not cover all of these issues. The following table provides a list of recommendations that LEA headset manufacturers should follow to improve end-to-end experience for Android users.
Support Cross Transport Key Derivation (CTKD) for
||Most new LEA headsets will be dual-mode until the LEA source device market share grows. It's important that users are able to pair their dual-mode headsets seamlessly and to set up both transports. This is also important for Google Fast Pair.|
Support Targeted Announcements (TAs) if you want your LEA headsets to reliably reconnect to the source devices.
LE audio earbuds should use TAs to request an incoming connection from the central devices.
Will be added to upcoming BT SIG.
|Unlike in BR/EDR's paging model where a connection can be initiated by either the phone or the headset, a connection in LEA must be initiated by the central device. Currently, many headsets do not use TAs, which means that the central device might not be able to reconnect to the peripheral without adding it to an Allowlist. However, an allowlist workaroud might prevent the headset from connecting to a different central device. Therefore, it's important for LEA headsets to support TAs properly so that the central device can reliably reconnect without workarounds that might break multi-point connections.|
Optimized discoverability for dual mode earbuds
This prevents dual-mode LEA earbuds from appearing as duplicate
entries in Bluetooth settings, which might confuse users and compromise
the LEA pairing experience.
The dynamic leader election is especially important for dual-mode devices that are paired incrementally. For example, if only one earbud is available at initial pairing, then it should present itself as a dual-mode device. When a user pairs with the second earbud later on, they only need to pair with the LE component, and CSIP will make sure they are grouped together on Android.
Identity address is recommended during pairing because the BR/EDR component already exposes the device's public address to nearby devices.
|Support Enhanced Attribute Protocol (EATT).||Reduces pairing and connection latency.|
|Support Robust GATT caching.||Reduces connection latency, especially for TWS buds.|
|Support connection subrating.||Allows for more flexible packet scheduling and potential battery savings.|
|Ensure that during pre- and post-processing for both playback and capture, the signal processing pipeline can operate at 16, 24, 32, and 48 kHz as well as supporting higher frequencies.||Takes advantage of the higher sampling rates supported for LEA call or VoIP capture paths and media playback.|
|Support LE Power Control||Better power management|
Context Type support
|Use all of the context types specified in Assigned Numbers 6.12.3 unless the headset explicitly does not support a given context type.||For example, if context type "Game" is not supported, then Android will send game sounds. In particular, note that the "Unspecified" context type doesn't mean "any context type", and it doesn't cover unsupported context types.|
When the central device interacts with the peripheral device's ASCS, the peripheral must connect to the central device's MCS and TBS.
The central device might not always use LE audio as the streaming route because it might fall back to using A2DP or HFP. The peripheral device can use ASCS interaction as an indication of whether the central device will use LE audio for streaming.
A few examples of ASCS interactions are read, write, and register for notification.