Auracast to the experts: AudioCollab 2025 looks at innovative future of hearing health
Auracast is one of a number of innovative futures that await the audio industry.
The new Bluetooth LE Audio technology was one of the topics explored during a panel discussion at Audiocollab 2025. The one-day conference held in The Soho Hotel, central London, was looking to the future.
Across the day, sessions explored wireless audio devices, the future of immersion, car audio, and audio streaming. There were also sessions dedicated to retail and consumer issues, and the event was organised by Futuresource Consulting.
One session, The Innovative Future of Hearing Health, was a panel discussion that explored how technology is making massive leaps in helping people to hear safely and well.
Chaired by Futuresource’s Rasika D’Souza, it featured Qualcomm Technologies product manager Blair Werp, who has a special responsibility for health and wellness; the RNID’s technology lead Alastair Moore; and Elfed Howells, the Co-founder and Commercial Director at Elevear, which provides audio companies with algorithms for their hearables, including firmware for noise cancellation and hearing health.
Rasika opened the discussion by reminding the audience that the World Health Organisation estimates that by 2052, half a billion people will have some form of hearing loss, and one in 10 will have a disabling hearing loss. She also said that while historically hearing loss has been related to ageing, the event had reminded people that there are more and more younger consumers who listen for longer periods of time, at home, in public spaces or in their workplaces.
As such, hearing health was becoming increasingly front and centre in the audio industry.
MORE AURAHEAR: ‘Would Auracast help you on the bus?’
Elfen echoed this, saying that some headphones have powerful noise-cancellation software or can output sound with 30 decibels of attenuation: “even your normal hearables that you buy in supermarkets can do that,” he said. “It’s capable of making as much noise in your ear as a hammer drill. You have a very powerful tool in your ear, and if it’s not configured properly and doesn’t work properly, you’re not using it correctly, and there is potential there for damage.”
Companies are looking at smart devices that can help people who wear devices all day, Alastair said. They can also help get a message over about hearing health and raising awareness at an earlier stage, so that people can wear devices sensibly, or help people to use amplification to address hearing loss.
Hearing can be a forgotten sense, Blair said, adding that it is being brought up more and more with the younger generation, but there was a need to ensure that they received reliable information.
“At younger ages, people are exposed to sound but not necessarily to the consequences of them. Understanding that is becoming more and more important in audio.”
The way we consume audio and knowing how loud was too loud has changed over the years. Elfed said that with traditional hi-fi systems, if the volume was too much neighbours would bang on the ceiling or knock on the door.
“The problem now is that you can get the same level of sound in your ears and nobody but you can hear it,” he said. “Unless you are self-policing, unless you are looking after your own health, your mum, your dad – they’re not going to tap you on the shoulder and say that is too loud.
“Other people are not giving you cues because your enjoyment is very much an insular thing; it’s just you by yourself.”
For example, he cited a fellow passenger on a railway platform standing too close to the edge as a train arrived. They couldn’t hear the shouted warnings because she had headphones on. It was only when someone waved at her that she was made aware there was an issue.
“She was completely oblivious,” Elfen said. “The issue there is you are socially isolated, even in a social situation … from a hearing health point of view, isolation is probably not so good.”
The mental effort of hearing in a noisy environment can also be an issue for people, as Alastair explained: “It can lead to people not just withdrawing from the situation but just not engaging in social situations at all. It’s a slippery slope … your life becomes more insular.”
Blair said that it can take an average of 10 years for people to realise their hearing has degraded to the point where they need to do something about it.
Later in the discussion, Alastair talked about the need to get people living with hearing loss involved with design processes right from the start to ensure the user has a good experience with the devices or the app.
“It’s not trivial,” he said, citing an example of how one company had worked with the RNID on speech enhancement features on its next-generation soundbar. The partnership included workshops and experiments to get the best results.
“It made clear to the engineering team that you can push this stuff a long, long way. The audiophile customer might say that artefacts are creeping in, but for the people testing, they wanted to keep it going and going.”
The result was a product that had more enhancements than usual, as it was available from the main menu, rather than being tucked away in an accessibility menu, which can put some people off.
AI has a role to play in the future of hearing health, and this was touched on by the panel. We have already covered the work of Deep Neural Networks that are in GN’s hearing aids in this article.
Blair said that Qualcomm has a voice enhancement and speech intelligibility programme, which would help with the way in which people interact with audio, but it isn’t necessarily for those with hearing impairments. AI can, she added, help with hearing people as you look at them.
Alastair agreed, saying that data can be collected that can be used, such as the sounds they are reacting to, and if that reaction means help is needed with hearing health. AI could also be used to let people describe what is not quite right about the sound they are experiencing in an environment.
Elfed said that AI is a broad subject, and also not new: he was working on automatic number plate recognition back in 1996. “We make algorithms that have AI in them, and we make them for chips that have limited resources. They’re amazing … but you don’t have a data centre under the desert in Nevada, you have a small device running on batteries in your ear.”
The AI in hearing health devices is, he added, very different to the AI that people hear about every day.
The session on Wednesday, November 12, closed with Alastair talking about Auracast, reminding people about the low latency of the Bluetooth LE Audio standard and the advantages this will have for assistive listening.
“It means we can do really interesting speech communication stuff, where you can send audio from a microphone … to the entire room. So if you are wearing hearing aids that are compatible, or earbuds or headphones, then you can get really clear audio directly to your ear.
“We (at the RNID) have been promoting and advocating this technology for a while, and we have started doing tests with companies and at venues like Bristol Temple Meads station, which had a trial installation in the summer. Passengers can hear train announcements for the first time.
“People have been welling up when they have a sound experience that is genuinely revelatory. This is an experience that is like that.”
