Loewe Announce Mimi Defined, the World’s First Personalised TV Audio Experience
Can’t quite hear? Just tell the TV your age.
AI was very much in evidence recently at CES and its use for improving the audio experience was one of the benefits being trumpeted. However, Loewe has partnered with leading hearing authority, Mimi Hearing Technologies, to create Mimi Defined - a slightly different take on what the other TV manufacturers are doing.After a preview showing at IFA in 2018, German, luxury TV manufacturer Loewe, and partner, Mimi Hearing Technologies, now reveal plans to incorporate their personalised sound optimisation software process into Loewe’s new TVs as well as providing an upgrade for many existing models as part of their long term partnership.
The software, previously only available while using headphones but now functioning with TV speakers, uses algorithms to create a profile for the user based on age. Hearing is individual and changes over the course of one's life, thus the perception of sound generally peaks and gradually decreases from the age of 20.
Rather than just relying on the gran favoured ‘turning up the volume’ approach, Mimi Defined uses data from over a million anonymous Mimi Hearing Test results. Mimi argues that in order to achieve an optimal sound experience, the volume of individual frequencies in the sound must be calculated and adapted to individual hearing capabilities. The Mimi Defined technology, therefore, works to analyse audio content in real time and adjust it to individual hearing sensitivity.
Of course, you would only be able to watch in this way on your own but the mode can be adjusted and turned off within the Loewe TV settings.
While the technology comes free on new Loewe TVs, customers of existing models will have to pay £99 for the Mimi Defined upgrade, which is available on Bild 3 models and higher.
One wonders whether this individually tailored experience could also be incorporated as part of the image processing in the latest TVs, with their growing reliance on AI and databases for regularly checking that the image parameters that are being used are still the best for a given scene.
Colour perception also changes with age, so imagine being able to have the TV adjust its colour output to recreate the original colours for you if you are an older movie enthusiast. It might also be useful for those who suffer other optical impairments. Isn’t this what technology is supposed to be able to do? Be inclusive? Bring individuals together by allowing the same experience to be shared by everyone despite physical or health differences? An approach like this could start building the connecting bridge between entertainment and assisted living technologies.
After all, why shouldn’t your gran be able to see and hear Avengers: Infinity War as you do?
Are there any other areas do AVForums members think this sort of AI based machine learning might be usefully applied? Let us know in the thread.
To comment on what you've read here, click the Discussion tab and post a reply.