The new feature for Android Q presented by Google at its 2019 edition of Google I/O should delight many people. Called Live Caption, it allows you to offer real-time subtitles on all the videos and sounds you listen to. An option that can be convenient for those with impaired hearing but also for many other users in certain circumstances.
This subtitle option works naturally with all the applications you use (YouTube, Pocket Casts, Skype, Instagram...) but also with videos and sounds that you can record yourself on your device. It must be activated in the accessibility settings and can then easily be activated with one of the volume buttons on your phone. The subtitles are then simply displayed as a small black box on the videos and sounds. It is also possible to move them on the screen of your device according to your preferences.
To do this, Google uses machine learning on your device itself. The Mountain View company explains that Live Caption does not require an Internet connection to work, but subtitles are not recorded and disappear automatically.
The demonstrations made during the conference were simply stunning, with excellent quality in the transcription. It remains to be seen whether Live Caption can do as well in other languages as it does in English.
The introduction of this feature is part of Google's commitment to offering its services to as many users as possible without any discrimination. Google explains that "for 466 million deaf and hard of hearing people around the world, subtitles are more than just a convenience: they make content more accessible."
What do you think of this new feature?