Live Relay And Captioning Makes Android Devices More Accessible To People With Disabilities
One of the focal points for this year’s Google I/O event is enhancing accessibility options and controls for people with disabilities. Live relay and live captioning for Android use speech recognition and synthesis to make the content of your device more accessible.
These new features for Android could help lessen the difficulties of people with hearing impairment and other conditions without having to rely on an internet connection.
The features were announced during the first day of Google’s I/O developers’ conference. The features rely on improved speech-to-text and text-to-speech algorithms, which run on-device rather than sending audio to third party apps to be decoded.
Live Relay And Caption To Help People With Speech Issues And Hearing-Related Disabilities
The live relay feature is set to help out people who have speech issues and hearing impairments that would specifically make phone calls a very daunting task. In practice, it works very similar to that of Google Duplex.
What the feature does is it adds the ability to have a live-transcribed call with another person. The caller would now be able to type in custom messages or use Smart Reply to send pre-composed quick replies all still based on the on-going call or conversation.
The system also works the other way around. The caller’s audio replies will be transcribed into texts for the other person to read.
Another feature that will deem help improve speech recognition ability is live captioning. Basically, it does the same thing as the live relay, this one for videos instead.
Imagine watching a YouTube video or even recording a video yourself, the live captioning lets you add closed captions to the videos to make it more accessible. This can even be used in voice messages or even a video call, users will be able to see and read what the person is saying in real time.
Now, this feature cannot only be incredibly useful for persons with hearing disabilities but also for people who do not speak the same language. The feature bridges the gap between all types of people including people with disabilities.
Or simply when you are almost off to bed and decide to watch a video without wanting to wake everyone up, you can just mute the video and read the live captions instead.
Live Relay Is Still In The Works…
Google CEO Sundar Pichai said the live relay feature is still “under development”, although the feature has already been demoed during the I/O event to see how it will work.
Live captioning should be available to Android Q users when it releases, with some device restrictions. Meanwhile, there is still no definite date on when the live relay will be available.
Although the demo of what looks like an almost complete feature means that the wait won’t be that too long.
There is not doubt that these features will help people with hearing-related conditions and people with speech issues communicate, and vice versa. Therefore, we can’t wait for these features to use them as soon as they are available on our devices.