08 May 2019
At Google I/O 2019 keynote, Google announced a project called Live Relay which is basically a new accessibility feature. It is aimed at making it possible for people to communicate on phone calls even if they cannot speak or listen because of any reason. While it can be used by anyone, it is primarily aimed at disabled people who face difficulties on phone calls or simply can't take them altogether.
Live Relay is a combination of text-to-speech and speech recognition functionalities. It offers both these features depending on what is needed when. It means if there is a person who cannot speak, Live Relay will help him communicate by typing text which will then be read out loud to the other person on the call. Similarly, for those with hearing problems, it will transform spoken words into text so that they can read and understand.
The best part is that Live Relay does both these tasks on-device with no need of connecting with servers. It means a user does not have to rely on an active internet connection to make use of it. It also helps in keeping conversations private as no data is shared on the cloud.
Even though Live Relay is currently primarily aimed at people with disability, Google's ultimate goal is to make it useful for all types of users. It can be used in situations where you cannot speak because of your surroundings. The team is also working on adding real-time translation support so that users can make use of it even when communicating in a language they don't know or are not fluent in. The feature is currently in the research phase and will take some time before it can be made available for public use.