Google has recently announced two new tools that describe how disabled people use Android phones using face gestures. According to Google, people with speech or physical disabilities can now operate their Android-powered smartphones hands-free by raising an eyebrow or smiling.
Machine learning and front-facing cameras are used in the newly released tools to detect face and eye movements. Users who are blind or visually impaired can now scan their phone screen and select a task by raising their brows, smiling, opening their mouth, or looking to the left, right, or up.
According to the Centers for Disease Control and Prevention, 61 million adults in the United States are disabled. As a result, Google and competitors Apple and Microsoft are determined to create products and services that are simple to use for them.
Here is How Disabled People Can Use Android Phones with Face Gestures
“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate their phones. However, that’s not always possible for people with severe motor and speech disabilities.” Google said in a blog post.
“To make Android more accessible for everyone, we’re launching new tools. They make it easier to control your phone and communicate using facial gestures,” Google said.
One of the tools is “Camera Switches,” which allow people to interact with smartphones using their faces rather than swipes and taps.
The other is Project Activate, a new Android app that lets people use gestures to trigger actions such as having a phone play a recorded phrase, send a text, or make a call.
“Now it’s possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone — sans hands and voice,” Google said.
The free Activate app is available on the Google Play store in Canada, the United Kingdom, Australia, and the United States. It will, however, be available in other countries very soon.
Apple, Google, and Microsoft have all released innovations that make internet technology more accessible to people with disabilities. Voice-controlled digital assistants are built into speakers and smartphones, for example. These allow people who have vision or movement issues to instruct computers on what to do.
Also, look at: Google Search Dark Mode Is Now Available for Desktop Users: How to Enable It.
Similarly, software exists that recognises text on web pages or in images and then reads it aloud. Captions that display what is said in videos are also generated automatically.
In addition, Apple has built an “AssistiveTouch” feature into the smartwatch that controls the displays by sensing movements like finger pinches or hand clenches.
“This feature also works with VoiceOver so you can navigate Apple Watch with one hand while using a cane or leading a service animal,” Apple said in a post.
Similarly, Microsoft describes accessibility as essential to empowering everyone with technology tools.
“To enable transformative change accessibility needs to be a priority. We aim to build it into what we design for every team, organization, classroom, and home. Microsoft said