This story is part of , stories about the diverse teams creating products, apps and services to improve our lives and society.
Google is rolling out a few new accessibility features for Android users, including the ability to control your phone and communicate using facial gestures, the company said Thursday. It also rolled out an update to , which uses a person’s phone camera to identify objects and text in the physical world.
The first update, called Camera Switches, detects facial gestures using your phone’s camera. Users can choose from six gestures — look right, look left, look up, smile, raise eyebrows or open your mouth — to navigate their phone. They can also assign gestures to carry out tasks like open notifications, go back to the home screen or pause gesture detection.
Camera Switches is an update to Switch Access, an Android feature launched in 2015 that lets people with limited dexterity more easily navigate their devices using adaptive buttons called physical switches. Now, with Camera Switches, users can scan and select items on their phone without using their hands or voice. The new feature can be used alongside physical switches.
Camera Switches also allows users or their caregiver to choose how long to hold a gesture and how big it needs to be for the phone to detect it. To use the feature, open your phone’s settings, select Accessibility, and then tap Switch Access (under Interaction Controls). Turn it on and grant permissions. You can also download the app from the Play Store. (Here’s a video on how to set up Camera Switches.)
Additionally, a new Android app called Project Activate lets people use those same facial gestures from Camera Switches to activate customized actions using a single gesture, like saying a preset phrase, sending a text and making a phone call.
For instance, someone could use Project Activate to answer yes or no to a question, ask for a moment to type something into a speech-generating device or to send a text to ask someone to come to them.
The app is customizable, ranging from the actions you want to activate to the facial gestures you want to use. Project Activate is available in the US, UK, Canada and Australia in English. You can download it from the Google Play store.
Lastly, Google rolled out an update to Lookout, which launched in 2019 and helps people who are blind or low-vision identify food labels, pinpoint objects in a room and scan US currency. Last year, the search giant to capture text on a page. Now, that feature can also read handwritten text. It currently supports handwriting in Latin-based languages, with more language compatibility coming soon, Google said. Additionally, Lookout’s currency mode now recognizes Euros and Indian Rupees, in addition to US dollars. More currencies will be added, the company said.
Google has been working to increase its accessibility offerings in recent years as more Silicon Valley giants recognize the Google rolled out a series of updates to Maps, Live Transcribe and Sound Amplifier aimed at improving user accessibility. Facebook has worked to improve photo descriptions for blind and visually impaired users, while also rolling out automatic captions on Instagram’s IGTV. And Apple launched a People Detection feature last year, which lets blind and low-vision iPhone and iPad users know how close someone is to them.. Last year,