Two new tools will put machine learning and cameras on phones to work detecting facial and eye movements. People with disabilities can scan their phone screens by smiling, raising their eyebrows, opening their mouths, or looking left, right, or up. The Centers for Disease Control and Prevention (CDC) estimates that 61 million adults in the United States live with a disability, prompting Google and rivals Apple and Microsoft to make products and services more accessible to people with disabilities. This change has resulted in two new features being introduced, one of which is called ‘Camera Switches’, which allows people to use their faces to operate their phones. The other is Project Activate, a new Android app that lets people use these gestures to trigger actions, such as making their phone play a recorded phrase, send a text, or make a call. The free Activate app is available in Australia, UK, Canada and the United States which can be downloaded on Google Play Store. Apple, Google, and Microsoft consistently launch innovations that make internet technology more accessible to people with disabilities. Digital assistants with voice commands built into speakers and smartphones can allow people with vision or movement impairments to tell the computer what to do. There is also software that identifies text on a web page or image and then reads it aloud, as well as automatic caption generation that displays what is said in the video.