Co-creating a world where people with disabilities can thrive
Making accessible technology with and for people with disabilities.
-
Transforming how people connect with Pixel and Android accessibility features
Transforming how people connect with Pixel and Android accessibility features
We’re committed to making sure people with disabilities can use Android and Pixel phones as easily and effectively as possible. Working together with the disability community, we’ve added accessibility features like:
- Live Caption uses AI to provide people with captions from audio content from across their phone and Chrome browser.
- Guided Frame, a Pixel accessibility feature that helps people with blindness, low vision, and dexterity-related disabilities take selfies and group photos with more ease
- Pixel Call Assist, which helps users avoid long hold times, navigate phone tree menus, ignore the calls they don't want, and get better sound quality on the calls they do want
- Voice recognition technology that makes voice typing on a Pixel device in English nearly three times faster than tapping
- Live Transcribe, an Android feature that provides automatic captions in more than 70 languages to help people who are deaf or hard of hearing participate in conversations
- Lookout, an Android feature that helps people with visual impairments get descriptions of the physical world using their phone's camera and ask follow-up questions by voice or keyboard
- Reading mode on Android, a more accessible reading experience designed for and with people with low vision, blindness, and dyslexia
- Flash notifications on Android, which provide a visual notification so people who are deaf or hard of hearing don’t have to rely on vibration and can keep their phones at a distance
- Live Caption uses AI to provide people with captions from audio content from across their phone and Chrome browser.
-
Harnessing gesture-led communication to help people express themselves and reach their goals
Harnessing gesture-led communication to help people express themselves and reach their goals
We’re developing tools that tap into the power of gesture-led communication to help people communicate, use technology, and reach their goals. We’re early on in this journey, but we’re excited to see what possibilities this new technology can unlock.
For example, Project Gameface is an open-source, hands-free gaming mouse that enables people to control a computer's cursor using their head movement and facial gestures, designed in collaboration with Lance Carr, a quadriplegic video game streamer. Another example is PopSign, a game developed by students from the Georgia Institute of Technology and the National Technical Institute for the Deaf at the Rochester Institute of Technology along with Google engineers and the Kaggle community. PopSign uses AI to help everyone – including Deaf children and their caregivers – become more confident at using sign language. -
Helping people make their voices heard with personalized speech recognition for Android
Helping people make their voices heard with personalized speech recognition for Android
More than 250 million people around the world experience difficulty making their words understood due to non-standard speech. That’s why we designed Project Relate, an Android app offering personalized speech recognition to help people with non-standard speech communicate more easily in face-to-face conversations. Developed through years of research and based on speech samples contributed by more than 1,000 individuals, the app is now available in beta in Australia, Canada, India, New Zealand, and the United States. -
With Google Maps, our goal is to build a better, more helpful map for everyone, including the tens of millions of people worldwide who use wheelchairs. Since 2020, the Accessible Places feature in Google Maps has made it easier to identify whether a place has a wheelchair-accessible entrance, indicated by the wheelchair icon. More information – like wheelchair-accessible seating, parking, and restrooms – is also available, helping people plan visits with confidence.
As of 2023, we’re able to provide wheelchair accessibility information for more than 40 million businesses around the world. This work is made possible by more than 120 million Local Guides who’ve responded to our call to share accessibility information, as well as decades of advocacy by individuals and groups fighting for equal access for people with disabilities. Were it not for them, there would be far fewer accessible places for Google Maps to show. -
Empowering every student to learn how they learn best
Empowering every student to learn how they learn best
With built-in accessibility features, Google for Education helps students and educators customize their learning tools and create inclusive learning environments so they can learn, teach, and collaborate with confidence. Products like Chromebooks, Classroom, and Assignments are designed to support the individual learning styles of everyone who uses them, including students with disabilities. Students and educators can customize their learning tools and create inclusive learning environments where they can learn, teach, and collaborate with confidence. Explore accessibility features and learning resources here. -
Creating new AI-powered tools in partnership with the blind and low-vision community
Creating new AI-powered tools in partnership with the blind and low-vision community
We’re excited about the power that AI has to help people with disabilities more easily communicate, access information, and achieve their unique goals. Here are some of our latest AI-driven innovations designed in collaboration with the blind and low-vision community:
- Lookout uses AI to help people explore the world and accomplish everyday tasks, like sorting mail and putting away groceries. Now this Android feature is even more powerful with the addition of Image Q&A, which generates image descriptions and answers any follow-up questions you ask using your voice or typing. The app recently became available in Japanese, Korean, and Chinese, bringing the total number of supported languages to 34.
- Based on our internal analysis, there are more than 360 billion PDFs online that are inaccessible for people who rely on screen readers. With the help of AI, we're creating a built-in Chrome browser feature that extracts text from PDFs, enabling blind and low-vision readers to access these documents using screen readers.
- We recently added new functionality and features to our TalkBack screen reader for Android and WearOS, which people in the blind and low-vision community can use to read, write, navigate apps, and more.
- Lookout uses AI to help people explore the world and accomplish everyday tasks, like sorting mail and putting away groceries. Now this Android feature is even more powerful with the addition of Image Q&A, which generates image descriptions and answers any follow-up questions you ask using your voice or typing. The app recently became available in Japanese, Korean, and Chinese, bringing the total number of supported languages to 34.
With your help, we’ll get closer to building a truly accessible world.
-
Get in touch with feedback
Have a piece of feedback on a product or service? Reach out and support our work.
-
Participate in UX research
Get involved in the product creation process by sharing your perspective.
-
Become an accessibility trusted tester
Want to test our products to ensure they’re fully accessible? Join our Trust Tester Program. Available in the U.S., Canada, and U.K.
Representing the authentic experiences of people with disabilities
-
Improving representation of people with disabilities in AI datasets
Improving representation of people with disabilities in AI datasets
We’re working to improve representation of people with disabilities by including more members of the disability community in datasets and reference sources. For AI, that means specifically training our image technology on thousands of photos of people from every background, including individuals with chronic conditions and disabilities, as well as people across the gender spectrum and models with darker skin tones. More representative data sets means better experiences for everyone on products like Google Photos, Google Pixel, and Pixel cameras. -
Investing in accessibility and disability innovation, from London to Cannes
Investing in accessibility and disability innovation, from London to Cannes
In 2022 we launched the Accessibility Discovery Centre in London. It’s a space where our engineers, researchers, product teams, and partners can explore and build new kinds of accessible technologies. Built in consultation with Google’s Disability Alliance employee resource group and local partners like the U.K.’s Royal National Institute of Blind People and Royal National Institute for Deaf People, the Centre builds on our years of investment and innovation in helpful and accessible technology. Since then, we’ve expanded to Dublin, Zurich, and Dubai.
We’re taking what we learn and applying it in our events and partnerships, with the goal of inspiring change across the creative industry. As the first official accessibility partner of Cannes Lions, we’re working with Cannes to make sure the festival is built with people with disabilities in mind. This includes providing sign language interpreters and using Live Transcribe for real-time captioning. We also ensure that the Google Beach, our home base at Cannes, is wheelchair-accessible and provides CART captioning, Live Transcribe for real-time captioning, and ASL interpreters.

By making a product more accessible for people with disabilities, we’re enabling people to learn more, to accomplish more. I feel so proud to be able to help make our products better for people so that they can do anything that they want to do in their lives.