Accessible design allows users of all abilities to navigate, understand, and use your UI successfully.
A well-designed product is accessible to users of all abilities, including those with low vision, blindness, hearing impairments, cognitive impairments, or motor impairments. Improving your product’s accessibility enhances the usability for all users. It’s also the right thing to do.
Material design’s built-in accessibility considerations will help you accommodate all of your users. This section primarily applies to mobile UI design. For more information on designing and developing fully accessible products, visit the Google accessibility site.
Color and contrast
Use color and contrast to help users see and interpret your app’s content, interact with the right elements, and understand actions.
Accessible color palette
Choose primary, secondary, and accent colors for your app that support usability. Ensure sufficient color contrast between elements so that users with low vision can see and use your app.
The contrast ratio between a color and its background ranges from 1-21 based on its luminance, or intensity of light emitted, according to the World Wide Web Consortium (W3C).
Contrast ratios represent how different a color is from another color, commonly written as 1:1 or 21:1. The higher the difference between the two numbers in the ratio, the greater the difference in relative luminance between the colors.
The W3C recommends the following contrast ratios for body text and image text:
Small text should have a contrast ratio of at least 4.5:1 against its background.
Large text (at 14 pt bold/18 pt regular and up) should have a contrast ratio of at least 3:1 against its background.
Icons or other critical elements should also use the above recommended contrast ratios.
Logos and decorative elements
While decorative elements (such as logos or illustrations) don’t have to meet contrast ratios, they should be distinguishable if they possess important functionality.
Other visual cues
For users who are colorblind, or cannot see differences in color, include design elements in addition to color that ensure they receive the same amount of information.
Colorblindness takes different forms (including red-green, blue-yellow, and monochromatic). Use multiple visual cues to communicate important states. Use elements such as strokes, indicators, patterns, texture, or text to describe actions and content.
Sound and motion
Give visual alternatives to sound, and vice versa. Provide closed captions, a transcript, or another visual alternatives to critical audio elements and sound alerts.
Allow users to navigate your app using sound by adding descriptive labels to UI elements. When using a screen reader such as TalkBack and navigating by touch exploration, labels are spoken aloud when users touch UI elements with their fingertips.
The following sounds should be avoided:
Unnecessary sounds that play over a screen reader, such as background music that autoplays when entering a web page. If there is background sound, ensure users can safely pause or stop it.
Extra sounds added to native elements (screen readers will be able to interpret native elements correctly).
Controls in an app may be set to disappear after a certain amount of time. For example, five seconds after starting a video, playback controls may fade from the screen.
High-priority controls Avoid using timers on controls that perform a high-priority functions, as users may not notice these controls if they fade away too quickly. For example, TalkBack reads controls out loud if they are focused on, and placing them on timers may prevent the controls from completing their task.
For controls that enable other important functions, make sure that the user can turn on the controls again or perform the same function in other ways. Learn more in hierarchy and focus.
Material design’s touch target guidelines enable users who aren’t able to see the screen, or who have motor-dexterity problems, to tap elements in your app.
Touch targets are the parts of the screen that respond to user input. They extend beyond the visual bounds of an element. For example, an icon may appear to be 24 x 24 dp, but the padding surrounding it comprises the full 48 x 48 dp touch target.
Touch targets should be at least 48 x 48 dp. A touch target of this size results in a physical size of about 9mm, regardless of screen size. The recommended target size for touchscreen elements is 7-10mm. It may be appropriate to use larger touch targets to accommodate a larger spectrum of users, such as children with developing motor skills.
Touch target spacing
In most cases, touch targets should be separated by 8dp of space or more to ensure balanced information density and usability.
Keeping related items in proximity to one another is helpful for those who have low vision or may have trouble focusing on the screen.
The slider value is in close proximity with the slider control.
The slider value is placed too far away from the control. A user of screen magnification may not be able to view both the slider and the value without panning back and forth.
To improve readability, users might increase font size. Mobile devices and browsers include features to allow users to adjust font size system-wide. To enable system font size in an Android app, mark text and their associated containers to be measured in scaleable pixels (sp) .
Make sure to allot enough space for large and foreign language fonts. See Line Height for information on the recommended sizes of foreign language fonts.
Hierarchy and focus
Apps should give users feedback and a sense of where they are in the app. Navigation controls should be easy to locate and clearly written. Visual feedback (such as labels, colors, and icons) and touch feedback show users what is available in the UI.
Navigation should have clear task flows with minimal steps. Focus control, or the ability to control keyboard and reading focus, should be implemented for frequently used tasks.
Screen readers give users multiple ways to navigate a screen, including:
Touch interface screen readers allow users to run their finger over the screen to hear what is directly underneath. This provides the user a quick sense of an entire interface. Or the user can quickly move to a UI element from muscle memory. In TalkBack, this feature is called “explore by touch.” To select an item, the user must double tap.
Users may also move focus by swiping backwards or forwards on screen to read pages linearly, from top to bottom. This allows users to hone in on certain elements. In TalkBack, this is called linear navigation.
Users may switch between both “explore by touch” and “linear navigation” modes. Some assistive technologies allow users to navigate between page landmarks, such as headings, when these landmarks use the appropriate semantic markup.
Hardware or software directional controllers (such as a D-pad, trackball, or keyboard) allow users to jump from selection to selection in a linear fashion.
Place items on the screen according to their relative level of importance.
Important actions: Place important actions at the top or bottom of the screen (reachable with shortcuts).
Related items: Place related items of a similar hierarchy next to each other.
Input focus should follow the order of the visual layout, from the top to the bottom of the screen. It should traverse from the most important to the least important item. Determine the following focus points and movements:
The order in which elements receive focus
The way in which elements are grouped
Where focus moves when the element in focus disappears
Clarify where the focus exists through a combination of visual indicators and accessibility text.
Group similar items under headings that communicate what the groupings are. These groups organize content spatially.
Focus traversal between screens and tasks should be as continuous as possible.
If a task is interrupted and then resumed, place focus on the element that was previously focused.
By using standard platform controls, your app will automatically contain the markup and code needed to work well with a platform’s assistive technology. Adapt your app to meet each platform's accessibility standards and assistive technology (including shortcuts and structure) to give users an efficient experience.
Test your design with the platform accessibility settings turned on (both during and after implementation).
Other design considerations:
Use scalable text and a spacious layout to accommodate users who may have large text, color correction, magnification, or other assistive settings turned on.
Keyboard/mouse interfaces should have every task and all hover information accessible by keyboard-only.
Touch interfaces should allow screen readers and other assistive technology devices to read all parts of your interface. The text read aloud should be both meaningful and helpful.
Label visual UI elements
Screen-reader users need to know which UI elements are tappable on-screen. To enable screen readers to read the names of components out loud, add the contentDescription attribute to components such as buttons, icons, and tabs containing icons that have no visible text.
Any features with special accessibility considerations should be included in help documentation. Make help documentation relevant, accessible, and discoverable. As an example, review this guide on how to use a screen reader with Google Drive.
Testing and research
Following these accessibility guidelines will help improve the accessibility of your app, but does not guarantee a fully accessible experience. It is recommended that you also:
Test your app for full task completion, beginning to end, with various assistive technologies turned on. For example, turn on Explore by Touch in TalkBack and change the speed at which text is spoken out loud.
Have users with impairments test your app.
Consider how individual elements should be accessible while also fitting together in a coherent user flow.
Make sure the major tasks you want your users to complete are possible for everyone.
Talk to your users, particularly those who use assistive technology, to learn about their needs, what they want out of your app, which tools they use, and how they use them. Become familiar with these tools so you can give them the best experience.
Clear and helpful accessibility text is one of the primary ways to make UIs more accessible. Users with limited or no eyesight benefit from explicit verbal descriptions. Accessibility text refers to text that is used by screen reader accessibility software, such as TalkBack on Android, VoiceOver on iOS, and JAWS on desktop. Screen readers read all text on screen aloud, including both visible and nonvisible alternative text.
Accessibility text includes both visible text (including labels for UI elements, text on buttons, links, and forms) and nonvisible descriptions that don’t appear onscreen (such as alternative text for buttons without text labels). Sometimes, an onscreen label may be overridden with accessibility text to provide more information for the user.
Both visible and nonvisible text should be helpfully descriptive and independently meaningful, as some users navigate by using all the headings or links on a page. Test your app with a screen reader to identify areas that are missing or need better accessibility text.
Keep content and accessibility text short and to the point. Screen reader users hear every UI element read aloud. The shorter the text, the faster the screen reader users can navigate it.
Switch to firstname.lastname@example.org
Write clear and short accessibility text.
Account switcher. Switch to account email@example.com
Don’t write long accessibility text.
Avoid including control type or state in text
Screen readers may automatically announce a control’s type or state through a sound or by speaking the control name before or after the accessibility text.
Use short descriptions.
Don’t write the control type.
Developer note: If the control type or state is not being read correctly, the control’s accessibility role may be improperly set or be a custom control. Every element should have an associated accessibility role on a website or be coded to be announced properly. This means a button should be set as a button, and a checkbox as a checkbox, so that the control’s type or state is communicated correctly to the user. If you extend or inherit from a native UI element, you will get the correct role. If not, you can override this information for accessibility on each platform (ARIA for web, AccessibilityNodeInfo for Android).
On Android, set the class name field of the control’s AccessibilityNodeInfo to "android.widget.Button".
Use action verbs to indicate what an element or link does, not what an element looks like, so a visually impaired person can understand. Link text should:
Specify the task that tapping the link will perform
Avoid vague descriptions, such as “click here”
Ensure an element has the same description everywhere it’s used.
Elements with state changes
For icons that toggle between values or states, announce the icon according to how it is presented to the user.
If the icon is a property of an item, make it a checkbox so that screen readers verbalize the current state, such as “on” or “off.”
If the icon is an action, write the text label to specify the action that occurs if the icon is selected, such as “Add to wishlist.”
How elements should be used affects how they are displayed. For example, if a star icon represents the action of adding something to a wishlist, the app should verbalize “Add to wishlist” or “Remove from wishlist.”
Don’t mention the exact gesture or interaction
Don’t tell users how to physically interact with a control, as they may be navigating with a keyboard or other device, not with their fingers or a mouse. Accessibility software will describe the correct interaction for the user.
Use dialogs, toasts, or snackbars (Android) to confirm or acknowledge user actions that are destructive (like “Delete” or “Remove”) or difficult to undo.
For actions that are confirmed through visual means, such as a grid rearranging itself when an item is deleted, a toast is not necessary. In these cases, add accessibility text to provide acknowledgement.
Provide hint speech
Hint speech provides extra information for actions that aren't clear. For example, Android's “double-tap to select” feature prompts the user to tap twice when landing on an item without taking action. Android TalkBack will also announce any custom actions associated with an element. Use hint speech sparingly and only for complex UI.