Skip to Content

We Design AI Interfaces with Accessibility in Mind – Here’s How

An illustration of an AI assistant chat interface, an stock image of a keyboard and the Salesforce character Astro.
Our team relied on alternate navigation, chat notifications, and button labels.

Discover 3 best practices that help keyboard users interact with Einstein Copilot and Prompt Builder.

When we design AI interfaces, we always ask: Can I do it with a keyboard alone? There are many ways to navigate beyond solely using a mouse. Some people are supported by a head wand, a switch control, or another assistive device to operate their computer. For these tools to work, each needs keyboard access. Knowing this, we all have a responsibility to address accessibility barriers and design inclusive products.

Our team asked this keyboard question when we designed our conversational AI assistant, Einstein Copilot.

Throughout the process, we prioritized accessible design standards. The Product Accessibility and Inclusive Design (PAID) team focused on best practices for alternate navigation, chat notifications, and button labels. Today, we’re sharing how the application of these concepts led to a more usable product. 

What you’ll learn

What is an accessibility barrier?

An accessibility barrier prohibits people from accessing services and information. In the physical world, a building entrance that has stairs but doesn’t have a ramp is an example of an accessibility barrier. It limits entry for lots of people. Ramps help those with wheelchairs, strollers and delivery handcarts or folks with low vision or limited mobility who prefer an alternative to the elevation changes of each narrow stair.

Good design benefits everyone. Similarly, in the digital world, AI interfaces without key accommodations can exclude users. 

We want to avoid barriers as we design the newest user interfaces: chat layouts for AI assistants

Chats often include an open text field and a vertical stream of responses in a side panel. So what are the interaction points in this design? We think about users who navigate with a mouse, voice dictation or assistive technology such as a screen reader. It’s critical that the AI interface is free of obstacles for keyboard users. Otherwise, it’s a mismatched interaction.

Mismatches create a huge barrier to many users with disabilities

Stephanie Louraine
Salesforce Lead Digital Accessibility Engineer

“Mismatches create a huge barrier to many users with disabilities,” said Stephanie Louraine, Salesforce Lead Digital Accessibility Engineer. “For example, those who have motor impairments can have limited use of a mouse or pointer device. Additionally, blind or low-vision users  who use a screen reader interact with interfaces using a keyboard.” 

As part of the PAID team, Louraine provides integrative accessibility support to our AI design and development teams. The result is ethical and inclusive AI experiences.

Accessible design is essential. We can’t leave best practices behind in the age of AI.

Back to top

Best Practice 1: Design AI interfaces with alternate navigation paths 

Interfaces shouldn’t require an interaction that a user can’t perform. To be operable, there needs to be multiple ways to complete a task. Our team considered this in the selection-menu design for Prompt Builder

First, let’s talk about where this menu lives.

A screenshot of Prompt Builder with an orange square highlighting the Resource selection area.
Select Resources in the top field, labeled “2” above.

Say you’re using Einstein Copilot to get more done in the flow of work. Our customers ask their AI assistant for instructions, custom close plans, and answers to common questions. They might start their day with a frequently entered prompt like, “summarize this account and show me the highlights.” 

Prompt Builder enables anyone to customize and create prompts to use in Einstein Copilot. To ground the prompt — or add context to it — there are a few steps. It includes selecting resources such as customer relationship management (CRM) data. 

Designing alternate navigation paths matters at every interaction point. It’s even important for the selection menu.

In Prompt Builder, keyboard users can select CRM data to ground their prompts.

If users need a mouse to select their CRM data, then keyboard users can’t complete that task. Today, they can complete this step thanks to custom scripting, labels, and tab navigation.

Including different ways to complete a task increases user success. This also applies to our design for Copilot’s chat interface.

Back to top

Best Practice 2: Announce when chat responses appear

In all user experiences — especially ones involving fast-moving AI — communications need to be timely and detailed. Typically there are visual cues for generative AI chat responses. However, sight-based cues alone aren’t enough. All designs need to be perceivable to screen reader users, too.

Screen reader users rely on their software to announce status changes. When Einstein Copilot responds to a prompt, it notifies the screen reader. The AI assistant also communicates if it needs time to process, takes an action, or gets an error. 

Someone typing in an Einstein Copilot chat on their laptop
Einstein Copilot notifies screen readers when it displays a response.

To achieve this, our designers and developers marked the chat as a live region. The screen reader looks for new text and announces new content.

When communications are clear and frequent, keyboard users can take action. 

Back to top

Best Practice 3: Clearly label additional information

It’s frustrating when a screen reader announces the word “button” without more information. This creates confusion at a time when users are still learning how to use generative AI products. 

Einstein Copilot announces each button’s function to screen readers. For example: our labels differentiate a “more information button” from a “close button.”

A screenshot of the Einstein Copilot panel with the logo, the word "Einstein", an information button, pin icon and close button.
The i button in the circle above is announced as “more information button.”

It helped that the team used Lightning components from the Salesforce Lightning Design System. Each component has accessibility guidelines built in. 

“Lightning components provide a consistent and accessible experience,” said Karen Herr, Salesforce Accessibility Foundations Director. Herr relied on these tools to answer the question: Can I do it with a keyboard alone?

Back to top

Design AI Interfaces for All

Our commitment to accessible design ensures that keyboard users can complete critical tasks. Everyone should be able to ground prompts in CRM data, recognize Einstein Copilot ‌responses, and find more information. It’s part of why users trust our AI assistant.  

It’s all possible because our team relies on accessible design best practices. Alternate navigation, chat notifications, and button labels are necessary for AI interfaces. This helps ensure that the benefits of the AI revolution are accessible to everyone.

Get to know Einstein Copilot today.

Back to top

Get the latest articles in your inbox.