Flutter Accessibility: Getting Started

Learn how to improve the accessibility of your Flutter app by providing more semantic details for screen readers and following other items from Flutter’s accessibility checklist. By Alejandro Ulate Fallas.

Leave a rating/review
Download materials
Save for later
You are currently viewing page 2 of 4 of this article. Click here to view the first page.

Testing With a Screen Reader

The next step is to do screen reader testing. To get an idea of how your app might feel for someone with vision impairments, you need to enable your phone’s accessibility features. If enabled, you’ll get spoken feedback about the screen’s contents and interact with the UI via gestures.

Flutter takes care of the heavy load since it enables the screen reader to understand most of the widgets on the screen. But, certain widgets need context so the screen reader can accurately interpret them.

Introducing The Semantics Widget

The Semantics widget provides context to widgets and describes its child widget tree. This allows you to provide descriptions of widgets so that Flutter’s accessibility tools can get the meaning of your app.

The framework already implements Semantics in the material and cupertino libraries. It also exposes properties you can use to provide custom semantics for a widget or a widget subtree.

But, there are times when you’ll need to add your own semantics to provide the correct context for screen readers. For example, when you want to merge or exclude semantics in a widget subtree, or when the framework’s implementation isn’t enough.

Enabling the Screen Reader

To enable your device’s screen reader, go to your phone’s settings and navigate to Accessibility. Then, enable TalkBack or VoiceOver if you’re using an iOS device.

Give the screen reader permission to take over the device’s screen.

By enabling TalkBack/VoiceOver, your navigation and interaction with the mobile phone will change. Here’s a quick rundown of how to use the screen reader:

  • Tap once to select an item.
  • Double-tap to activate an item.
  • Drag with one finger to move between items.
  • Drag with two fingers to scroll (use three fingers if you’re using VoiceOver).

Hot reload the app. Try using the app by opening a random meal and saving a couple of meals for later. Close your eyes if you want to experience total blindness while using the app. Here’s a preview:

Here’s what you may have experienced:

  • When in the Saved Meals For Later screen, it’s not clear what Random Meal does.
  • When in the Meal Detail screen, the screen reader refers to Save Meal for Later and Remove From Saved List icons as button. This is confusing.
  • When in the Meal Detail screen, after tapping the Save Meal for Later icon button, VoiceOver (iOS) doesn’t read the snackbar.
  • When in Meal Detail screen, after tapping the Remove From Saved List icon button, VoiceOver (iOS) doesn’t read the snackbar.

Adding Support for Screen Readers

OK, it’s time to add some Semantics. Open lib/presentation/widgets/random_meal_button.dart and
in build wrap FloatingActionButton in Semantics like below:

return Semantics(
  // 1
  button: true,
  enabled: true,
  // 2
  label: 'Random Meal',
  // 3
  onTapHint: 'View a random meal.',
  onTap: () => _openRandomMealDetail(context),
  // 4
  excludeSemantics: true,
  child: FloatingActionButton.extended(
    onPressed: () => _openRandomMealDetail(context),
    icon: const Icon(Icons.shuffle),
    label: const Text(
      'Random Meal',

Here’s what’s happening in the code above:

  1. This tells screen readers that the child is a button and is enabled.
  2. label is what screen readers read.
  3. onTapHint and onTap allows screen readers to know what happens when you tap Random Meal.
  4. excludeSemantics excludes all semantics provided in the child widget.
Note: you have to provide onTap when implementing onTapHint. Otherwise, the framework will ignore it.

If you’re using VoiceOver (iOS), you’ll notice there’s no change. That is because iOS doesn’t provide a way to override those values and thus it’s ignored for iOS devices. Also, onTap supersedes onPressed from FloatingActionButton. So, you don’t have to worry about _openRandomMealDetail executing twice.

Restart the app. Then, use the screen reader to focus on Random Meal and notice how the screen reader interprets the app:

You need to do something similar with MealCard. See if you can implement Semantics by yourself this time. You can find MealCard in lib/presentation/widgets/meal_card.dart.

Need help? Open the spoiler below to find out how.

[spoiler title=”Solution”]

return Semantics(
  button: true,
  label: meal.name,
  onTapHint: 'View recipe.',
  onTap: onTap,
  excludeSemantics: true,
  child: Material(...),


Using Semantics With Custom Widgets

Sometimes, you use Semantics to provide information about what role a widget plays, allowing screen readers to understand and behave accordingly.

You’ll use that for the meal heading. So, open lib/presentation/widgets/meal_header.dart, wrap Column with Semantics and set header to true like so:

return Semantics(
  header: true,
  child: Column(

This tells screen readers that the contents inside Column is a header.

Hot reload. Navigate to MealDetailPage using the screen reader. Confirm that the screen reader identifies it as a header. Here’s how the app is coming together:

Note: While you’re developing, it’s helpful to activate Flutter’s Semantics Debugger. Set showSemanticsDebugger to true in the app’s top-level MaterialApp. Semantics Debugger shows the screen reader’s interpretation of your app.

Using SemanticsService to Fill the Gaps

Now, to finish adding screen reader support, open lib/presentation/widgets/meal_appbar.dart. Replace // TODO: Add Tooltip for Remove Meal with this line of code:

tooltip: 'Remove from Saved Meals',

Do the same with // TODO: Add Tooltip for Save Meal for Later — replace it with this:

tooltip: 'Save for Later',

As you might’ve known, tooltips in IconButtons serve as semantic labels for screen readers. They also pop up when tapped or hovered over to give a visual description of the icon button — significantly improving your app’s accessibility and user experience.

Hot reload. Navigate to a meal and select the Save Meal for Later icon. Here’s what you’ll experience:

Now the screen reader knows appropriate labels for those buttons.

Making SnackBars Accessible on iOS

There’s still one issue you need to address, and it’s a platform-specific problem. For VoiceOver users, the snackbars aren’t read when they appear on the screen.

There is an issue about this on Flutter’s Github explaining reasons behind this behavior for iOS devices. For now, it’s safe to say you’ll need to use an alternative: SemanticsService.

SemanticsService belongs to Flutter’s semantics package, and you’ll use it to access the platform accessibility services. You shouldn’t use this service all the time because Semantics is preferable, but for this specific case, it’s OK.

First, replace // TODO add directionality and semanticsLabel fields in _onRemoveMeal with the following:

final textDirectionality = Directionality.of(context);
final semanticsLabel = 'Removed $mealName from Saved Meals list.';

Second, replace // TODO add directionality and semanticsLabel fields in _onSaveMealForLater with:

final textDirectionality = Directionality.of(context);
final semanticsLabel = 'Saved $mealName for later.';

Don’t forget to add the corresponding imports at the top of the file:

import 'package:flutter/foundation.dart';
import 'package:flutter/semantics.dart';

Flutter’s accessibility bridge needs context about the device’s textDirectionality. That’s why you obtained it from the current context.

Next, change Texts in both _onRemoveMeal and _onSaveMealForLater snackbars to the following, respectively:

  'Removed $mealName from Saved Meals list.',
  semanticsLabel: semanticsLabel,



Then, replace // TODO: Add Semantics for iOS. at the bottom of _onRemoveMeal with the code below:

if (defaultTargetPlatform == TargetPlatform.iOS) {

Lastly, do the same with _onSaveMealForLater:

if (defaultTargetPlatform == TargetPlatform.iOS) {

SemanticsService.announce will use the platform-specific accessibility bridge in Flutter to read out semanticsLabel. Then, you provide the device’s textDirectionality to it since the bridge needs that. This ensures the screen reader announces the snackbar message on iOS in both cases.

Hot reload the app. If you’re using VoiceOver, you’ll now hear both snackbars when saving or removing a meal:

If you’re using TalkBack, you won’t notice any differences.

Great job! Android and iOS screen readers can now interpret Mealize. It was a challenge, but you knocked it out of the park. Congratulations!