Introduction to Accessibility


Accessibility matters. You already know this, or you wouldn’t be reading this lesson. :] 15–20% of people live with some form of disability, and another 5% experience short-term disability. They’re all potential customers of your app: They want to use your app, you want to make sure they can use your app, and SwiftUI is here to help.

Most app UIs are very visual experiences, so most accessibility work focuses on VoiceOver — a screen reader that lets people with low to no vision use Apple devices without needing to see their screens. VoiceOver reads information aloud to users about your app’s UI elements — it’s up to you to make sure this information helps users interact efficiently with your app.

In this lesson, you’ll learn how VoiceOver works on an iOS device and discover some of the accessibility tools provided by Xcode and SwiftUI.

Here’s what you’ll learn:

  • How to use basic VoiceOver gestures on an iOS device.
  • That SwiftUI elements are accessible by default, and how this helps you.
  • How to use Xcode’s Accessibility Inspector with an iOS simulator.

Note: This course is about visionOS, but this lesson uses an iOS device and iOS previews to introduce (or review) VoiceOver and Xcode’s Accessibility Inspector. If you’re already familiar with these topics, skip to lesson 2.

Apple’s investing a lot of effort in helping you improve the accessibility of your apps. With SwiftUI, it’s easier than ever before, and you can help make it happen!

Note: This lesson assumes you’re comfortable using Xcode to develop iOS apps. You’ll need an iOS device to practice using VoiceOver gestures.

See forum comments
Download course materials from Github
Next: VoiceOver