Video Streaming Tutorial for iOS: Getting Started

Learn how to build a video streaming app using AVKit and AVFoundation frameworks. By Saeed Taheri.

Login to leave a rating/review
Download materials
Save for later

Learn how to build a video streaming app using AVKit and AVFoundation frameworks.

Update note: Saeed Taheri updated this tutorial for iOS 14 and SwiftUI. Luke Parham wrote the original.

You’ve been working on iOS apps for a while now and you think you’re pretty slick. Think you’ve done it all, eh?

Yeah, you can probably do some basic networking. Maybe even pull in some JSON and put together a decent table view with cells that have text and images.

That’s an impressive list of accomplishments to be sure, but…

Can you do this?

App with streaming video

That’s right, it’s time to take your app to the next level and learn how to add video streaming! :]

You’ll build a new app for all those travel vloggers out there. Some people want to make artsy films about their travels, and some people want to enjoy these experiences from the comfort of their own beds.

You’re here to make both of those dreams come true. In the process, you’ll learn the basics of the AVKit and AVFoundation frameworks.

In this tutorial, you’ll learn how to:

  • Add local video.
  • Add streaming video.
  • Enable playback controls.
  • Implement looping.
  • Implement picture-in-picture.

Getting Started

Download the starter project by clicking Download Materials at the top or bottom of this tutorial. Then, open TravelVlogs.xcodeproj from the starter folder and go to VideoFeedView.swift.

Note: It’s possible that videos won’t play in the simulator. Running the app on a real device will mitigate the issue.

The starter project is a vlogger app that you’ll add functionality and features to with AVKit and AVFoundation. Users can select a video, then control its playback options.

Understanding AVKit

A useful bit of development wisdom: Always favor the highest level of abstraction available to you. Then, you can drop down to lower levels when your needs change. In line with this advice, you’ll start your journey at the highest level video framework.

AVKit sits on top of AVFoundation and provides all necessary UI for interacting with a video.

Chart of relationship between AVKit and AVFoundation

Build and run the project, and you’ll see an app that’s already set up with a table full of potential videos for your viewing pleasure.

Starter project in simulator

Your goal is to show a video player whenever a user taps one of the cells.

Adding Local Playback

There are two types of videos you can play. The first one you’ll look at is the type that’s currently sitting in the phone’s storage. Later, you’ll learn how to play videos streaming from a server.

To get started, navigate to VideoFeedView.swift and add the following import right below the SwiftUI import:

import AVKit

Look below this, and you’ll see that you already have a list and an array of Videos. This is how the app fills the existing list with data. The videos themselves are coming from a JSON file embedded in the app bundle. You can look in Video.swift to see how they’re fetched if you’re curious.

To account for the user’s selection, add a state property to VideoFeedView.swift:

@State private var selectedVideo: Video?

Next, find the button inside List and add the following code under the Open Video Player comment:

selectedVideo = video

Then, add the fullScreenCover(item:onDismiss:content:) view modifier to NavigationView:

.fullScreenCover(item: $selectedVideo) {
  // On Dismiss Closure
} content: { item in
  makeFullScreenVideoPlayer(for: item)

This binds the selectedVideo property you defined earlier to full-screen cover. Whenever you set it to a non-nil value, the content of the full-screen cover shows up.

Swift is looking for the new makeFullScreenVideoPlayer(for:), so add the following to set everything straight:

private func makeFullScreenVideoPlayer(for video: Video) -> some View {
  // 1
  if let url = video.videoURL {
    // 2
    let avPlayer = AVPlayer(url: url)
    // 3    
    VideoPlayer(player: avPlayer)
      // 4
      .onAppear {
        // 5
  } else {
  1. All Video objects have a videoURL property representing the path to the video file.
  2. Here, you take url and create an AVPlayer object.

    AVPlayer is the heart of playing videos on iOS.

    A player object can start and stop your videos, change their playback rate and even turn the volume up and down. Think of a player as a controller object that’s able to manage playback of one media asset at a time.

  3. VideoPlayer is a handy SwiftUI view that needs a player object to be useful. You can use it to play videos.
  4. By default, SwiftUI views respect the device’s safe areas. Since it looks better to present a video player extended beyond the status bar and home indicator, you add this modifier.
  5. Once the video player appears on screen, you call play() to start the video.

And that’s all there is to it! Build and run to see how it looks.

Full screen video player

You can see that the video player shows a set of basic controls. This includes a play button, a mute button and 15-second skip buttons to go forward and backward.

Adding Remote Playback

That was pretty easy, right? How about adding video playback from a remote URL? That must be a lot harder!

Go to VideoFeedView.swift and find where videos is set. Instead of loading local videos, load all the videos by replacing that line with the following:

private let videos = Video.fetchLocalVideos() + Video.fetchRemoteVideos()

And… that’s it! Go to Video.swift. Here you can see that fetchRemoteVideos() is simply loading another JSON file. If you look at the videoURL computed property that you used earlier, you’ll see that it first looks for remoteVideoURL. If it doesn’t find any, you’ll get localVideoURL.

Build and run, then scroll to the bottom of the feed to find the キツネ村 (kitsune-mura) or Fox Village video.

Remote video playback in simulator

This is the beauty of VideoPlayer; all you need is a URL and you’re good to go!

In fact, go to RemoteVideos.json and find this line:

"remote_video_url": ""

Then, replace it with this one:

"remote_video_url": ""

Build and run, and you’ll see that the Fox Village video still works.

Fox Village video playing in simulator

The only difference is that the second URL represents an HTTP live stream (HLS). HLS works by splitting a video into 10-second chunks. These are then served to the client one chunk at a time. If you have a slow internet connection, you’ll see that the video starts playing much more quickly than when you used the MP4 version.

Adding a Looping Video Preview

You may have noticed that black box at the top of the list. Your next task is turning that black box into a custom video player. Its purpose is to play a revolving set of clips to get users excited about all these videos.

Looping video player at the top of the list

Then, you need to add a few custom gestures, like tapping to turn on sound and double-tapping to change it to 2x speed. When you want to have very specific control over how things work, it’s better to write your own video view.

It’s your job to get things going.

Understanding AVFoundation

While AVFoundation can feel a bit intimidating, most of the objects you deal with are still pretty high-level.

The main classes you’ll need to get familiar with are:

  1. AVPlayerLayer: This special CALayer subclass can display the playback of a given AVPlayer object.
  2. AVAsset: These are static representations of a media asset. An asset object contains information such as duration and creation date.
  3. AVPlayerItem: The dynamic counterpart to an AVAsset. This object represents the current state of a playable video. This is what you need to provide to AVPlayer to get things going.

AVFoundation is a huge framework that goes well beyond these few classes. Fortunately, this is all you’ll need to create your looping video player.

You’ll come back to each of these in turn, so don’t worry about memorizing them.

Writing a Custom Video View With AVPlayerLayer

The first class you need to cozy up to is AVPlayerLayer. This CALayer subclass is like any other layer: It displays whatever is in its contents property.

This layer just happens to fill its contents with frames from a video you’ve given it via its player property.

The problem is that you can’t use this layer directly in SwiftUI. After all, SwiftUI doesn’t have the concept of CALayers. For that, you need to go back to the old and gold UIKit.

Go to LoopingPlayerView.swift, where you’ll find an empty view you’ll use to show videos. It needs an array of video URLs to play.

The first thing you need to do is add the proper import statement, this time for AVFoundation:

import AVFoundation

Good start! Now you can get AVPlayerLayer into the mix.

A UIView is simply a wrapper around a CALayer. It provides touch handling and accessibility features but isn’t a subclass. Instead, it owns and manages an underlying layer property. One nifty trick is that you can actually specify what type of layer you would like your view subclass to own.

Add the following property override to tell LoopingPlayerView.swift that it should use an AVPlayerLayer instead of a plain CALayer:

override class var layerClass: AnyClass {
  return AVPlayerLayer.self

Since you’re wrapping the player layer in a view, you need to expose a player property.

To do so, add the following computed property so you don’t need to cast your layer subclass all the time:

var playerLayer: AVPlayerLayer {
  return layer as! AVPlayerLayer

To be able to use this view in SwiftUI, you need to create a wrapper using UIViewRepresentable.

Add these lines of code in the same file, outside the LoopingPlayerUIView definition:

struct LoopingPlayerView: UIViewRepresentable {
  let videoURLs: [URL]

UIViewRepresentable is a protocol. You need to implement its methods to complete the bridge between UIKit and SwiftUI.

Add these inside LoopingPlayerView:

// 1
func makeUIView(context: Context) -> LoopingPlayerUIView {  
  // 2
  let view = LoopingPlayerUIView(urls: videoURLs)
  return view

// 3
func updateUIView(_ uiView: LoopingPlayerUIView, context: Context) { }
  1. SwiftUI calls makeUIView(context:) when it needs a new instance of your UIView.
  2. You create a new instance of LoopingPlayerUIView using the initializer and returning the new instance.
  3. SwiftUI calls this method when it needs to update the underlying UIView. For now, leave it empty.

Now, get back to VideoFeedView.swift and add the following property to get URLs for video clips:

private let videoClips = VideoClip.urls

Inside makeEmbeddedVideoPlayer(), replace Rectangle() with the following code, but keep the view modifiers:

LoopingPlayerView(videoURLs: videoClips)

Build and run to see… nothing new! You just passed the video clip URLs to the view but you didn’t do anything with them yet.

Writing the Looping Video View

Next, go over to LoopingPlayerView.swift and get ready to add a player. After all, you now know you need a player to make video playing work.

To get started, add the following player property to LoopingPlayerUIView:

private var player: AVQueuePlayer?

The discerning eye will see that this is no plain AVPlayer instance. That’s right, this is a special subclass called AVQueuePlayer. As you can probably guess by the name, this class allows you to provide a queue of items to play.

Replace init(urls:) with the following to initialize the player:

init(urls: [URL]) {
  allURLs = urls
  player = AVQueuePlayer()

  super.init(frame: .zero)

  playerLayer.player = player

Here, you first create the player object and then connect it to the underlying AVPlayerLayer.

Now, it’s time to add your list of video clips to the player so it can start playing them.

Add the following method to do so:

private func addAllVideosToPlayer() {
  for url in allURLs {
    // 1
    let asset = AVURLAsset(url: url)

    // 2
    let item = AVPlayerItem(asset: asset)

    // 3
    player?.insert(item, after: player?.items().last)

Here, you’re looping through all the clips. For each one, you:

  1. Create an AVURLAsset from the URL of each video clip object.
  2. Then, you create an AVPlayerItem with the asset that the player can use to control playback.
  3. Finally, you use insert(_:after:) to add each item to the queue.

Now, go back to init(urls:) and call the method after super.init(frame:) and before setting player to playerLayer:


Now that you have your player set, it’s time to do some configuration.

To do this, add the following two lines in init(urls:) after addAllVideosToPlayer():

player?.volume = 0.0

This sets your looping clip show to auto-play and audio off by default.

Build and run to see your fully working clip show!

Black loop clip without repeat

Unfortunately, when the last clip has finished playing, the video player fades to black.

Implementing the Actual Looping

Apple wrote a nifty new class called AVPlayerLooper. This class will take a single-player item and take care of all the logic it takes to play that item on a loop. Unfortunately, that doesn’t help you here!

Annoyed emoji-like face

What you want is to play all of these videos on a loop. Looks like you’ll have to do things the manual way. All you need to do is keep track of your player and the currently playing item. When it gets to the last video, you’ll add all the clips to the queue again.

When it comes to “keeping track” of a player’s information, the only route you have is to use key-value observing (KVO).

Frustrated emoji-like face

Yeah, it’s one of the wonkier APIs Apple has come up with. If you’re careful, it’s a powerful way to observe and respond to state changes in real time. If you’re completely unfamiliar with KVO, here’s the quick explanation: The basic idea is that you register for notification any time the value of a particular property changes. In this case, you want to know whenever player‘s currentItem changes. Each time you’re notified, you’ll know the player has advanced to the next video.

To use KVO in Swift — much nicer than in Objective-C — you need to retain a reference to the observer. Add the following property to the existing properties in LoopingPlayerUIView:

private var token: NSKeyValueObservation?

To start observing the property, add the following to the end of init(urls:):

token = player?.observe(\.currentItem) { [weak self] player, _ in
  if player.items().count == 1 {

Here, you’re registering a block to run each time the player’s currentItem property changes. When the current video changes, you want to check to see if the player has moved to the final video. If it has, then it’s time to add all the video clips back to the queue.

That’s all there is to it! Build and run to see your clips looping indefinitely.

Properly looping clip in simulator

Playing with Player Controls

Next, it’s time to add some controls. Your tasks are to:

  1. Unmute the video when a single-tap occurs.
  2. Toggle between 1x and 2x speed when a double-tap occurs.

You’ll start with the actual methods you need to accomplish these things. First, you need to expose some methods in LoopingPlayerUIView where you have access to player directly. Second, you need to create a way to call those methods from LoopingPlayerView.

Add these methods to LoopingPlayerUIView:

func setVolume(_ value: Float) {
  player?.volume = value

func setRate(_ value: Float) {
  player?.rate = value

As the names imply, you can use these methods to control video volume and playback rate. You can also pass 0.0 to setRate(_:) to pause the video.

The way you can connect these methods to SwiftUI is by using a Binding.

Add these properties to LoopingPlayerView right under let videoURLs: [URL]:

@Binding var rate: Float
@Binding var volume: Float

Make sure to pass the binding values to the underlying UIView using the methods you already implemented:

func makeUIView(context: Context) -> LoopingPlayerUIView {
  let view = LoopingPlayerUIView(urls: videoURLs)
  return view

func updateUIView(_ uiView: LoopingPlayerUIView, context: Context) {

This time, you also add some lines to updateUIView(_:context:) to account for changes in volume and rate while the view is on screen.

Since you’ll control the playback from outside this struct, you can remove these two lines from the initializer of LoopingPlayerUIView:

player?.volume = 0.0

Now, head back to VideoFeedView.swift and add these state properties which you use to change and observe embedded video’s volume and playback rate:

@State private var embeddedVideoRate: Float = 0.0
@State private var embeddedVideoVolume: Float = 0.0

Then, pass the following state properties to LoopingPlayerView in makeEmbeddedVideoPlayer():

  videoURLs: videoClips,
  rate: $embeddedVideoRate,
  volume: $embeddedVideoVolume)

Finally, add the following view modifiers to LoopingPlayerView in makeEmbeddedVideoPlayer():

// 1
.onAppear {
  embeddedVideoRate = 1

// 2
.onTapGesture(count: 2) {
  embeddedVideoRate = embeddedVideoRate == 1.0 ? 2.0 : 1.0

// 3
.onTapGesture {
  embeddedVideoVolume = embeddedVideoVolume == 1.0 ? 0.0 : 1.0

Taking it comment by comment:

  1. By setting the rate to 1.0, you make the video play, just like before.
  2. You add a listener for when someone double-taps the player view. This toggles between playback rate of 2x and 1x.
  3. You add a listener for when someone single-taps the player view. This toggles the mute status of video.
Note: Make sure to add the double-tap listener first, then the single-tap. If you do it in reverse, the double-tap listener will never get called.

Build and run again, and you’ll be able to tap and double-tap to play around with the speed and volume of the clips. This shows how easy it is to add custom controls for interfacing with a custom video view.

Custom video view

Now, you can pump up the volume and throw things into overdrive at the tap of a finger. Pretty neat!

Playing Video Efficiently

One thing to note before moving on is that playing video is a resource-intensive task. As things are, your app will continue to play these clips, even when you start watching a full-screen video.

To fix this issue, go to VideoFeedView.swift and find the onAppear block of VideoPlayer in makeFullScreenVideoPlayer(for:). Stop the video clip playback by setting the rate to 0.0:

embeddedVideoRate = 0.0

To resume the playback when the full-screen video closes, find the fullScreenCover view modifier in the body of VideoFeedView and add the following after the On Dismiss Closure comment:

embeddedVideoRate = 1.0

You can also stop playing videos and remove all items from the player object when it’s no longer needed by the system. To do so, return to LoopingPlayerView.swift and add this method to LoopingPlayerUIView:

func cleanup() {
  player = nil

Fortunately, SwiftUI provides a way to call this cleanup method. Add the following to LoopingPlayerView:

static func dismantleUIView(_ uiView: LoopingPlayerUIView, coordinator: ()) {

This makes your wrapper a very good citizen in the SwiftUI world!

Build and run, and go to a full-screen video. The preview will resume where it left off when you return to the feed.

Video clip that pauses when full screen video plays

Trying Not to Steal the Show

If you’re going to make an app that has videos, it’s important to think about how your app will affect your users.

Yeah, that sounds blindingly obvious. But how many times have you been using an app that starts a silent video but turns off your music?

If you’ve never experienced this first-world travesty, plug in your headphones… Oh, sorry, present-day version: Bluetooth-connect your headphones. Turn on some music and then run the app. When you do, you’ll notice that your music is off even though the video looper isn’t making any noise!

As a considerate app developer, you should allow your user to turn off their own music instead of boldly assuming that your app should trump all others. Lucky for you, this isn’t very hard to fix by tweaking AVAudioSession‘s settings.

Head to AppMain.swift and add the following import to the top of the file:

import AVFoundation

Next, implement the default initializer with the following line:

init() {

Don’t forget to implement the method you just used:

private func setMixWithOthersPlaybackCategory() {
  try? AVAudioSession.sharedInstance().setCategory(
    mode: AVAudioSession.Mode.moviePlayback,
    options: [.mixWithOthers])

Here, you’re telling the shared AVAudioSession that you would like your audio to be in the ambient category. The default is AVAudioSession.Category.soloAmbient, which explains shutting off the audio from other apps.

You’re also specifying that your app is using audio for “movie playback” and that you’re fine with the sound mixing with sound from other sources.

Build and run, start your music back up and launch the app one more time.

Sample app running

You now have a video app that gives you the freedom to be the captain of your own ship. :]

Bonus: Adding Picture-in-Picture

What if you could continue watching the videos while doing other things on your device?

You’ll add the picture-in-picture (PiP) feature to the app.

First, you need to declare this compatibility for the app. In the Signing & Capabilities section for the app target, add Audio, AirPlay, and Picture in Picture background mode.

Adding background modes to the app

Next, you need to change the audio session category. PiP video doesn’t play in ambient mode. Open AppMain.swift and add this method:

private func setVideoPlaybackCategory() {
  try? AVAudioSession.sharedInstance().setCategory(.playback)

In the initializer, make sure to call this method instead of the old one:

init() {

Build and run, then tap one of the list items to open the full-screen player. You’ll see the PiP button in the top left corner… or you won’t!

Full-screen player, hoping to see the PiP button

The downside is that, as of writing this article when iOS 14.5 is the latest version available, VideoPlayer‘s view of SwiftUI doesn’t show the PiP button. If you want to use PiP, you need to use AVPlayerViewController, which belongs to UIKit. The upside is that you know how to create a bridge between SwiftUI and UIKit.

Create a file named VideoPlayerView.swift and replace its content with the following:

import SwiftUI
// 1
import AVKit

// 2
struct VideoPlayerView: UIViewControllerRepresentable {
  // 3 
  let player: AVPlayer?

  func makeUIViewController(context: Context) -> AVPlayerViewController {
    // 4    
    let controller = AVPlayerViewController()
    controller.player = player
    return controller

  func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {}
  1. You import AVKit since AVPlayerViewController is inside this module.
  2. You define a struct conforming to UIViewControllerRepresentable to be able to use AVPlayerViewController in SwiftUI.
  3. As with all ways of playing videos you’ve seen up to now, AVPlayerViewController needs a player as well.
  4. You create an instance of AVPlayerViewController, set its player and return the instance.

That’s it for the bridge. Go back to VideoFeedView.swift and replace VideoPlayer(player: avPlayer) in makeFullScreenVideoPlayer(for:) with this:

VideoPlayerView(player: avPlayer)

Build and run, open a full-screen video and watch the PiP button appear in the top-left corner.

Note: Picture-in-picture may not work on the simulator. Try running on a device.

Picture in Picture video playback

You can also add PiP support to the LoopingPlayerView. For the sake of brevity of this article, the code to do that is included inside the final project.

Where to Go From Here?

You can download the final project using the Download Materials button at the top or bottom of this video streaming tutorial for iOS.

You’ve successfully put together an app that can play both local and remote videos. It also efficiently shows off a highlight reel of all the coolest videos on the platform.

If you’re looking to learn more about video playback, this is only the tip of the iceberg. AVFoundation is a vast framework that can handle things such as:

  • Capturing video with the built-in camera.
  • Transcoding between video formats.
  • Applying filters to a video stream in real time.

As always, looking at the WWDC video archive when trying to learn more about a particular subject is a no-brainer.

One thing in particular not covered in this tutorial is reacting to AVPlayerItem‘s status property. Observing the status of remote videos will tell you about network conditions and the playback quality of streaming video.

Also, there’s a lot more to learn about livestreaming with HLS. If it’s something that interests you, check out Apple’s documentation. This page contains a nice list of links to other resources you can use to learn more.

You can also take a look at Forums and ask your questions there.

As always, thanks for reading, and let me know if you have any questions in the comments!