ShazamKit Tutorial for iOS: Getting Started
Learn how to use ShazamKit to find information about specific audio recordings by matching a segment of that audio against a reference catalog of audio signatures. By Saleh Albuga.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
ShazamKit Tutorial for iOS: Getting Started
30 mins
- Getting Started
- Understanding Shazam’s Matching Mechanism
- Shazam Signatures
- Shazam Catalogs
- Matching Music Against Shazam’s Catalog
- Exploring ShazamKit Sessions
- Displaying the Matched Song
- Testing The App
- Working With Custom Catalogs
- Shazam Signature Files
- Creating a Custom Catalog
- Matching Audio Against a Custom Catalog
- Synchronizing App Content With Audio
- Implementing the Annotations
- Displaying the Synchronized Annotations
- Testing the App
- Where to Go From Here?
You’ve probably heard a song you liked in a restaurant and wanted to know its name and artist. In this situation, the first thing that comes to mind is Shazam.
You simply open Shazam, tap recognize and voilà! The song info is right on your phone.
Apple acquired Shazam in 2018. With the release of Xcode 13 and iOS 15, Apple introduced ShazamKit, a framework you can use to add audio recognition experiences to your apps. Whether you want to show users what song is playing or match a track or a video you created, ShazamKit has got you more than covered.
In this tutorial, you’ll:
- Understand Shazam’s recognition mechanism.
- Create DevCompanion, a simple Shazam clone that matches popular, published music and songs.
- Match custom audio from a video.
- Change the app content depending on the video playing position.
For this tutorial, you should be familiar with the Shazam app or matching music with Siri. Don’t worry if you’re not. Just play a song on your laptop and ask Siri, “What’s this song?” or download the Shazam app.
Getting Started
Download the starter project by clicking Download Materials at the top or bottom of the tutorial. Open the project, then build and run.
DevCompanion has two views:
- What’s playing?: Where users can match popular music, just like Shazam.
- Video Content: Where users can see annotations and additional content while watching a SwiftUI video course here on raywenderlich.com.
Open MatchingHelper.swift and take a look at the code. It’s an empty helper class where you’ll write ShazamKit recognition code.
Don’t worry about the rest of the files for now. You’ll see them later in the tutorial when you create a custom audio experience. For now, you’ll learn more about how Shazam recognizes and matches audio.
You’ll also need an Apple Developer account in order to configure an App ID with the ShazamKit App Service.
Understanding Shazam’s Matching Mechanism
Before writing code and using the ShazamKit API, it’s essential to understand how Shazam works behind the scenes. This technology is exciting!
When you use Shazam, you tap the big recognition button, Tap to Shazam, while a song is playing. The app listens for a couple of seconds and then displays the song information if it finds a match. You can match any part of a song.
This is what goes under the hood:
- The app starts using the microphone to record a stream with a predefined buffer size.
- The Shazam library, now called ShazamKit, generates a signature from the audio buffer the app just recorded.
- Then, ShazamKit sends a query request with this audio signature to the Shazam API. The Shazam service matches the signature against reference signatures of popular music in the Shazam Catalog.
- If there’s a match, the API returns the metadata of the track to ShazamKit.
- ShazamKit calls the right delegate passing the metadata.
- Beyond this point, it’s up to the app logic to display the result with the track information.
Next, you’ll learn more about Shazam signatures and catalogs.
Shazam Signatures
Signatures are a fundamental part of the identification process. A signature is a lossy or simplified version of the song that’s easier to process and analyze. Shazam starts creating signatures by generating the spectrogram of the recorded part, then extracting and identifying the highs or the loudest parts.
A signature is not reversible to the original audio to ensure the original audio’s privacy.
During the identification process, Shazam matches query signatures sent by apps against reference signatures. A reference signature is a signature generated from the whole song or track.
Instead of comparing the recorded audio as is, there are many benefits to using signatures in identification. For example, Shazam signatures prevent most background noises from affecting the matching process, ensuring matching even in noisy conditions.
Signatures are also easier to share, store and index as they have a much smaller footprint than the original audio.
You can learn more about Shazam’s algorithm in this research paper by the founder of Shazam, Avery Wang.
Next, you’ll explore Shazam catalogs.
Shazam Catalogs
As mentioned earlier, Shazam matches signatures against reference signatures. It stores reference signatures and their metadata in catalogs. A signature’s metadata has information about the song, like its name, artist and artwork.
The Shazam Catalog has almost all popular songs’ reference signatures and metadata. You can also create a custom catalog locally in an app and store reference signatures and metadata for your audio tracks. You’ll create custom catalogs later in this tutorial.
Enough theory for now. Next, you’ll learn how to make the app identify popular music.
Matching Music Against Shazam’s Catalog
Time to implement the app’s first feature, a simplified Shazam clone. Open MatchingHelper.swift and look at the code:
import AVFAudio
import Foundation
import ShazamKit
class MatchingHelper: NSObject {
private var session: SHSession?
private let audioEngine = AVAudioEngine()
private var matchHandler: ((SHMatchedMediaItem?, Error?) -> Void)?
init(matchHandler handler: ((SHMatchedMediaItem?, Error?) -> Void)?) {
matchHandler = handler
}
}
It’s a helper class that controls the microphone and uses ShazamKit to identify audio. At the top, you can see the code imports ShazamKit along with AVFAudio
. You’ll need AVFAudio to use the microphone and capture audio.
MatchingHelper
also subclasses NSObject
since that’s required by any class that conforms to SHSessionDelegate
.
Take a look at MatchingHelper
‘s properties:
-
session
: The ShazamKit session you’ll use to communicate with the Shazam service. -
audioEngine
: AnAVAudioEngine
instance you’ll use to capture audio from the microphone. -
matchHandler
: A handler block the app views will implement. It’s called when the identification process finishes.
The initializer makes sure matchHandler
is set when you create an instance of the class.
Add the following method below the initializer:
func match(catalog: SHCustomCatalog? = nil) throws {
// 1. Instantiate SHSession
if let catalog = catalog {
session = SHSession(catalog: catalog)
} else {
session = SHSession()
}
// 2. Set SHSession delegate
session?.delegate = self
// 3. Prepare to capture audio
let audioFormat = AVAudioFormat(
standardFormatWithSampleRate:
audioEngine.inputNode.outputFormat(forBus: 0).sampleRate,
channels: 1)
audioEngine.inputNode.installTap(
onBus: 0,
bufferSize: 2048,
format: audioFormat
) { [weak session] buffer, audioTime in
// callback with the captured audio buffer
session?.matchStreamingBuffer(buffer, at: audioTime)
}
// 4. Start capture audio using AVAudioEngine
try AVAudioSession.sharedInstance().setCategory(.record)
AVAudioSession.sharedInstance()
.requestRecordPermission { [weak self] success in
guard
success,
let self = self
else { return }
try? self.audioEngine.start()
}
}
match(catalog:)
is the method the rest of the app’s code will use to identify audio with ShazamKit. It takes one optional parameter of type SHCustomCatalog
if you want to match against a custom catalog.
Take a look at each step:
- First, you create an
SHSession
and pass a catalog to it if you use a custom catalog.SHSession
defaults to the Shazam Catalog if you don’t provide a catalog, which will work for the first part of the app. - You set the
SHSession
delegate, which you’ll implement in a moment. - You call
AVAudioEngine
‘sAVAudioNode.installTap(onBus:bufferSize:format:block:)
, a method that prepares the audio input node. In the callback, which is passed the captured audio buffer, you callSHSession.matchStreamingBuffer(_:at:)
. This converts the audio in the buffer to a Shazam signature and matches against the reference signatures in the selected catalog. - You set
AVAudioSession
category, or mode, to recording. Then, you request microphone recording permission by callingAVAudioSession
‘srequestRecordPermission(_:)
to ask the user for the microphone permission the first time the app runs.Finally, you start recording by calling
AVAudioEngine.start()
.
matchStreamingBuffer(_:at:)
handles capturing audio and passing it to ShazamKit. Alternatively, you can use SHSignatureGenerator
to generate a signature object and pass it to the match
of SHSession
. However, matchStreamingBuffer(_:at:)
is suitable for contiguous audio and therefore fits your use case.
Next, you’ll implement the Shazam Session delegate.
Exploring ShazamKit Sessions
There are two steps left before you wire-up the UI. First, you need to implement SHSessionDelegate
to handle matching successes and failures.
Add the following class extension at the end of MatchingHelper.swift:
extension MatchingHelper: SHSessionDelegate {
func session(_ session: SHSession, didFind match: SHMatch) {
DispatchQueue.main.async { [weak self] in
guard let self = self else {
return
}
if let handler = self.matchHandler {
handler(match.mediaItems.first, nil)
// stop capturing audio
}
}
}
}
In this extension, you implement SHSessionDelegate
.
SHSession
calls session(_:didFind:)
when the recorded signature matches a song in the catalog. It has two parameters: The SHSession
it was called from and an SHMatch
object that contains the results.
Here, you check if the matchHandler
is set, and you call it passing the following parameters:
-
The first
SHMatchedMediaItem
of the returnedmediaItems
inSHMatch
: ShazamKit might return multiple matches if the query signature matches multiple songs in the catalog. The matches are ordered by the quality of the match, the first having the highest quality. - An error type: Since this is a success, you pass nil.
You’ll implement this handler block in SwiftUI in the next section.
Right after session(_:didFind:)
, add:
func session(
_ session: SHSession,
didNotFindMatchFor signature: SHSignature,
error: Error?
) {
DispatchQueue.main.async { [weak self] in
guard let self = self else {
return
}
if let handler = self.matchHandler {
handler(nil, error)
// stop capturing audio
}
}
}
session(_:didNotFindMatchFor:error:)
is the delegate method SHSession
calls when there’s no song in the catalog that matches the query signature or when an error that prevents matching occurs. It returns the error that occurred in the third parameter or nil
if there was no match in the Shazam catalog for the query signature. Similar to what you did in session(_:didFind:)
, you call the same handler block and pass in the error.
Finally, to adhere to Apple’s microphone use guidelines and protect user privacy, you need to stop capturing audio when any of the two delegate methods are called.
Add the following method right after match(catalog:)
in the main body of MatchingHelper
:
func stopListening() {
audioEngine.stop()
audioEngine.inputNode.removeTap(onBus: 0)
}
Then, call stopListening()
in both delegate methods above. Replace the following comment:
// stop capturing audio
with:
self.stopListening()
Next, you’ll display the matching result.
Displaying the Matched Song
The final part of your Shazam clone is the UI. Open SongMatchView.swift and check the preview in the canvas:
The view consists of two parts. The top part with the rounded green square is where you’ll show the song info. The bottom part has the Match button that starts the matching process.
First, you need a MatchHelper
object. At the top of SongMatchView
, add:
@State var matcher: MatchingHelper?
Then, at the end of the view struct
, right after body
, add:
func songMatched(item: SHMatchedMediaItem?, error: Error?) {
isListening = false
if error != nil {
status = "Cannot match the audio :("
print(String(describing: error.debugDescription))
} else {
status = "Song matched!"
print("Found song!")
title = item?.title
subtitle = item?.subtitle
artist = item?.artist
coverUrl = item?.artworkURL
}
}
songMatched(item:error:)
is the method that MatchingHelper
calls when it finishes matching. It:
- Sets
isListening
tofalse
. As a result, the UI updates to show the user that the app is not recording anymore and hides the activity indicator. - Checks the
error
parameter. If it isn’t nil, there was an error so it updates the status the user sees and logs the error to the console. - If there was no error, it tells the user it found a match and updates the other properties with the song metadata.
SHMatchedMediaItem
is a subclass of SHMediaItem
. It inherits a media item’s metadata properties for the matched items, like title of the song, artist, genre, artwork URL and video URL.It also has other properties specific to matched items like
frequencySkew
, the difference in frequency between the matched audio and the query audio.
Next, at the end of NavigationView
, add:
.onAppear {
if matcher == nil {
matcher = MatchingHelper(matchHandler: songMatched)
}
}
.onDisappear {
isListening = false
matcher?.stopListening()
status = ""
}
Here you instantiate the MatchHelper
passing the handler you just added when the view appears. When the view disappears, for example, when you switch to another tab, you stop the identification process by calling stopListening()
.
Finally, locate the Match button code, as below:
Button("Match") {
}
.font(.title)
In the button action block, add:
status = "Listening..."
isListening = true
do {
try matcher?.match()
} catch {
status = "Error matching the song"
}
This is where the magic starts. You change status
to tell the user the app is listening and call match()
to start the matching process. When SHSession
returns a result to MatchingHelper
, it calls songMatched(item:error:)
.
Next, you’ll test the app.
Testing The App
After that, set the bundle identifier and signing settings accordingly.
To try matching a song, build and run the app on an iPhone.
Open the following YouTube link to play the song. Can you guess the song?
Tap Match and bring your iPhone closer to the speakers. A few seconds later, you’ll see the match:
Hooray! The app successfully matched the song.
Working With Custom Catalogs
You learned how to use ShazamKit to match audio against the Shazam catalog. What if you wanted to match your music compositions or video content? ShazamKit’s got you covered.
Now you’ll implement the remaining part of DevCompanion, the Video Content tab. You’ll start by matching audio against a custom catalog you’ll create.
The app will identify the intro video of Your First iOS and SwiftUI App: An App From Scratch video course. This is an amazing free video course for learning the fundamentals of SwiftUI.
Before you do that, you need to learn more about Shazam Signature files.
Shazam Signature Files
As you saw earlier, Shazam catalogs are a collection of signatures and their metadata. But what does a Shazam Signature look like?
Shazam Signatures are stored in files with the .shazamsignature extension. They are opaque files that you can safely share or download from a remote server.
In the Project navigator, expand Signatures. You’ll find DevCompanion.shazamsignature, the SwiftUI course intro video’s Shazam signature file.
This signature file will be the reference signature in your custom catalog. ShazamKit will compare the query signature to this signature file to decide if you’re playing the intro video or something else.
Next, you’ll create a custom catalog.
Creating a Custom Catalog
Create a Swift file in Data. In the Project navigator, right-click the Data folder and choose New File….
Then, select Swift File and click Next.
Next, name it DevVideosCatalog.swift and click Create. Finally, open the file and add:
import ShazamKit
enum DevVideosCatalog {
static func catalog() throws -> SHCustomCatalog? {
// 1. Make sure the signature file exists
guard let signaturePath = Bundle.main.url(
forResource: "DevCompanion",
withExtension: "shazamsignature") else {
return nil
}
// 2. Read the signature file and instantiate an SHSignature
let signatureData = try Data(contentsOf: signaturePath)
let refSignature = try SHSignature(dataRepresentation: signatureData)
// 3. Create an SHMediaItem with the metadata of this signature
let videoMetadata = SHMediaItem(
properties: [
.title: "Your First iOS and SwiftUI App: An App From Scratch",
.subtitle: "Introduction",
.artist: "Ray Wenderlich"
])
// 4. Create the custom catalog.
let customCatalog = SHCustomCatalog()
try customCatalog.addReferenceSignature(
refSignature,
representing: [videoMetadata])
return customCatalog
}
}
catalog()
returns an object of type SHCustomCatalog
, the type ShazamKit provides for custom catalogs. It’s a static method that initializes your custom catalog and returns it. Here, it:
- Checks the app’s bundle for the Shazam signature file DevCompanion.shazamsignature. If the file doesn’t exist, it returns nil.
- Reads the
Data
contents of the signature file and initializesrefSignature
, which is anSHSignature
, the container type ShazamKit uses to store signature data. - Defines
videoMetadata
, the metadata of the SwiftUI course intro video. This is anSHMediaItem
with some pre-defined properties. - Initializes the catalog, then calls
SHCustomCatalog.addReferenceSignature(_:representing:)
to set the reference signature of the catalog with your metadata.
Next, you’ll match audio against this new custom catalog.
Matching Audio Against a Custom Catalog
Open VideoMatchView.swift and take a look at the preview in the Canvas.
The view looks similar to SongMatchView
.
Delete the whole current code of VideoMatchView
and VideoMatchView_Previews
. Then, uncomment the code at the end of file to replace them.
This implementation of VideoMatchView
is now the same as SongMatchView
, except there are different label names because you’re matching a development video not a song.
For example, take a look at VideoMatchView.videoMatched(result:error:)
:
func videoMatched(result: SHMatchedMediaItem?, error: Error?) {
isListening = false
if error != nil {
status = "Cannot match the audio :("
print(String(describing: error.debugDescription))
} else {
course = result?.title ?? course
episode = result?.subtitle ?? episode
author = result?.artist ?? author
}
}
Here you set the course
text to the SHMatchedMediaItem
‘s title
, the episode
text to subtitle
and the author
text to artist
. After all, aren’t developers and content creators artists?
Next, find the Start Episode button and take a look at its action code:
do {
try matcher?.match()
} catch {
status = "Error matching the song"
}
As you saw earlier in the Matching Music Against Shazam’s Catalog section, MatchingHelper.match(catalog:)
takes an optional parameter of type SHCustomCatalog
to pass it to SHSession
. If no custom catalog is passed, SHSession
defaults to the Shazam catalog. You need to change that.
Replace this line:
try matcher?.match()
with:
try matcher?.match(catalog: DevVideosCatalog.catalog())
Here, you pass your custom catalog to MatchingHelper
and, in turn, SHSession
uses it in the next match. Now, you’re ready to test.
Open the SwiftUI course intro and play the video. Build and run. Switch to the Video Content tab and hold your phone near to your speakers so that it can hear the video’s soundtrack.
Now, tap Start Episode. After few seconds, you’ll see the video info at the top:
The app matched audio from your custom catalog!
You can also save a custom catalog as an opaque file just like Shazam signatures using SHCustomCatalog.write(to:)
. This file has the extension .shazamcatalog. To learn more, check out the Apple documentation.
- Start capturing audio with
AVAudioEngine
— the same the way you did earlier. - In the callback of
AVAudioNode.installTap(onBus:bufferSize:format:block:)
, callSHSignatureGenerator
‘sappend(_:at:)
, passing thebuffer
andaudioTime
from the callback parameters. This will generate a signature from the captured audio. - Stop recording when the track ends.
- Write
SHSignatureGenerator.signature().dataRepresentation
to a file.
Check out the ShazamSignatureGenerator project in the tutorial materials folder. It’s a sample app that lets you create Shazam signatures and export them to .shazamsignature files.
Next, you’ll create a custom audio experience.
Synchronizing App Content With Audio
The last feature you’ll add to DevCompanion shows users additional content while they watch the SwiftUI course intro. You’ll show them annotations describing what part of the course Ray is talking about while introducing it. Take a look at the illustration below:
The app will synchronize content with the video playing position. For example:
- At 00:05,
VideoMatchView
will show Welcome! and an illustration. - At 00:14, when Ray describes the app you’ll build in the course, the view shows Your first SwiftUI App! and a screenshot of the app.
- At 00:47 when Ray talks about the first part of the course structure, the view shows Course Overview: SwiftUI vs UIKit, the title of that part along with an illustration.
Isn’t that cool? Next, you’ll implement these annotations.
Implementing the Annotations
You’ll create a simple struct
with the caption, image to display and time to show them.
In Project navigator, expand Data and click VideoAnnotation.swift to open it. Add the following at the beginning of the file, before the commented extension:
struct VideoAnnotation: Comparable, Equatable {
let content: String
let imageName: String?
let offset: TimeInterval
init(content: String, offset: TimeInterval, imageName: String? = nil) {
self.content = content
self.offset = offset
self.imageName = imageName
}
static func < (lhs: VideoAnnotation, rhs: VideoAnnotation) -> Bool {
return lhs.offset < rhs.offset
}
static func == (lhs: VideoAnnotation, rhs: VideoAnnotation) -> Bool {
return lhs.content == rhs.content && lhs.offset == rhs.offset
}
}
VideoAnnotation
has three properties:
-
content
is the string caption the user sees. -
imageName
is the annotation image name. It’s optional. -
offset
is theTimeInterval
in seconds when an annotation should display.
VideoAnnotation
conforms to Comparable
and Equatable
because you need to compare the annotations to determine which one to show, as you’ll see later.
Finally, you implement the <
operator from Comparable
to use offset
when comparing annotations. Additionally, you implement the ==
operator from Equatable
, where you specify that two annotations are equal when their content
and offset
match.
Uncomment VideoAnnotation
below the struct
and take a look at sampleAnnotations
, the pre-defined array of annotation you'll use.
Each definition is similar to this:
VideoAnnotation(content: "Welcome!", offset: 5, imageName: "an-1")
Next, you'll update VideoMatchView
to show the annotations.
Displaying the Synchronized Annotations
Start by returning the right VideoAnnotation
to VideoMatchView
.
Open MatchingHelper.swift and add the following properties to the class:
typealias MatchWithContentHandler =
((SHMatchedMediaItem?, VideoAnnotation?, Error?) -> Void)
private var matchWithContentHandler: MatchWithContentHandler?
private var lastMatch: SHMatchedMediaItem?
private var lastAnnotationMatch: VideoAnnotation?
matchWithContentHandler
is a handler block similar to matchHandler
, but it takes an extra parameter for the VideoAnnotation
. lastMatch
stores the last matched audio metadata and lastAnnotationMatch
stores the last matched annotation.
Then, below the class initializer, add:
init(matchWithContentHandler handler: MatchWithContentHandler?) {
matchWithContentHandler = handler
}
This is another initializer that sets matchWithContentHandler
.
matchHandler
and matchWithContentHandler
as private members and creating a separate class initializer for each, you make sure only one is set and called by the delegate methods.
Next, you need to update SHSessionDelegate
to call matchWithContentHandler
.
Append the following to the end of session(_:didFind:)
, inside the DispatchQueue
block:
if let handler = self.matchWithContentHandler {
let matchedAnnotation = VideoAnnotation
.sampleAnnotations.last { annotation in
(match.mediaItems.first?.predictedCurrentMatchOffset ?? 0) >
annotation.offset
}
if match.mediaItems.first != self.lastMatch
|| matchedAnnotation != self.lastAnnotationMatch {
handler(match.mediaItems.first, matchedAnnotation, nil)
self.lastMatch = match.mediaItems.first
self.lastAnnotationMatch = matchedAnnotation
}
}
Whenever SHSession
calls session(_:didFind:)
you:
- Find the right annotation by comparing each annotation's offset to
SHMatchedMediaItem
'spredictedCurrentMatchOffset
, which is the predicted current playing position. - Whenever either the matched audio or the matched annotation has changed, call
matchWithContentHandler
and updatelastMatch
andlastAnnotationMatch
to the most recent matches.
According to the ShazamKit WWDC session on custom matching, ShazamKit can call session(_:didFind:)
multiple times with the same match. Therefore, you only want to update your handler when you receive a new match.
MatchingHelper
to match music in SongMatchView
, you called stopListening()
after calling the matchHandler
, like this:
if let handler = self.matchHandler {
handler(match.mediaItems.first, nil)
self.stopListening()
}
That's because you only needed the song's metadata. Here you don't call stopListening()
because you want ShazamKit to keep listening and matching which predictedCurrentMatchOffset
the track is currently at.
Next, append the following to session(_:didNotFindMatchFor:error:)
, again inside the DispatchQueue
block:
if let handler = self.matchWithContentHandler {
handler(nil, nil, error)
self.stopListening()
}
When there isn't a match or if there's any other error, you call matchWithContentHandler
passing the error. Then you call stopListening()
to stop the matching process.
Finally, you'll update VideoMatchView
to show the annotations.
In VideoMatchView.swift, replace the contents of onAppear(perform:)
with:
if matcher == nil {
matcher = MatchingHelper(matchWithContentHandler: videoMatched)
}
Here, you call MatchingHelper
's new initializer to set matchWithContentHandler
.
Now, replace VideoMatchView.videoMatched(result:error:)
with:
func videoMatched(
result: SHMatchedMediaItem?,
annotation: VideoAnnotation?,
error: Error?
) {
if error != nil {
status = "Cannot match the audio :("
print(String(describing: error.debugDescription))
} else {
course = result?.title ?? course
episode = result?.subtitle ?? episode
author = result?.artist ?? author
annotationImageName = annotation?.imageName ?? annotationImageName
annotationContent = annotation?.content ?? annotationContent
print("Match updated: \(String(describing: annotationContent))")
}
}
Here, you add the annotation parameter. You also set annotationImageName
and annotationContent
to the annotation's image name and caption.
It's time to test the app.
Testing the App
You're finally ready to test the new feature. Build and run. Then switch to Video Content.
Play the SwiftUI course intro video and tap Start Episode.
First, the app will identify the video and show the first annotation:
Then, at 00:14 the app will show:
Next, at 00:47 the app will show:
Watch the whole video. No cheating! When you get to the end, scrub back to the middle and notice how the app displays the right annotation.
Where to Go From Here?
You can download the final project by clicking Download Materials at the top or bottom of this tutorial.
In this tutorial, you learned about ShazamKit and Shazam's audio matching process. Along the way, you also learned how to:
- Identify popular music using the Shazam catalog.
- Create custom catalogs and matching your own audio.
- Synchronize the app content with the played audio.
To learn more, checkout Apple's documentation on:
I hope you've enjoyed this tutorial. If you have any questions or comments, please join the forum discussion below.
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development — plans start at just $19.99/month! Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.
Learn more