Augmented Reality and ARKit Tutorial

Learn how to work with augmented reality in this SpriteKit and ARKit tutorial! By Caroline Begbie.

Login to leave a rating/review
Save for later

Augmented Reality. The phrase alone conjures up images of bionic limbs and holodecks. Wouldn’t it be cool if you could turn your iOS device into a mini-holodeck? Well, thanks to Apple’s exciting new framework named ARKit, now you can!

Augmented reality (AR), as its name indicates, adds computer-generated objects to reality. In short, you view the world through a camera and interact with virtual 2D or 3D objects inside that view.

With 2D AR, you can add overlays or signposts, which respond to geographic location or visual features in real-time. Or with 3D AR, you can visualize how a piece of furniture might look inside your living-room without ever leaving your home.

AR has been around for a while. For example, Snapchat’s 3D filters recognize facial features and wrap 3D objects around your head. But in order for that to work, you needed some hard-core math. You also needed hardware that can track in real-world space. With ARKit, Apple has provided both.

To demonstrate the differences between 2D and 3D ARKit, here is an example of Apple’s ARKit/SceneKit 3D template — a spaceship landing in my garden:

Now take a look at the 2D ARKit/SpriteKit template. With this template, you’re able to tap the screen and anchor a Space Invader 0.2 meters directly in front of your device’s camera.

In this tutorial, you’re going to step up the Pest Control game from 2D Apple Games by Tutorials and convert it into an immersive augmented reality first-person shooter named ARniegeddon, bringing bugs and firebugs into your very own home with the power of ARKit.

Getting Started

ARniegeddon is a first-person shooter. You’ll add bugs to the view and shoot them from a distance. Firebugs, just as they were in Pest Control, are a bit tougher to destroy. You’ll have to locate bug spray and pick it up by scooping your phone through the bug spray canister. You’ll then be able to aim at a firebug and take it down.

You’ll start from a SpriteKit template, and you’ll see exactly what goes into making an AR game using ARKit.

This is what the game will look like when you’ve completed this tutorial:

Download the starter project for this tutorial, and run the game on your device. The starter project is simply the Hello World code created from a new SpriteKit template. I’ve added the image and sound assets to it. There’s also a Types.swift file that has an enum for the sounds you’ll be using.

Note: To run the game on your device, you must select a development team under General\Signing of the ARniegeddon target.


There are a few requirements for using ARKit. So before you get too far into this tutorial, make sure you have all of these:

  • An A9 or later processor. Only the iPhone 6s and up, the 2017 iPads and the iPad Pros can run ARKit.
  • Space. You’ll need plenty of space. One of the great things about AR is that you can develop games like Pokémon Go and encourage people to leave their homes — and play games in the park. To play ARniegeddon, you’ll need a clear space so you can capture bugs without falling over your furniture.
  • Contrast. If you’re in a dark room, you wouldn’t expect to see much, would you? With augmented reality, contrast is the key. So, if your room has white reflective tiles, white furniture and white walls, or is too dark, things won’t work too well; the camera needs contrast in order to distinguish surfaces and distances of objects.

How AR Works

Using a process called Visual Inertial Odometry (VIO), ARKit uses your device’s motion sensors, combined with visual information from the camera, to track the real world. Some clever math takes this tracking information and maps features in the real 3D world to your 2D screen.

When your game first starts, the device sets its initial position and orientation. The camera determines features of possible objects in its frame. When you move the device, there’s a new position and orientation. Because you’ve moved, the camera sees the objects slightly differently. The device now has enough information to triangulate on a feature and can work out the distance of an object. As you move, the device constantly refines the information and can gradually work out where there are horizontal surfaces.

Note: At the time of this writing, vertical surfaces are not calculated.

In addition to all this tracking, the sensors can examine the amount of available light and apply the same lighting to the AR objects within the scene.

Rendering the View

The first thing you’ll change is the view itself. When using SpriteKit with ARKit, your main view will be an ARSKView. This is a subclass of SKView that renders live video captured by the camera in the view.

In Main.storyboard, in the view hierarchy, select View, which is listed under Game View Controller. In the Identity Inspector, change the class to ARSKView.

In GameViewController.swift, replace all the imports at the top of the file with this:

import ARKit

Next, add a property for the view to GameViewController:

var sceneView: ARSKView!

Then, change this:

if let view = self.view as! SKView? {

…to this:

if let view = self.view as? ARSKView {
  sceneView = view

Here you get the loaded view as an ARSKView — matching your change in Main.storyboard — and you set sceneView to this view. Now the main view is set to display the camera video feed.

World Tracking with Sessions

You set up tracking by starting a session on an ARSKView. The configuration class ARWorldTrackingConfiguration tracks orientation and position of the device. In the configuration, you can specify whether or not to detect horizontal surfaces and also turn off lighting estimations.

Add these two methods to GameViewController:

override func viewWillAppear(_ animated: Bool) {
  let configuration = ARWorldTrackingConfiguration()
override func viewWillDisappear(_ animated: Bool) {

Here you start the session when the view appears and pause the session when the view disappears.

If you run the game at this time, you’ll get a crash. That’s because the app is trying to access the camera without permission.

Open Info.plist, control-click Information Property List, and choose Add Row.

Add the key NSCameraUsageDescription.

Double-click in the value field, and add the following description, which explains to the user how the game will use the camera:

For an immersive augmented reality experience, ARniegeddon requires access to the camera
Note: If you want to restrict your game to devices that support ARKit, find the UIRequiredDeviceCapabilities section in Info.plist (that’s the Required device capabilities section), and add arkit as a string entry in the array.

Build and run the game. You’ll get a notification requesting access to the camera, with the text indicating why access is required. When you do, tap OK.

The template’s GameScene code still runs, but with the camera view in the background. It’s rendering to the ARSKView that you set up earlier.

Note: When developing for augmented reality, you’re often moving around and generally don’t want the device to be tethered to the computer. Wireless development, new in Xcode 9, assists greatly with this. You can set this up under Window\Devices and Simulators.

Respond to Session Events

ARSKView’s session has delegate methods for certain events. For example, you’ll want to know if the session failed. Perhaps the user denied access to the camera, or she could be running the game on a device that doesn’t support AR. You need to address these issues.

In GameViewController.swift, add an extension for the delegate methods with placeholder error messages:

extension GameViewController: ARSKViewDelegate {
  func session(_ session: ARSession,
               didFailWithError error: Error) {
  print("Session Failed - probably due to lack of camera access")
func sessionWasInterrupted(_ session: ARSession) {
  print("Session interrupted")
func sessionInterruptionEnded(_ session: ARSession) {
  print("Session resumed")!,
                        options: [.resetTracking, 
  • session(_:didFailWithError:): will execute when the view can’t create a session. This generally means that to be able to use the game, the user will have to allow access to the camera through the Settings app. This is a good spot to display an appropriate dialog.
  • sessionWasInterrupted(_:): means that the app is now in the background. The user may have pressed the home button or received a phone call.
  • sessionInterruptionEnded(_:): means that play is back on again. The camera won’t be in exactly the same orientation or position so you reset tracking and anchors. In the challenge at the end of the tutorial, you’ll restart the game.

Next, replace viewDidLoad() with this code:

override func viewDidLoad() {
  if let view = self.view as? ARSKView {
    sceneView = view
    sceneView!.delegate = self
    let scene = GameScene(size: view.bounds.size)
    scene.scaleMode = .resizeFill
    scene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
    view.showsFPS = true
    view.showsNodeCount = true

Here you set the view’s delegate and initialize GameScene directly, instead of through the scene file.

The Current Frame, Camera and Anchors

You’ve now set up your ARSKView with a session so the camera information will render into the view.

For every frame, the session will capture the image and tracking information into an ARFrame object, named currentFrame. This ARFrame object has a camera which holds positional information about the frame, along with a list of anchors.

These anchors are stationary tracked positions within the scene. Whenever you add an anchor, the scene view will execute a delegate method view(_:nodeFor:) and attach an SKNode to the anchor. When you add the game’s bug nodes to the scene, you’ll attach the bug nodes to these anchors.

Adding Bugs to the Scene

Now you’re ready to add bugs to the game scene.

First, you’ll remove all the template code. Delete GameScene.sks and Actions.sks since you won’t need them anymore.

In GameScene.swift, remove all the code in GameScene, leaving you with an empty class.

class GameScene: SKScene {

Replace the imports at the top with:

import ARKit

To get acquainted with ARKit, and possibly help with your entomophobia, you’re going to place a bug just in front of you.

Create a convenience property to return the scene’s view as an ARSKView:

var sceneView: ARSKView {
  return view as! ARSKView

Before you add the bug to your AR world, you need to make sure the AR session is ready. An AR session takes time to set up and configure itself.

First, create a property so you can check whether you added your AR nodes to the game world:

var isWorldSetUp = false

You’ll load the bug once — only if isWorldSetUp is false.

Add the following method:

private func setUpWorld() {
  guard let currentFrame = sceneView.session.currentFrame
    else { return }
  isWorldSetUp = true

Here you check whether the session has an initialized currentFrame. If the session doesn’t have a currentFrame, then you’ll have to try again later.

update(_:) is called every frame, so you can attempt to call the method inside there.

Override update(_:):

override func update(_ currentTime: TimeInterval) {
  if !isWorldSetUp {

Doing it this way, you only run the set up code once, and only when the session is ready.

Next you’ll create an anchor 0.3 meters in front of the camera, and you’ll attach a bug to this anchor in the view’s delegate.

But first there’s some math-y stuff to explain. Don’t worry — I’ll be gentle!

A Brief Introduction to 3D Math

Even though your game is a 2D SpriteKit game, you’re looking through the camera into a 3D world. Instead of setting position and rotation properties of an object, you set values in a 3D transformation matrix.

You may be a Neo-phyte to 3D (no more Matrix jokes, I promise!), so I’ll briefly explain.

A matrix is made up of rows and columns. You’ll be using four-dimensional matrices, which have four rows and four columns.

Both ARCamera and ARAnchor have a transform property, which is a four-dimensional matrix holding rotation, scaling and translation information. The current frame calculates ARCamera’s transform property, but you’ll adjust the translation element of the ARAnchor matrix directly.

The magic of matrices — and why they are used everywhere in 3D — is that you can create a transform matrix with rotation, scaling and translation information and multiply it by another matrix with different information. The result is then a new position in 3D space relative to an origin position.

Add this after the guard statement in setUpWorld():

var translation = matrix_identity_float4x4

Here you create a four-dimensional identity matrix. When you multiply any matrix by an identity matrix, the result is the same matrix. For example, when you multiply any number by 1, the result is the same number. 1 is actually a one-dimensional identity matrix. The origin’s transform matrix is an identity matrix. So you always set a matrix to identity before adding positional information to it.

This is what the identity matrix looks like:

The last column consists of (x, y, z, 1) and is where you set translation values.

Add this to setUpWorld(), right after the previous line:

translation.columns.3.z = -0.3

This is what the translation matrix looks like now:

Rotation and scaling use the first three columns and are more complex. They’re beyond the scope of this tutorial, and in ARniegeddon you won’t need them.

Continue adding to your code in setUpWorld():

let transform = * translation

Here you multiply the transform matrix of the current frame’s camera by your translation matrix. This results in a new transform matrix. When you create an anchor using this new matrix, ARKit will place the anchor at the correct position in 3D space relative to the camera.

Note: To learn more about matrices and linear algebra, take a look at 3Blue1Brown’s video series Essence of linear algebra at

Now, add the anchor to the scene using the new transformation matrix. Continue adding this code to setUpWorld():

let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)

Here you add an anchor to the session. The anchor is now a permanent feature in your 3D world (until you remove it). Each frame tracks this anchor and recalculates the transformation matrices of the anchors and the camera using the device’s new position and orientation.

When you add an anchor, the session calls sceneView’s delegate method view(_:nodeFor:) to find out what sort of SKNode you want to attach to this anchor.

Next, you’re going to attach a bug.

In GameViewController.swift, add this delegate method to GameViewController’s ARSKViewDelegate extension:

func view(_ view: ARSKView,
          nodeFor anchor: ARAnchor) -> SKNode? {
  let bug = SKSpriteNode(imageNamed: "bug") = "bug"
  return bug

Hold up your phone and build and run. And I mean… RUN! Your space has been invaded!

Move your phone around, and the bug will stay where it’s anchored. The tracking becomes more effective the more information you give the phone, so move your phone around a bit to give it some position and orientation updates.

Notice that whichever way you turn the camera, the bug faces you. This is called a billboard, which is a technique used in many 3D games as a cheap way of adding elements such as trees and grass to a scene. Simply add a 2D object to a 3D scene and make sure that it’s always facing the viewer.

If you want to be able to walk around the bug and see what it looks like from behind, you’ll have to model the bug in a 3D app such as Blender, and use ARKit with either SceneKit or Metal.

Note: You can find out more about ARKit and SceneKit in our book iOS 11 by Tutorials, available here:

Light Estimation

If you’re in a darkened room, then your bug will be lit up like a firefly. Bring your hand to gradually cover the camera and see the bug shine. Luckily, ARKit has light estimation so that your bug can lurk creepily in dark corners.

In GameScene.swift, add this to the end of update(_:):

// 1
guard let currentFrame = sceneView.session.currentFrame,
  let lightEstimate = currentFrame.lightEstimate else {
// 2
let neutralIntensity: CGFloat = 1000
let ambientIntensity = min(lightEstimate.ambientIntensity,
let blendFactor = 1 - ambientIntensity / neutralIntensity

// 3
for node in children {
  if let bug = node as? SKSpriteNode {
    bug.color = .black
    bug.colorBlendFactor = blendFactor

Here’s the breakdown of this code:

  1. You retrieve the light estimate from the session’s current frame.
  2. The measure of light is lumens, and 1000 lumens is a fairly bright light. Using the light estimate’s intensity of ambient light in the scene, you calculate a blend factor between 0 and 1, where 0 will be the brightest.
  3. Using this blend factor, you calculate how much black should tint the bugs.

As you pan about the room, the device will calculate available light. When there’s not much light, the bug will be shaded. Test this by holding your hand in front of the camera at different distances.

Shooting Bugs

Unless you’re an acrobat, these bugs are a little difficult to stomp on. You’re going to set up your game as a first-person shooter, very similar to the original Doom.

In GameScene.swift, add a new property to GameScene:

var sight: SKSpriteNode!

Override didMove(to:):

override func didMove(to view: SKView) {
  sight = SKSpriteNode(imageNamed: "sight")

This adds a sight to the center of the screen so you can aim at the bugs.

You’ll fire by touching the screen. Still in GameScene, override touchesBegan(_:with:) as follows:

override func touchesBegan(_ touches: Set<UITouch>,
                           with event: UIEvent?) {
  let location = sight.position
  let hitNodes = nodes(at: location)

Here you retrieve an array of all the nodes that intersect the same xy location as the sight. Although ARAnchors are in 3D, SKNodes are still in 2D. ARKit very cleverly calculates a 2D position and scale for the SKNode from the 3D information.

You’ll now find out if any of these nodes are a bug, and if they are, retrieve the first one. Add this to the end of touchesBegan(_:with:)

var hitBug: SKNode?
for node in hitNodes {
  if == "bug" {
    hitBug = node

Here you cycle through the hitNodes array and find out if any of the nodes in the array are bugs. hitBug now contains the first bug hit, if any.

You’ll need a couple of sounds to make the experience more realistic. The sounds are defined in Sounds in Types.swift; they are all ready for you to use.

Continue by adding this code to the end of the same method:

if let hitBug = hitBug,
  let anchor = sceneView.anchor(for: hitBug) {
  let action = {
    self.sceneView.session.remove(anchor: anchor)
  let group =[Sounds.hit, action])
  let sequence = [SKAction.wait(forDuration: 0.3), group]

You play a sound to indicate you’ve fired your weapon. If you do hit a bug, then play the hit sound after a short delay to indicate the bug is some distance away. Then remove the anchor for the node, which will also remove the bug node itself.

Build and run, and kill your first bug!

Level Design

Of course, a game with one bug in it isn’t much of a game. In Pest Control, you edited tile maps in the scene editor to specify the positions of your bugs. You’ll be doing something similar here, but you’ll directly add SKSpriteNodes to a scene in the scene editor.

Although ARniegeddon is fully immersive, you’ll design the level as top down. You’ll lay out the bugs in the scene as if you are a god-like being in the sky, looking down at your earthly body. When you come to play the game, you’ll be at the center of the world, and the bugs will be all around you.

Create a new SpriteKit Scene named Level1.sks. Change the scene size to Width: 400, Height: 400. The size of the scene is largely irrelevant, as you will calculate the real world position of nodes in the scene using a gameSize property which defines the size of the physical space around you. You’ll be at the center of this “scene”.

Place three Color Sprites in the scene and set their properties as follows:

  1. Name: bug, Texture: bug, Position: (-140, 50)
  2. Name: bug, Texture: bug, Position: (0, 150)
  3. Name: firebug, Texture: firebug, Position: (160, 120)

Imagine you’re at the center of the scene. You’ll have one bug on your left, one straight in front of you and the firebug on your right.

2D Design to 3D World

You’ve laid out the bugs in a 2D scene, but you need to position them in a 3D perspective. From a top view, looking down on the 2D scene, the 2D x-axis maps to the 3D x-axis. However, the 2D y-axis maps to the 3D z-axis — this determines how far away the bugs are from you. There is no mapping for the 3D y-axis — you’ll simply randomize this value.

In GameScene.swift, set up a game size constant to determine the real world area that you’ll play the game in. This will be a 2-meter by 2-meter space with you in the middle. In this example, you’ll be setting the game size to be a small area so you can test the game indoors. If you play outside, you’ll be able to set the game size larger:

let gameSize = CGSize(width: 2, height: 2) 

Replace setUpWorld() with the following code:

private func setUpWorld() {
  guard let currentFrame = sceneView.session.currentFrame,
    // 1
    let scene = SKScene(fileNamed: "Level1")
    else { return }
  for node in scene.children {
    if let node = node as? SKSpriteNode {
      var translation = matrix_identity_float4x4
      // 2
      let positionX = node.position.x / scene.size.width
      let positionY = node.position.y / scene.size.height
      translation.columns.3.x = 
              Float(positionX * gameSize.width)
      translation.columns.3.z = 
              -Float(positionY * gameSize.height)
      let transform = 
    * translation
      let anchor = ARAnchor(transform: transform)
      sceneView.session.add(anchor: anchor)
  isWorldSetUp = true

Taking each numbered comment in turn:

  1. Here you load the scene, complete with bugs from Level1.sks.
  2. You calculate the position of the node relative to the size of the scene. ARKit translations are measured in meters. Turning 2D into 3D, you use the y-coordinate of the 2D scene as the z-coordinate in 3D space. Using these values, you create the anchor and the view’s delegate will add the SKSpriteNode bug for each anchor as before.

The 3D y value — that’s the up and down axis — will be zero. That means the node will be added at the same vertical position as the camera. Later you’ll randomize this value.

Build and run and see the bugs laid out around you.

The firebug on your right still has the orange bug texture instead of the red firebug texture. You’re creating it in ARSKViewDelegate’s view(_:nodeFor:), which currently doesn’t distinguish between different types of bug. You’ll adjust this later.

First, you’ll randomize the y-position of the bugs.

Still in setUpWorld(), before:

let transform = * translation

add this:

translation.columns.3.y = Float(drand48() - 0.5)

drand48() creates a random value between 0 and 1. Here you step it down to create a random value between -0.5 and 0.5. Assigning it to the translation matrix means the bug will appear in a random position between half a meter above the position of the device and half a meter below the position of the device. For this game to work properly, it assumes the user is holding the device at least half a meter off the ground.

To get a random number, you’ll initialize drand48(); otherwise the random number will be the same every time you run it. This can be good while you’re testing, but not so good in a real game.

Add this to the end of didMove(to:) to seed the random number generator:


Build and run, and your bugs should show themselves in the same location, but at a different height.

Note: This was a lucky shot — you probably won’t see the scary alien ray.


Currently the game only has one type of bug in it. Now you’re going to create different nodes for bugs, firebugs and bug spray.

In Types.swift, add this enum:

enum NodeType: String {
  case bug = "bug"
  case firebug = "firebug"
  case bugspray = "bugspray"

New nodes will be attached to an anchor, so the anchor needs to have a type property to track the type of node you want to create.

Create a new file with the iOS/Source/Cocoa Touch Class template. Name the class Anchor with a subclass of ARAnchor. At the top of the file, replace the import statement with:

import ARKit

Add this new property to hold the type of node associated with the anchor:

var type: NodeType?

In GameScene.swift, in setUpWorld(), in the for loop, replace this:

let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)

with this:

let anchor = Anchor(transform: transform)
if let name =,
  let type = NodeType(rawValue: name) {
  anchor.type = type
  sceneView.session.add(anchor: anchor)

Here you get the type of the bug from the SKSpriteNode name you specified in Level1.sks. You create the new anchor, specifying the type.

In GameViewController.swift, in the ARSKViewDelegate extension, replace view(_:nodeFor:) with this:

func view(_ view: ARSKView,
          nodeFor anchor: ARAnchor) -> SKNode? {
  var node: SKNode?
  if let anchor = anchor as? Anchor {
    if let type = anchor.type {
      node = SKSpriteNode(imageNamed: type.rawValue)
      node?.name = type.rawValue
  return node

You check to see whether the anchor being added is of the subclass Anchor. If it is, then you create the appropriate SKSpriteNode using the anchor’s type.

Build and run, and this time the firebug on your right should have the correct red texture. Oh… also, you can’t kill it.

Anchor Collision

Remember how in Pest Control you needed bug spray to kill a firebug? Your current game will also place bug spray randomly around the scene, but the player will pick up bug spray by moving the phone over the bug spray canister. The player will then be able to kill one firebug with that bug spray.

The steps you’ll take are as follows:

  1. Add bug spray at a random position when you add a firebug.
  2. Check the distance of your device and the bug spray nodes each frame.
  3. If a collision occurs, “pick up” bug spray by removing the anchor and bug spray node and providing a visual cue.

In GameScene.swift, add a new method to GameScene:

private func addBugSpray(to currentFrame: ARFrame) {
  var translation = matrix_identity_float4x4
  translation.columns.3.x = Float(drand48()*2 - 1)
  translation.columns.3.z = -Float(drand48()*2 - 1)
  translation.columns.3.y = Float(drand48() - 0.5)
  let transform = * translation
  let anchor = Anchor(transform: transform)
  anchor.type = .bugspray
  sceneView.session.add(anchor: anchor)

In this method, you add a new anchor of type bugspray with a random position. You randomize the x (side) and z (forward/back) values between -1 and 1 and the y (up/down) value between -0.5 and 0.5.

In setUpWorld(), call this method when you add a firebug. After this:

sceneView.session.add(anchor: anchor)

add this:

if anchor.type == .firebug {
  addBugSpray(to: currentFrame)

Build and run to ensure that you are getting the same amount of bug spray as firebugs in your game.

Hint: If you can’t see the bug spray, walk away from the game area and point the device back. You should be able see all of the nodes in the game area. You can even see through walls! And if you still can’t see it, try commenting out the lighting code in update(_:).

Now for the collision. This is a simplified collision, as a real physics engine would have more efficient algorithms.

update(_:) runs every frame, so you’ll be able to check the current distance of the bug spray from the device by using the camera’s transformation matrix and the bug spray’s anchor’s transformation matrix.

You’ll also need a method to remove the bug spray and its anchor when the collision is successful.

Add this method to GameScene to remove the bug spray and make a “success” sound:

private func remove(bugspray anchor: ARAnchor) {
  sceneView.session.remove(anchor: anchor)

Here you set up an SKAction to make a sound and then remove the anchor. This also removes the SKNode attached to the anchor.

At the end of update(_:), add:

// 1
for anchor in currentFrame.anchors {
  // 2
  guard let node = sceneView.node(for: anchor), == NodeType.bugspray.rawValue 
    else { continue }
  // 3
  let distance = simd_distance(anchor.transform.columns.3,
  // 4
  if distance < 0.1 {
    remove(bugspray: anchor)

Going through this code point-by-point:

  1. You process all of the anchors attached to the current frame,
  2. You check whether the node for the anchor is of type bugspray. At the time of writing, there is an Xcode bug whereby subclasses of ARAnchor lose their properties, so you can’t check the anchor type directly.
  3. ARKit includes the framework simd, which provides a distance function. You use this to calculate the distance between the anchor and the camera.
  1. If the distance is less than 10 centimeters, you remove the anchor from the session. This will remove the bug spray node as well.

You should give the player a visual cue when she manages to collide the device with the bug spray and pick it up. You’ll set up a property which changes the sight image when it is changed.

Add this property to GameScene:

var hasBugspray = false {
  didSet {
    let sightImageName = hasBugspray ? "bugspraySight" : "sight"
    sight.texture = SKTexture(imageNamed: sightImageName)

When you set hasBugSpray to true, you change the sight to a different texture, indicating that you’re carrying the ultimate firebug exterminator.

At the end of remove(bugspray:), add this:

hasBugspray = true

Build and run and see if you can pick up a bug spray canister. Notice that while you’re holding a bug spray canister, the sight texture changes to a green one.

Firebug Destruction

In touchesBegan(_:with:), locate the for loop where you set up hitBug with the hit SKSpriteNode.

Replace this:

if == "bug" {

with this:

if == NodeType.bug.rawValue ||
  ( == NodeType.firebug.rawValue && hasBugspray) {

As well as checking to see if the node is a “bug”, you can now check to see if it’s a firebug. If it is a firebug and you have bug spray, then you’ve scored a hit.

At the end of touchesBegan(_:with:), add this:

hasBugspray = false

You only get one shot with the bug spray. If you miss, beware! You can no longer kill the firebug.

Where to Go From Here?

You can download the final project for this tutorial here.

Congratulations! At this point you have an almost playable game, and you’ve experimented with 2D ARKit. I hope I’ve whet your appetite for the potential of augmented reality, so you’ll move into the more exciting third dimension. Our book 3D iOS Games by Tutorials, available here will teach you how to work with SceneKit so you’ll be comfortable using 3D objects in 3D space.

Using ARKit and SceneKit, you’ll be able to move 3D models around the scene instead of attaching them to a stationary anchor. You’ll be able to walk around them and examine the model from behind. You’ll also be able to gather information about flat surfaces, place your models on a surface and measure distances accurately.

If you enjoyed what you learned in this tutorial, why not check out the complete 2D Apple Games by Tutorials book, available on our store?

Here’s a taste of what’s in the book:

  • Section I: Getting Started: This section covers the basics of making 2D games with SpriteKit. These are the most important techniques, the ones you’ll use in almost every game you make. By the time you reach the end of this section, you’ll be ready to make your own simple game. Throughout this section, you’ll create an action game named Zombie Conga, where you take the role of a happy-go-lucky zombie who just wants to party!
  • Section II: Physics and Nodes: In this section, you’ll learn how to use the built-in 2D physics engine included with SpriteKit. You’ll also learn how to use special types of nodes that allow you to play videos and create shapes in your game. In the process, you’ll create a physics puzzle game named Cat Nap, where you take the role of a cat who has had a long day and just wants to go to bed.
  • Section III: Tile Maps: In this section, you’ll learn about tile maps in SpriteKit and how to save and load game data. In the process, you’ll create a game named Pest Control, where you take control of a vigorous, impossibly ripped he-man named Arnie. Your job is to lead Arnie to bug-fighting victory by squishing all those pesky bugs.
  • Section IV: Juice: In this section, you’ll learn how to take a good game and make it great by adding a ton of special effects and excitement — also known as “juice.” In the process, you’ll create a game named Drop Charge, where you’re a space hero with a mission to blow up an alien space ship — and escape with your life before it explodes. To do this, you must jump from platform to platform, collecting special boosts along the way. Just be careful not to fall into the red hot lava!
  • Section V: Other Platforms: In this section, you’ll learn how to leverage your iOS knowledge to build games for the other Apple Platforms: macOS, tvOS and watchOS. In the process, you’ll create a game named Zombie Piranhas. In this game, your goal is to catch as many fish as possible without hooking a zombie — because we all know what happens when zombies are around.
  • Section VI: Advanced Topics: In this section, you’ll learn some APIs other than SpriteKit that are good to know when making games for the Apple platforms. In particular, you’ll learn how to add Game Center leaderboards and achievements into your game. You’ll also learn how to use the ReplayKit API. In the process, you’ll integrate these APIs into a top-down racing game named Circuit Racer, where you take the role of an elite race car driver out to set a world record — which wouldn’t be a problem if all this debris wasn’t on the track!
  • Section VII: Bonus Chapters: On top of the above, we included two bonus chapters. You can learn about the new ARKit framework by reworking the Pest Control game and turning it into an Augmented Reality game. If you liked the art in these mini-games and want to learn how to either hire an artist or make some art of your own, there’s a chapter to guide you through drawing a cute cat in the style of this book with Illustrator.

By the end of this book, you’ll have some great hands-on experience with how to build exciting, good-looking games using Swift and SpriteKit!

And to help sweeten the deal, the digital edition of the book is on sale for $49.99! But don’t wait — this sale price is only available for a limited time.

Speaking of sweet deals, be sure to check out the great prizes we’re giving away this year with the iOS 11 Launch Party, including over $9,000 in giveaways!

To enter, simply retweet this post using the #ios11launchparty hashtag by using the button below:

We hope you enjoy this update to one of our most-loved books. Stay tuned for more book releases and updates coming soon!