Augmented Reality and ARKit Tutorial

Learn how to work with augmented reality in this SpriteKit and ARKit tutorial! By Caroline Begbie.

Leave a rating/review
Save for later
Share

Augmented Reality. The phrase alone conjures up images of bionic limbs and holodecks. Wouldn’t it be cool if you could turn your iOS device into a mini-holodeck? Well, thanks to Apple’s exciting new framework named ARKit, now you can!

Augmented reality (AR), as its name indicates, adds computer-generated objects to reality. In short, you view the world through a camera and interact with virtual 2D or 3D objects inside that view.

With 2D AR, you can add overlays or signposts, which respond to geographic location or visual features in real-time. Or with 3D AR, you can visualize how a piece of furniture might look inside your living-room without ever leaving your home.

AR has been around for a while. For example, Snapchat’s 3D filters recognize facial features and wrap 3D objects around your head. But in order for that to work, you needed some hard-core math. You also needed hardware that can track in real-world space. With ARKit, Apple has provided both.

To demonstrate the differences between 2D and 3D ARKit, here is an example of Apple’s ARKit/SceneKit 3D template — a spaceship landing in my garden:

Now take a look at the 2D ARKit/SpriteKit template. With this template, you’re able to tap the screen and anchor a Space Invader 0.2 meters directly in front of your device’s camera.

In this tutorial, you’re going to step up the Pest Control game from 2D Apple Games by Tutorials and convert it into an immersive augmented reality first-person shooter named ARniegeddon, bringing bugs and firebugs into your very own home with the power of ARKit.

Getting Started

ARniegeddon is a first-person shooter. You’ll add bugs to the view and shoot them from a distance. Firebugs, just as they were in Pest Control, are a bit tougher to destroy. You’ll have to locate bug spray and pick it up by scooping your phone through the bug spray canister. You’ll then be able to aim at a firebug and take it down.

You’ll start from a SpriteKit template, and you’ll see exactly what goes into making an AR game using ARKit.

This is what the game will look like when you’ve completed this tutorial:

Download the starter project for this tutorial, and run the game on your device. The starter project is simply the Hello World code created from a new SpriteKit template. I’ve added the image and sound assets to it. There’s also a Types.swift file that has an enum for the sounds you’ll be using.

Note: To run the game on your device, you must select a development team under General\Signing of the ARniegeddon target.

Requirements

There are a few requirements for using ARKit. So before you get too far into this tutorial, make sure you have all of these:

  • An A9 or later processor. Only the iPhone 6s and up, the 2017 iPads and the iPad Pros can run ARKit.
  • Space. You’ll need plenty of space. One of the great things about AR is that you can develop games like Pokémon Go and encourage people to leave their homes — and play games in the park. To play ARniegeddon, you’ll need a clear space so you can capture bugs without falling over your furniture.
  • Contrast. If you’re in a dark room, you wouldn’t expect to see much, would you? With augmented reality, contrast is the key. So, if your room has white reflective tiles, white furniture and white walls, or is too dark, things won’t work too well; the camera needs contrast in order to distinguish surfaces and distances of objects.

How AR Works

Using a process called Visual Inertial Odometry (VIO), ARKit uses your device’s motion sensors, combined with visual information from the camera, to track the real world. Some clever math takes this tracking information and maps features in the real 3D world to your 2D screen.

When your game first starts, the device sets its initial position and orientation. The camera determines features of possible objects in its frame. When you move the device, there’s a new position and orientation. Because you’ve moved, the camera sees the objects slightly differently. The device now has enough information to triangulate on a feature and can work out the distance of an object. As you move, the device constantly refines the information and can gradually work out where there are horizontal surfaces.

Note: At the time of this writing, vertical surfaces are not calculated.

In addition to all this tracking, the sensors can examine the amount of available light and apply the same lighting to the AR objects within the scene.

Rendering the View

The first thing you’ll change is the view itself. When using SpriteKit with ARKit, your main view will be an ARSKView. This is a subclass of SKView that renders live video captured by the camera in the view.

In Main.storyboard, in the view hierarchy, select View, which is listed under Game View Controller. In the Identity Inspector, change the class to ARSKView.

In GameViewController.swift, replace all the imports at the top of the file with this:

import ARKit

Next, add a property for the view to GameViewController:

var sceneView: ARSKView!

Then, change this:

if let view = self.view as! SKView? {

…to this:

if let view = self.view as? ARSKView {
  sceneView = view

Here you get the loaded view as an ARSKView — matching your change in Main.storyboard — and you set sceneView to this view. Now the main view is set to display the camera video feed.

World Tracking with Sessions

You set up tracking by starting a session on an ARSKView. The configuration class ARWorldTrackingConfiguration tracks orientation and position of the device. In the configuration, you can specify whether or not to detect horizontal surfaces and also turn off lighting estimations.

Add these two methods to GameViewController:

override func viewWillAppear(_ animated: Bool) {
  super.viewWillAppear(animated)
  let configuration = ARWorldTrackingConfiguration()
  sceneView.session.run(configuration)
}
  
override func viewWillDisappear(_ animated: Bool) {
  super.viewWillDisappear(animated)
  sceneView.session.pause()
}

Here you start the session when the view appears and pause the session when the view disappears.

If you run the game at this time, you’ll get a crash. That’s because the app is trying to access the camera without permission.

Open Info.plist, control-click Information Property List, and choose Add Row.

Add the key NSCameraUsageDescription.

Double-click in the value field, and add the following description, which explains to the user how the game will use the camera:

For an immersive augmented reality experience, ARniegeddon requires access to the camera
Note: If you want to restrict your game to devices that support ARKit, find the UIRequiredDeviceCapabilities section in Info.plist (that’s the Required device capabilities section), and add arkit as a string entry in the array.

Build and run the game. You’ll get a notification requesting access to the camera, with the text indicating why access is required. When you do, tap OK.

The template’s GameScene code still runs, but with the camera view in the background. It’s rendering to the ARSKView that you set up earlier.

Note: When developing for augmented reality, you’re often moving around and generally don’t want the device to be tethered to the computer. Wireless development, new in Xcode 9, assists greatly with this. You can set this up under Window\Devices and Simulators.