Chapters

Hide chapters

Apple Augmented Reality by Tutorials

First Edition · iOS 14 · Swift 5.1 · Xcode 12

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: Reality Composer

Section 1: 5 chapters
Show chapters Hide chapters

15. ARKit & SceneKit
Written by Chris Language

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Now that you’ve seen ARKit in action alongside Apple’s general-purpose 2D graphics framework, SpriteKit, it’s time to unlock the third dimension with SceneKit. In this chapter, you’ll continue to learn more about ARKit, but with the focus of using it with SceneKit as its rendering technology.

You’ll start by creating a new SceneKit-based AR project with Xcode. The project will grow into an interactive 3D augmented reality experience: a small AR Airport with basic animations.

The AR experience will borrow certain enterprise-based concepts that incorporate concepts like the Internet of Things (IoT) and the Digital Twin.

Don’t worry, the data component for this project is non-existent. Your main focus is to create an awesome AR experience that will serve as a fun little frontend.

What is SceneKit?

SceneKit is a high-performance rendering engine and 3D graphics framework. It’s built on top of Metal, which delivers the highest performance possible. It leverages the power of Swift to deliver a simple, yet extremely powerful and descriptive, 3D graphics framework. With it, you can easily import, manipulate and render 3D content. With its built-in physics simulation and animation capabilities, creating rich 3D experiences has never been easier.

Best of all, Apple’s platforms all support SceneKit and it integrates extremely well with other frameworks, like GameplayKit and SpriteKit.

Creating a SceneKit AR project

Start Xcode — it’s time to create a SceneKit-based AR project.

Exploring the project

In Xcode, with the project open, explore the important components that Xcode generated for you based on the SceneKit Augmented Reality Template project. Although the generated project is similar to a SpriteKit project, there are a few small differences:

AppDelegate.swift

This is the standard starting point of your app.

LaunchScreen.storyboard

The launch screen is a standard part of every app. It’s the first thing the user sees when the app launches. Here, you’ll represent your app with a beautiful splash image.

Main.storyboard

The main storyboard is the view component of your AR app, containing the app’s UI. This is a good place to put buttons and heads-up displays, for example.

ViewController.swift

The view controller contains the code behind the entire AR experience, specifically the main storyboard.

art.scnassets

art.scnassets is a standard folder that was simply renamed by adding the .scnassets extension to it. This type of folder is known as a SceneKit Asset Catalog. Its purpose is to help you manage your game assets separately from the code.

ship.scn

Within the SceneKit Asset Catalog, you’ll find a b file. This defines the SceneKit Scene containing a model of the ship that you see when the AR experience starts.

Assets.xcassets

Here, you find your standard app assets like your app icon, for example.

Info.plist

When your app runs for the first time, it needs to ask for permission to access the camera. ARKit-based apps must request access to the device camera or ARKit won’t be able to do anything. It does this by setting the value of Privacy — Camera Usage Description.

Loading & exploring the Starter project

Now that you know how to create a standard SceneKit-based AR project with Xcode… you won’t use the project you just created.

ViewController.swift

You’ll find that the view controller has been reduced to its bare bones. Don’t worry, you’ll fill in all the missing code later.

Main.storyboard

Next, you’ll look at Main.storyboard.

App state management

With all the logistics out of the way, you’ll start with some basic app management. The app requires a state machine to manage its lifecycle through various states. During each individual state, the app will focus on doing one particular thing.

enum AppState: Int16 {
  case DetectSurface
  case PointAtSurface
  case TapToStart
  case Started
}
var trackingStatus: String = ""
var statusMessage: String = ""
var appState: AppState = .DetectSurface
// 1
func startApp() {
  DispatchQueue.main.async {
    self.appState = .DetectSurface
  }
}
//2
func resetApp() {
  DispatchQueue.main.async {
    //self.resetARSession()
    self.appState = .DetectSurface
  }
}

Basic scene management

Now, move on to the SceneKit component of your app. The first thing you need to do is to make the view controller comply with a special protocol.

class ViewController: UIViewController, ARSCNViewDelegate {

Initializing a new SceneKit scene

Now, to create the new SceneKit scene.

func initScene() {
  // 1
  let scene = SCNScene()
  sceneView.scene = scene
  // 2
  sceneView.delegate = self
}

Providing feedback

Feedback helps the user know what the app is doing and what steps they need to take next. To start providing feedback, add the following helper function to the Scene Management section:

func updateStatus() {
  // 1
  switch appState {
  case .DetectSurface:
    statusMessage = "Scan available flat surfaces..."
  case .PointAtSurface:
    statusMessage = "Point at designated surface first!"
  case .TapToStart:
    statusMessage = "Tap to start."
  case .Started:
    statusMessage = "Tap objects for more info."
  }
  // 2
  self.statusLabel.text = trackingStatus != "" ?
      "\(trackingStatus)" : "\(statusMessage)"
}
func renderer(_ renderer: SCNSceneRenderer, 
  updateAtTime time: TimeInterval) {
  DispatchQueue.main.async {
    self.updateStatus()
  }
}
self.initView()

AR session management

Now that you’ve created the scene and ensured that the user will be kept informed of what the app’s doing, move on to the AR component.

AR configuration

Before starting an AR session, you have to create an AR session configuration. You use this configuration to establish the connection between the real world, where your device is, and the virtual 3D world, where your virtual content is.

Starting the AR session

With ViewController.swift still open, add the following extension under the AR Session Management section:

func initARSession() {
  // 1
  guard ARWorldTrackingConfiguration.isSupported else {
    print("*** ARConfig: AR World Tracking Not Supported")
    return
  }
  // 2
  let config = ARWorldTrackingConfiguration()
  // 3
  config.worldAlignment = .gravity
  config.providesAudioData = false
  config.planeDetection = .horizontal
  config.isLightEstimationEnabled = true
  config.environmentTexturing = .automatic
  // 4
  sceneView.session.run(config)
}

Resetting the AR session

At times, you might want to reset the AR session. This comes in handy when you want to restart the AR experience, for example.

func resetARSession() {
  // 1
  let config = sceneView.session.configuration as!
    ARWorldTrackingConfiguration
  // 2
  config.planeDetection = .horizontal
  // 3
  sceneView.session.run(config,
    options: [.resetTracking, .removeExistingAnchors])
}

Handling AR session state changes

Now that you can start and reset the AR session, you need to keep the user informed any time the AR session state changes. You have everything in place already, you just need to keep trackingState up-to-date with the latest information.

func session(_ session: ARSession, 
  cameraDidChangeTrackingState camera: ARCamera) {
  switch camera.trackingState {
  case .notAvailable: self.trackingStatus = 
    "Tracking:  Not available!"
  case .normal: self.trackingStatus = ""
  case .limited(let reason):
    switch reason {
    case .excessiveMotion: self.trackingStatus = 
      "Tracking: Limited due to excessive motion!"
    case .insufficientFeatures: self.trackingStatus =
      "Tracking: Limited due to insufficient features!"
    case .relocalizing: self.trackingStatus = 
      "Tracking: Relocalizing..."
    case .initializing: self.trackingStatus = 
      "Tracking: Initializing..."
    @unknown default: self.trackingStatus = 
      "Tracking: Unknown..."
    }
  }
}

Handling AR session issues

Finally, you need to keep the user informed when any issues occur. Again, you’ll use trackingState for this purpose.

func session(_ session: ARSession, 
  didFailWithError error: Error) {
  self.trackingStatus = "AR Session Failure: \(error)"
}
  
func sessionWasInterrupted(_ session: ARSession) {
  self.trackingStatus = "AR Session Was Interrupted!"
}
  
func sessionInterruptionEnded(_ session: ARSession) {
  self.trackingStatus = "AR Session Interruption Ended"
}
self.initARSession()
self.resetARSession()

AR Coaching Overlay

Currently, the app uses the status bar at the top to provide step-by-step instructions to help onboard the user into the AR experience. However, your approach to this onboarding process might differ entirely from another developer’s. This causes massive fragmentation in AR experiences as the user switches from one experience to another.

What is an AR Coaching Overlay view?

Apple now provides a special overlay view known as the ARCoachingOverlayView. You can easily integrate it into your existing AR experiences to provide the user with a standardized AR onboarding process.

Adding AR Coaching Overlay

The first thing to do is to ensure your view controller conforms to the new protocol.

extension ViewController : ARCoachingOverlayViewDelegate {
}

Handling AR Coaching Overlay events

Next, you need to provide some functions to handle the overlay events.

// 1
func coachingOverlayViewWillActivate(_ 
  coachingOverlayView: ARCoachingOverlayView) {
}
// 2  
func coachingOverlayViewDidDeactivate(_ 
  coachingOverlayView: ARCoachingOverlayView) {
  self.startApp()
}
// 3
func coachingOverlayViewDidRequestSessionReset(_
  coachingOverlayView: ARCoachingOverlayView) {
  self.resetApp()
}

Initializing the AR Coaching Overlay

Now that you’re handling everything the AR Coaching Overlay will throw at your app, you need to initialize it.

func initCoachingOverlayView() {
  // 1
  let coachingOverlay = ARCoachingOverlayView()
  // 2
  coachingOverlay.session = self.sceneView.session
  // 3
  coachingOverlay.delegate = self
  // 4
  coachingOverlay.activatesAutomatically = true
  // 5
  coachingOverlay.goal = .horizontalPlane
  // 6
  self.sceneView.addSubview(coachingOverlay)
}

Adding constraints to the AR Coaching Overlay

With the overlay initialized, your next step is to provide it with proper constraints.

// 1
coachingOverlay.translatesAutoresizingMaskIntoConstraints =
  false
// 3
NSLayoutConstraint.activate([
  NSLayoutConstraint(item: coachingOverlay,
    attribute: .top, relatedBy: .equal,
    toItem: self.view, attribute: .top,
    multiplier: 1, constant: 0),
  NSLayoutConstraint(item:  coachingOverlay, 
    attribute: .bottom, relatedBy: .equal,
    toItem: self.view, attribute: .bottom, 
    multiplier: 1, constant: 0),
  NSLayoutConstraint(item:  coachingOverlay,
    attribute: .leading, relatedBy: .equal,
    toItem: self.view, attribute: .leading,
    multiplier: 1, constant: 0),
  NSLayoutConstraint(item:  coachingOverlay,
    attribute: .trailing, relatedBy: .equal,
    toItem: self.view, attribute: .trailing,
    multiplier: 1, constant: 0)])
self.initCoachingOverlayView()

Key points

Congratulations, you’ve reached the end of this chapter — and your app is shaping up nicely.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now