Chapters

Hide chapters

Apple Augmented Reality by Tutorials

First Edition - Early Access 3 · iOS 14 · Swift 5.1 · Xcode 12

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: Reality Composer

Section 1: 5 chapters
Show chapters Hide chapters

Section VI: ARKit & SceneKit

Section 6: 2 chapters
Show chapters Hide chapters

Section VII: ECS & Collaborative Experiences (Bonus Section)

Section 7: 2 chapters
Show chapters Hide chapters

11. Facial Blend Shapes
Written by Chris Language

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!

What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.

Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!

What are facial blend shapes?

ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.

You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.

Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.

Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.

These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.

As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.

The mouth works the same way, starting at 100% open then reducing to 0% open.

You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.

You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!

Building the robot

Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.

Adding the new robot scene

Now that you’ve built the Robot scene, you need to update the app so it knows about the additional prop.

var robot: Experience.Robot!
robot = nil
self.propId = self.propId >= 3 ? 3 : self.propId + 1
case 3: // Robot
  // 1
  let arAnchor = try! Experience.loadRobot()
  // 2
  uiView.scene.anchors.append(arAnchor)
  // 3
  robot = arAnchor
  break

Using the ARSessionDelegate protocol

To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.

Adding ARDelegateHandler

For your next step, you’ll create a new class that inherits from this protocol so you can track changes to the facial blend shapes.

// 1
class ARDelegateHandler: NSObject, ARSessionDelegate {
  // 2
  var arViewContainer: ARViewContainer
  // 3      
  init(_ control: ARViewContainer) {
    arViewContainer = control
    super.init()
  }
}      
func makeCoordinator() -> ARDelegateHandler {
  ARDelegateHandler(self)
}
arView.session.delegate = context.coordinator

Handling ARSession updates

With the delegate class in place, you can now start tracking updates to any of the facial blend shapes.

// 1
func session(_ session: ARSession, 
  didUpdate anchors: [ARAnchor]) {
  // 2
  guard robot != nil else { return }
  // 3
  var faceAnchor: ARFaceAnchor?
  for anchor in anchors {
    if let a = anchor as? ARFaceAnchor {
      faceAnchor = a
    }
  }
}

Tracking blinking eyes

Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.

let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue

Tracking eyebrows

To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.

let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue

Tracking the jaw

Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.

let jawOpen = blendShapes?[.jawOpen]?.floatValue

Positioning with quaternions

In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.

func Deg2Rad(_ value: Float) -> Float {
  return value * .pi / 180
}

Updating the eyelids

Now that you’ve collected all the blend shape data, you need to update the eyelid orientation.

// 1
robot.eyeLidL?.orientation = simd_mul(
  // 2
  simd_quatf(
    angle: Deg2Rad(-120 + (90 * eyeBlinkLeft!)),
    axis: [1, 0, 0]),
  // 3  
  simd_quatf(
    angle: Deg2Rad((90 * browLeft!) - (30 * browInnerUp!)),
    axis: [0, 0, 1]))
// 4            
robot.eyeLidR?.orientation = simd_mul(
  simd_quatf(
    angle: Deg2Rad(-120 + (90 * eyeBlinkRight!)),
    axis: [1, 0, 0]),
  simd_quatf(
    angle: Deg2Rad((-90 * browRight!) - (-30 * browInnerUp!)),
    axis: [0, 0, 1]))

Updating the jaw

The eyelids are done, but you still need to update the jaw orientation with the captured blend shape information:

robot.jaw?.orientation = simd_quatf(
  angle: Deg2Rad(-100 + (60 * jawOpen!)),
  axis: [1, 0, 0])

Adding lasers

The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?

Sending & receiving notifications

Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.

Coding the notifications

Now, you’ll add the code that prevents other things from happening while the lasers are firing.

var isLasersDone = true
// 1
if (self.isLasersDone == true && jawOpen! > 0.9) {
  // 2
  self.isLasersDone = false
  // 3
  robot.notifications.showLasers.post()
  // 4
  robot.actions.lasersDone.onAction = { _ in
    self.isLasersDone = true
  }
}

Key points

Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now