Apple Pencil Tutorial: Getting Started

In this Apple Pencil tutorial, you’ll learn about force, touch coalescing, altitude, and azimuth, to add realistic lines and shading to a drawing app. By Caroline Begbie.

Leave a rating/review
Save for later
You are currently viewing page 3 of 4 of this article. Click here to view the first page.

Using Azimuth to Adjust Width

One more thing to do: When you draw at 90 degrees with a real pencil, the line gets narrower as you change the pencil's tilt angle. But, if you try that with your Apple Pencil, the line width stays the same.

In addition to the azimuth angle, you also need to take into Pencil's altitude into account when calculating the width of the line.

Add this constant to the top of the CanvasView class, just below the others:

private let minLineWidth: CGFloat = 5

This will be the narrowest that a shading line can be -- you can change it to suit your own personal shading tastes. :]

At the bottom of lineWidthForShading(_:touch:), just before the return statement, add the following:

// 1    
let minAltitudeAngle: CGFloat = 0.25
let maxAltitudeAngle = tiltThreshold
// 2
let altitudeAngle = touch.altitudeAngle < minAltitudeAngle ? minAltitudeAngle : touch.altitudeAngle
// 3
let normalizedAltitude = 1 - ((altitudeAngle - minAltitudeAngle) / (maxAltitudeAngle - minAltitudeAngle))

// 4
lineWidth = lineWidth * normalizedAltitude + minLineWidth

Note: Make sure you add this code to lineWidthForShading(_:touch:), and not lineWidthForDrawing(_:touch:) by mistake.

There's a lot to digest here, so let's take this bit by bit.

  1. Theoretically, the minimum altitude of Pencil is 0 degrees, meaning it's lying flat on the iPad and the tip isn't touching the screen, hence, altitude can't be recorded. The actual minimum altitude is somewhere around 0.2, but I've made the minimum to be 0.25.
  2. If the altitude is less than the minimum, you use the minimum instead.
  3. Just like you did earlier, you normalize this altitude value to be between 0 and 1.
  4. Finally, you multiply the line width you calculated with the azimuth by this normalized value, and add that to the minimum line width.

Build and run. As you shade, change the Pencil's altitude and see how the strokes get wider and narrower. Increasing the Pencil's altitude gradually should let you segue smoothly into the drawing line:

Apple Pencil Tutorial

Playing with Opacity

The last task in this section is to make the shading look a bit more realistic by turning down the texture's opacity, which you'll calculate with force.

Just before the return statement in lineWidthForShading(_:touch:), add the following:

let minForce: CGFloat = 0.0
let maxForce: CGFloat = 5

let normalizedAlpha = (touch.force - minForce) / (maxForce - minForce)
CGContextSetAlpha(context, normalizedAlpha)

After working through the previous blocks of code, this one should be self-explanatory. You're simply taking the force and normalizing it to a value between 0 and 1, and then setting the alpha used by the drawing context to that value.

Build and run. Try shading with varying pressure:


Finger vs. Pencil

If you're anything like me, you've probably made a few sketching errors here and there and wish you could erase those errant lines.

In this section, you're going to look at how you can distinguish between using the Apple Pencil and your finger. More specifically, you'll configure the app so that your finger can play the role of a faithful eraser.

It turns out that checking whether a finger or the Apple Pencil is being used is pretty easy -- you just use the type property on UITouch.

At the top of CanvasView, add a property for the eraser color. You're going to paint in the background color of the canvas view, and it will give the illusion of acting as an eraser. Clever, eh? :]

private var eraserColor: UIColor {
  return backgroundColor ?? UIColor.whiteColor()

Here you set eraserColor to the view's background color, unless it's nil, in which case you just set it to white.

Next, find the following code in drawStroke(_:touch:):

if touch.altitudeAngle < tiltThreshold {
  lineWidth = lineWidthForShading(context, touch: touch)
} else {
  lineWidth = lineWidthForDrawing(context, touch: touch)


And replace it with the following:

if touch.type == .Stylus {
  if touch.altitudeAngle < tiltThreshold {
    lineWidth = lineWidthForShading(context, touch: touch)
  } else {
    lineWidth = lineWidthForDrawing(context, touch: touch)
} else {
  lineWidth = 20

Here you've added a check to see whether it's Pencil or a finger, and if it's the latter you change the line width and use the eraser color for drawing.

Build and run. Now you can clean up any untidy edges or erase everything with your finger!

Apple Pencil Tutorial

Faking Force For a Finger

Just as an aside, did you know that since iOS 8 you've been able to fake force with your finger? There's a property declared on UITouch called majorRadius, which, as its name implies, holds the size of the touch.

Find this line that you just added in the previous code block:

lineWidth = 20

And replace it with this one:

lineWidth = touch.majorRadius / 2

Build and run. Shade a dark area, and then erase with both the tip of your finger and the flat of your finger to see the varying thicknesses:

Apple Pencil Tutorial

Finger painting feels really clumsy and the drawings are painful after you've played around with the elegant Apple Pencil. :].

Reducing Latency

You might think that your Pencil zooms over the surface of the iPad with the drawn line following closer than ever. Not so much -- it's an illusion because there is latency between the touch and the time the line renders. Apple has a trick up its sleeve to deal with it: Touch Prediction.

Incredible as it may seem, all-seeing Apple knows where your Pencil, or finger, is about to draw. Those predictions are saved into an array on UIEvent so that you can draw that predicted touch ahead of time. How cool is that!? :]

Before you can begin working with predicted touches, there's one small technical obstacle to overcome. At the moment, you're drawing strokes in the graphics context, which are then displayed immediately in the canvas view.

You'll need to draw the predicted touches onto the canvas but discard them when the actual touches catch up with the predicted ones.

For example, when you draw an S-shape it predicts the curves, but when you change direction, those predictions will be wrong and need to be discarded. This picture illustrates the problem. The "S" is drawn in red and the predicted touches show in blue.


Here's what your code will need to do to avoid this problem:

  1. You'll create a new UIImage property named drawingImage to capture the true -- not predicted -- touches from the graphics context.
  2. On each touch move event, you'll draw drawingImage into the graphics context.
  3. The real touches will be drawn into the graphics context, and you'll save it to the new drawingImage instead of using the image property on the canvas view.
  4. The predicted touches will be drawn into the graphics context.
  5. The graphics context, complete with predicted touches, will be pushed into canvasView.image, which is what the user will see.

In this way, no predicted touches will draw into drawingImage and each time a touch move event occurs, the predictions will be deleted.