Core Image Tutorial: Getting Started

Learn the basics of cool image filtering effects with Core Image and Swift. By Nick Lockwood.

Leave a rating/review
Save for later
Share
You are currently viewing page 4 of 5 of this article. Click here to view the first page.

What About Image Metadata?

Let’s talk about image metadata for a moment. Image files taken on mobile phones have a variety of data associated with them, such as GPS coordinates, image format, and orientation.

Orientation in particular is something that you’ll need to preserve. The process of loading a UIImage into a CIImage, rendering to a CGImage, and converting back to a UIImage strips the metadata from the image. In order to preserve orientation, you’ll need to record it and then pass it back into the UIImage constructor.

Start by adding a new property inside the ViewController class definition:

var orientation: UIImageOrientation = .Up

Next, add the following line to imagePickerController(picker:didFinishPickingMediaWithInfo:) just before setting beginImage:

orientation = gotImage.imageOrientation

This will save the original image orientation to the property.

Finally, alter the line in amountSliderValueChanged that creates the UIImage that you set to the imageView object:

let newImage = UIImage(CGImage: cgimg, scale:1, orientation:orientation)

Now, if you take a picture taken in something other than the default orientation, it will be preserved.

What Other Filters are Available?

The CIFilter API has more than 160 filters on Mac OS, 126 of which are available on iOS 8. As of iOS 8, it is now possible to create you own custom filters as well.

In order to find out what filters are available on a given device, you can use the CIFilter method filterNamesInCategory(kCICategoryBuiltIn). This method will return an array of filter names.

In addition, each filter has an attributes() method that will return a dictionary containing information about that filter. This information includes the filter’s name and category, the kinds of inputs the filter takes, and the default and acceptable values for those inputs.

Let’s put together a method for your class that logs the information for all the currently available filters. Add this method inside the ViewController class definition:

func logAllFilters() {
  let properties = CIFilter.filterNamesInCategory(kCICategoryBuiltIn)
  println(properties)

  for filterName: AnyObject in properties {
    let fltr = CIFilter(name:filterName as String)
    println(fltr.attributes())
  }
}

This method simply gets the array of filters from filterNamesInCategory(). It prints the list of names first. Then, for each name in the list, it instantiates the filter and logs its attributes dictionary.

Call this method at the end of viewDidLoad():

self.logAllFilters()

You will see the many filters listed in the console like the following:

[CIAttributeFilterDisplayName: Color Monochrome, inputColor: {
    CIAttributeClass = CIColor;
    CIAttributeDefault = "(0.6 0.45 0.3 1)";
    CIAttributeType = CIAttributeTypeColor;
}, inputImage: {
    CIAttributeClass = CIImage;
    CIAttributeType = CIAttributeTypeImage;
}, CIAttributeFilterCategories: (
    CICategoryColorEffect,
    CICategoryVideo,
    CICategoryInterlaced,
    CICategoryNonSquarePixels,
    CICategoryStillImage,
    CICategoryBuiltIn
), inputIntensity: {
    CIAttributeClass = NSNumber;
    CIAttributeDefault = 1;
    CIAttributeIdentity = 0;
    CIAttributeSliderMax = 1;
    CIAttributeSliderMin = 0;
    CIAttributeType = CIAttributeTypeScalar;
}, CIAttributeFilterName: CIColorMonochrome]

Wow, that’s a lot of filters! That should give you some ideas for other filters to try out in your own app!

More Intricate Filter Chains

Now that we’ve looked at all the filters that are available on the iOS platform, it’s time to create a more intricate filter chain. In order to do this, you’ll create a dedicated method to process the CIImage. It will take a CIImage, filter it to look like an old aged photo, and return a modified CIImage.

Add the following method to ViewController:

func oldPhoto(img: CIImage, withAmount intensity: Float) -> CIImage {
  // 1
  let sepia = CIFilter(name:"CISepiaTone")
  sepia.setValue(img, forKey:kCIInputImageKey)
  sepia.setValue(intensity, forKey:"inputIntensity")

  // 2
  let random = CIFilter(name:"CIRandomGenerator")

  // 3
  let lighten = CIFilter(name:"CIColorControls")
  lighten.setValue(random.outputImage, forKey:kCIInputImageKey)
  lighten.setValue(1 - intensity, forKey:"inputBrightness")
  lighten.setValue(0, forKey:"inputSaturation")

  // 4
  let croppedImage = lighten.outputImage.imageByCroppingToRect(beginImage.extent())

  // 5
  let composite = CIFilter(name:"CIHardLightBlendMode")
  composite.setValue(sepia.outputImage, forKey:kCIInputImageKey)
  composite.setValue(croppedImage, forKey:kCIInputBackgroundImageKey)

  // 6
  let vignette = CIFilter(name:"CIVignette")
  vignette.setValue(composite.outputImage, forKey:kCIInputImageKey)
  vignette.setValue(intensity * 2, forKey:"inputIntensity")
  vignette.setValue(intensity * 30, forKey:"inputRadius")

  // 7
  return vignette.outputImage
}

Here’s what’s going on, section by section:

  1. Set up the sepia filter the same way you did in the simpler scenario. You’re passing in a Float value in the method to set the intensity of the sepia effect. This value will be provided by the slider.
  2. Set up a filter that creates a random noise pattern that looks like this:It doesn’t take any parameters. You’ll use this noise pattern to add texture to your final “old photo” look.
  3. Alter the output of the random noise generator. You want to change it to grayscale, and lighten it up a little bit so the effect is less dramatic. You’ll notice that the input image key is set to the outputImage property of the random filter. This is a convenient way to pass the output of one filter as the input of the next.
  4. imageByCroppingToRect() takes an output CIImage and crops it to the provided rect. In this case, you need to crop the output of the CIRandomGenerator filter because it tiles infinitely. If you don’t crop it at some point, you’ll get an error saying that the filters have ‘an infinite extent’. CIImages don’t actually contain image data, they describe a ‘recipe’ for creating it. It’s not until you call a method on the CIContext that the data is actually processed.
  5. Combine the output of the sepia filter with the output of the CIRandomGenerator filter. This filter performs the exact same operation as the ‘Hard Light’ setting does in a photoshop layer. Most (if not all) of the filter options in photoshop are achievable using Core Image.
  6. Run a vignette filter on this composited output that darkens the edges of the photo. You’re using the value from the slider to set the radius and intensity of this effect.
  7. Finally, return the output of the last filter.

That’s all for this filter chain. You should now have an idea of how complex these filter chains may become. By combining Core Image filters into these kinds of chains, you can achieve an endless variety of effects.

The next thing to do is implement this method in amountSliderValueChanged(). Change these two lines:

filter.setValue(sliderValue, forKey: "inputIntensity")
let outputImage = filter.outputImage

To this one line:

let outputImage = self.oldPhoto(beginImage, withAmount: sliderValue)

This just replaces the previous sepia effect with your new, more complex filter method. You pass in the slider value for the intensity and you use the beginImage, which you set in the viewDidLoad method as the input CIImage. Build and run now and you should get a more refined old photo effect, complete with sepia, a little noise, and some vignetting.

coreimage-aged

That noise could probably be more subtle, but I’ll leave that experiment to you, dear reader. Now you have the full power of Core Image at your disposal. Go crazy!

Nick Lockwood

Contributors

Nick Lockwood

Author

Over 300 content creators. Join our team.