Beginning Core Image in iOS 6

Note from Ray: This is the eighth iOS 6 tutorial in the iOS 6 Feast! In this tutorial, you’re updating one of our older tutorials to iOS 6 so it’s fully up-to-date with the latest features like the new Core Image filters in iOS 6. Parts of this tutorial come from Jake Gundersen‘s three Core […] By Jake Gundersen.

Leave a rating/review
Save for later
Share

Learn how to apply cool effects to images with Core Image in iOS 6!

Note from Ray: This is the eighth iOS 6 tutorial in the iOS 6 Feast! In this tutorial, you’re updating one of our older tutorials to iOS 6 so it’s fully up-to-date with the latest features like the new Core Image filters in iOS 6.

Parts of this tutorial come from Jake Gundersen‘s three Core Image chapters in iOS 5 by Tutorials and iOS 6 by Tutorials. Enjoy!

Update note: Want to read this tutorial in Swift? Check out the version of this tutorial updated for Swift and iOS 8!

Core Image is a powerful framework that lets you easily apply filters to images, such as modifying the vibrance, hue, or exposure. It uses the GPU (or CPU, user definable) to process the image data and is very fast. Fast enough to do real time processing of video frames!

Core Image filters can stacked together to apply multiple effects to an image or video frame at once. When multiple filters are stacked together they are efficient because they create a modified single filter that is applied to the image, instead of processing the image through each filter, one at a time.

Each filter has it’s own parameters and can be queried in code to provide information about the filter, it’s purpose, and input parameters. The system can also be queried to find out what filters are available. At this time, only a subset of the Core Image filters available on the Mac are available on iOS. However, as more become available the API can be used to discover the new filter attributes.

In this tutorial, you will get hands-on experience playing around with Core Image. You’ll apply a few different filters, and you’ll see how easy it is to apply cool effects to images in real time!

Core Image Overview

Before you get started, let’s discuss some of the most important classes in the Core Image framework:

  • CIContext. All of the processing of a core image is done in a CIContext. This is somewhat similar to a Core Graphics or OpenGL context.
  • CIImage. This class hold the image data. It can be creating from a UIImage, from an image file, or from pixel data.
  • CIFilter. The filter class has a dictionary that defines the attributes of the particular filter that it represents. Examples of filters are vibrance filters, color inversion filters, cropping filters, and much more.

You’ll be using each of these classes as you create your project.

Getting Started

Open up Xcode and create a new project with the iOS\Application\Single View Application template. Enter CoreImageFun for the Product Name, select iPhone for the device family, and make sure that Use Storyboards and Use Automatic Reference Counting are checked (but leave the other checkboxes unchecked).

First things first, let’s add the Core Image framework. On the Mac this is part of the QuartzCore framework, but on iOS it’s a standalone framework. Go to the project container in the file view on the left hand side. Choose the Build Phases tab, expand the Link Binaries with Library group and press the +. Navigate to the CoreImage framework and double-click on it.

Second, download the resources for this tutorial, add the included image.png to your project. Done with setup!

Next open MainStoryboard.storyboard, drag an image view into the view controller, and set its mode to Aspect Fit. The position and dimensions should roughly match the following image:

Placing an image view into the view controller

Also, open the Assistant Editor, make sure it’s displaying ViewController.h, and control-drag from the UIImageView to below the @interface. Set the Connection to Outlet, name it imageView, and click Connect.

Compile and run just to make sure everything is good so far – you should just see an empty screen. The initial setup is complete – now onto Core Image!

Basic Image Filtering

You’re going to get started by simply running your image through a CIFilter and displaying it on the screen.

Every time you want to apply a CIFilter to an image you need to do four things:

  1. Create a CIImage object. CIImage has the following initialization methods: imageWithURL:, imageWithData:, imageWithCVPixelBuffer:, and imageWithBitmapData:bytesPerRow:size:format:colorSpace:. You’ll most likely be working with imageWithURL: most of the time.
  2. Create a CIContext. A CIContext can be CPU or GPU based. A CIContext can be reused, so you needn’t create it over and over, but you will always need one when outputting the CIImage object.
  3. Create a CIFilter. When you create the filter, you configure a number of properties on it that depend on the filter you’re using.
  4. Get the filter output. The filter gives you an output image as a CIImage – you can convert this to a UIImage using the CIContext, as you’ll see below.

Let’s see how this works. Add the following code to ViewController.m inside viewDidLoad:

// 1
NSString *filePath =
  [[NSBundle mainBundle] pathForResource:@"image" ofType:@"png"];
NSURL *fileNameAndPath = [NSURL fileURLWithPath:filePath];

// 2
CIImage *beginImage =
  [CIImage imageWithContentsOfURL:fileNameAndPath];

// 3
CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"
                              keysAndValues: kCIInputImageKey, beginImage,
                    @"inputIntensity", @0.8, nil];
CIImage *outputImage = [filter outputImage];

// 4
UIImage *newImage = [UIImage imageWithCIImage:outputImage];
self.imageView.image = newImage;

Let’s go over this section by section:

Next you’ll create your CIFilter object. A CIFilter constructor takes the name of the filter, and a dictionary that specifies the keys and values for that filter. Each filter will have its own unique keys and set of valid values.

The CISepiaTone filter takes only two values, the KCIInputImageKey (a CIImage) and the @”inputIntensity”, a float value, wrapped in an NSNumber (using the new literal syntax), between 0 and 1. Here you give that value 0.8. Most of the filters have default values that will be used if no values are supplied. One exception is the CIImage, this must be provided as there is no default.

Getting a CIImage back out of a filter is easy. You just use the outputImage property.

  1. The first two lines create an NSURL object that holds the path to your image file.
  2. Next you create your CIImage with the imageWithContentsOfURL method.
  3. Next you’ll create your CIFilter object. A CIFilter constructor takes the name of the filter, and a dictionary that specifies the keys and values for that filter. Each filter will have its own unique keys and set of valid values.

    The CISepiaTone filter takes only two values, the KCIInputImageKey (a CIImage) and the @”inputIntensity”, a float value, wrapped in an NSNumber (using the new literal syntax), between 0 and 1. Here you give that value 0.8. Most of the filters have default values that will be used if no values are supplied. One exception is the CIImage, this must be provided as there is no default.

    Getting a CIImage back out of a filter is easy. You just use the outputImage property.

  4. Once you have an output CIImage, you will need to convert it into a UIImage. New in iOS 6 is the UIImage method +imageWithCIImage: method. This method creates a UIImage from a CIImage. Once we’ve converted it to a UIImage, you just display it in the image view you added earlier.

Compile and run the project, and you’l see your image filtered by the sepia tone filter. Congratulations, you have successfully used CIImage and CIFilters!

Hello, Core Image!

Jake Gundersen

Contributors

Jake Gundersen

Author

Over 300 content creators. Join our team.