Core Image Tutorial for iOS: Custom Filters
Learn to create your own Core Image filters using the Metal Shading Language to build kernels that provide pixel-level image processing. By Vidhur Voora.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Core Image Tutorial for iOS: Custom Filters
25 mins
- Getting Started
- Introducing Core Image Classes
- Fetching the List of Built-In Filters
- Using Built-In Filters
- Displaying a Built-In Filter’s Output
- Meet CIKernel
- Creating Build Rules
- Adding the Metal Source
- Loading the Kernel Code
- Applying the Color Kernel Filter
- Creating a Warp Kernel
- Loading the Warp Kernel
- Applying the Warp Kernel Filter
- Challenge: Implementing a Blend Kernel
- Debugging Core Image Issues
- Using Core Image Quick Look
- Using CI_PRINT_TREE
- Where to Go From Here?
Core Image is a powerful and efficient image processing framework. You can create beautiful effects using the built-in filters the framework provides, as well as create custom filters and image processors. You can adjust the color, geometry and perform complex convolutions.
Creating beautiful filters is an art, and one of the greatest artists was Leonardo da Vinci. In this tutorial, you’ll add some interesting touches to da Vinci’s famous paintings.
In the process, you’ll:
- Get an overview of Core Image’s classes and built-in filters.
- Create a filter using built-in filters.
- Transform an image’s color using a custom color kernel.
- Transform the geometry of an image using a custom warp kernel.
- Learn to debug Core Image issues.
Get your paintbrushes, oops, I mean your Xcode ready. It’s time to dive into the amazing world of Core Image!
Getting Started
Download the project by clicking Download Materials at the top or bottom of this page. Open the RayVinci project in starter. Build and run.
You’ll see four of Leonardo da Vinci’s most famous works. Tapping a painting opens a sheet, but the image’s output is empty.
In this tutorial, you’ll create filters for these images and then see the result of applying a filter in the output.
Swipe down to dismiss the sheet. Next, tap Filter List on the top right.
That button should show a list of available built-in filters. But wait, it’s currently empty. You’ll fix that next. :]
Introducing Core Image Classes
Before you populate the list of filters, you need to understand the Core Image framework’s basic classes.
-
CIImage: Represents an image that is either ready for processing or produced by the Core Image filters. A
CIImage
object has all the image’s data within it but isn’t actually an image. It’s like a recipe that contains all the ingredients to make a dish but isn’t the dish itself.You’ll see how to render the image to display later in this tutorial.
-
CIFilter: Takes one or more images, processes each image by applying transformations and produces a
CIImage
as its output. You can chain multiple filters and create interesting effects. The objects ofCIFilters
are mutable and not thread-safe. -
CIContext: Renders the processed results from the filter. For example,
CIContext
helps create a Quartz 2D image from aCIImage
object.
To learn more about these classes, refer to the Core Image Tutorial: Getting Started.
Now that you’re familiar with the Core Image classes, it’s time to populate the list of filters.
Fetching the List of Built-In Filters
Open RayVinci and select FilterListView.swift. Replace filterList
in FilterListView
with:
let filterList = CIFilter.filterNames(inCategory: nil)
Here, you fetch the list of all available built-in filters provided by Core Image by using filterNames(inCategory:)
and passing nil
as the category. You can view the list of available categories in CIFilter
‘s developer documentation.
Open FilterDetailView.swift. Replace Text("Filter Details")
in body
with:
// 1
if let ciFilter = CIFilter(name: filter) {
// 2
ScrollView {
Text(ciFilter.attributes.description)
}
} else {
// 3
Text("Unknown filter!")
}
Here, you:
- Initialize a filter,
ciFilter
, using the filter name. Since the name is a string and can be misspelled, the initializer returns an optional. For this reason, you’ll need to check for the existence of a filter. - You can inspect the filter’s various attributes using
attributes
. Here, you create aScrollView
and populate the description of the attributes in aText
view if the filter exists. - If the filter doesn’t exist or isn’t known, you show a
Text
view explaining the situation.
Build and run. Tap Filter List. Whoa, that’s a lot of filters!
Tap any filter to see its attributes.
Remarkable, isn’t it? You’re just getting started! In the next section, you’ll use one of these built-in filters to make the sun shine on the “Mona Lisa”. :]
Using Built-In Filters
Now that you’ve seen the list of available filters, you’ll use one of these to create an interesting effect.
Open ImageProcessor.swift. At the top, before the class declaration, add:
enum ProcessEffect {
case builtIn
case colorKernel
case warpKernel
case blendKernel
}
Here, you declare ProcessEffect
as an enum
. It has all the filter cases you’ll work on in this tutorial.
Add the following to ImageProcessor
:
// 1
private func applyBuiltInEffect(input: CIImage) {
// 2
let noir = CIFilter(
name: "CIPhotoEffectNoir",
parameters: ["inputImage": input]
)?.outputImage
// 3
let sunGenerate = CIFilter(
name: "CISunbeamsGenerator",
parameters: [
"inputStriationStrength": 1,
"inputSunRadius": 300,
"inputCenter": CIVector(
x: input.extent.width - input.extent.width / 5,
y: input.extent.height - input.extent.height / 10)
])?
.outputImage
// 4
let compositeImage = input.applyingFilter(
"CIBlendWithMask",
parameters: [
kCIInputBackgroundImageKey: noir as Any,
kCIInputMaskImageKey: sunGenerate as Any
])
}
Here, you:
- Declare a private method that takes a
CIImage
as input and applies a built-in filter. - You start by creating a darkened, moody noir effect using
CIPhotoEffectNoir
.CIFilter
takes a string as the name and parameters in the form of a dictionary. You fetch the resulting filtered image fromoutputImage
. - Next, you create a generator filter using
CISunbeamsGenerator
. This creates a sunbeams mask. In the parameters, you set:- inputStriationStrength: Represents the intensity of the sunbeams.
- inputSunRadius: Represents the radius of the sun.
- inputCenter: The x and y position of the center of the sunbeam. In this case, you set the position to the top right of the image.
- Here, you create a stylized effect by using
CIBlendWithMask
. You apply the filter on theinput
by setting the result ofCIPhotoEffectNoir
as the background image andsunGenerate
as the mask image. The result of this composition is aCIImage
.
ImageProcessor
has output
, a published property which is a UIImage
. You’ll need to convert the result of the composition to a UIImage
to display it.
In ImageProcessor
, add the following below @Published var output = UIImage()
:
let context = CIContext()
Here, you create an instance of CIContext
that all the filters will use.
Add the following to ImageProcessor
:
private func renderAsUIImage(_ image: CIImage) -> UIImage? {
if let cgImage = context.createCGImage(image, from: image.extent) {
return UIImage(cgImage: cgImage)
}
return nil
}
Here, you use context
to create an instance of CGImage
from CIImage
.
Using cgImage
, you then create a UIImage
. The user will see this image.
Displaying a Built-In Filter’s Output
Add the following to the end of applyBuiltInEffect(input:)
:
if let outputImage = renderAsUIImage(compositeImage) {
output = outputImage
}
This converts compositeImage
, which is a CIImage
, to a UIImage
using renderAsUIImage(_:)
. You then save the result to output
.
Add the following new method to ImageProcessor
:
// 1
func process(painting: Painting, effect: ProcessEffect) {
// 2
guard
let paintImage = UIImage(named: painting.image),
let input = CIImage(image: paintImage)
else {
print("Invalid input image")
return
}
switch effect {
// 3
case .builtIn:
applyBuiltInEffect(input: input)
default:
print("Unsupported effect")
}
}
Here, you:
- Create a method that acts as an entry point to
ImageProcessor
. It takes an instance ofPainting
and aneffect
to apply. - Check for a valid image.
- If the effect is of type
.builtIn
, you callapplyBuiltInEffect(input:)
to apply the filter.
Open PaintingWall.swift. Below selectedPainting = paintings[index]
in the action
closure of Button
, add:
var effect = ProcessEffect.builtIn
if let painting = selectedPainting {
switch index {
case 0:
effect = .builtIn
default:
effect = .builtIn
}
ImageProcessor.shared.process(painting: painting, effect: effect)
}
Here, you set the effect
to .builtIn
for the first painting. You also set it as the default effect. Then you apply the filter by calling process(painting:, effect:)
on ImageProcessor
.
Build and run. Tap the “Mona Lisa”. You’ll see a built-in filter applied in the output!
Great job making the sun shine on Mona Lisa. No wonder she’s smiling! Now it’s time to create your filter using CIKernel.
Meet CIKernel
With CIKernel, you can put in place custom code, called a kernel, to manipulate an image pixel by pixel. The GPU processes these pixels. You write kernels in the Metal Shading Language, which offers the following advantages over the older Core Image Kernel Language, deprecated since iOS 12:
- Supports all the great features of Core Image kernels like concatenation and tiling.
- Comes pre-compiled at build-time with error diagnostics. This way, you don’t need to wait for runtime for errors to appear.
- Offers syntax highlighting and syntax checking.
There are different types of kernels:
- CIColorKernel: Changes the color of a pixel but doesn’t know the pixel’s position.
- CIWarpKernel: Changes the position of a pixel but doesn’t know the pixel’s color.
- CIBlendKernel: Blends two images in an optimized way.
To create and apply a kernel, you:
- First, add custom build rules to the project.
- Then, add the Metal source file.
- Load the kernel.
- Finally, initialize and apply the kernel.
You’ll implement each of these steps next. Get ready for a fun ride!
Creating Build Rules
You need to compile the Core Image Metal code and link it with special flags.
Select the RayVinci target in the Project navigator. Then, select the Build Rules tab. Add a new build rule by clicking +.
Then, set up the first new build rule:
- Set Process to Source files with name matching:. Then set *.ci.metal as the value.
- Uncheck Run once per architecture.
- Add the following script:
xcrun metal -c -fcikernel "${INPUT_FILE_PATH}" \ -o "${SCRIPT_OUTPUT_FILE_0}"
This calls the Metal compiler with the required -fcikernel flag.
- Add the following in Output Files:
$(DERIVED_FILE_DIR)/${INPUT_FILE_BASE}.air
This produces an output binary that ends in .ci.air.
Next, add another new build rule by clicking + again.
Follow these steps for the second new build rule:
- Set Process to Source files with name matching:. Then set *.ci.air as the value.
- Uncheck Run once per architecture.
- Add the following script:
xcrun metallib -cikernel "${INPUT_FILE_PATH}" -o "${SCRIPT_OUTPUT_FILE_0}"
This calls the Metal linker with the required -cikernel flag.
- Add the following in Output Files:
$(METAL_LIBRARY_OUTPUT_DIR)/$(INPUT_FILE_BASE).metallib
This produces a file ending with .ci.metallib in the app bundle.
Next, it’s time to add the Metal source.
Adding the Metal Source
First, you’ll create a source file for a color kernel. In the Project navigator, highlight RayVinci right under the RayVinci project.
Right-click and choose New Group. Name this new group Filters. Then, highlight the group and add a new Metal file named ColorFilterKernel.ci.metal.
Open the file and add:
// 1
#include <CoreImage/CoreImage.h>
// 2
extern "C" {
namespace coreimage {
// 3
float4 colorFilterKernel(sample_t s) {
// 4
float4 swappedColor;
swappedColor.r = s.g;
swappedColor.g = s.b;
swappedColor.b = s.r;
swappedColor.a = s.a;
return swappedColor;
}
}
}
Here’s a code breakdown:
- Including the Core Image header lets you access the classes the framework provides. This automatically includes the Core Image Metal Kernel Library, CIKernelMetalLib.h.
- The kernel needs to be inside an
extern "C"
enclosure to make it’s accessible by name at runtime. Next, you specify the namespace ofcoreimage
. You declare all the extensions in thecoreimage
namespace to avoid conflicts with Metal. - Here, you declare
colorFilterKernel
, which takes an input of typesample_t
.sample_t
represents a single color sample from an input image.colorFilterKernel
returns afloat4
that represents the RGBA value of the pixel. - Then, you declare a new
float4
,swappedColor
, and swap the RGBA values from the input sample. You then return the sample with the swapped values.
Next, you’ll write the code to load and apply the kernel.
Loading the Kernel Code
To load and apply a kernel, start by creating a subclass of CIFilter
.
Create a new Swift file in the Filters group. Name it ColorFilter.swift and add:
// 1
import CoreImage
class ColorFilter: CIFilter {
// 2
var inputImage: CIImage?
// 3
static var kernel: CIKernel = { () -> CIColorKernel in
guard let url = Bundle.main.url(
forResource: "ColorFilterKernel.ci",
withExtension: "metallib"),
let data = try? Data(contentsOf: url) else {
fatalError("Unable to load metallib")
}
guard let kernel = try? CIColorKernel(
functionName: "colorFilterKernel",
fromMetalLibraryData: data) else {
fatalError("Unable to create color kernel")
}
return kernel
}()
// 4
override var outputImage: CIImage? {
guard let inputImage = inputImage else { return nil }
return ColorFilter.kernel.apply(
extent: inputImage.extent,
roiCallback: { _, rect in
return rect
},
arguments: [inputImage])
}
}
Here, you:
- Start by importing the Core Image framework.
- Subclassing
CIFilter
involves two main steps:- Specifying the input parameters. Here, you use
inputImage
. - Overriding
outputImage
.
- Specifying the input parameters. Here, you use
- Then, you declare a static property,
kernel
, that loads the contents of the ColorFilterKernel.ci.metallib. This way, the library loads only once. You then create an instance ofCIColorKernel
with the contents of the ColorFilterKernel.ci.metallib. - Next, you
override outputImage
. Here, you apply the kernel by usingapply(extent:roiCallback:arguments:)
. Theextent
determines how much of the input image gets passed to the kernel.You pass the entire image, so the filter will apply to the entire image.
roiCallback
determines therect
of the input image needed to render therect
inoutputImage
. Here, therect
ofinputImage
andoutputImage
doesn’t change, so you return the same value and pass theinputImage
in the arguments array to the kernel.
Now that you’ve created the color kernel filter, you’ll apply it to an image.
Applying the Color Kernel Filter
Open ImageProcessor.swift. Add the following method to ImageProcessor
:
private func applyColorKernel(input: CIImage) {
let filter = ColorFilter()
filter.inputImage = input
if let outputImage = filter.outputImage,
let renderImage = renderAsUIImage(outputImage) {
output = renderImage
}
}
Here, you declare applyColorKernel(input:)
. This takes a CIImage
as input. You create the custom filter by creating an instance of ColorFilter
.
The filter’s outputImage
has the color kernel applied. You then create an instance of UIImage
using renderAsUIImage(_:)
and set this as the output.
Next, handle .colorKernel
in process(painting:effect:)
as shown below. Add this new case above default
:
case .colorKernel:
applyColorKernel(input: input)
Here, you call applyColorKernel(input:)
to apply your custom color kernel filter.
Finally, open PaintingWall.swift. Add the following in the switch
statement right below case 0
in the Button
‘s action
closure:
case 1:
effect = .colorKernel
This sets the effect to .colorKernel
for the second painting.
Build and run. Now tap the second painting, “The Last Supper”. You’ll see the color kernel filter applied and the RGBA values swapped in the image.
Great job! Next, you’ll create a cool warp effect on da Vinci’s mysterious “Salvator Mundi.”
Creating a Warp Kernel
Similar to the color kernel, you’ll start by adding a Metal source file. Create a new Metal file in the Filters group named WarpFilterKernel.ci.metal. Open the file and add:
#include <CoreImage/CoreImage.h>
//1
extern "C" {
namespace coreimage {
//2
float2 warpFilter(destination dest) {
float y = dest.coord().y + tan(dest.coord().y / 10) * 20;
float x = dest.coord().x + tan(dest.coord().x/ 10) * 20;
return float2(x,y);
}
}
}
Here’s what you added:
- Like in the color kernel Metal source, you include the Core Image header and enclose the method in an
extern "C"
enclosure. Then you specify thecoreimage
namespace. - Next, you declare
warpFilter(_:)
with an input parameter of typedestination
, allowing access to the position of the pixel you’re currently computing. It returns the position in the input image coordinates you can then use as a source.You access the x and y coordinates of the destination pixel using
coord()
. Then, you apply simple math to transform the coordinates and return them as source pixel coordinates to create an interesting tile effect.Note: Try replacingtan
withsin
inwarpFilter(_:)
and you’ll get an interesting distortion effect! :]
Loading the Warp Kernel
Similar to the filter you created for the color kernel, you’ll create a custom filter to load and initialize the warp kernel.
Create a new Swift file in the Filters group. Name it WarpFilter.swift and add:
import CoreImage
// 1
class WarpFilter: CIFilter {
var inputImage: CIImage?
// 2
static var kernel: CIWarpKernel = { () -> CIWarpKernel in
guard let url = Bundle.main.url(
forResource: "WarpFilterKernel.ci",
withExtension: "metallib"),
let data = try? Data(contentsOf: url) else {
fatalError("Unable to load metallib")
}
guard let kernel = try? CIWarpKernel(
functionName: "warpFilter",
fromMetalLibraryData: data) else {
fatalError("Unable to create warp kernel")
}
return kernel
}()
// 3
override var outputImage: CIImage? {
guard let inputImage = inputImage else { return .none }
return WarpFilter.kernel.apply(
extent: inputImage.extent,
roiCallback: { _, rect in
return rect
},
image: inputImage,
arguments: [])
}
}
Here, you:
- Created
WarpFilter
as a subclass ofCIFilter
withinputImage
as the input parameter. - Next, you declare the static property
kernel
to load the contents of WarpFilterKernel.ci.metallib. You then create an instance ofCIWarpKernel
using the contents of.metallib
. - Finally, you provide the output by overriding
outputImage
. Withinoverride
, you apply the kernel toinputImage
usingapply(extent:roiCallback:arguments:)
and return the result.
Applying the Warp Kernel Filter
Open ImageProcessor.swift. Add the following to ImageProcessor
:
private func applyWarpKernel(input: CIImage) {
let filter = WarpFilter()
filter.inputImage = input
if let outputImage = filter.outputImage,
let renderImage = renderAsUIImage(outputImage) {
output = renderImage
}
}
Here, you declare applyColorKernel(input:)
, which takes CIImage
as input. You then create an instance of WarpFilter
and set inputImage
.
The filter’s outputImage
has the warp kernel applied. You then create an instance of UIImage
using renderAsUIImage(_:)
and save it to output.
Next, add the following case to process(painting:effect:)
, below case .colorKernel
:
case .warpKernel:
applyWarpKernel(input: input)
Here, you handle the case for .warpKernel
and call applyWarpKernel(input:)
to apply the warp kernel filter.
Finally, open PaintingWall.swift. Add the following case in the switch
statement right below case 1
in action
:
case 2:
effect = .warpKernel
This sets the effect to .warpKernel
for the third painting.
Build and run. Tap the painting of Salvator Mundi. You’ll see an interesting warp-based tile effect applied.
Congrats! You applied your own touch to a masterpiece! ;]
Challenge: Implementing a Blend Kernel
The CIBlendKernel
is optimized for blending two images. As a fun challenge, implement a custom filter for CIBlendKernel
. Some hints:
- Create a subclass of
CIFilter
that takes in two images: an input image and a background image. - Use the built-in available
CIBlendKernel
kernels. For this challenge, use the built-in multiply blend kernel. - Create a method in
ImageProcessor
that applies the blend kernel filter to the image and sets the result as the output. You can use the multi_color image provided in the project assets as the background image for the filter. In addition, handle the case for.blendKernel
. - Apply this filter to the fourth image in PaintingWall.swift.
You’ll find the solution implemented in the final project available in the downloaded materials. Good luck!
Debugging Core Image Issues
Knowing how Core Image renders an image can help you debug when the image doesn’t appear the way you expected. The easiest way is using Core Image Quick Look when debugging.
Using Core Image Quick Look
Open ImageProcessor.swift. Put a breakpoint on the line where you set output
in applyColorKernel(input:)
. Build and run. Tap “The Last Supper”.
When you hit the breakpoint, hover over outputImage
. You’ll see a small popover that shows the address.
Click the eye symbol. A window will appear that shows the graph that makes the image. Pretty cool, huh?
Using CI_PRINT_TREE
CI_PRINT_TREE is a debugging feature based on the same infrastructure as Core Image Quick Look. It has several modes and operations.
Select and Edit the RayVinci scheme. Select the Run tab and add CI_PRINT_TREE as a new environment variable with a value of 7 pdf.
The value of CI_PRINT_TREE takes the form graph_type output_type options
.
graph_type
denotes the stages of the Core Image render. Here are the values you can specify:
- 1: The initial graph showing the color spaces.
- 2: An optimized graph showing exactly how Core Image optimizes.
- 4: A concatenated graph showing how much memory you need.
- 7: Verbose logging. This prints all the above graphs.
For output_type
, you can specify either PDF or PNG. It saves the documents to a temporary directory.
Build and run. Select “The Last Supper” in the simulator. Now, open the temporary directory on your Mac by navigating to /tmp using the terminal.
You’ll see all the graphs as PDF files. Open one of the files with _initial_graph.pdf as the suffix.
The input is at the bottom, and the output is at the top. The red nodes represent the color kernels, while the green nodes represent the warp kernels. You’ll also see each step’s ROI and extent.
To learn more about the various options you can set for CI_PRINT_TREE, check out this WWDC session: Discover Core Image debugging techniques.
Where to Go From Here?
You can download the completed version of the project using the Download Materials button at the top or bottom of this tutorial.
In this tutorial, you learned to create and apply custom filters using Metal-based Core Image kernels. To learn more, check out these WWDC videos:
- Advances in Core Image: Filters, Metal, Vision and More
- Build Metal-based Core Image kernels with Xcode
You can also refer to Apple’s guide, Metal Shading Language for Core Image Kernels.
I hope you enjoyed this tutorial. If you have any questions or comments, please join the forum discussion below.
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development — plans start at just $19.99/month! Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.
Learn more