iOS 7 Blur Effects with GPUImage

iOS 7 embodies deference, clarity and depth. Using GPUImage, this tutorial looks at one of the most appealing aspects of depth; the iOS 7 blur effect. By Mike Jaoudi.

Leave a rating/review
Save for later
You are currently viewing page 3 of 4 of this article. Click here to view the first page.

Resetting Blur Filters

This issue comes about the second time you calculate the blur; the proper way to solve this is to remove all of the targets from the blur which resets the filter. If you don’t then the filter call outputs nothing at all.

Update updateBlur as shown below:

    if(_blurFilter == nil){
        _blurFilter = [[GPUImageiOSBlurFilter alloc] init];
        _blurFilter.blurRadiusInPixels = 1.0f;
    UIImage *image = [self.view.superview convertViewToImage];
    GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:image];
    [picture addTarget:_blurFilter];
    [_blurFilter addTarget:_blurView];
    [picture processImageWithCompletionHandler:^{
        [_blurFilter removeAllTargets];

Here you’ve replaced processImage with processImageWithCompletionHandler:. This new method has a completion block that runs once the image processing is complete. Once the image is completely processed, you can safely remove all of the targets.

Build and run your app; tap the Menu and see if the black box issue is gone for good:


Open and close the menu a few more times to make sure you’ve squashed that bug for good.

Look closely at the blur effect as the menu opens — something doesn’t look quite right. To get a closer look, slow down the animation to see what’s happening in slow motion.

Update the duration of the animation block in show to 10.0f.

Build and run your app, tap Menu, and watch the menu appear in slow motion:


Ah, now you can see what’s wrong. The blurred image is sliding in from the top. You really want it to appear as if the blur effect itself is sliding down the screen.

Aligning the Background Image

This is where you need to play some tricks with the static blur. When the menu comes down, you need to align the blur with the backdrop instead. So instead of shifting the image view down, you need to expand it, starting at size zero and expanding to full size. This ensures the image will stay in place as the menu opens.

You already have the menu open to the full size in show — you just need to set the height of contentRect to zero when the image view is first created and when it is hidden.

Add the following code to viewDidLoad in DropDownMenuController.m, just below where you initialize _blurView and set it’s initial properties:

_blurView.layer.contentsRect = CGRectMake(0.0f, 0.0f, 1.0f, 0.0f);

Still in the same file, add the following line to the bottom of the animation block in :

_blurView.layer.contentsRect = CGRectMake(0.0f, 0.0f, 1.0f, 0.0f);

The contentRect property can be animated as well; therefore the original and updated rects will be interpolated automatically during the animation.

Build and run your app. Fortunately you still have the animation running slowly, so it’s easy to check if you’ve fixed the issue:


That looks much more natural. You now have a slide-in menu with a blurred background.

Before moving on, re-adjust the little bits of code you changed for testing purposes.

Go to the show method and change the duration of the animation block to 0.25.

Next, in updateBlur, change the value of _blurFilter.blurRadiusInPixels to 4.0f.

Build and run your app; pop open the menu a few times to see what it looks like now:


Live Blurring

Live blurring is a technically difficult issue to solve. In order to do live blurring effectively you need to capture the screen, blur it and display it, all whilst maintaining 60 frames per second. Using GPUImage, blurring images and displaying them at 60 frames per second is no problem.

The real tricky bit? How to capture the live screen images, believe it or not.

Since you are working with capturing the main user interface, you must use the main thread of the CPU to capture the screen and convert it into an image.

Note: The human eye generally can’t notice anything faster than 46 frames per second. This helps us out as developers because modern processors can get a lot of work done between each of those frames!

A Brief Branch on Threading

When you run a program, you are executing a list of instructions. Each list of instructions runs inside its own thread, and you can run multiple lists of instructions concurrently in separate threads. An app starts on the main thread and new threads are created and executed in the background as necessary. If you haven’t had to manage multiple threads before, you’ve likely always written your apps to run exclusively on the main thread.

The main thread handles interactions and updates to the user interface; it’s critical to make sure that it stays responsive. If you overload the main thread with tasks, you will see the user interface start to stutter or freeze up completely.

If you’ve ever scrolled through the Twitter or Facebook app on your phone, then you’ve seen background threads in action. Not all profile pictures show up immediately as you scroll; the app launches background threads to fetch the image, and once the image has been retrieved it then shows up on-screen.

Without background threads, the scrolling table view would freeze while it tried to retrieve each and every profile image; since retrieving an image can take several seconds, it’s best passed off to a background thread to keep the user interface smooth and responsive.

So how does this affect your app? The UIView snapshot APIs covered earlier must be run on the main thread. This means that each time they run, the entire interface will freeze for a moment before it continues on.

For static blurs, this action happens so fast that you don’t notice it. You only need to capture the screen once. However live blur effects need to capture the screen 60 times a second. If you performed live captures like this on the main thread, animations and transitions would become choppy and stuttered.

Even worse, as the complexity of your user interface increased, the time it would take to capture the interface would also increase, and your app would stutter even more!

What to do, dear reader?

Potential Live Blur Solutions

One solution which many open source live blur libraries use is to slow down the capture frame rate. So instead of capturing the interface every 60 seconds, you capture it maybe 20, 30 or 40 times a second. Even though it doesn’t seem like much of a difference, your eye will pick up the delay. You’ll notice that the blur is out of sync with the rest of the app — and it sometimes looks worse than doing no blurring at all.

Apple handles live blur in some of their apps without issue — but unfortunately they haven’t made this API public yet. The UIView snapshot APIs of iOS 7 are a huge improvement over the old ways of doing things, but they aren’t quite fast enough for live blurring.

Some developers have seized on the blurring features of the UIToolbar to do their dirty work. Yes, it works, but it’s strongly advised that you do NOT use that in your production apps. Sure, it isn’t a private API, but it is an unsupported feature and Apple may reject your app should you use it. This means there are zero guarantees or promises that it will continue to work this way in your app under future versions of iOS 7.

Apple could modify the UIToolbar at any point and break your app in ugly ways. In the iOS 7.0.3 update, Apple modified the effect in the UIToolbar and UINavigationBar and some developers reported that the effect stopped working altogether. Don’t fall into this trap!

Mike Jaoudi


Mike Jaoudi


Over 300 content creators. Join our team.