How To Create an App Like Instagram With a Web Service Backend – Part 1/2
There’s no doubt photography apps have huge momentum on the App Store. With the iPhone 4’s awesome camera and fast processor, taking photos and applying various effects is a blast.
You guys requested a tutorial on how to create a photo application that pairs with a web service as a backend, and your wish is our command! :]
In this tutorial, you’ll learn how to make a simple photo sharing app, like an extremely simple version of Instagram. In particular, you’ll learn:
Using a blank startup project that has all the UI already set up, this tutorial will cover how to:
- How to connect to a JSON-based web API in Objective-C
- How to create a simple JSON API in PHP
- How to implement user authorization for the API
- How to take photos, apply effects and send them over to the JSON service.
Quite a lot of cool stuff, eh? ;]
You will also need access to a web server with MySQL running. If that sounds intimidating, check out this project, which makes it pretty easy to run a local test server on your Mac.
If you are still feeling insecure about the process of setting up a web server on your Mac, have a look at this great tutorial, which covers all the basics of setting up your development environment.
Without further ado, strike a pose for the camera – and let’s get started!
Download the startup project and extract the contents of the ZIP file to a convenient location on your hard drive. It includes quite few things, so I’ll briefly go over what’s inside.
The project you’re going to develop in this tutorial is called iReporter. It’s an app that gives you the ability to see a stream of all users’ photos, and if you’d like to, register to upload your own photos to the stream.
iReporter has in total four view controllers. Have a look at Storyboard.storyboard in the project file list to get a feeling of what the workflow looks like:
The initial view controller is a navigation controller that shows a screen called StreamScreen after the app launches. This is where you are going to show the stream of all user-uploaded photos.
If you follow the segue going to the screen below, you get to a controller called StreamPhotoScreen. This screen will show up when the user taps a photo thumbnail, and will show a larger-size photo and the photo title.
From StreamScreen there’s a second segue going to the right to a screen called PhotoScreen. On PhotoScreen, you’ll show an action sheet and allow the user to take a photo, apply effects to it and finally post it to the web API.
To be able to do the above, the user will have to authorize themselves to the API. That means when the PhotoScreen appears, in case the user hasn’t logged in yet, you will show a modal screen with the LoginScreen view controller asking the user to either login or register.
Lucky for you, the startup project already includes classes for all of those screens, including connected IBOutlets and IBActions (empty bodies which you will implement during the tutorial).
There are a few more files in the startup project. Inside “Categories,” you’ll find the UIImage categories from Trevor Harmon. You’ll need these to easily resize and crop images. As a side benefit, Trevor’s categories fix the faulty image orientation that sometimes comes from iPhone’s camera! So thanks again Trevor!
I also included a small category which facilitates showing UIAlertView on the screen with a simple message.
As you can see, you’re off to a good start :] Next, you’ll add a third-party library to the project that will handle all the networking for you.
Outsourcing Your App’s Social Life
You can use the good old NSURLConnection class provided by iOS to handle the communication with the web API. But it’s 2012… let’s do something sexier :]
Of the few up-and-coming networking libraries out there for Objective-C, it seems AFNetworking has the most momentum right now, so you’re going to use this one as well. Head over to the AFNetworking Git and download the current version. (Short docs are included on the page, too.)
After downloading, inside the AFNetworking folder you’ll find a subfolder called “AFNetworking.” Just drag and drop it into your project file list. You should get a prompt similar to the following:
Make sure that “Copy items into destination group’s folder (if needed)” is checked so that the AFNetworking files will be copied into your project folder. Click Finish, and you should see the AFNetworking files included in your files list.
AFNetworking is not ARC-compatible, but your skeleton project is set to use Automatic Reference Counting (ARC). So you’ll have to set the AFNetworking classes to be non-ARC so that the compiler knows how to handle them.
Select the Xcode project root (as in the picture above), and then switch to the “Build Phases” tab in the right-hand pane. Find and open the “Compile Sources” strip and you’ll see the classes to be compiled within your project.
Scroll down. At the bottom you’ll see all the files for AFNetworking (from AFHTTPClient.m to UIImageView+AFNetworking.m). Select them all (as in the image below), press Enter on the keyboard and in the popup enter “-fno-objc-arc” and click Done. All the AFNetworking files should now be marked as not supporting ARC.
Press Cmd+B now to build the project. If everything is set up correctly, the project should compile successfully. (Except, that is, for a few warnings in the UIImage category classes from Trevor. You can disregard these.)
Note: When you include AFNetworking in your project, you have to go to your project’s .pch and add “#import <SystemConfiguration/SystemConfiguration.h>.” Otherwise, AFNetworking won’t compile. This is already done for you in the startup project, but it’s good to know for the future.