Your Own Image Picker With Flutter Channels

In this tutorial you will learn how to use Flutter Channels to communicate with platform code and create an image picker for both Android and iOS. By JB Lorenzo.

Leave a rating/review
Download materials
Save for later

Flutter? So you’ve decided to go down the Dart side, eh? Great! Imagine you want to make something more complex than UI and logic. Something that will help you develop that buy-and-sell app that you are making. Wouldn’t it be great if you could allow the user to select photos from their own gallery in your app? Is this possible? Without the use of plugins? (Spoiler alert: Yes!)

In this tutorial, you will:

  • Learn to use Flutter’s Platform Channels.
  • Learn how to access photos from Android and iOS to use them within your Flutter components.
  • Build an image picker in Flutter using Platform Channels.

Don’t change the channel, this article will be right back after these important messages!

Note: This tutorial assumes that you’re already familiar with the basics of Flutter development. If you are new to Flutter, read through our Getting Started With Flutter tutorial. Other pre-requisites include knowledge of using Android Studio with Flutter. Using Xcode is optional, but you must have it installed.

Getting Started

You can download the project files by clicking on the Download Materials button at the top or bottom of the tutorial. Then, open the project up in Android Studio 3.4 or greater. You should be using a recent version of Flutter, 1.5 or above. Be sure to get Flutter dependencies for the project if prompted to do so by Android Studio. You’ll start working with the project after a bit of theory.

Notably, you’ll find the grid user interface for the image picker already provided for you in the starter project.

Flutter and Platform-Specific Code

Flutter provides a lot of functionality on its own. You can create a very pleasing interface together with cool animations. Yet, some functionality might not be available directly in Flutter because it’s platform specific. For example, you might want to use some sensors from your device or access a specific feature on Android or iOS.

In that case, you might first try to find a Flutter plugin/package provided by Google or some other third party. If you find something, you’re in luck. But you might want to understand how they created the plugin. If you don’t find anything, you will have to develop your own solution. You can make your own solutions using Flutter’s Platform Channels.

Platform Channels

Platform Channels allow communication between the Flutter layer of your app and the platform layer, that is, the iOS or Android layer. You use channels by sending messages to and from each end. For example, you can send messages to the channel from Flutter; then, the platform layer reacts to the messages by passing back a result. See the image below for reference.

Method Channels

As shown above, the arrows are two-way. Thus, you can pass messages from the platform layer, too. Similarly, the Flutter layer will react and send back a result. Also, you access the channel asynchronously. Hence, you can perform long operations with channels without slowing down your UI.

The MethodChannel below is the component used to enable this communication from the Flutter side:

const MethodChannel(
  String name, [
  MethodCodec codec = const StandardMethodCodec()

To start, you provide a name to the method channel. You can provide an optional codec that will be used to encode and decode the message. The StandardMethodCodec is used as a default, but there are other codecs to choose from. The specialized codecs are BinaryCodec, StringCodec, and JSONMessageCodec. You can also make your own codec.

Then, you will choose how to invoke the method. You can use one of the three invokation options below:

Future<T> invokeMethod <T>(
  String method, [
  dynamic arguments

Future<List<T>> invokeListMethod <T>(
  String method, [
  dynamic arguments

Future<Map<K, V>> invokeMapMethod <K, V>(
  String method, [
  dynamic arguments

For example, you can use invokeMethod to pass a message through the channel with a method name and optional arguments.

You also call setMethodCallHandler to provide a way to handle the incoming messages. Usually, you call this method on the receiving side of the channel. But since you can do two-way communication, you can in fact call this on either side.


There is another platform channel provided by Flutter, namely, the BasicMessageChannel, and it looks like this:

const BasicMessageChannel<T>(
  String name,
  MessageCodec<T> codec

As the name implies, this is a more basic option for asynchronously passing messages using a custom codec.

In contrast to method channels in which you pass a method and parameters, with the BasicMessageChannel, you simply call send with your desired message to pass to the platform. Note that you need to specify the codec for this channel; there is no default. Thus, the message can be anything as long as the channel can decode it:

Future<T> send(T message) async {
  return codec.decodeMessage(await BinaryMessages.send(name, codec.encodeMessage(message)));

This is how sending a message through a BasicMessageChannel works internally. Specifically, the channel’s codec encodes the message. Since the codec is customizable, you can pass almost anything here.

Writing Your Platform-Specific Code

After all that theory, you are now ready to build something! In the rest of the tutorial, you will:

  • Learn how to read photos from Android.
  • Replicate the above for iOS.
  • Open a channel from Dart.
  • Prepare the data that you will be passing through the channel.
  • On the receiving side, bring back that data into something you can use.

At that point, you will be able to see the image picker working.

For a challenge, you will get to know the concept of Flutter plugins. Plugins give you the ability to make this image picker into sharable code. Now, you should start by taking some photos, then go on to the next section.

Getting Images From the Gallery

The concepts of this platform-specific section are simple. First, you need to request access for photos. Then, you need to make a few methods. One method gets the total number of images; another is to return the image data, including extra data, for one image given an index. Time to start!

Accessing the Gallery in Android

First, open the starter project on Android Studio. Build and run the app. Set the device to Android. You should see something similar to the below.

Getting the Total Image Count on Android

Open the Kotlin file MainActivity.kt, which you find can within the android/app/src/main/kotlin path. In the onCreate method, notice that there is already a request for permissions. This is to read images located in your external storage. Next, put the following inside getGalleryImageCount(), replacing the existing content:

val uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI
// 1
val cursor = contentResolver.query(uri, columns, null, null, null);
// 2
return cursor?.count ?: 0

This returns the total number of images that are in your external storage. Going over each line:

  1. Here, you open a cursor using the columns provided. The cursor is pointed at the media content in your external storage.
  2. You return the total number of items in that cursor. You are also handling the case of the cursor being null. In that case, the count will be zero.

So now you return the total number of images in a method. Next, you should fill in the details for getting image data.

Getting Image Data on Android

Next, update dataForGalleryItem() to the following, being sure to add the imports shown:


private fun dataForGalleryItem(index: Int, completion: (ByteArray, String, Int, String)
  -> Unit) {
  val uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI
  val orderBy = MediaStore.Images.Media.DATE_TAKEN
  // 1
  val cursor = contentResolver.query(uri, columns, null, null, "$orderBy DESC")
  cursor?.apply {
    // 2

    // 3
    val idIndex = getColumnIndexOrThrow(MediaStore.Images.Media._ID)
    val createdIndex = getColumnIndexOrThrow(MediaStore.Images.Media.DATE_ADDED)
    val latitudeIndex = getColumnIndexOrThrow(MediaStore.Images.Media.LATITUDE)
    val longitudeIndex = getColumnIndexOrThrow(MediaStore.Images.Media.LONGITUDE)
    val id = getString(idIndex)
    // 4
    val bmp = MediaStore.Images.Thumbnails.getThumbnail(contentResolver, id.toLong(), MediaStore.Images.Thumbnails.MINI_KIND, null)
    val stream = ByteArrayOutputStream()
    bmp.compress(Bitmap.CompressFormat.JPEG, 90, stream)
    val data = stream.toByteArray()
    // 5
    val created = getInt(createdIndex)
    val latitude = getDouble(latitudeIndex)
    val longitude = getDouble(longitudeIndex)
    // 6
    completion(data, id, created, "$latitude, $longitude")

Going over each in turn:

  1. Here, you open a cursor again with the columns from earlier. You also request sorted items descending with the date taken.
  2. This moves the cursor to that index.
  3. Then, you get each column item’s column index and the image id.
  4. This block of code gets a thumbnail of the image and converts the bitmap to a byte array.
  5. Here, you get the corresponding data for each column item. Depending on the data type of that column, use an appropriate method.
  6. Finally, hand all the data that you got to a completion function.

Build and run the project — but make sure you have photos in your gallery, otherwise, you will get an error. Then, you should check Run panel to see something similar to the following:

2019-04-28 20:30:09.695 9033-9033/com.raywenderlich.imagepickerflutter I/System.out: number of items 2
2019-04-28 20:30:10.669 9033-9033/com.raywenderlich.imagepickerflutter I/System.out: first item [B@86487e3 12 1556476198 0.0, 0.0

This proves that you have the total count of your images. This also shows that you were able to get the first image, along with the timestamp and coordinates.

If you get an error saying something like the following, add a photo to your gallery, e.g., take a selfie:

android.database.CursorIndexOutOfBoundsException: Index 0 requested, with a size of 0

You can also modify the code so that it doesn’t try to fetch an image when there are none.

The Android Kotlin code is setup, so now you are ready to move on to the iOS-specific code.

Getting the List of Images on iOS

You can still use Android Studio here to edit the project Swift code, but make sure that you are targeting an iOS simulator or device. See the following screenshot to confirm that you are setting it properly:

Build and run the project while targeting an iOS simulator or device. You should see something like the image below:

Similar to the previous section, you will need to make methods to return the total count and image data.

Getting the Total Image Count on iOS

Still in Android Studio, or in Xcode if you prefer, open AppDelegate.swift, which you can find in iOS/Runner. Then add the following code, replacing the placeholder for getGalleryImageCount():

import Photos

func getGalleryImageCount() -> Int {
  // 1
  let fetchOptions = PHFetchOptions()
  fetchOptions.includeHiddenAssets = true

  // 2
  let collection: PHFetchResult = PHAsset.fetchAssets(with: fetchOptions)
  // 3
  return collection.count

Here, you are doing the following:

  1. Declare fetchOptions that includes hidden assets. This parameter is useful to show the images from the simulator.
  2. Query fetch assets using the options from the previous step.
  3. Return the total count of assets.

Next, you need to get the image data for a given index.

Getting Image Data on iOS

Like for Android, replace the placeholder for dataForGalleryItem(), this time using the following Swift code:

func dataForGalleryItem(index: Int, completion: @escaping (Data?, String, Int, String) -> Void) {
  // 1
  let fetchOptions = PHFetchOptions()
  fetchOptions.includeHiddenAssets = true
  let collection: PHFetchResult = PHAsset.fetchAssets(with: fetchOptions)
  if (index >= collection.count) {

  // 2
  let asset = collection.object(at: index)

  // 3
  let options = PHImageRequestOptions()
  options.deliveryMode = .fastFormat
  options.isSynchronous = true
  let imageSize = CGSize(width: 250,
                         height: 250)

  // 4
  let imageManager = PHCachingImageManager()
  imageManager.requestImage(for: asset, targetSize: imageSize, contentMode: .aspectFit, options: options) { (image, info) in
    // 5
    if let image = image {
      // 6
      let data = UIImageJPEGRepresentation(image, 0.9)
                 Int(asset.creationDate?.timeIntervalSince1970 ?? 0),
                 "\(asset.location ?? CLLocation())")
    } else {
      completion(nil, "", 0, "")

Going over each one by one:

  1. You open a query for the assets similar to the previous method. You also stop getting data if the index is beyond the number of images.
  2. You get the asset at the given index.
  3. You declare the request options to get the image synchronously. Also, using the fast way (not high quality), and with that image size.
  4. You request the image.
  5. If the image exists, process it, otherwise return nil.
  6. Convert the UIImage into data and pass it to the completion closure. Also, include the image identifier, creation date and location.

Once you’ve made these changes, build and run your project. It’s possible you may need to restart your app a second time after accepting the prompt to allow access to photos. You should then see the same interface but have a similar output to the below in your logs. Make sure you are looking at the Run panel of Android Studio or the console in Xcode:

image count: 9
first data: 41540 bytes 106E99A1-4F6A-45A2-B320-B0AD4A8E8473/L0/001 1299975445 <+38.03744450,-122.80317833> +/- 0.00m (speed 0.00 mps / course 0.00) @ 1/1/01, 1:00:00 AM Central European Standard Time

This shows you have successfully read the image count and image data. Hooray!

Now you can finally move to setting up the platform channels so that you can pass all this image information to Flutter.