AR Foundation in Unity: Getting Started

In this AR Foundation tutorial, you’ll learn how to build a cross-platform AR application in Unity. By Ken Lee.

3.4 (12) · 2 Reviews

Download materials
Save for later

Augmented reality (AR) is one of the fastest growing technologies in today’s software industry. Many applications use AR technology to simulate makeup, camera filters and stage effects.

Developing an AR application from scratch isn’t an easy task. You need to know advanced algorithms for imaging processing, motion tracking, spatial analysis and even machine learning. Luckily, Apple and Android have developed their own AR software development kits (SDKs), which combine the necessary algorithms into tidy packages to make the task easier.

Unfortunately, if you want to build an AR application for both iOS and Android devices, you need to use both SDKs, which doubles your development efforts. To solve this, Unity has a handy library called AR Foundation. This library can help you build your AR application for both iOS and Android with a single codebase!

In this tutorial, you’ll learn how to build an AR doodling application. After you’ve completed this tutorial, you’ll be familiar with:

  • How AR Foundation works
  • Installing and setting up AR Foundation
  • Detecting surfaces in the AR environment
  • Interacting with the AR environment with raycasting
  • Improving visuals with the post-processing module

The materials for this tutorial were built in Unity version 2020.2. You can get this version from the Unity website or install it via Unity Hub.

Note: This tutorial requires basic knowledge of the Unity Editor and C# programming. If you’re new to Unity development, check out our tutorial on getting started with Unity.

Getting Started

Download the starter project by clicking the Download Materials button at the top or bottom of the tutorial.

Unzip its contents and open the ARDoodle-Starter project in Unity.

After the project loads, you’ll see the RW folder in the Project window and the folder structure broken down as follows:

  • Materials: Materials for the scene
  • Prefabs: Pre-built components composed of scripts and models
  • Scenes: The AR scene and the doodle testing scene
  • Scripts: Scripts of Doodle components and the UI
  • Settings: The settings file for the rendering
  • Sprites: The graphics used by the UI

Open the DoodleTest demo scene from the Scenes folder and click Play to try it for yourself!

Doodle Demo

Your task is to bring this doodling application into the realm of AR! (Bonus: If you have kids, this will let them doodle freely without upsetting mom and dad. :])

AR Doodling on Wall

But first you’ll learn more about AR Foundation, what it is and what it does.

Understanding AR Foundation

Take a look at the most popular AR devices and their SDKs:

Device Operating System AR SDK
iPhone, iPad iOS ARKit
Android Phone Android ARCore
Magic Leap Lumin OS LuminSDK
HoloLens Microsoft Windows Mixed Reality Toolkit SDK

Before AR Foundation, developers had to write different sets of code using device-specific SDKs to communicate with the AR devices. If you wanted to support all devices in a single application, you had to write several branching logics to switch between platforms. This made the development more time-consuming and the codebase more complicated.

With AR Foundation, you can use a unified interface to control device-specific SDKs!

To do this, you’ll dive down a little bit more into the architecture of AR Foundation.

AR Foundation Architecture

AR Foundation architecture

There are two core parts of the AR Foundation architecture:

  • AR Subsystem
  • AR Components

The AR subsystem is the bridge between Unity and the Native AR SDK modules. Every subsystem provides different features, e.g. ImageTrackingSubSystem detects images and RaycastSubSystem casts rays.

AR Subsystems

The subsystems only define method interfaces; they don’t have any implementation details. The actual implementation details are written in different device-specific XR plugins, such as the ARKIT XR Plugin for iOS and the ARCore XR Plugin for Android. These plugins implement the methods defined in the subsystem using the native AR SDK functions.

Although you can use the subsystems to control the native AR SDK, they’re not always easy to use. AR components are much more developer friendly.

Here are some components you’ll encounter:

  • ARSession: for the lifecycle of the AR
  • ARPlaneManager: for surface detection
  • ARRaycastManager: for detecting user touch in the AR environment

Now that you have an overview of the architecture, you’ll move on to the next step of the tutorial.

Building Your Application to the Device

AR Foundation requires either an ARKit- or ARCore-powered phone to run. So first things first, you need to have your device ready and connected to your computer.

If you haven’t set up iOS or Android projects before, please read the instructions:

  • For iOS, please click here.
  • For Android, please click here.

Now, try to build and run the starter project on your device.

Open the Building Setting window by selecting File ▸ Build Setting from the menu. Set RW/Scene/ARScene as the first open scene, and choose iOS or Android in the platform panel.

Next, choose Switch Platform. It may take a few minutes to process, so grab some coffee or tea. :]

When the process is done, click Build and Run. When the file dialog appears, select a build folder, enter ArDoodle in the Save As field, and Save.

Unity will build and deploy the application to your device.

Build to device

Tip: It’s good practice to build your application to the device once before installing the AR Foundation plugin. This way, if something goes wrong, it helps determine whether the problem came from the platform setting or the plugin.

Installing AR Foundation Packages

Next, you’ll install the AR Foundation packages you need for this tutorial:

  • AR Foundation: The core module for all AR Foundation applications
  • ARCore XR Plugin: The plugin for Google AR SDK — ARCore
  • ARKit XR Plugin: The plugin for iOS AR SDK — ARKit

Here’s how you do it:

  1. Open the package manager from Window ▸ Package Manager.
  2. Select Unity Registry at the top of the menu.
  3. Find AR Foundation, ARCore XR Plugin and ARKit XR Plugin in the package list, select version 4.0.10, and click Install.

Installing ARFoundation Package

Here’s what your Package Manager should look like with the AR Foundation packages installed:

Package Manager Windows

To verify the installation, check that the following items exist:

  1. XR option in GameObject Menu
  2. XR option in Assets Menu
  3. XR Plugin Management setting in Project Setting

New XR Menus

Setting Up the AR Environment

Before you start developing, you need to modify some settings to get AR Foundation up and running.

First, configure the Project Setting:

  1. Select Edit ▸ Project Setting to open the Project Setting window.
  2. Select the XR Plug-in Management ▸ iOS tab.
  3. Check ARKit in Plug-in Providers.
  4. Open the XR Plug-in Management ▸ Android Setting tab.
  5. Check ARCore in Plug-in Providers.

A new folder named XR is automatically created in the Assets folder. This folder contains scripts and settings needed by AR Foundation.

Since this project uses the Universal Render Pipeline (URP), you also need to configure the URP renderer setting. Open Assets/RW/Settings/ForwardRenderer, click Add Renderer Features in the Inspector, and select AR Background Renderer Feature.

One of the awesome things about URP is its post-processing feature, which enhances the visuals of the application. You’ll near about this later.

Next, you’ll modify the AR scene. Every AR Foundation scene needs two core components: ARSession and ARSessionOrigin. So you’ll add them to the scene next.

  • Open Assets/RW/Scene/ARScene, right-click an empty space in the Hierarchy, and select XR ▸ AR Session to create the ARSession object.
  • Right-click an empty space in the Hierarchy again and select XR ▸ ARSessionOrigin to create the ARSessionOrigin object.

The ARSession and ARSessionOrigin objects are ready now, but they don’t have any AR features to help verify whether they’re working.

Verifying AR Capabilities

Luckily, you can use a basic feature called Feature Point Tracking to verify the AR capabilities. This feature finds the feature points in the AR environment.

  • Select ARSessionOrigin in the Hierarchy, click Add Component in the Inspector, search for AR Point Cloud Manager and select the item.
  • Then, right-click in the Hierarchy and select XR ▸ AR Default Point Cloud to create an AR Point Cloud object.
  • When the object is created, assign it to AR Point Cloud Manager’s Point Cloud Prefab in the Inspector.

Add Point Cloud Manager

Now, build and run to check that the Point Cloud is working.

AR Point Cloud demo

If you find lots of yellow spots, it means AR Foundation is working.

After testing, disable the ARPointCloudManager component to prevent feature points from showing up in the future.

If you’re building on Android, you may experience a problem related to the build setting.

Android Build Error

You can fix it by changing the following items in the Android Player Setting:

  • Remove Vulkan from the Graphics APIs.
  • Set Minimum API Level to API Level 24.

Fixed Building Setting

About ARSession and ARSessionOrigin

In the last section, you used ARSession and ARSessionOrigin to make a simple AR application. Both of them are key components of AR Foundation programs, so it’s important to understand them.


ARSession controls the lifecycle of an AR session. It keeps getting updates from the subsystem to check if the session is still alive.


ARSessionOrigin keeps your virtual objects in the correct position in AR environments.

It keeps getting updates from the subsystem and modifies the position of the virtual objects to stay aligned with the real environment. Notice that you need to place your AR objects under the ARSessionOrigin in the Hierarchy. If you don’t, the positions of the objects won’t update.

Exploring the Doodle Logic

Before making the AR version of the Doodling App, take a look at the original doodling logic.

The doodle logic consists of DoodlePen and DoodleLine.

DoodleLine handles all the line-drawing logic. When users touch the screen, it casts a ray to find the position hit on the surface and feeds the position to Line Renderer to show the doodle line. Notice that the raycasting logic isn’t implemented in the DoodleLine class. The concrete implementation is defined outside, so we can toggle between the standard raycasting logic and AR raycasting logic later on.

DoodlePen is a class for managing the doodle lines. When users touch the screen, a new DoddleLine object is instantiated with the selected line size and color. Currently, DoodlePen only supports non-AR mode. Your job is to add AR environment support with AR raycasting logic.

Consider developing non-AR logics first, because you can’t review your AR Scene in the Unity Editor. If you build your device to check your work, you’ll end up wasting a lot of time on the build process.

Bringing the Doddle Logic to AR

To evolve the Doodle app to AR, you need to do two things:

  • Detect an AR Surface for doodling.
  • Add the AR raycasting logic to DoodlePen.

Detecting the AR Surface

First, detect the AR surface:

  1. Open the Assets/RW/ARScene scene.
  2. Select ARSessionOrigin in the Hierarchy, click Add Component in the Inspector and find and select AR Plane Manager.
  3. Right-click an empty space in the Hierarchy and select XR ▸ AR Default Plane to create an AR Plane Object. This object is created when a surface is found in AR.
  4. Select AR Plane Manager in the Inspector and assign the AR Plane object to the AR Plane Manager Plane Prefab field.

Build and run. When you’re ready, scan a wall to find the AR surface.

Plane Tracking Demo

You can now add the DoodlePen’s raycasting logic by modifying the DoodlePen code and component.

Adding the AR Raycasting Logic

In Assets/RW/Scripts, double-click DoodlePen.cs to launch the file in the code editor.

First, add two using statements above the class definition:

using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

Then, add the following variables inside the class definition:

public ARRaycastManager raycastManager;
public bool arMode = false;

raycastManager is used to reference ARRaycastManager in the scene, and arMode is the flag to toggle between AR and non-AR mode.

Add the following method:

bool GetArRaycastLogic(out Vector3 hitPosition)

    // 1
    var hits = new List<ARRaycastHit>();

    // 2
    bool hasHit = raycastManager.Raycast(Input.mousePosition, hits, TrackableType.PlaneWithinInfinity);

    // 3
    if (hasHit == false || hits.Count == 0)
        hitPosition =;
        return false;
        hitPosition = hits[0].pose.position;
        return true;

Here’s how the code breaks down:

  1. Define the output variable for storing the hit information.
  2. Use the ARRaycastManager to find out where the AR surface is touched. Input.mousePosition is the position where users touched the screen, and hits contains the position where the ray is hitting the AR surface.
  3. Return true and the hit position if the ray successfully hit a surface, or return false to the caller.

Lastly, modify the content of SetupRaycastLogic():

if (arMode)
     doodleLine.raycastDelegate = GetArRaycastLogic;
     doodleLine.raycastDelegate = GetNonArRaycastLogic;

This lets the DoodleLine toggle between AR and non-AR raycasting logic depending on the arMode variable.

Save the code and return to Unity.

Select ARSessionOrigin in the Hierarchy and add the AR Raycast Manager component from the Inspector.

Then, go to Assets/RW/Prefabs and drag DoodlePen into the ARSessionOrigin object in the Hierarchy. Select DoodlePen in the Hierarchy and set the following properties in the Inspector:

  • Main Camera: AR Session Origin ▸ AR Camera
  • Raycast Manager: AR Session Origin
  • Ar Mode: check to enable

Build and run to test the AR doodling.

AR Doodle Demo

Building the User Interface for AR

Now you need to create the UI to control the color and size of the lines, along with a clear button to delete the drawn doodles.

You’ll also build a coaching dialog telling users to find a surface in the AR environment.

First, double-click DoodleUI.cs in Assets/RW/Scripts and add the following code above the class definition:

using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

Then, add the following variables inside the class definition:

public ARPlaneManager planeManager;
public bool arMode;

Replace the code after SetPenDrawingBound() in Start() with the code below:

if (arMode)
    // 1

    // 2
    planeManager.planesChanged += PlanesChanged;
    // 3

Here’s how the code breaks down:

  1. If using AR mode, show the coaching dialog.
  2. Add the listener for the planesChanged event.
  3. Show the Doodle UI if not using AR mode.

Now, add the PlanesChange method:

private void PlanesChanged(ARPlanesChangedEventArgs planeEvent)
    if (planeEvent.added.Count > 0 || planeEvent.updated.Count > 0)

        planeManager.planesChanged -= PlanesChanged;

planeEvent.added.Count and planeEvent.updated.Count are examined to determine if a surface has been found. If the count is more than one, it means a surface has been found and the app can present the DoodleUI.

Save and return to the editor.

You still need to add and configure the DoodleUI.

Drag DoodleUI from Assets/RW/Prefabs to Canvas in the Hierarchy. Then, configure the properties of the DoodleUI script in the Inspector:

  • Pen: DoodlePen
  • Coaching UI: ScanSurfacePanel
  • Doodle UI: DoodlePanel
  • Plane Manager: AR Session Origin
  • Ar Mode: check

Build and run.

AR GUI demo

Now you have more control over the style of the doodle lines.

Improving Visuals With Post Processing

Although the AR doodling is ready, the visuals aren’t super appealing. To improve this, you’ll apply some post-processing effects:

  • Vignette: This effect makes the edges on the screen darkened.
  • Bloom: This effect makes doodle lines glow.

To set these up, do the following:

  • Right-click an empty area in the Hierarchy and select Volume ▸ Global Volume, which handles the global post-processing effects.
  • Select Global Volume and click New at Profile in the Inspector.
  • Click Add Override in the Volume component and select Post Processing ▸ Bloom. Set the Boom’s Threshold value to 0.4, the Intensity to 2.5, and the Scatter to 0.7.
  • Click Add Override again in the Volume component and select Post Processing ▸ Vignette this time. Set the Intensity value to 0.5, the Smoothness to 0.5, and other values as the default.
  • Select AR Camera in AR Session Origin, and check Rendering ▸ Post Processing to enable in the Inspector.

Build and run.

Post Processing demo

Where to Go From Here?

Download the starter project by clicking the Download Materials button at the top or bottom of the tutorial.

You’ve learned how to build an AR experience with the tools provided by AR Foundation.

But the journey doesn’t stop here! There are tons of awesome AR Foundation features for you to explore — features like face tracking, image tracking and even object tracking. You’ll find more details in the official documentation.

We hope you enjoyed this tutorial. If you have any questions or comments, please join the forum discussion below!