top of page

Using ARKit with Swift to build AR applications in iOS

Using ARKit to build AR applications in iOS

Augmented Reality (AR) has become an exciting technology that allows developers to create immersive experiences by overlaying virtual objects onto the real world. ARKit, Apple's framework for building AR applications, provides powerful tools and features to integrate AR into iOS apps using the Swift programming language.

In this blog post, we will explore how to use ARKit with Swift to create an AR application step by step.


Before we dive into coding, make sure you have the following prerequisites:

  • A Mac running macOS 10.13.2 or later.

  • Xcode 9.0 or later.

  • An iOS device with an A9 or later processor, running iOS 11.0 or later.

  • Basic knowledge of Swift programming language and iOS app development.

Setting Up ARKit

To get started, let's create a new iOS project in Xcode and configure it for ARKit. Follow these steps:

  1. Open Xcode and click on "Create a new Xcode project."

  2. Choose "Augmented Reality App" template under the "App" category.

  3. Enter the product name, organization identifier, and select Swift as the language.

  4. Choose a location to save your project and click "Create."

Exploring the Project Structure

Once the project is created, let's take a quick look at the project structure:

  • AppDelegate.swift: The entry point of the application.

  • ViewController.swift: The default view controller for the ARKit app.

  • Main.storyboard: The user interface layout for the app.

  • Assets.xcassets: The asset catalog where you can add images and other resources.

  • Info.plist: The property list file that contains the configuration settings for the app.

Understanding the View Controller

The ViewController.swift file is the main view controller for our ARKit app. Open the file and let's explore its structure:

import UIKit
import ARKit

class ViewController: UIViewController {
    @IBOutlet var sceneView: ARSCNView!
    override func viewDidLoad() {
        // Set the view's delegate
        sceneView.delegate = self
        // Create a new scene
        let scene = SCNScene()
        // Set the scene to the view
        sceneView.scene = scene
    override func viewWillAppear(_ animated: Bool) {
        // Create a session configuration
        let configuration = ARWorldTrackingConfiguration()
        // Run the view's session
    override func viewWillDisappear(_ animated: Bool) {
        // Pause the view's session

extension ViewController: ARSCNViewDelegate {

The ViewController class inherits from UIViewController and conforms to the ARSCNViewDelegate protocol. It contains an ARSCNView object named sceneView, which is responsible for rendering the AR scene.

In the viewDidLoad() method, we set the sceneView delegate to self and create a new SCNScene object. We then assign the created scene to the sceneView.scene property.

In the viewWillAppear() method, we create an ARWorldTrackingConfiguration object, which is the primary configuration for AR experiences. We run the AR session by calling with the created configuration.

Finally, in the viewWillDisappear() method, we pause the AR session by calling sceneView.session.pause().

Adding 3D Objects to the Scene

To add 3D objects to the AR scene, we need to implement the ARSCNViewDelegate methods. Modify the extension block in ViewController.swift as follows:

extension ViewController: ARSCNViewDelegate {
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        // Check if the added anchor is an ARPlaneAnchor
        guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
        // Create a new plane node with the anchor's dimensions
        let planeNode = createPlaneNode(with: planeAnchor)
        // Add the plane node to the scene
    private func createPlaneNode(with anchor: ARPlaneAnchor) -> SCNNode {
        // Create a plane geometry with the anchor's dimensions
        let planeGeometry = SCNPlane(width: CGFloat(anchor.extent.x), height: CGFloat(anchor.extent.z))
        // Set the plane's color
        planeGeometry.materials.first?.diffuse.contents =
        // Create a plane node with the geometry
        let planeNode = SCNNode(geometry: planeGeometry)
        // Position the plane node at the anchor's center
        planeNode.position = SCNVector3(, 0,
        // Rotate the plane node to match the anchor's orientation
        planeNode.eulerAngles.x = -.pi / 2
        return planeNode

In the renderer(_:didAdd:for:) method, we check if the added anchor is an ARPlaneAnchor. If it is, we call the createPlaneNode(with:) method to create a plane node and add it to the scene.

The createPlaneNode(with:) method takes an ARPlaneAnchor as input and creates an SCNPlane geometry with the anchor's dimensions. We set the plane's color to blue with 50% transparency. Then, we create an SCNNode with the plane geometry, position it at the anchor's center, and rotate it to match the anchor's orientation. Finally, we return the plane node.

Running the AR App

Now that we have implemented the basic setup and added functionality to display plane nodes, let's run the AR app on a compatible iOS device. Follow these steps:

  1. Connect your iOS device to your Mac.

  2. Select your iOS device as the build destination in Xcode.

  3. Click the "Play" button or press Command+R to build and run the app on your device.

Once the app is launched, point the camera at a flat surface, such as a tabletop or floor. As the ARKit detects and recognizes the surface, it will display a blue semi-transparent plane overlay on it.


In this blog post, we learned how to use ARKit with Swift to create an AR application in iOS. We explored the project structure, understood the view controller, and added 3D plane nodes to the scene using the ARSCNViewDelegate methods.

This is just the beginning of what you can achieve with ARKit. You can further enhance your AR app by adding custom 3D models, interactive gestures, and more.

Have fun exploring the possibilities of AR with Swift and ARKit!

Happy coding!

Blog for Mobile App Developers, Testers and App Owners


This blog is from Finotes Team. Finotes is a lightweight mobile APM and bug detection tool for iOS and Android apps.

In this blog we talk about iOS and Android app development technologies, languages and frameworks like Java, Kotlin, Swift, Objective-C, Dart and Flutter that are used to build mobile apps. Read articles from Finotes team about good programming and software engineering practices, testing and QA practices, performance issues and bugs, concepts and techniques. 

bottom of page