Tastes Like Burning: An Example of ARKit and iOS Particle Systems

Derek Andre Mobile, Technology Snapshot Leave a Comment

We have reached a peak in computer science: I can make fire come out of my face. Apple has made it simple with an iPhone X to track a user’s face and use a particle systems file to add special effects.

In this post, I will demonstrate how to “breathe fire” using Xcode 9.4.1, Swift 4.1.2, and iOS 11.4.1 on my iPhone X. For this tutorial, you will need a physical device with a TrueDepth camera. A virtual notch on the simulator will not cut it.  The completed project is available on GitHub.

File -> New -> Project

A lot of iOS tutorials start off with creating a Single View Application. That can get boring. Luckily in this article, we are going to create an Augmented Reality Application. After you create your new project you will have…cough…a Single View Application…with ARKit installed! That is different, right?

Before starting, remember to select SceneKit in the Content Technology dropdown when you are creating your new Augmented Reality Application project. This framework is like a “mini 3D animation suite” where you can manipulate 3D assets and make animations.

You can also choose to make a SpriteKit game or use Metal APIs from the Content Technology dropdown, but that will not be covered in this article.

One thing that is different, is the main storyboard has a control called ARSCNView. This will display the view from your camera and keep track of object coordinate translations as you move your phone.

When you look at the structure in the Project Navigator, you will also notice something called art.scnassets. This is where you can import 3D models and images to use for textures. You can also erase the default content in art.scnassets. I will not be adding any 3D models or textures to my project, so we will leave art.scnassets empty.

MVCish

Next, we are going to override some of the functions in the UIViewController class that are listed below. You will want to copy the code into your ViewController class.

ViewController Class

import UIKit
import ARKit

class ViewController: UIViewController {

    @IBOutlet var sceneView: ARSCNView!

    var arSceneView = ARSceneView()

    override func viewDidLoad() {
        super.viewDidLoad()

        sceneView.delegate = arSceneView
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        arSceneView.run(sceneView)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        arSceneView.pause(sceneView)
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)

        UIApplication.shared.isIdleTimerDisabled = true
    }
}

The ARSCNView class delegates to the ARSCNViewDelegate protocol. I am going to encapsulate the implementation of that protocol in the ARSceneView class. The run and pause functions are the main interface for that class. We see the implementation of these functions and the rest of the ARSceneView class next.

Let’s Get Ob-SCN

Create a new Swift file in your project and call it ARSceneView. This class will inherit from NSObject, because ARSCNViewDelegate inherits from ARSessionObserver, which inherits from NSObjectProtocol. You can implement all of the methods for NSObjectProtocol, or inherit from NSObject.

ARSceneView Class

import SceneKit
import ARKit

class ARSceneView: NSObject, ARSCNViewDelegate {
    private let faceQueue = DispatchQueue(label: "com.derekandre.Tastes-Like-Burning.faceQueue")

    private var mouth: Mouth?

    private let fire = SCNParticleSystem(named: "Fire.scnp", inDirectory: nil)!

    private var isMouthBurning = false

    func run(_ sceneView: ARSCNView) {
        guard ARFaceTrackingConfiguration.isSupported else { return }

        let configuration = ARFaceTrackingConfiguration()
        configuration.isLightEstimationEnabled = true

        sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
    }

    func pause(_ sceneView: ARSCNView) {
        sceneView.session.pause()
    }

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        faceQueue.async {
            self.mouth = Mouth()

            node.addChildNode(self.mouth!)

            self.mouth!.position.y = -0.06
            self.mouth!.position.z = 0.07

            self.fire.emitterShape = self.mouth?.geometry!
        }
    }

    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard let faceAnchor = anchor as? ARFaceAnchor else { return }

        if let mouth = self.mouth {
            if let jawOpenAmount = faceAnchor.blendShapes[.jawOpen] {
                if jawOpenAmount.floatValue > 0.4 {
                    if !isMouthBurning {
                        isMouthBurning = true
                        mouth.addParticleSystem(fire)
                    }

                    return
                }
            }

            mouth.removeAllParticleSystems()
            isMouthBurning = false
        }
    }
}

If you look at the run function, we start off with the ARFaceTrackingConfiguration class and the property isSupported. In this case, it checks for a front-facing TrueDepth camera. After, we can create an instance of ARFaceTrackingConfiguration and set the property isLightEstimationEnabled to true. This will have ARKit emulate the lighting in the video feed. Then we pass that configuration to the run function of the sceneView‘s session object.

See Also:  AWS Amplify GraphQL Queries with TypeScript and Hooks

The session, or ARSession, object handles motion, image, and camera data while creating the illusion of augmented reality. You can only have one ARSession per SCNView.

The two renderer functions are used to capture when a SCNNode is added to the SceneKit scene and when properties on that object are updated to match the ARAnchorSCNNode objects are placeholders for the position, orientation, and scale data relative to its parent in a SceneKit project. They also can contain geometry, particle systems, and other child node objects. This means we can attach objects to these nodes and the position, orientation, and scale will change along with its parents.

The corresponding ARAnchor object is also passed to the renderer functions. Anchors are placeholders that have a specific position and orientation in the augmented reality space that are kept on your session object. The ARAnchor that is passed into the renderer functions is a ARFaceAnchor. This type of anchor has information on facials expressions, poses, and even topology of the face that is being tracked by the TrueDepth camera.

The first renderer function with the didAdd node argument, contains a DispatchQueue. I am going to add the fire particle systems and everything else in another thread, other than the main thread, so it doesn’t affect the UI.

Big Mouth

Now create a new Swift file and call it Mouth.

Mouth Class

import SceneKit

class Mouth: SCNNode {
    override init() {
        super.init()

        self.geometry = createSphere()
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    private func createSphere() -> SCNSphere {
        let sphere = SCNSphere(radius: 0.03)

        let sphereMaterial = sphere.firstMaterial!
        sphereMaterial.lightingModel = SCNMaterial.LightingModel.constant
        sphereMaterial.diffuse.contents = UIColor.clear

        return sphere
    }
}

The Mouth class is a subclass of SCNNode. It is added to the scene’s parent node that is passed into the renderer function. Inside the Mouth class, a 3D Sphere model is created and added as the geometry for the node. This is going to be the 3D surface that the fire particle system will emit from. I also added a clear material shader to it so the user cannot see it.

Firestarter

It is time to get “arty.” Not “farty,” but “arty.”

See Also:  Building a Java Cloud Native Spring Microservice Application on Azure, Part 1

Create a new group in your project navigator called, “Fire.” Right click on the “Fire” group and create a new file. Select “SceneKit Particle System File” and choose the “Fire” template. Call it, “Fire.” Hot!

This will create a SceneKit Particle Systems file (.scnp) and add a PNG file that will define the shape of each particle emitted.

Select the Fire.scnp file. The particle system editor will appear and you should see…well…fire. There are a lot of settings for particle systems which you can find in the developer documentation on particle systems. You can also copy my scnp file from GitHub, or clone my repository and check out the settings in Xcode.

One thing to keep in mind is the positive z-axis is coming out of the mouth towards the camera. When you create your particle system, make sure the particle system direction is in the z-axis.

Back in the ARSceneView class, we are keeping a reference to the particle system. The particle system is called, “Fire”. The constructor for the SCNParticleSystem class points to the fire.scnp file in your project tree.

In the didAdd renderer function, we set the fire particle system’s emitter to the mouth’s geometry, which is a sphere positioned over the user’s mouth. Now the particle system will emit from the sphere’s surface.

Open Wide

The didUpdate renderer function is called when the parent SCNNode is updated to match its ARAnchor. You want to cast the anchor as a ARFaceAnchor, so you can access the face blend shapes dictionary on it.

The blend shapes dictionary has a key. It is a collection of parts of the face and how they are deforming, like JawOpen. The value for the key is a Float from 0-1.  Basically 0% – 100%. Here is the documentation for the blend shapes dictionary. This page also links to the blend shape locations.

We are going to use the JawOpen location to see if the mouth is open. If the jaw is open more than 0.4, we are going to add our fire particle system to the mouth with the addParticleSystem function.

This will add the particle system into your scene and attach to the mouth SCNNode. If the mouth closes, then we are going to remove the particle system from the scene.

Your Face is Burning, Bro

You now know how to breathe fire! Remember, you need a physical device with a TrueDepth camera to test this application. When you build and run the application you should see your face in the ARSCNView control.

Then, when you open your mouth, fire should shoot out of it. Fancy!

What Do You Think?