Nowadays, new technologies expand the fields of application of Unity, from gaming to all kinds of software, such as simulations, training, apps, and so on. In the latest versions of Unity, we saw lots of improvements in the field of augmented reality (AR), which allows us to add a layer of virtuality on top of our reality, thereby augmenting what our device can perceive to create games that rely on real-world data, such as the camera's image, our real-world position, and the current weather. This can also be applied to work environments, such as when viewing the building map or checking the electrical ducts inside a wall. Welcome to the extra section of this book, where we are going to discuss how to create AR applications using Unity's AR Foundation package.
In this chapter, we will examine the following AR Foundation concepts:
- Using AR Foundation
- Building for mobile devices
- Creating a simple AR game
By the end of this chapter, you will be able to create AR apps using AR Foundation, and will have a fully functional game that uses its framework so that you can test the framework's capabilities.
Let's start by exploring the AR Foundation framework.
When it comes to AR, Unity has two main tools to create applications: Vuforia and AR Foundation. Vuforia is an AR framework that can work on almost any phone and contains all the needed features for basic AR apps, but with a paid subscription for more advanced features. On the other hand, the completely free AR Foundation framework supports the latest AR native features of our devices but is supported only by newer devices. Picking between one or the other depends a lot on the type of project you're going to build and the target audience. However, since this book aims to discuss the latest Unity features, we are going to explore how to use AR Foundation to create our first AR app for detecting the positions of images and surfaces in the real world. So, we'll start by exploring its API.
In this section, we will examine the following AR Foundation concepts:
- Creating an AR Foundation project
- Using tracking features
Let's start by discussing how to prepare our project so that it can run AR Foundation apps.
Something to consider when creating AR projects is that we will not only change the way we code our game, but also the game design aspect. AR apps have differences, especially in the way the user interacts, and also limitations, such as the user being in control of the camera all the time. We cannot simply port an existing game to AR without changing the very core experience of the game. That's why, in this chapter, we are going to work on a brand new project; it would be too difficult to change the game we've created so far so that it works well in AR.
In our case, we are going to create a game where the user controls a player moving a "marker", a physical image you can print that will allow our app to recognize where the player is in the real world. We will be able to move the player while moving that image, and this virtual player will automatically shoot at the nearest Enemy. Those enemies will spawn from certain spawn points that the user will need to place in different parts of the home. As an example, we can put two spawn points on the walls and place our player marker in a table in the middle of the room so that the enemies will go toward them. In the following image, you can see a preview of what the game will look like:
We'll start creating a new URP-based project in the same manner we created our first game. Something to consider is that AR Foundation works with other pipelines, including built-in ones, in case you want to use it in already existing projects. If you don't remember how to create a project, please refer to Chapter 2, Setting Up Unity. Once you're in your new blank project, install the AR Foundation package from the Package Manager, just like we've installed other packages previously; that is, from Window | Package Manager. Remember to set the Package Manager so that it shows all packages, not only the ones in the project (the Packages button at the top-left part of the window needs to be set to Unity Registry). At the time of writing this book, the latest version is 4.0.2. Remember you can use the See other Versions button that appears clicking the triangle at the left of the package under Package Item in the list to display other version options. If you find a newer version than mine, you can try using that one, but as usual, if something works differently to what we want, please install this specific version:
Before we install any other needed packages, now is a good moment to discuss some core ideas of the AR Foundation framework. This package, by itself, does nothing; it defines a series of AR features that mobile devices offer, such as image tracking, cloud points, and object tracking, but the actual implementation of how to do that is contained in the Provider packages, such as AR Kit and AR Core XR plugins. This is designed like this because, depending on the target device you want to work with, the way those features are implemented changes. As an example, in iOS, Unity implements those features using AR Kit, while in Android, it uses AR Core; they are platform-specific frameworks.
Something to consider here is that not all iOS or Android devices support AR Foundation apps. You might find an updated list of supported devices when searching for AR Core and AR Kit supported devices on the internet. At the time of writing, the following links provide the supported devices lists:
- iOS: https://www.apple.com/lae/ios/augmented-reality/ (at the bottom of the page)
- Android: https://developers.google.com/ar/discover/supported-devices
Also, there isn't a PC Provider package, so the only way to test AR Foundation apps so far is directly on the device, but testing tools are going to be released soon. In my case, I will be creating an app for iOS, so aside from the AR Foundation package, I need to install the ARKit XR plugin. However, if you want to develop for Android, install the ARCore XR plugin instead (or both if you're targeting both platforms). I will be using the 4.0.2 version of the ARKit package, but at the moment of writing this book, the ARCore recommended version is 4.0.4 Usually, the versions of the AR Foundation and Provider packages match, but apply the same logic as when you picked the AR Foundation version. In the following screenshot, you can see the ARKit package in the Package Manager:
Now that we have the needed plugins, we need to prepare a scene for AR, as follows:
- Create a new Scene in File | New Scene.
- Delete Main Camera; we are going to use a different one.
- In the GameObject | XR menu, create an AR Session Object.
- In the same menu, create an AR Session Origin Object that has a Camera inside it:
- Your hierarchy should look as follows:
The AR Session object will be responsible for initializing AR Framework and will handle all the update logic for the AR systems. The AR Session Origin object will allow the framework to locate tracked objects such as images and point clouds in a relative position to the scene. The devices inform the positions of tracked objects relative to what the device considers "the origin". This is usually the first area of your house you were pointing at when the app started detecting objects, so the AR Session Origin object will represent that area. Finally, you can check the camera inside the origin, which contains some extra components, with the most important being AR Pose Driver, which will make your Camera object move along with your device. Since the device's position is relative to the Session Origin object's point, the camera needs to be inside the origin object.
One extra step in case you are working in a URP project (our case) is that you need to set up the render pipeline so that it supports rendering the camera image in the app. To do that, go to the Settings folder that was generated when we created the project, look for the Forward Renderer file, and select it. In the Renderer Features list, click the Add renderer feature button and select AR Background Renderer Feature. Consider that this option might be unavailable if you are working on versions older than 4.0.0 of the AR Foundation and Provider plugins. In the following screenshot, you can see what the Forward Renderer asset should look like:
And that's all! We are ready to start exploring the AR Foundation components so that we can implement tracking features.
For our project, we are going to need two of the most common tracking features in AR (but not the only ones): image recognition and plane detection. The first one consists of detecting the position in the real world of a specific image so that we can place digital objects on top of it, such as the player. The second one, plane detection, consists of recognizing real-life surfaces, such as floors, tables, and walls, so that we have a reference of where we can put objects such as the enemies' spawn points. Only horizontal and vertical surfaces are recognized (just vertical surfaces on some devices).
The first thing we need to do is tell our app which images it needs to detect, as follows:
- Add an image to the project that you can print or display in a cellphone. Having a way to display the image in the real world is necessary to test this. In this case, I will use the following image:
Try to get an image that contains as many features as you can. This means an image with lots of little details, such as contrasts, sharp corners, and so on. Those are what our AR systems use to detect it; the more detail, the better the recognition. In our case, the Unity logo we are using doesn't actually have too many details, but there's enough contrast (just black and white) and sharp corners for the system to recognize it. If your device has trouble detecting it, try other images (the classic QR code might help).
Consider that some devices might have trouble with certain images, such as the image suggested in this book. If this generates issues when testing, please try using another one. You will be testing this on your device in the upcoming sections of this chapter, so just keep this in mind.
- Create a Reference Image Library, an asset containing all the images we wish our app to recognize, by clicking the + button in Project Panel and selecting XR | Reference Image Library:
- Select the library asset and click the Add Image button to add a new image to the library.
- Drag the texture to the texture slot (the one that says None).
- Turn Specify Size on and set Physical Size to the size that your image will be in real life, in meters. Try to be accurate here; on some devices, not having this value right might result in the image not being tracked:
Now that we've specified the images to be detected, let's test this by placing a cube on top of the real-life image:
- Create a prefab of a cube and add the AR Tracked Image component to it.
- Add the AR Tracked Image Manager component to the AR Session Origin object. This will be responsible for detecting images and creating objects in its position.
- Drag the Image Library asset to the Serialized Library property of the component to specify the images to recognize.
- Drag the Cube prefab to the Tracked Image Prefab property of the component:
And that's all! We will see a cube spawning in the same position the image is located at in the real world. Remember that you need to test this in the device, which we will do in the next section, so for now, let's keep coding our test app:
Let's also prepare our app so that it can detect and display the plane surfaces the camera has recognized. This is simply done by adding the AR Plane Manager component to the AR Session Origin object:
This component will detect surface planes over our house as we move the camera over it. It can take a while to detect them, so it's important to visualize the detected areas to get feedback about this to ensure it's working properly. We can manually get information about the plane from a component reference to the AR Plane Manager, but luckily, Unity allows us to visualize planes easily. Let's take a look:
- Create a prefab of a plane, first by creating the plane in GameObject | 3D Object | Plane.
- Add a Line Renderer to it. This will allow us to draw a line over the edges of the detected areas.
- Set the Width property of Line Renderer to a small value such as 0.01, the Color property to black, and uncheck Use World Space:
- Remember to create a material for Line Renderer with the proper shader and set it as the material of the renderer:
- Also, create a transparent material and use it in the MeshRenderer plane. We want to see through it so that we can easily see the real surface beneath:
- Add the AR Plane and AR Plane Mesh Visualizer components to the Plane prefab.
- Drag the prefab to the Plane Prefab property of the AR Plane Manager component of the AR Session Origin object:
Now, we have a way to see the planes, but seeing them is not the only thing we can do (sometimes, we don't even want them to be visible). The real power of planes resides on placing virtual objects on top of real-life surfaces, tapping in a specific plane area, and getting its real-life position. We can access the plane data using the AR Plane Manager or by accessing the AR Plane component of our visualization planes, but something easier is to use the AR Raycast Manager component.
The AR Raycast Manager component provides us with the equivalent to the Physics.Raycast function of the Unity Physics system, which, as you may recall, is used to create imaginary rays that start from one position and go toward a specified direction in order to make them hit surfaces and detect the exact hit point. The version provided by AR Raycast Manager, instead of colliding with Physics Colliders, collides with tracked objects, mostly Point Clouds (we are not using them) and the "Planes" we are tracking. We can test this feature by following these steps:
- Add the AR Raycast Manager component to the AR Session Origin object.
- Create a custom script called InstanceOnPlane in the AR Session Origin object.
- In the Awake cache, add the reference to ARRaycastManager. You will need to add the using UnityEngine.XR.ARFoundation; line to the top of the script for this class to be usable in our script.
- Create a private field of the List<ARRaycastHit> type and instantiate it; the Raycast is going to detect every plane our ray hit, not just the first one:
- Under Update, check if the Left Mouse Button (KeyCode.Mouse0) is being pressed. In AR apps, the mouse is emulated with the device's touch screen (you can also use the Input.touches array for multi-touch support).
- Inside the if statement, add another condition for calling the Raycast function of AR Raycast Manager, passing the position of the mouse as the first parameter and the list of hits as the second.
- This will throw a raycast toward the direction the player touches the screen and store the hits inside the list we provided. This will return true if something has been hit, and false if not:
- Add a public field to specify the prefab to instantiate in the place we touched. You can just create a Sphere prefab to test this; there's no need to add any special component to the prefab here.
- Instantiate the prefab in the Position and Rotation fields of the Pose property of the first hit stored in the list. The hits are sorted by distance, so the first hit is the closest one. Your final script should look as follows:
In this section, we learned how to create a new AR project using AR Foundation. We discussed how to install and set up the framework, as well as how to detect real-life image positions and surfaces, and then how to place objects on top of them.
As you may have noticed, we never hit play to test this, and sadly at the time of writing this book, we cannot test this in the Editor. Instead, we need to test this directly on the device. Due to this, in the next section, we are going to learn how to do builds for mobile devices such as Android and iOS.
Unity is a very powerful tool that solves the most common problems in game development very easily, and one of them is building the game for several target platforms. Now, the Unity part of building our project for such devices is easy to do, but each device has its non-Unity-related nuances for installing development builds. In order to test our AR app, we need to test it directly in the device. So, let's explore how we can make our app run on Android and iOS, the most common mobile platforms.
Before diving into this topic, it is worth mentioning that the following procedures change a lot over time, so you will need to find the latest instructions on the internet. The Unity Learn portal site (https://learn.unity.com/tutorial/building-for-mobile) may be a good alternative in case the instructions in this book fail, but try the steps here first.
In this section, we will examine the following mobile building concepts:
- Building for Android
- Building for iOS
Let's start by discussing how to build our app so that it runs on Android phones.
Creating Android builds is relatively easy compared to other platforms, so we'll start with Android. Remember that you will need an Android device capable of running AR Foundation apps, so please refer to the link regarding Android supported devices we mentioned in the first section of this chapter. The first thing we need to do is check if we have installed Unity's Android support and configured our project to use that platform. To do that, follow these steps:
- Close Unity and open Unity Hub.
- Go to the Installs section and locate the Unity version you are working on.
- Click the three dots button at the top-right corner of the Unity version and click Add Modules:
- Make sure Android Build Support and the sub-options that are displayed when you click the arrow on its left are checked. If not, check them and click the Done button at the bottom-right to install them:
- Open the AR project we created in this chapter.
- Go to Build Settings (File | Build Settings).
- Select the Android platform from the list and click the Switch Platform button at the bottom-right part of the window:
To build an app on Android, there are some requirements we need to meet, such as having the Java SDK (not the regular Java runtime) and Android SDK installed, but luckily, the new versions of Unity take care of that. Just to double-check that we have installed the needed dependencies, follow these steps:
- Go to Unity Preferences (Edit | Preferences on Windows, Unity | Preferences on Mac).
- Click External Tools.
- Check that all the options that say …Installed with Unity on the Android section are checked. This means we will be using all the dependencies installed by Unity:
There are some additional Android AR Core-specific related settings to check that you can find at https://developers.google.com/ar/develop/unity-arf/quickstart-android. These can change if you are using newer versions of AR Core. You can apply them by following these steps:
- Go to Player Settings (Edit | Project Settings | Player).
- Uncheck Multithreaded Rendering and Auto Graphics API.
- Remove Vulkan from the Graphics APIs list.
- Set Minimum API Level to Android 7.0:
Now, you can finally build the app from File | Build Settings like usual, by using the Build button. This time, the output will be a single APK file that you can install by copying the file to your device and opening it. Remember that in order to install APKs that weren't downloaded from the Play Store, you need to set your device to allow Install Unknown Apps. The location for that option varies a lot, depending on the Android version and the device you are using, but this option is usually located in the Security settings. Some Android versions prompt you to view these settings when installing the APK.
Now, we can copy and install the generated APK build file every time we want to create a build. However, we can let Unity do that for us using the Build and Run button. This option, after building the app, will look for the first Android device connected to your computer via USB and will automatically install the app. For this to work, we need to prepare our device and PC, as follows:
On your device, find the build number in the Settings section of the device, whose location, again, can change depending on the device. On my device, it is located in the About Phone | Software Information section:
- Tap it a few times until the device says you are now a programmer. This procedure enables the hidden developer option in the device, which you can now find in the settings.
- Open the developer options and turn on USB Debugging, which allows your PC to have special permissions on your device. In this case, it allows you to install apps.
- Install the USB drivers from your phone manufacturer's site onto your computer. For example, if you have a Samsung device, search for Samsung USB Driver. Also, if you can't find that, you can look for Android USB Driver to get the generic drivers, but that might not work if your device manufacturer has their own. On Mac, this step is usually not necessary.
- Connect your device (or reconnect it if it's already connected). The option to Allow USB Debugging for your computer will appear on the device. Check Always Allow and click OK:
- Accept the Allow Data prompt that appears.
- If these options don't appear, check that the USB Mode of your device is set to Debugging and not any other.
- In Unity, build with the Build and Run button.
- Please remember to try another image if you have trouble detecting the image where we instantiate the player (the Unity logo, in my case). This might vary a lot, according to your device's capabilities.
And that's all! Now that you have your app running on your device, let's learn how to do the same for the iOS platform.
When developing on iOS, you will need to spend some money. You will need to run Xcode, a piece of software you can only run on OS X. Due to this, you'll need a device that can run it, such as a MacBook, a Mac mini, and so on. There may be ways to run OS X on PCs, but you will need to find this out and try it for yourself. Besides spending on a Mac and on an iOS device (iPhone, iPad, iPod, and so on), you'll need to pay for an Apple Developer account, which costs 99 USD per year, even if you are not planning to release the application on the App Store (there may be alternatives, but, again, you will need to find them).
So, to create an iOS build, you should do the following:
- Get a Mac computer.
- Get an iOS device.
- Create an Apple Developer account (at the time of writing this book, you can create one at https://developer.apple.com/).
- Install Xcode from the App Store onto your Mac.
- Check if you have iOS build support in Unity Install on the Unity Hub. Please refer to the Building on Android section for more information about this step:
- Switch to the iOS platform under Build Settings, selecting iOS and clicking the Switch Platform button:
- Click the Build button in the Build Settings window and wait.
You will notice that the result of the build process will be a folder containing an Xcode project. Unity cannot create the build directly, so it generates a project you can open with the Xcode software we mentioned previously. The step you need to follow to create a build with the Xcode version being used in this book (11.4.1) are as follows:
- Double-click the .xcproject file inside the generated folder:
- Go to Xcode | Preferences.
- In the Accounts tab, hit the + button at the bottom-left part of the window and log in with the Apple account you registered as an Apple developer:
- Connect your device and select it from the top-left part of the window, which should now say Generic iOS device:
- In the left panel, click the folder icon and then the Unity-iPhone settings to display the project settings.
- From the TARGETS list, select Unity-iPhone and click on the Signing & Capabilities tab.
- In the Team settings, select the options that says Personal Team:
- If you see a Failed to register bundle identifier error, just change the Bundle Identifier setting for another one, always respecting the format (com.XXXX.XXXX), and then click on Try Again until it is solved. Once you find one that works, set it in Unity (Bundle Identifier under Player Settings) to avoid needing to change it in every build.
- Hit the Play button at the top-left part of the window and wait for the build to complete. You might be prompted to enter your password a couple of times in the process, so please do so.
- When the build completes, remember to unlock the device. A prompt will ask you to do that. Note that the process won't continue unless you unlock the phone.
- After completion, you may see an error saying that the app couldn't be launched but that it was installed anyway. If you try to open it, it will say you need to trust the developer of the app, which you can do by going to the settings of your device.
- From there, go to General | Device Management and select the first developer in the list.
- Click the blue Trust … button and then Trust.
- Try to open the app again.
- Please remember to try another image if you're having trouble detecting the image where we instantiate the player (the Unity logo, in my case). This might vary a lot, depending on your device's capabilities.
In this section, we discussed how to build a Unity project that can run on iOS and Android, thus allowing us to create mobile apps–AR mobile apps, specifically. Like any build, there are methods we can follow to profile and debug, as we saw when we looked at PC builds, but we are not going to discuss that here. Now that we have created our first test project, we will convert it into a real game by adding some mechanics to it.
As we discussed previously, the idea is to create a simple game where we can move our player while moving a real-life image, and also put in some Enemy Spawners by just tapping where we want them to be, such as a wall, the floor, a table, and so on. Our player will automatically shoot at the nearest Enemy, and the enemies will shoot directly at the player, so our only task will be to move the Player so that they avoid bullets. We are going to implement these game mechanics using scripts very similar to the ones we used in this book's main project.
In this section, we will develop the following AR game features:
- Spawning the Player and Enemies
- Coding the Player and Enemy behavior
First, we are going to discuss how to make our Player and Enemies appear on the app, specifically in real-world positions, and then we will make them move and shoot each other to create the specified gameplay mechanics. Let's start with spawning.
Let's start with the Player, since that's the easiest one to deal with: we will create a prefab with the graphics we want the player to have (in my case, just a cube), a Rigidbody with Is Kinematic checked (the Player will move), and an AR Tracked Image script. We will set that prefab as Tracked Image Prefab of the AR Tracked Image Manager component in the AR Session Origin object. This will put the Player on the tracked image. Remember to set the size of the Player in terms of real-life sizes. In my case, I scaled the Player to (0.05, 0.05, 0.05). Since the original cube is 1 meter in size, this means that my player will be 5x5x5 centimeters. Your Player prefab should look as follows:
- Create a prefab called Spawner with the graphic you want your Spawner to have (in my case, a cylinder) and its real-life size.
- Add a custom script that spawns a prefab every few seconds, such as the one shown in the following screenshot.
- You will notice the usage of Physics.IgnoreCollision to prevent the Spawner from colliding with the Spawner object, getting the colliders of both objects and passing them to the function. You can also use the Layer Collision Matrix to prevent collisions, just like we did with this book's main project, if you prefer to:
- Create an Enemy prefab with the desired graphic (a Capsule, in my case) and a Rigidbody component with the Is Kinematic checkbox checked. This way, the Enemy will move but not with physics. Remember to consider the real-life size of the Enemy.
- Set the Prefab property of the Spawner so that it spawns our Enemy at your desired time frequency:
- Add a new SpawnerPlacer custom script to the AR Session Origin object that instantiates a prefab in the place the player tapped using the AR Raycast system, as shown in the following screenshot:
- Set the prefab of SpawnerPlacer so that it spawns the Spawner prefab we created earlier.
And that's all for the first part. If you test the game now, you will be able to tap on the detected planes in the app and see how the Spawner starts creating enemies. You can also look at the target image and see our Cube Player appear.
Now that we have the objects in the scene, let's make them do something more interesting, starting with the Enemies.
The Enemy must move toward the player in order to shoot at them, so it will need to have access to the player position. Since the Enemy is instantiated, we cannot drag the Player reference to the prefab. However, the Player has also been instantiated, so we can add a PlayerManager script to the player that uses the Singleton pattern (as we did with managers). To do that, follow these steps:
- Create a PlayerManager script similar to the one shown in the following screenshot and add it to the Player:
- Now that the Enemy has a reference to the player, let's make them look at the player by adding a LookAtPlayer script, as shown here:
- Also, add a simple MoveForward script like the one shown in the following screenshot to make the Enemy not only look at the player but also move toward them. Since the LookAtPlayer script is making the Enemy face the Player, this script moving along the z axis is just enough:
Now, we will take care of the Player movement. Remember that our player is controlled through moving the image, so here, we are actually referring to the rotation, since the player will need to automatically look and shoot at the nearest Enemy. To do this, follow these steps:
- Create an Enemy script and add it to the Enemy prefab.
- Create an EnemyManager script like the one shown in the following screenshot and add it to an empty EnemyManager object in the scene:
- In the Enemy script, make sure to register the object in the all list of EnemyManager, as we did previously with WavesManager in this book's main project:
- Create a LookAtNearestEnemy script like the one shown in the following screenshot and add it to the Player prefab to make it look at the nearest Enemy:
Now that our objects are rotating and moving as expected, the only thing missing is shooting and damaging:
- Create a Life script like the one shown in the following screenshot and add it to both the Player and Enemy components. Remember to set a value for the amount-of-life field. You will see this version of Life instead of needing to check if the life has reached zero every frame. We have created a Damage function to check that damage is dealt (the Damage function is executed), but the other version of this book's project also works:
- Create a Bullet prefab with the desired graphics, the collider with the Is Trigger checkbox on the collider checked, a Rigidbody component with Is Kinematic checked (a Kinematic Trigger Collider), and the proper real-life size.
- Add the MoveForward script to the Bullet prefab to make it move. Remember to set the speed.
- Add a Spawner script to both the Player and the Enemy components and set the Bullet prefab as the prefab to spawn, as well as the desired spawn frequency.
- Add a Damager script like the one shown in the following screenshot to the Bullet prefab to make bullets inflict damage on the objects it touches. Remember to set the damage:
- Add an AutoDestroy script like the one shown in the following screenshot to the Bullet prefab to make it despawn after a while. Remember to set the Destroy time:
And that's all! As you can see, we basically created a new game using almost the same scripts we used in the main game, mostly because we designed them to be generic (and the game genres are almost the same). Of course, this project can be improved a lot, but we have a nice base project upon which to create amazing AR apps.
In this chapter, we introduced the AR Foundation Unity framework, explored how to set it up, and how to implement several tracking features so that we can position virtual objects on top of real-life objects. We also discussed how to build our project so that it can run on both iOS and Android platforms, which is the only way we can test our AR apps at the time of writing. Finally, we created a simple AR game based on the game we created in the main project but modified it so that it's suitable for use in AR scenarios.
With this new knowledge, you will be able to start your path as an AR app developer, creating apps that augment real objects with virtual objects by detecting the positions of the real objects. This can be applied to games, training apps, and simulations. You may even be able to find new fields of usage, so take advantage of this new technology and its new possibilities!