OrnARment AR App Prototype Demonstration
This is a demonstration of an an iOS AR app I built in Unity. In this video I showcase the app working and provide an explanation of the Unity project.
This was my first venture into Augmented Reality (AR) (besides messing about with Snapchat filters and a very brief experimentation into BlippAR). This was also my first project made with the games development engine, Unity. This blog post will discuss my experience of developing this app, both from the perspective of AR as a whole, and specifically the use of Unity.
Firstly, let me put this out there – I don’t like AR apps. Why did I make one? I’ll explain shortly, but every AR ‘game’ app that predated Pokemon GO was a demonstration of ‘look what AR can do!’, and every app that came after Pokemon GO was just another remake of Pokemon GO. I specifically stated ‘game’ apps because AR ‘utility’ apps, for me, are in a different area of discussion – examples such as the IKEA Place app and the Dulux Visuliser app could be argued to be a more useful implementation of the technology. The idea of visualising items in your home that aren’t there yet, but you would like them to be there soon, is why these apps from IKEA and Dulux are very appealing. It almost constructs a mini narrative centered around your life and aspirations. The reason I do not like these AR ‘game’ apps is mainly because how non-entertaining I find them. The medium was the selling point at the beginning, which is what made it fun, ‘the medium is the message’ (McLuhan, 2008), yet AR is no longer a ‘new’ technology (kind of), and I find it disappointing that there are not many – maybe ANY – AR games that are fun to me personally. Pokemon GO is maybe different, yet sometimes I think this might be a nostalgia thing…
So, why did I make one? Glad you remembered to ask. It’s an answer of two parts – first, making an AR app, similar to the ones everyone else has made, was the quickest way to get into the realm of AR. There’s a reason why so many app developers are repeating this over, and over again – because it’s relatively straight forward. Placing digital items in the physical world, with little interactions, is clearly enough for some people to be entertained (and for developers to make a quick earning). I’m not interested in that though. So, the second part to my answer is – I am not interested in AR right now, but I am really, really interested in AR next year. It’s slightly annoying, but Apple, the company that has just surpassed a net-worth of $3 Trillion, seems to dictate the consumer technology industry. Touch screen phone; industry follows, remove headphone jack; industry follows, put a massive notch at the top of the screen; industry follows, make a standardised AR headset implementation; industry foll…you get the idea.
The reason why I am disappointed time and time again at these AR applications is because it seems no one knows what they’re actually doing with it. Apple might not either, but they’ll sure make everyone think they do and everyone will follow. Apple’s new rumoured VR/AR combined headset will set a standard for what AR is meant to look like and meant to be, and it’s that standard that is missing right now. It’s really exciting.
Back to me though – where do I fit in with this? I wanted to learn the foundations of AR, the way it works at the moment, so when the time comes to jump on the Apple AR train, I can do so without any resistance. I am very interested in the merging of physical and digital information, I quite like the way pressing a digital button makes something change in real life. AR is a medium that helps facilitate this, and this app was a very brief exploration into that. I did, however, try to make this somewhat different. I implemented photogrammetry data. This means, rather than placing digital items into a physical world, I can place physical items BACK into a physical world. I used the concept of a ‘souvenir’ collecting application in this example, but this is quite an interesting area to explore. It’s been done by developers before, but allowing the user to upload their own photogrammetry 3D models to the app is something that is relatively new, so new I couldn’t figure out for this prototype, but also so new that Apple has only just released their Object Capture API, that allows 3rd parties to implement this into their applications. This is exciting, and it’s different. I’m not sure what use it could be yet, but let’s leave Apple to figure that one out too.
My app – it works fine! It’s not the prettiest, nor the best implementation of AR, as it doesn’t contain any shadows, reflections or shaders, and that means it doesn’t use any object occlusion neither (where I can hide a digital item behind a physical object). This was difficult because I had multiple objects, each which required a shader, and I couldn’t quite figure out how to attach that shader onto each prefab every time the script requested it. However, I can work on these finer details later. You can’t place multiple objects at the same time either, which kind of defeats the point as a souvenir app, but there’s considerations to do with the poly-count of the 3D models – there’s lots of polygons as it stands! I did actually get it working with multiple objects as you can see from the video below, but this was with one model, and the plane detection was constantly turned on, and it got a bit overwhelming… so decided to forget about that for now. This is just an example of another simple AR app, that whilst boring, has taught me how to make AR applications in Unity.
Speaking of Unity, I quite like it. It’s a fantastic games development engine that is easy to navigate, has a fantastic support community and is widely adopted by the games-development industry, so much so it has become the standard for independent games development. Notice how I keep saying games? That’s the catch, I found creating an AR app in Unity quite cumbersome (I’m differentiating between AR game and AR app). This is likely down to my inexperience, but unless I wanted to make complex variables, character profiles, scoring systems, event systems (all the typical game characteristics), I would likely not choose Unity as an AR application platform again. The first reason being, what I made doesn’t need to be a separate App. I haven’t mentioned AR filters yet, but Snapchat filters are huge, and whilst intended to be fun, have become a significant use of AR by nearly every single teenager who owns a smartphone. It’s become a personality trait. The same can be said for Meta’s offerings of Facebook and Instagram. My AR app would fit perfectly into Snapchat’s or Meta’s AR collection; a fun, harmless and sharable filter that fits tightly within a pre-defiend, pre-made ecosystem that has a robust marketplace. Even better, the tools are already there to make it – Facebook’s/Meta’s Spark AR and Snapchat’s Lens Studio are streamlined, entirely AR-focused (unlike Unity) platforms that work great – I just foolishly went straight past them!
I also think the use of a phone here doesn’t really lend itself to being a practical use for this application. If we’re talking about these IKEA apps again that that let you temporarily place your new sofa in the living room until your real life one arrives, then yes, using a phone as a window to look into the future is great. But if we’re talking about souvenirs, things that could replace ornaments, or even pets, in your own home, a phone as a medium does not facilitate this. ‘The physical and the digital rarely co-exist. There are anchors, or touch points where the physical is linked to the digital, but there are many places where the physical and the digital remain separate’ (Benyon, 2012). The app is the touch point in this case, and this connection is lost as soon as you ‘talk to a friend when they return to the physical space, or make some adjustments to the software when they move into the digital space’ (ibid). Again, Apple’s AR headset or glasses could solve this. We’ve got the Microsoft’s Hololens which is really bulky, Snapchat’s Ray-Ban Stories which are expensive toys, and Google Glass, which… well… doesn’t exist anymore. Apple could yet again save the whole industry! 🤠
Finally, and I mention this is my demonstration video, and sorry to mention Apple again, but their ARKit implementation is very attractive to developers. It’s a purpose-made platform that tightly integrates software and hardware (the ever-expanding Apple Silicon processors and LiDAR scanners) that is ‘easy’ to integrate into the Swift programming language. With their new Object Capture API, I can only see native Apple development growing stronger, rather than using Unity as a go-between option. Of course, Android needs to be considered too, but I should probably build for the phone that’s in my pocket right now, and then build for the phone that’s in my friend’s pocket later on.
Benyon, D., 2012. Presence in blended spaces. Interacting with Computers, 24(4), pp.219-226.
McLuhan, M. and Fiore, Q., 2008. The medium is the massage. London: Penguin.