AR Project Behind-the-scene
AR game development is certainly full of fun and uncertainty. Unlike traditional development in established engines like Unity and Unreal, AR development is done in less robust engines such as Spark AR, as well as Unity plugin API. I was introduced to AR game design in a class associated with Niantic Labs and the concepts behind AR and location-based games are absolutely fascinating.
Project One - Modified Object
Our first project prompt was to modify an existing object with AR. Being a detective and crime show fan, I work on my prototype under the premise of a detective game. The prototype is meant to be a fancy fingerprint dusting tool in the detective’s toolbox to find and identify the suspect.
Since the surface recognition tool in Spark AR was not able to recognize the surface of a cylindrical object, I built a mesh model that resembles the juice bottle’s volume and shape and then place the fingerprints onto the surface of the model. Using the pattern recognition feature, I was successful in placing the model in the same place when my phone scanned the bottle’s lable. The process took quite some time to do position adjustments so that the fingerprints look believable. Here’s a video of early testing.
Project Two - Augmented Location
Under the same detective game setting, my second project was aiming to create an augmented location where I have to modify an actual place. This time, I extended my premise that there is a murder happening in the apartment next to where I was living. The suspect broke into the crime scene using the underground garage entrance. After committing the crime, the suspect fled the scene with the murder weapon dripping blood on the ground. The prototype is another tool in the detective’s toolbox and it reveals the footprint and blood trail on the ground to help the detective determine the suspect’s route in and out of the crime scene.
The footprints are designed and placed under the premise that the suspect has a limping leg with his left foot folding inward. The size and gait pattern of the footprint is also showing the player extra information about the suspect’s height and the physical trails of their legs.
Here is a video of my early testing using the plane detection tool.
In-Engine Screenshot - Footprint and Blood Trails Placement on a Virtual Plane
Project Three - Augmented Fairytale
My third project is a team project and the prompt was to recreate a scene from any fairytale. After our group brainstorming, we ended up picking The Three Little Pigs as the model and the mechanic of the big bad wolf blowing away the little pigs’ houses. We designed to have the player blow into the microphone to imitate the big bad wolf’s behavior and we would convert the volume from the microphone to a force vector to push the house over. In our pitch vision, we imagined that this could be a fun interaction if used in a children’s book app so that young readers can experience the story in front of their eyes through AR technology.
Later in our development, we added a goal to have the house try to go back to its original position if the player’s blow isn’t hard enough. In this way, the player would have to blow repeatedly until the house is flipped over, the same way the big bad wolf did in the original story.
After many tries, we managed to make the prototype work within the Spark AR engine, but the flipping-restoring animation and volume conversion feature encountered bugs that we weren’t able to solve. And so the final demo was done in the engine, and we could demonstrate to our peer that the model placement works as expected on the phone.
In-Engine Screenshot - Early Models and Placements
In-Engine Screenshot - Vksual Scripting within Spark AR Engine