I'm Fine Box Breathing AR App​

Client project created for helping kids work through their emotions by using an Augmented Reality (AR) App in Partnership with The Look for the Good Project developed in Unity3D.

ROLE: Product Owner, Product Manager, Designer, Developer
TIMELINE: 8 Month Graduation Project
SKILLS: Unity3D Development, 3D Character Implementation
TOOLS: Unity3D, Maya, Figma, C#

Context

After working with at-risk kids for the past 7 years running after-school programs in Vancouver, B.C, Canada, I found that some youth I met had a hard time understanding how to control their emotions. A specific situation I recall is the difficulty one eight year old girl named Stella faced with her feelings. She would burst out crying or fill with rage every time something she could not control happened around her. This experience showed me that there is a necessity for alternate methods to help kids understand how to work through their emotions. That’s where this app comes in! The I’m Fine Breathing App is an Augmented Reality (AR) app that helps kids find the inner strength they need when feelings get overwhelming by providing self-reflective breathing practices to work through difficult situations in a fun way.

Partnership

This project is based on content from the partner organization The Look for the Good Project (LFTG), a non-profit organization that gives kids access to social-emotional learning programs. I partnered with one of the founders and integrated their insights in child development from their workbooks based on in-depth research from child psychologists and industry experts.

This app was created on a contract basis where I licensed my IP to The LFTG as a free app for the community.

The Problem

Most AR apps on the market currently range from games, toys, animated filters, and very simple educational tools for younger children. There has not been much exploration into AR and social-emotional development with kids, which is why I want to use this medium to engage already digitally native kids in activities that demonstrate an alternate perspective to how they see their emotions. 

The goal of this project was to create an augmented reality tool to helps kids work through their feelings through proven social-emotional reflective practices.

Objective

The app focuses on the use of 4 general emotions in the form of 3D characters that are used as a companion to guide the user to do breathing techniques based on how they’re feeling. 

Goal

How can I create an experience that allows kids to engage and connect with their emotions in order to enhance their social-emotional understanding?

The Tools

I designed all of the mockups and user flows in Figma.

I used the Unity platform as well as Maya to develop the final app. I worked with Unity3d to create the Interactions using c# scripts and Apple ARKit plugins to develop the AR Face tracking mechanics.

The Design

After several iterations I created an outline and flow of how the user will go through the activities.

  • The physical movements for the activities were chosen because they are are proven from the partner organizations workbooks to focus hyper-emotional feelings and centralize them into activities to release tense emotions.
  • The 3D characters were chosen to reflect back the users actions in the form of a companion so as not to feel so alone and like these emotions are so scary.
  • The interface asks how you are feeling because being able to choose the user is able to establish their own emotions instead of just jumping into activities that do not have an explanation behind them.

The Development

I created features called “Blendshapes” in MAYA which allow for the 3d character to move a different part of the face (eyes up, down, side, etc..)

I imported the 3D characters in Unity and connected the “Blendshapes” to the face tracking mechanism used to make AR face tracking work (i.e: When your eyes move, the characters eyes move the same way)

I duplicated each of the characters and created the different colours that correspond to the different emotions to each of portion of the app.

Testing and Release

I created the app for IOS only currently as the front facing and face tracking camera is only supported on IOS 12 and above with the Unity engines. I used Xcode to test and run the app as it was the best way to support the project. launched this project on test flight for preliminary beta users within the partner organization so that they are able to try the app out for themselves.