Insights on our
FWA Award

Get the details on the FWA award winning UFC Ultimate Staredown

Get the insights

What did you want to accomplish with the project?

We really wanted to bring people face to face with an actual UFC fighter, so they could experience what it would be like during that intense moment when fighters first meet before a UFC event.

What technical challenges did you encounter and how did you solve them?

Wow, ok. There were quite a few.

The first was the AI side of things, mainly 2 areas. Accuracy and performance.

We consulted and trialled a lot of emotion API’s and found the accuracy was entirely dependant on how well the machine learning algorithm was trained.

In a lot of cases, we found some AI’s would be more accurate towards a certain ethnicity and less accurate towards other ethnicities and visa/versa.

When we trialled Microsoft’s Cognitive Services API we found it to be the most accurate and this was vital for the accuracy of the scoring mechanic.

The other issue was mobile device performance - specifically in the browser.

Anyone that has tried to run a machine learning algorithm within a browser will tell you the performance load on the device is quite large. This is due to the fact that these algorithms are quite CPU intensive by nature.

We had several prototypes running algorithms on the device in real-time however the performance was crippling.

In the end, we decided on a solution where throughout the staredown we would capture the users face every half a second, build a single square image of over 40 captures and ask the Cognitive Services API to return us this data for the 40 captures. This AJAX request allowed us time to bring in the ‘decision pending’ screen and help build tension like in a real UFC fight.

The other issue we ran into was around the fighters availability and collecting the actual footage before a UFC fight.

Obviously, the fighters before an event are extremely busy and we were challenged with obtaining this footage due to scheduling.

In the end, the fighters were brought to life by retouching still photos of their faces onto a head and body model. We then animated subtle movements like breathing and muscle-twitch before using the toolkit, VoluMax, to create a realistic 3D effect.

What did winning the FWA award mean to you?

Everything.

Both founders of our company started out as Flash Developers. We’d check FWA every day for what agencies and people were doing to push the limits. For us, it’s the Gold Standard.

Years on now, founding and building an agency that can deliver FWA worthy work is a dream come true. We’re so incredibly proud of the people that make up our agency and to be recognised along with the calibre of talent that resides within the FWA community is an amazing feat.

Tools used:

Node.JS

Vue.JS

Microsoft Cognitive Services (Emotion)

VoluMax

Three hot facts:

We actually created a partial Deepfake of Khabib Nurmagomedov that never made it into the final project.

Not all cognitive services are equal, it depends entirely on how well the algorithm has been trained.

It took close to 24 hours to render a single fighter animation, with 12 macs in net render mode.