Samuele Cigardi

Aqualandia Scary Falls VR

Description

This is one of the greatest projects I've ever worked on. It was a huge challenge and I loved it. Everything started when Aqualandia, the massive waterpark based in Jesolo asked for a brilliant idea, a VR waterslide.

At first I thought: "Guys, this is crazy and absurd, water and electronics are known for not being great friends! And how is the movement supposed to be tracked inside the water? The Wifi and Bluetooth connections hate water, it's impossible to communicate with the device in a stable way!" and those are the reasons why I said: "Of course, let’s do this!".

 

First of all, I started working on the hardware part. We decided to use a Google Cardboard Ready smartphone, but most important we had to find a solution for the water problem, so I developed a Waterproof 3D printed case for the chosen smartphone in collaboration with Solidcreation SNC.

 

The second big hardware problem was finding a fail-proof solution to track the phone position during the waterslide. I worked hard on testing different positional tracking systems already existing on the market, but none of them was stable enough for what could happen during the slide. But an idea came up. Based on the well known bluetooth Beacon tracking, I created a similar system using high frequency audio signals, which act as "checkpoints" during the slide. Realigning the experience, the signals were triggered when the raft passes in certain points of the waterslide, the frequency of the impulses was calibrated to be the resonance frequency of the smartphone microphone and because of that the designe of the waterproof covers had to be modified adding a small resonance chamber tuned to this frequency. Using this fail-proof communication method, I could transmit bytes that identifies the "Beacon", this method worked well and turned out to be great, cheap and stable.

 

The last problem I had to deal with was that the system has to enter in a low power mode at the end of the slide, like a fade out, and it has to ignore all the impulses of the "Beacons" triggered by other people that are performing the slide. Along the same path, another issue was the calibration of the slide direction. Because the compass of the phone is too unstable to rely on, I solved this two troubles using the NFC. Here is where the magic happened, the solution was just a card. As a first start, the NFC card is swiped in front of the HMD by an attendant when the users are looking in the direction of the slide and they are ready to begin. This action activates the system and calibrates its direction. Easy peasy. For doing it in Unity I had to create a Native Plugin using Android Studio, the plugin will be soon free on the Unity plugins store.

 

Moving on to the software part, to analyze the whole system we decided that the rendering of the environment could not be in realtime, but it should be pre-rendered to reach the greatest quality on the Smartphone based system. The problem now was that the sequences needed a lot of calibrations to perfectly fit the timing of the various parts of the slide, so we decided to create a workflow based on Unity as rendering engine. In collaboration with the Bigrock's RED Team headed by Paolo Zucchetto, we created different stereo 360 video sequences that are triggered in any different situations by the smartphone app. I developed the app in Unity as well.

In the end, to have a final stable system, I created an Android distro especially for this, in which I implemented a watchdog that keeps the Application open and reopens the App if it crashes.

Sorry for the long post, here’s a potato.