This page hosts some Android Apps that were developed for the Sonic Tilt Competition at ICAD 2023. The Sonic Tilt App allows you to hear how close it is to level by sonifying accelerometer sensor data in real time.

The Apps can be installed by downloading the APK file on an Android device. These Apps were made with an older version of Android which is no longer supported by the Play Store. However they may (or may not) still work on your device if you want to try them.

The original Tiltification App was developed by Dr. Tim Ziemer and his students at the University of Bremen.

This Sonic Tilt Challenge inspired me to contribute three Apps to explore other sonic metaphors, sound designs and synthesis techniques.

SonicLevel-Tuning
The SonicLevel-Tuning App uses the metaphor of tuning a guitar to transform the visual task of levelling a two dimensional bubble into a one dimensional sonic task of tuning a string to a reference tone.

I found it easy to get close to level by sound alone, but it took longer to find the exact level because although the sound provides information about closeness I couldn’t tell which direction to tilt when very close. It was frustrating to keep trying random movements to find the exact level. The sonification design could be improved by adding additional directional information when close to level, but I like the simplicity of the tuning metaphor and the implementation. Although I found the App to be fun and rewarding, people around me found it to be annoying and asked me to use headphones.

SonicLevel-Pobblebonk
The SonicLevel-Pobblebonk App sounds like a congregation of frogs in a pond. You navigate around the pond by tilting the spirit level. Different frog-calls come from each quadrant and become more distinctly different as you approach the centre. This allows you to tell which way to go as you get close to level. When you reach the level spot in the middle the frogs stop calling and the pond goes quiet.

I knew that it was exactly level when the sound stopped, but I often passed through this point and had to readjust because it was too late to stop moving by the time I had mentally registered the silence. Perhaps an auditory cue that you are very close, but not quite level could improve hand-ear coordination. In the Visual+Sonic mode I noticed that the graphical feedback on the screen did not show 0 when the sound went silent. This seems to indicate a technical discrepancy between the visual interface and the sound synthesis in processing the data from the tilt sensor.

SonicLevel-Crickets
The SonicLevel-Crickets App is modelled on a singing cricket that stops chirping when you move toward it. This sonification provides information about relative movement rather than absolute position in the 2D space. It is a minimal approach where sound and silence provide binary information.

I found a clearly audible lag between the movement of the phone and the update of the tilt sensor values which means you have to move slowly to get sound feedback that is relevant to your current movement and position. The update rate of the tilt data seems to be about 200 ms or 5 updates per second. This suggests that the sensor update rate needs to be an order of magnitude faster to allow for hand-ear coordination. This observation raises the difference in hand-ear vs. hand-eye coordination times. The time for hand-eye coordination in a User Interfaces is predicted by Fitt’s Law. I couldn’t find an auditory version of Fitt’s Law, but such a thing would be useful for designing Sonic Interfaces to take into account hand-ear coordination times.