Multimodal Interfaces for Human Perception of Digital Datasets

This is an invited seminar at the Technical University of Chemnitz in Germany.

Data visualisation enables the human perception of complex information in digital datasets. However the everyday perception of the world around us does not rely on vision alone. In his seminal book the Senses Considered as Perceptual Systems J.J. Gibson proposed that human perception is an integrated multimodal system that actively seeks out information from the environment, rather than a set of 5 independent channels of passive reception. The hypothesis of ecological perception proposes that multimodal interfaces will be more effective for the human perception of information in digital datasets than visualisations alone.

In this talk I will present experiments that apply this theory in a variety of multimodal interfaces that combine interactive sonification, haptic feedback and 3D visualisation to perceptualise datasets from environmental monitoring, oil and gas discovery, car design, museum exhibitions, nueroscience, surgery training, elite sports coaching, self health monitoring, and climate change.

Through these experiments I have developed methods for designing stream-based sonifications using auditory grouping, a diagram for designing affect in interactive interfaces modelled on the Affect Grid and Russel’s Circumplex of Emotions, and proposed Sonic Information Design as a third wave design research paradigm for sonification.

During the talk I will describe and reflect on my experience with multimodal technologies such as the VR CyberStage, the Haptic workbench, non-visual haptic-audio installations, a touch sensitive floor, an animatronic couch that purrs when you pat it, 3D printed acoustic sonifications, and the MozziByte microcontroller synth.

Seminar links