SOUND DESIGN, SOUND RESEARCH, EXPERIMENTATION
EXPERIMENTAL PROJECT IMAGE2SOUND — MULTIMODAL SOUND INTERACTION SYSTEM
This document presents the official and complete version of the research project dedicated to the development of a synesthetic interactive system based on the simultaneous combination of two real-time acquisition technologies: the Playtronica tactile/conductive interface and the patented optical system Image2Sound (WO2022064416A1).
The aim of the experiment is to transform water into a dual control surface capable of generating a complex and organic soundscape, simultaneously shaped by human touch and by chromatic and morphological variations captured optically.
The project investigates the potential of water as a hybrid medium—both physical and visual—capable of activating a multimodal dialogue between body, color, and sound.
The choice of water as an interface is not only functional but conceptually meaningful: it reacts unpredictably to interactions, amplifies gestures, distorts images, alters conductivity, and introduces an element of natural variability that becomes an integral part of the sound composition.
Water is not a passive support, but a true generative matrix.
The system consists of two parallel acquisition channels.
The Playtronica component uses conductivity and human contact to detect variations in the electric field and ripples on the surface, converting these micro-events into discrete MIDI signals that trigger rhythmic events, impulses, or rapid sonic articulations.
In parallel, the Image2Sound system captures a continuous flow of visual data from beneath the container, based on hue, saturation, brightness, and RGB variations, transforming the movement of immersed objects—or the simple changing of the water—into continuous sonic parameters that shape timbre, frequency, intensity, and duration.
The two data streams are routed and processed within an environment such as Max/MSP, where a sound-mapping system specifically designed for this research governs the relationship between physical gesture, visual variation, and sonic response.
The tactile layer produces the discrete structure of the piece, while the optical layer generates the ambient and modulatory component.
The resulting composition is a dynamic sonic ecosystem in which rapid events and fluid textures intertwine continuously, without predetermined repetition.
The interaction between color and sound also introduces an advanced synesthetic dimension: hues and chromatic values influence pitch and frequency; saturation and brightness determine intensity and duration.
From a research perspective, this project represents a significant advancement in the study of multimodal interfaces.
The simultaneous integration of two distinct sensory vectors demonstrates that it is possible to build complex yet intuitive sound-control systems based on parallel perceptual modalities.
The user can “play” the water both by touching it directly and by manipulating colored objects, without any physical contact with the sensors—an example of synesthetic interaction and of overcoming the traditional dichotomy between discrete and continuous input.
Furthermore, the use of the patented Image2Sound system in a dynamic setup provides an important validation platform for its future applications.
Alessio Greco’s contribution is central and unfolds across three fundamental levels.
As a sound designer and researcher, he defines the mapping architecture, establishing the relationships between tactile data, chromatic variations, and sonic parameters.
As a developer of interactive systems, he implements the protocols and data-flow integration in Max/MSP, ensuring stability and coherence in multimodal behavior.
Finally, as an experimental artist, he guides the exploration of the system’s timbral and compositional possibilities, turning it into an expressive and performative language.
The project opens up a broad spectrum of applications.
In the museum context, it can serve as an interactive platform for responsive and immersive installations.
In the neuroscientific field, it offers a potentially valuable tool for sensory-substitution devices or multisensory feedback systems.
In performance settings, it enables the creation of live sets where the performer controls rhythmic structure through touch and timbral dynamics through color, resulting in truly hybrid performances.
The project documentation includes essential materials for a portfolio: demonstrative videos showcasing the interaction between the two systems, technical diagrams of the data flow from sensor to sound, images of the experimental setup, and audio recordings illustrating the relationship between the rhythmic layer generated by Playtronica and the ambient texture produced by Image2Sound.
This document synthesizes and reorganizes the project materials into a professional format, presenting it as an advanced example of multimodal research capable of uniting technical rigor, artistic experimentation, and conceptual innovation.