Three physical controllers, a textured xylophone, a hexagonal pad grid, and a wireless joystick, built through Research through Design and tested with 10 participants using the System Usability Scale. Grounded in conversations with Malay percussion practitioners.
Five 3D printed bars in a linear xylophone-inspired layout. Each bar carries a distinct surface texture, default, grids, cloud, steps, organic wood, allowing tactile navigation without visual reference. The linear arrangement gives performers a spatial map they can feel during performance.
Piezo electric sensors beneath each bar, feeding into an Arduino UNO via analog input pins. 10MΩ resistors to ground stabilise the signal. Five sensors at the UNO's practical capacity, a constraint that sharpened the design question: how much can five zones express?
Built under Research through Design, the prototype itself is the inquiry. The goal wasn't a finished product but to find out what the making revealed about the design problem.
Five 3D printed bars with distinct surface textures, default, grids, cloud, steps, organic wood, enabling tactile differentiation without visual reference. The linear xylophone layout gives performers a spatial map they can navigate by feel during live performance.
Five piezo electric sensors, one beneath each bar, connected via 10MΩ resistors to Arduino UNO analog inputs. The 10MΩ value worked across all five piezos without individual calibration, confirming the approach could scale reliably to a multi-input system.
Sensor data fed into a p5.js sketch. Each bar maps to a distinct zone, bar 1 to zone 1, bar 5 to zone 5. A deliberate linear simplification to prove the gesture-to-visual connection before adding complexity. Received the most positive feedback for intuitiveness across all three prototypes.
Testing the textured xylophone in context, observing how percussion-informed striking gestures map across five zones to produce corresponding visual responses.
Nine piezo sensors in a 3×3 spatial matrix, in a stacked two-layer structure, electronics below, sensor surface above. Compact and self-contained, built from laser-cut wood and 3D printed sensor mounts.
Arduino Mega for nine-channel analog input. A prototyping shield from Kuriosity (Sim Lim Square), soldered top and bottom to keep the build clean. Wiring documented in an Illustrator schematic before assembly.
Striking the top-right sensor activates the top-right visual region. Position itself becomes part of the expressive vocabulary, not just whether you hit, but where.
Two distinct layers: a bottom layer housing all wiring and electronics, and a top layer holding the sensor grid. This stacked approach made the object compact enough for a live setting, moving beyond breadboard-stage construction.
Nine piezo sensors require nine analog inputs, beyond the UNO's capacity. The Mega provides 16 analog inputs. A Mega prototyping shield from Kuriosity at Sim Lim Square was soldered directly to keep all nine connections clean and stable.
The 3×3 hexagonal grid maps spatially to a 3×3 region in the p5.js visual output. Performers who grasped the spatial mapping reported a stronger sense of agency, the correspondence needed to feel immediate, not learned through trial and error.
Observing how the shift to spatial, area-based input changes the quality of performer interaction, from sequential triggering to positional expression.
Joystick in a 3D printed handheld casing. Continuous X/Y axis data, slow drifts, circular movements, held tension. A different expressive vocabulary from the percussive prototypes.
ESP Feather S3 with built-in Bluetooth. Joystick data transmitted wirelessly to the p5.js sketch, no cable, no tether, full stage freedom. Latency management was the central technical challenge.
Strongest at expressive duration, gradual shifts, sustained visual states, rather than precise event-triggering. The controller's strength is in shaping over time, not punctuation.
Prototypes 1 and 2 were both tethered to a laptop via USB. Prototype 3 takes portability seriously as a design requirement, the controller works anywhere on stage without a cable, changing how a performer inhabits the space.
The ESP Feather S3 is significantly smaller than the Mega, making the handheld form factor possible. Built-in Bluetooth replaces the USB tether entirely. A small but real latency challenge: for continuous input, even minor lag can break the feeling of direct connection.
Piezo sensors produce discrete impact events. The joystick produces continuous X/Y data updated in real time. This opens different expressive possibilities, but also a steeper learning curve. The tool is most satisfying when producing sustained, evolving visual effects.
Exploring what wireless freedom changes in a live performance context, the relationship between a performer, the tool, and the space.
The complete system in use — all three prototypes brought together into a single performance documentation.