By Thursday I'd made two prototypes. Not two polished versions, two iterations of figuring out what the thing actually is. I explored a hexagonal grip pattern and a trapezoidal profile as an alternative. The hexagonal one had something about it, the tactile quality, or maybe just that it felt less generic. Having both in hand made the comparison immediate.
The consultation on Friday was titled 'Refining' which was an accurate description of where Week 2 had landed. What I was showing: a breadboard setup with 3D printed hexagonal keys mounted in a frame, connected to piezo electric sensors in a grid arrangement, running into p5js on the laptop.
WEEK 2
20-23 January 2026
Grid Structure
Second form modelled in 3D before fabrication. Testing whether a fabric with the texture surface profile is more effective. Comparing both in hand resolved the question quickly.
Arduino + Piezo Setup
This image is the Arduino Mega Shield with the wiring connected to the 10M Ohm Resistors
Before fabricating the second form I modelled it in 3D first. Going through the modelling process forces decisions about proportions and grip zones that are easy to skip when you're just 3D printing and testing. The form went through a few variations before I settled on the one worth making.
Friday was the grad project consultation with Andreas, Ethel was also in the session. The electronics: Arduino UNO connected to piezo electric sensors via analog pins and resistors. The schematic was in place, and I had an exploded diagram working through the layers: fabric on top, sensor underneath, circuit below that. The breadboard setup was functioning. Next phase is moving this onto a proper PCB.
For this prototype I had the piezo sensors sit on top of the 3d printed grid where the resistors were underneath. The sensors are pressure sensitive, so they can be activated by pressing down on the keys, but they also pick up vibrations, which is why I had them sit on top. This way they can pick up the vibrations from the keys being tapped, which is important for the kind of interaction I'm aiming for.
I then tested the sensor to the visual using the p5js sketch I had from Prototype 1. For the time being I had the sensors controlling horizontal bands with one zone highlighted on activation. This was just to prove the connection between the sensor input and the visual output, but eventually each zone will control a different area of the visual output, allowing for more complex interactions.