One idea that I’ve been thinking about, straight out of Phyiscal Computing’s Greatest Hits, is a pair of MIDI controller gloves. One glove would control drum beats via tapping:
The other glove would control the pitch of synth pad notes by sliding your fingers across a surface:
I’m imagining that the synth glove would trigger a note continuously while your finger was in contact with a surface, even if you weren’t sliding your finger to change the pitch. I still need to do some research to determine whether I’d need additional sensors beyond force-sensitive resistors to sense a sliding motion.
I’d also like to provide some sort of visual and/or haptic/rumble feedback to user when notes are triggered. Visual feedback could be something like LEDs near the fingertips that light up when notes are triggered.
I’d also like to make these gloves wireless if possible, so I’d have to look into sending MIDI data over Bluetooth or WIFI.
It might be unrealistic to try to make two working gloves in time for the final, in which case I would just pick one of the two to work on, but I like the idea of a pair that can be combined to play rhythm and melody simultaneously.
I also think that this project would benefit from collaboration, as it’s pretty technically complicated in scope.
However, in order for this to qualify as a Physical Computing final project, I would need to come up with a way to make it physically interactive.
One way to make it interactive could be to use a Kinect and/or computer vision to detect the contours of a person’s body or face, and have the machine draw those contours, adjusting the drawing if the person moves. So, if you were to stand perfectly still, the machine would draw you accurately, but if you moved, you’d end up with a distorted drawing.
This is also a technically complicated project that would benefit from collaboration.