Mapping IMU to Gestures / by Maya Pruitt

Made a little progress:

I wanted to try mapping IMU values to the hand symbols. Instead of mapping the raw data, I was advised by my partner, who works with drones, to convert the IMU data to pitch, roll, and yaw or angles of change in position/rotation. A few equations found here allow for this sensor fusion. This made the IMU values more reliable and easy to read.

For a quick hacky fab start, I attached the Arduino to the top of my hand with a hair tie to test.

I performed the gestures and assigned ranges of pitch & roll values to correspond with each gesture. (i.e. if pitch is between these values, and roll between these, then that’s note “do”….maybe there is a better way to do this?) I can only get to “So” with IMU alone, so with the combination of flex sensors, I think it’ll become more precise.

Here is a quick gif of the initial mapping.

IMUmapping.gif

I notice that when changing hand gestures, it’ll pass through notes I may not want to play. Not sure how to avoid this.