This week, my job was to program my myoband to recognize six different hand gestures: cylinder grip, fine pinch, 3 jaw chuck, key grip hook grasp, and index finger point. And now, I know several fantastic ways to fail at programming them! They are:
My first thought was to use code somebody else has written to classify the data. PyoConnect, the program I am using, is built off of another program written by DZHU on GitHub (GitHub is a website where people share code). PyoConnect works fantastically, so I figured that DZHU would work well too. And DZHU’s code contains a ‘classifier’, which basically takes in user data and trains an AI to recognize new gestures. Unfortunately, it keeps crashing! This is a known issue according to google, but it hasn’t been solved. PyoConnect has no classifier, so I can’t use it.
My first failure was trying to figure out which data to use! You see, there are two types of data the band gets from the arm: motion data, and EMG data. At first, I mistook the motion data for EMG data — and I couldn’t figure out why my data was so weird! Motion data describes the roll/pitch/yaw of the hand, but what I wanted was the electrical muscle signals! So I found the correct variables, and went back to the drawing board.
So at first, I thought my task was fairly simple. I cannot fathom why I thought this, but I did. Basically, my hope was that when a certain person put their hand in a certain position, the emg data would all be within a certain range — so if the signals from one muscle were from 345 to 567, then you knew that person A was pointing their index finger. So I’d code it where a person would put on the band, move in all six positions, and then it would know what the range was. Well, turns out it’s way more complex than that. It’s more of a ratio than a range, and I’d need a medical degree to understand it. So on to the next plan!
Why oh why did I do this?? I figured I’d have to use AI at this point, because I was classifying data. But instead of using the programs that had already been written, I decided to write it myself. After reading several papers on it, I remembered something: I was programming in python. Python has two wonderful AI libraries, PyTorch and TensorFlow. I should really use those rather than writing my own AI library, which would involve enough research to get a PhD.
So I decided to use Tensorflow! I was actually reasonably far along with my code (by that I mean I had a few hundred lines of code that failed to compile) when I showed it to my professor and he asked why I wasn’t using the built in AI in the armband. You see, whenever you put on a MyoBand, you have to make a several gestures to teach it what the EMG signals from your gestures looks like. However, if it tells you to make a fist, and instead you give it the finger, it will recognize you giving the finger as a fist. Technically, I could use this data to teach it all the necessary signals, but it only can handle up to five signals. It should be trivial to get it to hold more signals, and by trivial I mean it will take me all summer. So that’s my next step!
May you be ever victorious in your future endeavors,