101 Ways to Not Program a MyoBand

Hello world!

I have not failed 1,000 times. I have successfully discovered 1,000 ways not to make a lightbulb

Thomas Edison

The quote is probably apocryphal, but I like it anyways. This week, my job was to program my myoband to recognize six different hand gestures: cylinder grip, fine pinch, 3 jaw chuck, key grip hook grasp, and index finger point. And now, I know several fantastic ways to fail at programming them! They are:

Use the DZHU classifier

My first thought was to use code somebody else has written to classify the data. PyoConnect, the program I am using, is built off of another program written by DZHU on GitHub (GitHub is a website where people share code). PyoConnect works fantastically, so I figured that DZHU would work well too. And DZHU’s code contains a ‘classifier’, which basically takes in user data and trains an AI to recognize new gestures. Unfortunately, it keeps crashing! This is a known issue according to google, but it hasn’t been solved. PyoConnect has no classifier, so I can’t use it.

Use the Roll/Pitch/Yaw data

My first failure was trying to figure out which data to use! You see, there are two types of data the band gets from the arm: motion data, and EMG data. At first, I mistook the motion data for EMG data — and I couldn’t figure out why my data was so weird! Motion data describes the roll/pitch/yaw of the hand, but what I wanted was the electrical muscle signals! So I found the correct variables, and went back to the drawing board.

Approximate positions with data ranges

So at first, I thought my task was fairly simple. I cannot fathom why I thought this, but I did. Basically, my hope was that when a certain person put their hand in a certain position, the emg data would all be within a certain range — so if the signals from one muscle were from 345 to 567, then you knew that person A was pointing their index finger. So I’d code it where a person would put on the band, move in all six positions, and then it would know what the range was. Well, turns out it’s way more complex than that. It’s more of a ratio than a range, and I’d need a medical degree to understand it. So on to the next plan!

Write my own AI library

Why oh why did I do this?? I figured I’d have to use AI at this point, because I was classifying data. But instead of using the programs that had already been written, I decided to write it myself. After reading several papers on it, I remembered something: I was programming in python. Python has two wonderful AI libraries, PyTorch and TensorFlow. I should really use those rather than writing my own AI library, which would involve enough research to get a PhD.

Use TensorFlow!

So I decided to use Tensorflow! I was actually reasonably far along with my code (by that I mean I had a few hundred lines of code that failed to compile) when I showed it to my professor and he asked why I wasn’t using the built in AI in the armband. You see, whenever you put on a MyoBand, you have to make a several gestures to teach it what the EMG signals from your gestures looks like. However, if it tells you to make a fist, and instead you give it the finger, it will recognize you giving the finger as a fist. Technically, I could use this data to teach it all the necessary signals, but it only can handle up to five signals. It should be trivial to get it to hold more signals, and by trivial I mean it will take me all summer. So that’s my next step!

Hyperloop

The brakes team did some static testing recently, and got some really neat data!

email

Basically, what this is showing you is how much force the halbach array (which is a big bar covered with strong magnets) will exert on the I-beam.

email

We also got a Kalman filter to work! The controls team wrote some code that filters noise out of data.

At the moment, I am trying to recruit next years team. This is easy, because so many people have lost internships to to COVID, and want something neat to do over the summer. This is also hard though, because I have no idea how to read resumes.

You see, while there is a wealth of information online about how to write resumes, there is considerably less on how to look at them. I’m looking at over 90 resumes right now, and while some of them are easy decisions, many of them are kind of confusing. For example, I got a resume recently with a QR code that sent me to a youtube video about a volunteer project he did (if this is you, don’t worry you are definitely in due to other stuff you wrote about). What am I supposed to do with this information?? Why was it a QR code instead of a link?? What does this mean?? If I were a professional I would probably have a pretty good idea about what this indicates, but I have never seen anything like this. The resumes for people who want to be on the business team are significantly more confusing than the ones for engineers, which makes sense because they are trying to demonstrate how good they are at grabbing attention. The problem is that since I’m an engineer rather than a businessperson, I have even less of an idea in what to look for in a business application, and they are making these already confusing resumes even more confusing! My main worry is that I will turn down somebody qualified who deserves a position on the team. I have been turned down from many things, and I know how sucky it feels and I don’t want to do it to somebody else.

May you be ever victorious in your future endeavors,

M.E.W