Robotics and Downloading

Hello World!

So this summer, I have two research gigs going on: my robotics research with Professor Schofield, and the AI one with Professor Tagkopoulos. It’s much less work than regular classes, so I’ve been having a pretty nice break!

BEAR Labs

Unfortunately, my professor and I have concluded that the MyoBand I have been using to detect the myoelectic signals from people’s arms is probably not the greatest of tools for doing what we are supposed to be doing. It’s good for commercial use, but not great for hand positions other than the ones that it has been trained for. If we had an entire lab to figure this out it could be done, but with one undergrad we aren’t really going anywhere. The primary issue is that it really only detects huge amounts of muscular signals. So if you are doing something relatively minor, like a pinch grip, it doesn’t pick it, and a pinch grip will be pretty important for what we are going to use it for. Unfortunately, one of the gestures that it does detect is when a person bends their wrist backwards, and I’ve been doing that gesture quite a bit! My wrists are rather sore.

Professor Schofield, being the amazing human he is, decided that since I can’t do anything all that useful until one of his grad students has something for me in the fall (which could get me on a paper, I love this prof) I can have a bit of fun plumbing the depths of the MyoBand and making a proof of concept! So he gave me a robo-hand, which I hooked up to the Pi and it can now move it’s fingers based on the program I wrote! Viewer discretion is advised for the second video.

Professor Schofield, being the amazing human he is, decided that since I can’t do anything all that useful until one of his grad students has something for me in the fall (which could get me on a paper, I love this prof) I can have a bit of fun plumbing the depths of the MyoBand and making a proof of concept! So he gave me a robo-hand, which I hooked up to the Pi and it can now move it’s fingers based on the program I wrote! Viewer discretion is advised for the second video.

Eventually I’m going to program new gestures into the MyoBand and make it control the hand based on the Pi. I’m also going to see if I can adjust the gain on the band – basically make it more sensitive so although it will hear more ‘noise’ it will also be able to detect more sensitive gestures.

robot hand on the table Also, just for fun, I attached the hand to a cardboard tube to make it look like an arm.

Tagkopoulos Labs

My job for the AI lab is to train a AI to associate words with other words. The final goal is to have essentially a search engine that can search scientific databases for articles pertaining to food science. Word association programs are quite common. For example, when you are writing an email in gmail, you can set it to predict the rest of your sentence and it’s reasonably accurate for a machine! That’s because it has been trained with a dataset consisting of human emails, and has a pretty good idea of how emails are written. I am going to be training it in a food science dataset! It will then be used to search scientific literature for food science applications. So it will learn what words are associated with foods in a scientific realm.

There is a catch though. You see, I need to run this on a massive dataset. So the first step for me is to download the enormous dataset! I wrote a script to download everything, and have been running it since 3:00 today. It is 10:00 as I write this, and I am 7% complete. So I guess this program will just be running in the background for a couple days!

May you be ever victorious in your endeavors!