BEAR Laboratories

Hello World!

This week marked the beginning of my journey at UC Davis's BEAR laboratories! I've had previous lab experience, but those were more about building a foundation for my resume, hoping to land a spot in cooler labs down the road. The BEAR lab, on the other hand, is where I intend to stay until I graduate.

BEAR, or the Bionic Engineering and Assistive Robotics Laboratory, is under the guidance of Professor Jonathon Schofield from the Mechanical and Aerospace department. He's a specialist in machine-human interfaces, essentially dealing with the technical aspect of cyborgs. While a substantial part of the research focuses on prosthetics for individuals with injuries, some projects explore more unconventional territories. I'm currently involved in a project to design a third arm for astronauts to use as an extra appendage in their spacesuits. This arm can assist astronauts in various tasks, such as holding flashlights while they use their other two arms. So, in essence, I've transitioned from programming massive lasers for the military-industrial complex to designing arms akin to Doc Ock's. If I could grow a mustache, I'd be twirling it right now!

The MyoBand

A while back, Thalmic Labs developed a device known as the MyoBand. This wristband senses 'myoelectric' signals from the arm. As a programmer, my understanding of myoelectric signals is rather limited, but the basic idea is that your brain triggers muscle contractions by sending electrical signals through them. These signals are strong enough to be picked up by a sensor on the skin. The MyoBand captures these impulses and uses them to interpret the user's hand movements. Since these impulses can vary depending on the user and where they place the band, it requires calibration by having the user perform various hand movements when they put it on. This calibration process makes the band versatile and adaptable. Originally designed for controlling PowerPoint presentations, our goal is to make it do something far more intriguing. It will detect the user's hand movements and send this data to a Raspberry Pi. I'll program this behavior using the Thalmic Labs software development kit. Subsequently, I will program the Pi to relay the data to a robotic arm being designed by the PhD students, causing the arm to move in response to the wearer's hand motions.

The lab is brand new, and MUCH nicer than any of the other labs I have worked in. The whole place looks kind of like a hipster’s living room, complete with super snazzy white boards on every wall. There’s a 3D printer, a soldering iron, several breadboards, and many more tools for making prototypes. Predictably, there is significantly less physical equipment in graduate student labs than in undergraduate labs, because they outsource their building. It seems like an awesome place to design an evil cyborg suit!

May you be ever victorious in your endeavors!