Gesture recognition usually involves some sort of optical system watching your hands, but researchers at UC Berkeley took a different approach. Instead they are monitoring the electrical signals in the forearm that control the muscles, and creating a machine learning model to recognize hand gestures.
The sensor system is a flexible PET armband with 64 electrodes screen printed onto it in silver conductive ink, attached to a standalone AI processing module. Since everyone’s arm is slightly different, the system needs to be trained for a specific user, but that also means that the specific electrical signals don’t have to be isolated as it learns to recognize patterns.
The challenging part of this is that the patterns don’t remain constant over time, and will change depending on factors such as sweat, arm position, and even just biological changes. To deal with this the model can update itself on the device over time as the signal changes. Another part of this research that we appreciate is that all the inferencing, training, and updating happens locally on the AI chip in the armband. There is no need to send data to an external device or the “cloud” for processing, updating, or third-party data mining. Unfortunately the research paper with all the details is behind a paywall.
Continue reading “A Gesture Recognizing Armband”
Walt and Molly Weber had just finished several long weeks of work. He was an FBI agent on an important case. She had a management job at Houghton Mifflin. On a sunny Friday evening in February of 1995, the two embarked on a much needed weekend skiing getaway. They drove five hours to the Sierra Mountains in California’s Mammoth Lakes ski area. This was a last-minute trip, so most of the nicer hotels were booked. The tired couple checked in at a lower cost motel at around 11:30pm on Friday night. They quickly settled in and went to bed, planning for an early start with a 7am wakeup call Saturday morning.
When the front desk called on Saturday, no one answered the phone. The desk manager figured they had gotten an early start and were already on the slopes. Sunday was the same. It wasn’t until a maid went to check on the room that the couple were found to be still in bed, unresponsive.
Continue reading “Carbon Monoxide: Hunting A Silent Killer”
[Malte Ahlers] from Germany, After having completed a PhD in neurobiology, decided to build a human sized humanoid robot torso. [Malte] has an interest in robotics and wanted to show case some of his skills.The project is still in its early development but as you will see in the video he has achieved a nice build so far.
A1 consists of a Human sized torso with two arms, each with five (or six, including the gripper) axes of rotation, which have been based on the robolink joints from German company igus.de. The joints are tendon driven by stepper motors with a planetary gear head attached. Using an experimental controller which he has built, [Malte] can monitor the position of the axis by monitoring the encoders embedded in the joints.
The A1 torso features a head with two degrees of freedom, which is equipped with a Microsoft Kinect sensor and two Logitech QuickCam Pro 9000 cameras. With this functionality the head can spatially ”see” and ”hear”. The head also has speakers for voice output, which can be accompanied by an animated gesture on the LCD screen lip movements for example. The hands feature a simple gripping tool based on FESTO FinGripper finger to allow the picking up of misc items.