Electromyography

Electromyography Hack Chat

Join us on Wednesday, January 19 at noon Pacific as we kick off the 2022 Hack Chat season with the Electromyography Hack Chat with hut!

It’s one of the simplest acts most people can perform, but just wiggling your finger is a vastly complex process under the hood. Once you consciously decide to move your digit, a cascade of electrochemical reactions courses from the brain down the spinal cord and along nerves to reach the muscles fibers of the forearm, where still more reactions occur to stimulate the muscle fibers and cause them to contract, setting that finger to wiggling.

join-hack-chatThe electrical activity going on inside you while you’re moving your muscles is actually strong enough to make it to the skin, and is detectable using electromyography, or EMG. But just because a signal exists doesn’t mean it’s trivial to make use of. Teasing a usable signal from one muscle group amidst the noise from everything else going on in a human body can be a chore, but not an insurmountable one, even for the home gamer.

To make EMG a little easier, our host for this Hack Chat, hut, has been hard at work on PsyLink, a line of prototype EMG interfaces that can be used to detect muscle movements and use them to control whatever you want. In this Hack Chat, we’ll dive into EMG in general and PsyLink in particular, and find out how to put our muscles to work for something other than wiggling our fingers.

Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, January 19 at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.

Continue reading “Electromyography Hack Chat”

Quick Face Recognition With An FPGA

It’s the 21st century, and according to a lot of sci-fi movies we should have perfected AI by now, right? Well we are getting there, and this project from a group of Cornell University students titled, “FPGA kNN Recognition” is a graceful attempt at facial recognition.

For the uninitiated, the K-nearest neighbors or kNN Algorithm is a very simple classification algorithm that uses similarities between given sets of data and a data point being examined to predict where the said data point belongs. In this project, the authors use a camera to take an image and then save its histogram instead of the entire image. To train the network, the camera is made to take mug-shots of sorts and create a database of histograms that are tagged to be for the same face. This process is repeated for a number of faces and this is shown as a relatively quick process in the accompanying video.

The process of classification or ‘guess who’, takes an image from the camera and compares it with all the faces already stored. The system selects the one with the highest similarity and the results claimed are pretty fantastic, though that is not the brilliant part. The implementation is done using an FPGA which means that the whole process has been pipe-lined to reduce computational time. This makes the project worth a look especially for people looking into FPGA based development. There is a hardware implementation of a k-distance calculator, sorting and selector. Be sure to read through the text for the sorting algorithm as we found it quite interesting.

Arduino recently released the Arduino MKR4000 board which has an FPGA, and there are many opensource boards out there in the wild that you can easily get started with today. We hope to see some of these in conference badges in the upcoming years.

Continue reading “Quick Face Recognition With An FPGA”

Rat Propulsion Via Brain-machine Interface

Our little red-eyed friend can drive this vehicle around with his mind. WITH HIS MIND, MAN!

This is the product of research into adaptive technologies. The process is pretty invasive, implanting neural electrodes in the motor cortex of the brain. The hope is that some day this will be a safe and reliable prospect for returning mobility to paralysis victims.

We found it interesting that the vehicle was trained to react to the rats’ movements. They were allowed to move around a test space under their own power while brain signals were monitored by the electrodes. Video tracking was used to correlate their movements with those signals, and that data is used to command the motors for what the Japanese researchers are calling RatCar.

We can see the possibilities opening up for a mechanized cockroach v. RatCar free-for-all. Something of a battlebots with a live tilt. But we kid, this is actually quite creepy.

[via Neatorama and PopSci]