A Surefire Way To Make Masks

By now, the wearing of a facemask to protect ourselves from pandemic infection is for many of us a daily fact of life. Perhaps that means a cheap disposable mask, but there’s no reason that has to be the case. It’s easy to make more durable masks that can be washed and re-used time and time again, and our Hackaday colleague [Kristina Panos] has shared her pattern and workflow to help you do it.

Her pattern isn’t a complex cut-out but a simple rectangle, and the trick of sewing them together and flipping them inside out makes for a very tidy result. With three pleats pressed in and the elastic sewn up the result is a mask that’s neat, attractive, effective, and cheap, which is a win in our book.

It’s worth repeating her important point that these are not for use in medical environments, instead they’re the standard street-wear aerosol catchers we’re all used to. This isn’t the first time we’ve looked at masks here at Hackaday, or indeed though [Kristana]’s are by far the tidier neither is it first time one of us has made a mask. We looked at them in depth last year in our surviving the pandemic as a hacker series.

How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs

Thirty years ago, [Robert “Buz” Chmielewski] suffered a surfing accident as a teenager. This left him as a quadriplegic due to a C6 spinal cord injury. After becoming a participant in a brain-computer interface study at Johns Hopkins, he was recently able to feed himself through the use of prosthetic arms. The most remarkable thing about these prosthetic arms is primarily the neural link with [Buz’s] brain, which allows him to not only control the artificial arms, but also feel what they are touching, due to a closed-loop system which transfers limb sensory input to the patient’s brain.

The prosthetic limb in question is the Modular Prosthetic Limb (MPL) from Johns Hopkins Applied Physics Laboratory (APL). The Johns Hopkins Medicine Brain-Computer Interface study began a year ago, when [Buz] had six microelectrode arrays (MEA) implanted into his brain: half in the motor cortex and half in the sensory cortex. During the following months, the study focused on using the signals from the first set of arrays to control actuators, such as the MPL. The second set of arrays was used to study how the sensory cortex had to be stimulated to allow a patient to feel the artificial limb much as one feels a biological limb.

What makes this study so interesting is not only the closed-loop approach which provides the patient with feedback on the position and pressure on the prosthetic, but also that it involves both hemispheres of the brain. As a result, after only a year of the study, [Buz] was able to use two of the MPLs simultaneously to feed himself, which is a delicate and complicated tasks.

In the video embedded after the break one can see a comparison of [Buz] at the beginning of the study and today, as he manages to handle cutlery and eat cake, without assistance.

Continue reading “How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs”

A Gesture Recognizing Armband

Gesture recognition usually involves some sort of optical system watching your hands, but researchers at UC Berkeley took a different approach. Instead they are monitoring the electrical signals in the forearm that control the muscles, and creating a machine learning model to recognize hand gestures.

The sensor system is a flexible PET armband with 64 electrodes screen printed onto it in silver conductive ink, attached to a standalone AI processing module.  Since everyone’s arm is slightly different, the system needs to be trained for a specific user, but that also means that the specific electrical signals don’t have to be isolated as it learns to recognize patterns.

The challenging part of this is that the patterns don’t remain constant over time, and will change depending on factors such as sweat, arm position,  and even just biological changes. To deal with this the model can update itself on the device over time as the signal changes. Another part of this research that we appreciate is that all the inferencing, training, and updating happens locally on the AI chip in the armband. There is no need to send data to an external device or the “cloud” for processing, updating, or third-party data mining. Unfortunately the research paper with all the details is behind a paywall.

Continue reading “A Gesture Recognizing Armband”

Seeking Enlightenment: The Quest To Restore Vision In Humans

Visual impairment has been a major issue for humankind for its entire history, but has become more pressing with society’s evolution into a world which revolves around visual acuity. Whether it’s about navigating a busy city or interacting with the countless screens that fill modern life, coping with reduced or no vision is a challenge. For countless individuals, the use of braille and accessibility technology such as screen readers is essential to interact with the world around them.

For refractive visual impairment we currently have a range of solutions, from glasses and contact lenses to more permanent options like LASIK and similar which seek to fix the refractive problem by burning away part of the cornea. When the eye’s lens itself has been damaged (e.g. with cataracts), it can be replaced with an artificial lens.

But what if the retina or optic nerve has been damaged in some way? For individuals with such (nerve) damage there has for decades been the tempting and seemingly futuristic concept to restore vision, whether through biological or technological means. Quite recently, there have been a number of studies which explore both approaches, with promising results.

Continue reading “Seeking Enlightenment: The Quest To Restore Vision In Humans”

Analyzing The “Source Code” Of The COVID-19 Vaccine

Computer programs are written in code, which comes in many forms. At the lowest level, there’s machine code and assembly, while higher-level languages like C and Python aim to be more human-readable. However, the natural world has source code too, in the form of DNA and RNA strings that contain the code for the building blocks of life. [Bert] decided to take a look at the mRNA source code of Tozinameran, the COVID-19 vaccine developed by BioNTech and Pfizer.

The analysis is simple enough for the general reader, while nonetheless explaining some highly complex concepts at the cutting edge of biology. From codon substitutions for efficiency and the Ψ-base substitution to avoid the vaccine being destroyed by the immune system, to the complex initialisation string required at the start of the RNA sequence, [Bert] clearly explains the clever coding hacks that made the vaccine possible. Particularly interesting to note is the Prolase substitution, a technique developed in 2017. This allows the production of coronavirus spike proteins in isolation of the whole virus, in order to safely prime the immune system.

It’s a great primer and we can imagine it might inspire some to delve further into the rich world of genetics and biology. We’ve featured other cutting edge stories on COVID-19 too; [Dan Maloney] took a look at how CRISPR techniques are helping with the testing effort. If there’s one thing the 2020 pandemic has shown, it’s humanity’s ability to rapidly develop new technology in the face of a crisis.

Webcam Heart Rate Monitor Brings Photoplethysmography To Your PC

It seems like within the last ten years, every other gadget to be released has some sort of heart rate monitoring capability. Most modern smartwatches can report your BPMs, and we’ve even seen some headphones with the same ability hitting the market. Most of these devices use an optical measurement method in which skin is illuminated (usually by an LED) and a sensor records changes in skin color and light absorption. This method is called Photoplethysmography (PPG), and has even been implemented (in a simple form) in smartphone apps in which the data is generated by video of your finger covering the phone camera.

The basic theory of operation here has its roots in an experiment you probably undertook as a child. Did you ever hold a flashlight up to your hand to see the light, filtered red by your blood, shine through? That’s exactly what’s happening here. One key detail that is hard to perceive when a flashlight is illuminating your entire hand, however, is that deoxygenated blood is darker in color than oxygenated blood. By observing the frequency of the light-dark color change, we can back out the heart rate.

This is exactly how [Andy Kong] approached two methods of measuring heart rate from a webcam.

Method 1: The Cover-Up

The first detection scheme [Andy] tried is what he refers to as the “phone flashlight trick”. Essentially, you cover the webcam lens entirely with your finger. Ambient light shines through your skin and produces a video stream that looks like a dark red rectangle. Though it may be imperceptible to us, the color changes ever-so-slightly as your heart beats. An FFT of the raw data gives us a heart rate that’s surprisingly accurate. [Andy] even has a live demo up that you can try for yourself (just remember to clean the smudges off your webcam afterwards).

Method 2: Remote Sensing

Now things are getting a bit more advanced. What if you don’t want to clean your webcam after each time you measure your heart rate? Well thankfully there’s a remote sensing option as well.

For this method, [Andy] is actually using OpenCV to measure the cyclical swelling and shrinking of blood vessels in your skin by measuring the color change in your face. It’s absolutely mind-blowing that this works, considering the resolution of a standard webcam. He found the most success by focusing on fleshy patches of skin right below the eyes, though he says others recommend taking a look at the forehead.

Every now and then we see something that works even though it really seems like it shouldn’t. How is a webcam sensitive enough to measure these minute changes in facial color? Why isn’t the signal uselessly noisy? This project is in good company with other neat heart rate measurement tricks we’ve seen. It’s amazing that this works at all, and even more incredible that it works so well.

TI EZ430-Chronos Turned Medical Alert Wearable

Long before the current smartwatch craze, Texas Instruments released the eZ430-Chronos. Even by 2010s standards, it was pretty clunky. Its simple LCD display and handful of buttons also limited what kind of “smart” tasks it could realistically perform. But it did have one thing going for it: its SDK allowed users to create a custom firmware tailored to their exact specifications.

It’s been nearly a decade since we’ve seen anyone dust off the eZ430-Chronos, but that didn’t stop [ogdento] from turning one into a custom alert device for a sick family member. A simple two-button procedure on the watch will fire off emails and text messages to a pre-defined list of contacts, all without involving a third party or have to pay for a service contract. Perhaps most importantly, the relatively energy efficient eZ430 doesn’t need to be recharged weekly or even daily as would be the case for a modern smartwatch.

To make the device as simple as possible, [ogdento] went through the source code for the stock firmware and commented out every function beyond the ability to show the time. With the watch’s menu stripped down to the minimum, a new alert function was introduced that can send out a message using the device’s 915 MHz CC1101 radio.

Messages and recipients can easily be modified.

The display even shows “HELP” next to the appropriate button so there’s no confusion. A second button press is required to send the alert, and there’s even a provision for canceling it should the button be pressed accidentally.

On the receiving side, [ogdento] is using a Raspberry Pi with its own CC1101 radio plugged into the USB port. When the Python scripts running on the Pi picks up the transmission coming from the eZ430 it starts working through a list of recipients to send messages to. A quick look at the source code shows it would be easy to provide your own contact list should you want to put together your own version of this system.

We’ve seen custom alert hardware before, but like [ogdento] points out, using the eZ430-Chronos provides a considerable advantage in that its a turn-key platform. It’s comfortable to wear, reliable, and fairly rugged. While some would argue against trusting independently developed code for such a vital task, at least the hardware is a solved problem.