A Sleep Monitor For Minimum Outlay

There are a variety of instruments used in sleep studies to measure bodily activity during sleep and consequent sleep quality. Many of them use techniques that perhaps aren’t so easy to replicate on the bench, but an EEG or electroencephalograph to measure brain waves can be achieved using a readily-available module. [Ben Jabituya] shows us a sleep monitor using one of these modules, an EGG Mikroe Click.

The brains of the operation is an Adafruit Adalogger Feather M0, which is hooked up to a headband containing the sensing electrodes. The write-up gives us a round-up of the available boards, which should be handy for any experimenters in this field. The firmware meanwhile was written using the Arduino IDE. It collects raw sampling data to an SD card, and one surprise comes in just how relatively small a space it requires to store a night’s results.

Finally, a Python script was used to process the data and turn it into a spectrogram to look at brain activity through the night. He envisages using the device for triggering lucid dreaming during REM sleep, but we can see it might be rather useful for sleep disorder sufferers, too. Take a look at it in the video below the break. Continue reading “A Sleep Monitor For Minimum Outlay”

AI Watches You Sleep; Knows When You Dream

If you’ve never been a patient at a sleep laboratory, monitoring a person as they sleep is an involved process of wires, sensors, and discomfort. Seeking a better method, MIT researchers — led by [Dina Katabi] and in collaboration with Massachusetts General Hospital — have developed a device that can non-invasively identify the stages of sleep in a patient.

Approximately the size of a laptop and mounted on a wall near the patient, the device measures the minuscule changes in reflected low-power RF signals. The wireless signals are analyzed by a deep neural-network AI and predicts the various sleep stages — light, deep, and REM sleep — of the patient, negating the task of manually combing through the data. Despite the sensitivity of the device, it is able to filter out irrelevant motions and interference, focusing on the breathing and pulse of the patient.

What’s novel here isn’t so much the hardware as it is the processing methodology. The researchers use both convolutional and recurrent neural networks along with what they call an adversarial training regime:

Our training regime involves 3 players: the feature encoder (CNN-RNN), the sleep stage predictor, and the source discriminator. The encoder plays a cooperative game with the predictor to predict sleep stages, and a minimax game against the source discriminator. Our source discriminator deviates from the standard domain-adversarial discriminator in that it takes as input also the predicted distribution of sleep stages in addition to the encoded features. This dependence facilitates accounting for inherent correlations between stages and individuals, which cannot be removed without degrading the performance of the predictive task.

Anyone out there want to give this one a try at home? We’d love to see a HackRF and GNU Radio used to record RF data. The researchers compare the RF to WiFi so repurposing a 2.4 GHz radio to send out repeating uniformed transmissions is a good place to start. Dump it into TensorFlow and report back.

Continue reading “AI Watches You Sleep; Knows When You Dream”