Open-Source Neuroscience Hardware Hack Chat

Join us on Wednesday, February 19 at noon Pacific for the Open-Source Neuroscience Hardware Hack Chat with Dr. Alexxai Kravitz and Dr. Mark Laubach!

There was a time when our planet still held mysteries, and pith-helmeted or fur-wrapped explorers could sally forth and boldly explore strange places for what they were convinced was the first time. But with every mountain climbed, every depth plunged, and every desert crossed, fewer and fewer places remained to be explored, until today there’s really nothing left to discover.

Unless, of course, you look inward to the most wonderfully complex structure ever found: the brain. In humans, the 86 billion neurons contained within our skulls make trillions of connections with each other, weaving the unfathomably intricate pattern of electrochemical circuits that make you, you. Wonders abound there, and anyone seeing something new in the space between our ears really is laying eyes on it for the first time.

But the brain is a difficult place to explore, and specialized tools are needed to learn its secrets. Lex Kravitz, from Washington University, and Mark Laubach, from American University, are neuroscientists who’ve learned that sometimes you have to invent the tools of the trade on the fly. While exploring topics as wide-ranging as obesity, addiction, executive control, and decision making, they’ve come up with everything from simple jigs for brain sectioning to full feeding systems for rodent cages. They incorporate microcontrollers, IoT, and tons of 3D-printing to build what they need to get the job done, and they share these designs on OpenBehavior, a collaborative space for the open-source neuroscience community.

Join us for the Open-Source Neuroscience Hardware Hack Chat this week where we’ll discuss the exploration of the real final frontier, and find out what it takes to invent the tools before you get to use them.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, February 19 at 12:00 PM Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about. Continue reading “Open-Source Neuroscience Hardware Hack Chat”

Reverse-Engineering Brains, One Neuron At A Time

Most posts here are electrical or mechanical, with a few scattered hacks from other fields. Those who also keep up with advances in biomedical research may have noticed certain areas are starting to parallel the electronics we know. [Dr. Rajib Shubert] is in one such field, and picked up on the commonality as well. He thought it’d be interesting to bridge the two worlds by explaining his research using analogies familiar to the Hackaday audience. (Video also embedded below.)

He laid the foundation with a little background, establishing that we’ve been able to see individual static neurons for a while via microscope slides and such, and we’ve been able to see activity of the whole living brain via functional MRI. These methods gradually improved our understanding of neurons, and advances within the past few years have reached an intersection of those two points: [Dr. Shubert] and colleagues now have tools to peer inside a functional brain, teasing out how it works one neuron at a time.

[Dr. Shubert]’s talk makes analogies to electronics hardware, but we can also make a software analogy treating the brain as a highly optimized (and/or obfuscated) piece of code. Virus stamping a single cell under this analogy is like isolating a single function, seeing who calls it, and who it calls. This pairs well with optogenetics techniques, which can be seen as like modifying a function to see how it affects results in real time. It certainly puts a different meaning on the phrase “working with live code”!

Continue reading “Reverse-Engineering Brains, One Neuron At A Time”

Hackaday Prize Entry: Grasshopper Neurons

A plague of locusts descends on your garden, and suddenly you realize grasshoppers are very hard to catch. Grasshoppers are nature’s perfect collision avoidance system, and this is due to a unique visual system that includes neurons extending directly from the eye to the animal’s legs. For this Hackaday Prize entry – and as a research project for this summer at Backyard Brains, [Dieu My Nguyen] is studying the neuroscience of grasshopper vision with stabs and shocks.

We visited Backyard Brains about two years ago, and found three very interesting projects. The first was a project on optogenetics, or rewiring neurons so flies taste something sweet when they’re exposed to red light. The second was remote-controlled cockroaches. Number three will shock you: a device that allowed me to expand my megalomania by shocking people with the power of my mind. It’s not all fun and games, though. This grasshopper neuron probe will use the Backyard Brains SpikerBox to investigate when those neurons are activated in response to a stimuli.

The utility of looking at the common grasshopper to learn about collision and object avoidance may not be very apparent at first. The more you learn about neuroscience, the more apparent the biological connection to common computer vision tasks becomes. That makes this a great research project and an excellent entry into the Hackaday Prize.

Mouse Brain with neurons exhibiting GFP expression

UC Davis Researchers Use Light To Erase Memories In Genetically Altered Mice

Much like using UV light to erase data from an EPROM, researchers from UC Davis have used light to erase specific memories in mice. [Kazumasa Tanaka, Brian Wiltgen and colleagues] used optogenetic techniques to test current ideas about memory retrieval. Optogenetics has been featured on Hackaday before. It is the use of light to control specific neurons (nerve cells) that have been genetically sensitized to light.  By doing so, the effects can be seen in real-time.

For their research, [Kazumasa Tanaka, Brian Wiltgen and colleagues] created genetically altered mice whose activated neurons expressed GFP, a protein that fluoresces green. This allowed neurons to be easily located and track which ones responded to learning and memory stimuli. The neurons produced an additional protein that made it possible to “switch them off” in response to light.  This enabled the researchers to determine which specific neurons are involved in the learning and memory pathways as well as study the behavior of the mouse when certain neurons were active or not.

Animal lovers may want to refrain from the following paragraph. The mice were subjected to mild electric shocks after being placed in a cage. They were trained so that when they were put in the cage again, they remembered the previous shock and would freeze in fear. However, when specific neurons in the hippocampus (a structure in the brain) were exposed to light transmitted through fiber optics (likely through a hole in each mouse’s skull), the mice happily scampered around the cage, no memory of the earlier shock to terrify them. The neurons that stored the memory of the shock had been “turned off” after the light exposure.

Continue reading “UC Davis Researchers Use Light To Erase Memories In Genetically Altered Mice”

THP Semifinalist: FNIR Brain Imager

565281406845688681 The current research tool du jour in the field of neuroscience and psychology is the fMRI, or functional magnetic resonance imaging. It’s basically the same as the MRI machine found in any well equipped hospital, but with a key difference: it can detect very small variances in the blood oxygen levels, and thus areas of activity in the brain. Why is this important? For researchers, finding out what area of the brain is active in response to certain stimuli is a ticket to Tenure Town with stops at Publicationton and Grantville.

fMRI labs are expensive, and [Jeremy]’s submission to The Hackaday Prize is aiming to do the same thing much more cheaply, and in a way that will vastly increase the amount of research being done with this technique. How is he doing this? Using the same technology used in high-tech vein finders: infrared light.

[Jeremy]’s idea is much the same as a photoplethysmograph, better known as a pulse oximeter. Instead of relatively common LEDs, [Jeremy] is using near infrared LEDs, guided by a few papers from Cornell and Drexel that demonstrate this technique can be used to see blood oxygen concentrations in the brain.

Being based on light, this device does not penetrate deeply into the brain. For many use cases, this is fine: the motor cortex is right next to your skull, stretching from ear to ear, vision is taken care of at the back of your head, and memories are right up against your forehead. Being able to scan these areas noninvasively with a device you can wear has incredible applications from having amputees control prosthetics to controlling video game characters by just thinking about it.

[Jeremy]’s device is small, about the size of a cellphone, and uses an array of LEDs and photodiodes to assemble an image of what’s going on inside someone’s head. The image will be somewhat crude, have low resolution, and will not cover the entire brain like an fMRI can. It also doesn’t cost millions of dollars, making this one of the most scientifically disruptive entries we have for The Hackaday Prize.

You can check out [Jeremy]’s intro video below.


SpaceWrencherThe project featured in this post is a semifinalist in The Hackaday Prize. 

Continue reading “THP Semifinalist: FNIR Brain Imager”

One arm controlled by another person

Backyard Brains: Controlling Cockroaches, Fruit Flys, And People

[Greg Gage] and some of the other crew at Backyard Brains have done a TED talk, had a few successful Kickstarters, and most surprisingly given that pedigree, are actually doing something interesting, fun, and educational. They’re bringing neuroscience to everyone with a series of projects and kits that mutilate cockroaches and send PETA into a tizzy.

[Greg] demonstrated some of his highly modified cockroaches by putting a small Bluetooth backpack on one. The roach had previously been ‘prepared’ by attaching small electrodes to each of its two front antennas. The backpack sends a small electrical signal to the antennae every time I swiped the screen of an iPhone. The roach thinks it’s hitting a wall and turns in the direction I’m swiping, turning it into a roboroach. We seen something like this before but it never gets old.

Far from being your one stop shop for cockroach torture devices, Backyard Brains also has a fairly impressive lab in the basement of their building filled with grad students and genetically modified organisms. [Cort Thompson] is working with fruit flies genetically modified so a neuron will activate when they’re exposed to a specific pulse of light. It’s called optogenetics, and [Cort] has a few of these guys who have an ‘I’m tasting something sweet’ neuron activated when exposed to a pulse of red light.

Of course controlling cockroaches is one thing, and genetically engineering fruit flies is a little more impressive. How about controlling other people? After being hooked up to an EMG box to turn muscle actuation in my arm into static on a speaker, [Greg] asked for a volunteer. [Jason Kridner], the guy behind the BeagleBone, was tagging along with us, and stepped up to have two electrodes attached to his ulnar nerve. With a little bit of circuitry that is available in the Backyard Brains store, I was able to control [Jason]’s wrist with my mind. Extraordinarily cool stuff.

There was far too much awesome stuff at Backyard Brains for a video of reasonable length. Not shown includes projects with scorpions, and an improved version of the roboroach that gives a roach a little bit of encouragement to move forward. We’ll put up a ‘cutting room floor’ video of that a bit later.