Reverse-Engineering Brains, One Neuron At A Time

Most posts here are electrical or mechanical, with a few scattered hacks from other fields. Those who also keep up with advances in biomedical research may have noticed certain areas are starting to parallel the electronics we know. [Dr. Rajib Shubert] is in one such field, and picked up on the commonality as well. He thought it’d be interesting to bridge the two worlds by explaining his research using analogies familiar to the Hackaday audience. (Video also embedded below.)

He laid the foundation with a little background, establishing that we’ve been able to see individual static neurons for a while via microscope slides and such, and we’ve been able to see activity of the whole living brain via functional MRI. These methods gradually improved our understanding of neurons, and advances within the past few years have reached an intersection of those two points: [Dr. Shubert] and colleagues now have tools to peer inside a functional brain, teasing out how it works one neuron at a time.

[Dr. Shubert]’s talk makes analogies to electronics hardware, but we can also make a software analogy treating the brain as a highly optimized (and/or obfuscated) piece of code. Virus stamping a single cell under this analogy is like isolating a single function, seeing who calls it, and who it calls. This pairs well with optogenetics techniques, which can be seen as like modifying a function to see how it affects results in real time. It certainly puts a different meaning on the phrase “working with live code”!

Continue reading “Reverse-Engineering Brains, One Neuron At A Time”

When The Going Gets Tough, These Wheels Transform To Tracks

When we want to build something to go where wheels could not, the typical solution is to use tracks. But the greater mobility comes with trade-offs: one example being tracked vehicles can’t go as fast as a wheeled counterpart. Information released by DARPA’s ground experimental vehicle technology (GXV-T) program showed what might come out of asking “why can’t we switch to tracks just when we need them?”

This ambitious goal to literally reinvent the wheel was tackled by Carnegie Mellon’s National Robotics Engineering Center. They delivered the “Reconfigurable Wheel-Track” (RWT) that can either roll like a wheel or travel on its tracks. A HMMWV serves as an appropriate demonstration chassis, where two or all four of its wheels were replaced by RWTs. In the video (embedded below) it is seen quickly transforming from one mode to another while moving. An obviously desirable feature that looks challenging to implement. This might not be as dramatic of a transformation as a walking robot that can roll up into a wheel but it has the advantage of being more immediately feasible for human-scale vehicles.

The RWT is not the only terrain mobility project in this DARPA announcement but this specific idea is one we would love to see scaled downed to become a 3D-printable robot module. And though our Hackaday Prize Robotics Module Challenge has already concluded, there are more challenges still to come. The other umbrella of GXV-T is “crew augmentation” giving operators better idea of what’s going around them. The projects there might inspire something you can submit to our upcoming Human-Computer Interface Challenge, check them out!

Continue reading “When The Going Gets Tough, These Wheels Transform To Tracks”

After The Sun Set On San Mateo, LED Takes Over Hackaday’s BAMF Meetup

After this Spring’s Bay Area Maker Faire closed down for Saturday night and kicked everybody out, the fun moved on to O’Neill’s Irish Pub where Hackaday and Tindie held our fifth annual meetup for fellow Maker Faire attendees. How do we find like-minded hackers in a crowded bar? It’s easy: look for tables lit by LEDs and say hello. It was impossible to see everything people had brought, but here are a few interesting samples.

Continue reading “After The Sun Set On San Mateo, LED Takes Over Hackaday’s BAMF Meetup”

Evolving The 3D Printed Linear Actuator

Our open source community invites anyone with an idea to build upon the works of those who came before. Many of us have encountered a need to control linear motion and adapted an inexpensive hobby servo for the task. [Michael Graham] evaluated existing designs and believed he has ideas to advance the state of the art. Our Hackaday Prize judges agreed, placing his 3D Printed Servo Linear Actuator as one of twenty winners of our Robotics Module Challenge.

[Michael]’s actuator follows in the footstep of other designs based on a rack-and-pinion gear such as this one featured on these pages, but he approached the design problem from the perspective of a mechanical engineer. The design incorporated several compliant features to be tolerant of variances between 3D printers (and slicer, and filament, etc.) Improving the odds of a successful print and therefore successful projects. Beginners learning to design for 3D printing (and even some veterans) would find his design tips document well worth the few minutes of reading time.

Another useful feature of his actuator design is the 20mm x 20mm screw mounting system. Visible on either end of the output slider, it allows mixing and matching from a set of accessories to be bolted on this actuator. He is already off and running down this path and is facing the challenge of having too many things to share while keeping them all organized and usable by everyone.

The flexible construction system allows him to realize different ideas within the modular system. He brought one item (a variant of his Mug-O-Matic) to the Hackaday + Tindie Meetup at Bay Area Maker Faire, and we’re sure there will be more. And given the thoughtful design and extensive documentation of his project, we expect to see his linear servos adopted by others and appear in other contexts as well.

This isn’t the only linear actuator we’ve come across. It isn’t even the only winning linear actuator of our Robotics Module Challenge, but the other one is focused on meeting different constraints like compactness. They are different tools for different needs – and all worthy additions to our toolbox of mechanical solutions.

Badge Bling And More At LayerOne 2018

The security conference LayerOne 2018 took place this past weekend in Pasadena, California. A schedule conflict meant most of our crew was at Hackaday Belgrade but I went to LayerOne to check it out as a first-time attendee. It was a weekend full of deciphering an enigmatic badge, hands-on learning about physical security, admiring impressive demos, and building a crappy robot.

Continue reading “Badge Bling And More At LayerOne 2018”

Hacking For Learning And Laughs: The Makers Of Oakwood School

The tagline of Bay Area Maker Faire is “Inspire the Future” and there was plenty of inspiration for our future generation. We have exhibits encouraging children to get hands-on making projects to call their own, and we have many schools exhibiting their student projects telling stories of what they’ve done. Then we have exhibitors like Oakwood School STEAM Council who have earned a little extra recognition for masterfully accomplishing both simultaneously.

[Marcos Arias], chair of the council, explained that each exhibit on display have two layers. Casual booth visitors will see inviting hands-on activities designed to delight kids. Less obvious is that each of these experiences are a culmination of work by Oakwood 7th to 12th grade students. Some students are present to staff activities and they were proud to talk about their work leading up to Maker Faire with any visitors who expressed interest.

Continue reading “Hacking For Learning And Laughs: The Makers Of Oakwood School”

Modern Wizard Summons Familiar Spirit

In European medieval folklore, a practitioner of magic may call for assistance from a familiar spirit who takes an animal form disguise. [Alex Glow] is our modern-day Merlin who invoked the magical incantations of 3D printing, Arduino, and Raspberry Pi to summon her familiar Archimedes: The AI Robot Owl.

The key attraction in this build is Google’s AIY Vision kit. Specifically the vision processing unit that tremendously accelerates image classification tasks running on an attached Raspberry Pi Zero W. It no longer consumes several seconds to analyze each image, classification can now run several times per second, all performed locally. No connection to Google cloud required. (See our earlier coverage for more technical details.) The default demo application of a Google AIY Vision kit is a “joy detector” that looks for faces and attempts to determine if a face is happy or sad. We’ve previously seen this functionality mounted on a robot dog.

[Alex] aimed to go beyond the default app (and default box) to create Archimedes, who was to reward happy people with a sticker. As a moving robotic owl, Archimedes had far more crowd appeal than the vision kit’s default cardboard box. All the kit components have been integrated into Archimedes’ head. One eye is the expected Pi camera, the other eye is actually the kit’s piezo buzzer. The vision kit’s LED-illuminated button now tops the dapper owl’s hat.

Archimedes was created to join in Google’s promotion efforts. Their presence at this Maker Faire consisted of two tents: one introductory “Learn to Solder” tent where people can create a blinky LED badge, and the other tent is focused on their line of AIY kits like this vision kit. Filled with demos of what the kits can do aside from really cool robot owls.

Hopefully these promotional efforts helped many AIY kits find new homes in the hands of creative makers. It’s pretty exciting that such a powerful and inexpensive neural net processor is now widely available, and we look forward to many more AI-powered hacks to come.

Continue reading “Modern Wizard Summons Familiar Spirit”