Hackaday Prize Entry: An AI Robot

For her Hackaday Prize entry, [ThunderSqueak] is building an artificial intelligence. P.A.L., the Self-Programming AI Robot, is building on the intelligence displayed by Amazon’s Alexa, Apple’s Siri, and whatever the Google thing is called, to build a robot that’s able to learn from its environment, track objects, judge distances, and perform simple tasks.

As with any robotic intelligence, the first question that comes to mind is, ‘what does it look like’. The answer here is, ‘a little bit like Johnny Five.’ [ThunderSqueak] has designed a robotic chassis using treads for locomotion and a head that can emote by moving its eyebrows. Those treads are not a trivial engineering task – the tracks are 3D printed and bolted onto a chain – and building them has been a very, very annoying part of the build.

But no advanced intelligent robot is based on how it moves. The real trick here is the software, and for this [ThunderSqueak] has a few tricks up her sleeve. She’s doing voice recognition through a microcontroller, correlating phonemes to the spectral signature without using much power.

The purpose of P.A.L. isn’t to have a conversation with a robotic friend in a weird 80s escapade. The purpose of P.A.L. is to build a machine that can learn from its mistakes and learn just a little bit about its environment. This is where the really cool stuff happens in artificial intelligence and makes for an excellent entry for the Hackaday Prize.

Hackaday Prize Entry: Augmented Reality For Firefighters

Augmented reality is all the rage right now, and it’s all because of Pokemon. Of course, this means the entire idea of augmented reality is now wrapped up in taking pictures of Pidgeys in their unnatural setting. There are more useful applications of augmented reality, as [vijayvictory]’s Hackaday Prize entry shows us. He’s built an augmented reality helmet for firefighters that will detect temperature, gasses, smoke and the user’s own vital signs, displaying the readings on a heads up display.

The core of the build is a Particle Photon, a WiFi-enabled microcontroller that also gives this helmet the ability to relay data back to a base station, ostensibly one that’s not on fire. To this, [vijayvictory] has added an accelerometer, gas sensor, and a beautiful OLED display mounted just behind a prism. This display overlays the relevant data to the firefighter without obstructing their field of vision.

Right now, this system is fairly basic, but [vijayvictory] has a few more tricks up his sleeve. By expanding this system to include a FLIR thermal imaging sensor, this augmented reality helmet will have the ability to see through smoke. By integrating this system into an existing network and adding a few cool WiFi tricks, this system will be able to located a downed firefighter using signal trilateralization. It’s a very cool device, and one that should be very useful, making it a great entry for The Hackaday Prize.

Hackaday Prize Entry: Let Your Muscles Do The Work

Electromyography is a technique used to study and record the electrical signals generated when a muscle contracts. It’s used for medical diagnosis, rehab, kinesiological studies, and is the preferred method of control for robotic prosthetics and exoskeletons. There are a few companies out there with myoelectric products, and the use case for those products is flipping the slides on a PowerPoint presentation. Lucky for us, this project in the Hackaday Prize isn’t encumbered by such trivialities. It’s an open, expandable platform to turn muscle contractions into anything.

As you would expect, reading the electrical signals from muscles requires a little more technical expertise than plugging a cable into an Arduino. This project has opamps in spades, and is more than sensitive enough to serve as a useful sensor platform. Already this project is being used to monitor bruxism – inadvertent clenching or grinding of the jaw – and the results are great.

While it’s doubtful this device will ever be used in a medical context, it is a great little board to add muscle control to a robot arm, or build a very cool suit of power armor. All in all, a very cool entry for The Hackaday Prize.

Hackaday Prize Entry: Piezo Gait Analysis

Go into a fancy drug store, and you might just find one of the most amazing sales demonstrations you’ll ever see. Step right up, take your shoes off, and place your feet onto the magical Dr. Scholl’s machine, and you’ll get a customized readout of how your feet touch the ground. As an added bonus, you’ll also get a recommendation for a shoe insert that will make your feet feel better and your shoes fit better.

There is, of course, one problem with this setup. You don’t stand on a footprint measuring device all day. A better solution to the problem of measuring how your feet hit the ground is doing it while you walk. That’s where [chiprobot]’s Alli-Gait-Or Analysis comes in. It’s that Dr. Scholl’s machine tucked into the sole of a shoe. It can be worn while you walk, and it can tell you exactly how your feet work.

[chiprobot]’s robotic shoes consist of a 3D printed insert that holds eighteen piezo transducers per shoe. These are connected to ADCs, which feed into a microcontroller which sends the data out to a computer. That’s simple enough, but making sense of the data is the real problem.

To turn this data into something that could be used for selecting orthotics or simply finding a better shoe, [chiprobot] is plugging this data into Blender and creating some very cool visualizations. It’s good enough to get some serious data off a shoe, and since this Alli-Gait-Or is wearable, the data is much more valid than a machine sitting in a drug store.

Hackaday Prize Entry: A Simple Spectrophotometer

Building on the work of other Citizen Science efforts, [doctek]’s entry for the Hackaday Prize promises to detect pollution, identify chemicals, and perform other analyses with a simple handheld device. It’s a spectrophotometer, and [doctek] is putting some real engineering into this build.

A spectrophotometer is one of the simplest devices able to perform spectroscopy, requiring only a light source, a photoresistor, and some means of producing monochromatic light. By putting a sample in front of the photoresistor, the absorption spectrum of the sample can be measured. With this data, it’s a simple matter to identify the sample.

A light and a photoresistor are simple enough, but as with every precision measurement device, the devil is in the details. [doctec] is using new, low-noise, low-offset opamps, and precision references to get his data. Some of the parts in the schematic were actually designed in this century – a far cry from the ‘plug the photoresistor into the analog input’ projects we see so often.

[doctec] is using a Teensy 3.0 to drive the electronics and collect the data, and he already has the mechanics of this build pretty much figured out. It’s a great project that shows off some engineering skill, making it a great entry for The Hackaday Prize.

Hackaday Prize Entry: An Internet Of Things Microscope

For their entry into the Citizen Scientist portion of the Hackaday Prize, the folks at Arch Reactor, the St. Louis hackerspace, are building a microscope. Not just any microscope – this one is low-cost, digital, and has a surprisingly high magnification and pretty good optics. It’s the Internet of Things Microscope, and like all good apparatus for Citizen Scientist, it’s a remarkable tool for classrooms and developing countries.

When you think of ‘classroom microscope’, you’re probably thinking about a pile of old optics sitting in the back of a storage closet. These microscopes are purely optical, without the ability to take digital pictures. The glass is good, but you’re not going to get a scanning stage when you’re dealing with 30-year-old gear made for a classroom full of sticky-handed eighth graders.

The Internet of Things Microscope includes a scanning stage that moves across the specimen on the X and Y axes, stitching digital images together to create a very large image. That’s a killer feature for a cheap digital microscope, and the folks at Arch Reactor are doing this with a few cheap stepper motors and stepper motor drivers.

The rest of the electronics are built around a Raspberry Pi, Raspberry Pi camera (which recently got a nice resolution upgrade), and a some microscope eyepieces and objectives. Everything else is 3D printed, making this a very cheap and very accessible microscope that has some killer features.

Hackaday Prize Entry: The Cheapest Logic Analyzer

There are piles of old 128MB and 256MB sticks of RAM sitting around in supply closets and in parts bins. For his Hackaday Prize project, [esot.eric] is turning these obsolete sticks of RAM into something useful – a big, fast logic analyzer. It’s cheap, and simple enough that it can be built on a breadboard.

If using old SDRAM in strange configurations seems familiar, you’re correct. This project is based on [esot.eric]’s earlier AVR logic analyzer project that used a slow AVR to measure 32 channels of logic at 30 megasamples per second. The only way this build was possible was by hacking an old stick of RAM to ‘free run’, automatically logging data to the RAM and reading it out with an AVR later.

This project expands on the earlier projects by using bigger sticks of RAM faster, with the ultimate goal being a 32-bit, 133MS/s logic analyzer that is more of a peripheral than a single, monolithic project. With a Raspberry Pi Zero, a stick of RAM, and a few miscellaneous logic chips, this project can become anything from a logic analyzer to a data logger to an oscilloscope. It’s weird, yes, but the parts to make this very handy tool can be found in any hackerspace or workshop, making it a great trick for the enterprising hardware hacker.