For their Hackaday Prize entry, [Jithin], [Praveen], [Varunbluboy], and [Georges] are working on SEELablet, a device that will equip budding citizen scientists with control and measurement equipment.
One of the best ‘all-in-one’ lab devices is National Instruments’ VirtualBench, a device that’s an oscilloscope, logic analyzer, function generator, multimeter, and power supply, all crammed into one box. There’s a lot you can do with a device like this, but as you would expect, the name-brand version of this isn’t meant for middle school students.
In an effort to bring the cost of an all-in-one lab tool down to a price mere mortals can afford, the team behind the SEELablet have combined a single board computer with the capability of an oscilloscope, frequency counter, logic analyser, waveform generator, and a programmable power supply.
This has been a multi-year project for the team, beginning with a Python-powered instrumentation tool, and later a device running this code that’s also a versatile lab tool. If the latest iteration of the project turns out to be all it promises, we can’t wait to see the data this box will produce. There’s a lot you can measure in a fully stocked electronics lab, and this project makes the whole setup much easier to obtain.
The microscope is one of the most useful instruments for the biological sciences, but they are expensive. Lucky for us, a factory in China can turn out webcams and plastic lenses and sell them for pennies. That’s the idea behind Flypi – a cheap microscope for scientific experiments and diagnostics that’s based on the ever-popular Raspberry Pi.
Flypi is designed to be a simple scientific tool and educational device. With that comes the challenges of being very cheap and very capable. It’s based around a Raspberry Pi and the Pi camera, with the relevant software for taking snapshots, recording movies, and controlling a few different modules that extend the capabilities of this machine. These modules include a Peltier element to heat or cool the sample, a temperature sensor, RGB LED, LED ring, LED matrix, and a special blue LED for activating fluorescent molecules in a sample.
The brains behind the Flypi, [Andre Chagas], designed the Flypi to be cheap. He’s certainly managed that with a frame that is mostly 3D printed, and some surprisingly inexpensive electronics. Already the Flypi is doing real science, including tracking bugs wandering around a petri dish and fluorescence microscopy of a zebrafish’s heart. Not bad for a relatively simple tool, and a great entry for the Hackaday Prize.
For her Hackaday Prize entry, [ThunderSqueak] is building an artificial intelligence. P.A.L., the Self-Programming AI Robot, is building on the intelligence displayed by Amazon’s Alexa, Apple’s Siri, and whatever the Google thing is called, to build a robot that’s able to learn from its environment, track objects, judge distances, and perform simple tasks.
As with any robotic intelligence, the first question that comes to mind is, ‘what does it look like’. The answer here is, ‘a little bit like Johnny Five.’ [ThunderSqueak] has designed a robotic chassis using treads for locomotion and a head that can emote by moving its eyebrows. Those treads are not a trivial engineering task – the tracks are 3D printed and bolted onto a chain – and building them has been a very, very annoying part of the build.
But no advanced intelligent robot is based on how it moves. The real trick here is the software, and for this [ThunderSqueak] has a few tricks up her sleeve. She’s doing voice recognition through a microcontroller, correlating phonemes to the spectral signature without using much power.
The purpose of P.A.L. isn’t to have a conversation with a robotic friend in a weird 80s escapade. The purpose of P.A.L. is to build a machine that can learn from its mistakes and learn just a little bit about its environment. This is where the really cool stuff happens in artificial intelligence and makes for an excellent entry for the Hackaday Prize.
Augmented reality is all the rage right now, and it’s all because of Pokemon. Of course, this means the entire idea of augmented reality is now wrapped up in taking pictures of Pidgeys in their unnatural setting. There are more useful applications of augmented reality, as [vijayvictory]’s Hackaday Prize entry shows us. He’s built an augmented reality helmet for firefighters that will detect temperature, gasses, smoke and the user’s own vital signs, displaying the readings on a heads up display.
The core of the build is a Particle Photon, a WiFi-enabled microcontroller that also gives this helmet the ability to relay data back to a base station, ostensibly one that’s not on fire. To this, [vijayvictory] has added an accelerometer, gas sensor, and a beautiful OLED display mounted just behind a prism. This display overlays the relevant data to the firefighter without obstructing their field of vision.
Right now, this system is fairly basic, but [vijayvictory] has a few more tricks up his sleeve. By expanding this system to include a FLIR thermal imaging sensor, this augmented reality helmet will have the ability to see through smoke. By integrating this system into an existing network and adding a few cool WiFi tricks, this system will be able to located a downed firefighter using signal trilateralization. It’s a very cool device, and one that should be very useful, making it a great entry for The Hackaday Prize.
Electromyography is a technique used to study and record the electrical signals generated when a muscle contracts. It’s used for medical diagnosis, rehab, kinesiological studies, and is the preferred method of control for robotic prosthetics and exoskeletons. There are a few companies out there with myoelectric products, and the use case for those products is flipping the slides on a PowerPoint presentation. Lucky for us, this project in the Hackaday Prize isn’t encumbered by such trivialities. It’s an open, expandable platform to turn muscle contractions into anything.
As you would expect, reading the electrical signals from muscles requires a little more technical expertise than plugging a cable into an Arduino. This project has opamps in spades, and is more than sensitive enough to serve as a useful sensor platform. Already this project is being used to monitor bruxism – inadvertent clenching or grinding of the jaw – and the results are great.
While it’s doubtful this device will ever be used in a medical context, it is a great little board to add muscle control to a robot arm, or build a very cool suit of power armor. All in all, a very cool entry for The Hackaday Prize.
Go into a fancy drug store, and you might just find one of the most amazing sales demonstrations you’ll ever see. Step right up, take your shoes off, and place your feet onto the magical Dr. Scholl’s machine, and you’ll get a customized readout of how your feet touch the ground. As an added bonus, you’ll also get a recommendation for a shoe insert that will make your feet feel better and your shoes fit better.
There is, of course, one problem with this setup. You don’t stand on a footprint measuring device all day. A better solution to the problem of measuring how your feet hit the ground is doing it while you walk. That’s where [chiprobot]’s Alli-Gait-Or Analysis comes in. It’s that Dr. Scholl’s machine tucked into the sole of a shoe. It can be worn while you walk, and it can tell you exactly how your feet work.
[chiprobot]’s robotic shoes consist of a 3D printed insert that holds eighteen piezo transducers per shoe. These are connected to ADCs, which feed into a microcontroller which sends the data out to a computer. That’s simple enough, but making sense of the data is the real problem.
To turn this data into something that could be used for selecting orthotics or simply finding a better shoe, [chiprobot] is plugging this data into Blender and creating some very cool visualizations. It’s good enough to get some serious data off a shoe, and since this Alli-Gait-Or is wearable, the data is much more valid than a machine sitting in a drug store.
Building on the work of other Citizen Science efforts, [doctek]’s entry for the Hackaday Prize promises to detect pollution, identify chemicals, and perform other analyses with a simple handheld device. It’s a spectrophotometer, and [doctek] is putting some real engineering into this build.
A spectrophotometer is one of the simplest devices able to perform spectroscopy, requiring only a light source, a photoresistor, and some means of producing monochromatic light. By putting a sample in front of the photoresistor, the absorption spectrum of the sample can be measured. With this data, it’s a simple matter to identify the sample.
A light and a photoresistor are simple enough, but as with every precision measurement device, the devil is in the details. [doctec] is using new, low-noise, low-offset opamps, and precision references to get his data. Some of the parts in the schematic were actually designed in this century – a far cry from the ‘plug the photoresistor into the analog input’ projects we see so often.
[doctec] is using a Teensy 3.0 to drive the electronics and collect the data, and he already has the mechanics of this build pretty much figured out. It’s a great project that shows off some engineering skill, making it a great entry for The Hackaday Prize.