Reflectance transformation imaging (RTI), or polynomial texture mapping, is a very interesting imaging technique that allows you to capture all the detail of an object. It’s used to take finely detailed pictures of scrawlings on cave walls in archeology, capture every detail of a coin for coin collectors, and to measure the very slight changes in a work of art.
RTI does this by shining light over an object at very particular angles and then using image processing to produce the best image. Despite being only a few LEDs and a bit of software, RTI systems are outrageously expensive. For his Hackaday Prize entry, [leszekmp] is building his own RTI system. It’ll cost about $600, making this the best way for Citizen Scientists to capture the best image possible.
RTI is simply shining light onto an object and taking synchronized pictures of the object from directly above. As you can imagine, putting LEDs in a dome is the obvious solution to this problem, and already [leszekmp] has made three systems that works well on domes up to a meter in diameter. The electronics are as simple as an Arduino shield and a few MOSFETS, and the dome itself is an off the shelf component. It’s a great project that enables better photography, and one of the simplest and best entries we’ve seen for The Hackaday Prize.
The goal for the Citizen Science portion of the Hackaday Prize is to empower people to create their own devices to perform their own analyses For [Adam]’s project, he’s designing a device that measures the health of waterways simply by looking at the light availability through the water column. It’s called PULSE, the Profiling Underwater Light SEnsor, and is able to monitor changes that are caused by algal blooms, suspended sediments, or sewer runoff.
The design of PULSE is a small electronic depth charge that can be lowered into a water column from anything between a research vessel to a kayak. On the top of this sinkable tube is a sensor to measure photosynthetically active radiation (PAR). This sensor provides data on light irradiance through the water column and gives a great insight into the health of photosynthesis, marine plant life, and ultimately the health of any aquatic environment.
Measuring the light available for photosynthesis through a water column is great, but PULSE isn’t a one trick pony. On the bottom of the aquatic probe are three sensors designed to measure photosynthesis, dissolved organic matter, and turbidity. These sensors are really just a few LEDs and photodiodes, proving just how much science you can do with simple tools.
The goal of the Citizen Science portion of the Hackaday Prize is to put scientific discovery in the hands of everyone. PULSE is a great example of this: it’s a relatively simple device that can be thrown over the side of a boat, lowered to the bottom or a lake, and hoisted back up again. It’s inexpensive to build, but still provides great data. That’s remarkable, and an excellent example of what we’re looking for in the Hackaday Prize.
For their Hackaday Prize entry, [Jithin], [Praveen], [Varunbluboy], and [Georges] are working on SEELablet, a device that will equip budding citizen scientists with control and measurement equipment.
One of the best ‘all-in-one’ lab devices is National Instruments’ VirtualBench, a device that’s an oscilloscope, logic analyzer, function generator, multimeter, and power supply, all crammed into one box. There’s a lot you can do with a device like this, but as you would expect, the name-brand version of this isn’t meant for middle school students.
In an effort to bring the cost of an all-in-one lab tool down to a price mere mortals can afford, the team behind the SEELablet have combined a single board computer with the capability of an oscilloscope, frequency counter, logic analyser, waveform generator, and a programmable power supply.
This has been a multi-year project for the team, beginning with a Python-powered instrumentation tool, and later a device running this code that’s also a versatile lab tool. If the latest iteration of the project turns out to be all it promises, we can’t wait to see the data this box will produce. There’s a lot you can measure in a fully stocked electronics lab, and this project makes the whole setup much easier to obtain.
The microscope is one of the most useful instruments for the biological sciences, but they are expensive. Lucky for us, a factory in China can turn out webcams and plastic lenses and sell them for pennies. That’s the idea behind Flypi – a cheap microscope for scientific experiments and diagnostics that’s based on the ever-popular Raspberry Pi.
Flypi is designed to be a simple scientific tool and educational device. With that comes the challenges of being very cheap and very capable. It’s based around a Raspberry Pi and the Pi camera, with the relevant software for taking snapshots, recording movies, and controlling a few different modules that extend the capabilities of this machine. These modules include a Peltier element to heat or cool the sample, a temperature sensor, RGB LED, LED ring, LED matrix, and a special blue LED for activating fluorescent molecules in a sample.
The brains behind the Flypi, [Andre Chagas], designed the Flypi to be cheap. He’s certainly managed that with a frame that is mostly 3D printed, and some surprisingly inexpensive electronics. Already the Flypi is doing real science, including tracking bugs wandering around a petri dish and fluorescence microscopy of a zebrafish’s heart. Not bad for a relatively simple tool, and a great entry for the Hackaday Prize.
For her Hackaday Prize entry, [ThunderSqueak] is building an artificial intelligence. P.A.L., the Self-Programming AI Robot, is building on the intelligence displayed by Amazon’s Alexa, Apple’s Siri, and whatever the Google thing is called, to build a robot that’s able to learn from its environment, track objects, judge distances, and perform simple tasks.
As with any robotic intelligence, the first question that comes to mind is, ‘what does it look like’. The answer here is, ‘a little bit like Johnny Five.’ [ThunderSqueak] has designed a robotic chassis using treads for locomotion and a head that can emote by moving its eyebrows. Those treads are not a trivial engineering task – the tracks are 3D printed and bolted onto a chain – and building them has been a very, very annoying part of the build.
But no advanced intelligent robot is based on how it moves. The real trick here is the software, and for this [ThunderSqueak] has a few tricks up her sleeve. She’s doing voice recognition through a microcontroller, correlating phonemes to the spectral signature without using much power.
The purpose of P.A.L. isn’t to have a conversation with a robotic friend in a weird 80s escapade. The purpose of P.A.L. is to build a machine that can learn from its mistakes and learn just a little bit about its environment. This is where the really cool stuff happens in artificial intelligence and makes for an excellent entry for the Hackaday Prize.
Augmented reality is all the rage right now, and it’s all because of Pokemon. Of course, this means the entire idea of augmented reality is now wrapped up in taking pictures of Pidgeys in their unnatural setting. There are more useful applications of augmented reality, as [vijayvictory]’s Hackaday Prize entry shows us. He’s built an augmented reality helmet for firefighters that will detect temperature, gasses, smoke and the user’s own vital signs, displaying the readings on a heads up display.
The core of the build is a Particle Photon, a WiFi-enabled microcontroller that also gives this helmet the ability to relay data back to a base station, ostensibly one that’s not on fire. To this, [vijayvictory] has added an accelerometer, gas sensor, and a beautiful OLED display mounted just behind a prism. This display overlays the relevant data to the firefighter without obstructing their field of vision.
Right now, this system is fairly basic, but [vijayvictory] has a few more tricks up his sleeve. By expanding this system to include a FLIR thermal imaging sensor, this augmented reality helmet will have the ability to see through smoke. By integrating this system into an existing network and adding a few cool WiFi tricks, this system will be able to located a downed firefighter using signal trilateralization. It’s a very cool device, and one that should be very useful, making it a great entry for The Hackaday Prize.
Electromyography is a technique used to study and record the electrical signals generated when a muscle contracts. It’s used for medical diagnosis, rehab, kinesiological studies, and is the preferred method of control for robotic prosthetics and exoskeletons. There are a few companies out there with myoelectric products, and the use case for those products is flipping the slides on a PowerPoint presentation. Lucky for us, this project in the Hackaday Prize isn’t encumbered by such trivialities. It’s an open, expandable platform to turn muscle contractions into anything.
As you would expect, reading the electrical signals from muscles requires a little more technical expertise than plugging a cable into an Arduino. This project has opamps in spades, and is more than sensitive enough to serve as a useful sensor platform. Already this project is being used to monitor bruxism – inadvertent clenching or grinding of the jaw – and the results are great.
While it’s doubtful this device will ever be used in a medical context, it is a great little board to add muscle control to a robot arm, or build a very cool suit of power armor. All in all, a very cool entry for The Hackaday Prize.