Counting Eggs With A Webcam

You’ll have to dig out your French dictionary (or Google translate) for this one, but it is worth it. [Nicolas Giraud] has been experimenting with ways to use a webcam to detect the number of eggs chickens have laid in a chicken coop. This page documents these experiments using a number of different algorithms to automatically detect the number of eggs and notify the owner. The system is simple, built around a Pi running Debian Jesse Lite and a cheap USB webcam. An LED running off one of the GPIO pins illuminates the eggs, and the camera then captures the image for analysis.

Continue reading “Counting Eggs With A Webcam”

NASA Knows Where The Meteors Are

NASA has been tracking bright meteoroids (“fireballs”) using a distributed network of video cameras pointed upwards. And while we usually think of NASA in the context of multi-bazillion dollar rocket ships, but this operation is clearly shoe-string. This is a hack worthy of Hackaday.

droppedimage

The basic idea is that with many wide-angle video cameras capturing the night sky, and a little bit of image processing, identifying meteoroids in the night sky should be fairly easy. When enough cameras capture the same meteoroid, one can use triangulation to back out the path of the meteoroid in 3D, estimate its mass, and more. It’s surprising how many there are to see on any given night.

You can watch the videos of a meteoroid event from any camera, watch the cameras live, and even download the meteoroid’s orbital parameters. We’re bookmarking this website for the next big meteor shower.

cameraThe work is apparently based on [Rob Weryk]’s ASGARD system, for which the code is unfortunately unavailable. But it shouldn’t be all that hard to hack something together with a single-board computer, camera, and OpenCV. NASA’s project is limited to the US so far, but we wonder how much more data could be collected with a network of cameras all over the globe. So which ones of you are going to take up our challenge? Build your own version and let us know about it!

Between this project and the Radio Meteor Zoo, we’re surprised at how much public information there is out there about the rocky balls of fire that rain down on us every night, and will eventually be responsible for our extinction. At least we can be sure we’ll get it on film.

Hallucinating Machines Generate Tiny Video Clips

Hallucination is the erroneous perception of something that’s actually absent – or in other words: A possible interpretation of training data. Researchers from the MIT and the UMBC have developed and trained a generative-machine learning model that learns to generate tiny videos at random. The hallucination-like, 64×64 pixels small clips are somewhat plausible, but also a bit spooky.

The machine-learning model behind these artificial clips is capable of learning from unlabeled “in-the-wild” training videos and relies mostly on the temporal coherence of subsequent frames as well as the presence of a static background. It learns to disentangle foreground objects from the background and extracts the overall dynamics from the scenes. The trained model can then be used to generate new clips at random (as shown above), or from a static input image (as shown in pairs below).

Currently, the team limits the clips to a resolution of 64×64 pixels and 32 frames in duration in order to decrease the amount of required training data, which is still at 7 TB. Despite obvious deficiencies in terms of photorealism, the little clips have been judged “more realistic” than real clips by about 20 percent of the participants in a psychophysical study the team conducted. The code for the project (Torch7/LuaJIT) can already be found on GitHub, together with a pre-trained model. The project will also be shown in December at the 2016 NIPS conference.

My Take On Assistive Tech For The Hackaday Prize

We’re in the last few weeks for entries in the 2016 Hackaday Prize — specifically the challenge is to show off your take on assisstive technology. This is a hugely broad category and I’ve been thinking about it for a while. I’m sure there’s a ton of low-hanging fruit that’s not obvious to everyone. This would be a great time to hit up the comments below and leave your “hey, I always thought someone should make…” ideas. I’m looking forward to reading them and it might just inspire someone to spend the next couple weeks hammering out a prototype to enter.

For me, it’s medication. I knew this can be a challenging problem having gone through a few cycles of prescription medicines in my life. But recently I helped out a family member who was suddenly on many medications taken on eight different times a day — including once, twice, three, and six times per day. This was further compounded by sleep deprivation (having to set alarms at night to take the medicine) and  drowsy/woozy effects from the medicine. I can tell you first hand that this is really tough for anyone to deal with and it’s incredibly easy to make a mistake or not be able to remember if you took a dose.

Pill Organizers Do No More or Less

We’ve seen a number of pill organizers before and that’s what I reached for in this case. However, that organizer only had four slots for each day. I didn’t hack it (other than writing on the doors with a Sharpie for when to take each) but even if there were added buttons or LEDs I’m not convinced this would be a marked improvement.

What you see above is my proposal for the medicine problem. Smartphones have become ubiquitous and the processing power and cameras of even budget phones are mind blowing. I think it is entirely possible to write an app that uses computer vision to recognize pills and sync them with the schedule. This may mean whipping the phone out of your pocket, or designing a pill box that has a phone stand next to it (saying that makes me think of using RPi and a Pi camera). Grab your pills and validate them under the camera.

Useful Augmented Reality

The screen of the phone would use augmented reality to overlay information about the pills it sees — you know, like Pokemon Go but in a way that enriches your life. ‘pills, catch ’em all!’ — new pills can be learned of the fly, delivering the user to a screen to identify the pill and the dosing schedule. Taking the validation picture will record when the medicine was taken, and the natural extension of this systems is a pharmacy’s ability to push your dose schedule to your account when you pick up the prescription. A stretch goal would be keeping an eye out for interactions.

This is all very much like how hospitals do it — they’re scanning bar codes on the packaging and the patient bracelet and recording it. This would be an easier user experience and quite frankly I think companies already in this space (like Snapchat and Niantic) could whip this up in a single-day hackathon no problem.

Is it the perfect system? Maybe not. But there is no perfect system or we’d be using it by now. We need you, the world’s talent pool, to step up and make life a little better. Do it in prototype form by October 3rd and you’ll be eligible for one of twenty $1000 cash prizes and a chance at winning the Hackaday Prize. But even if you don’t build a single thing, one idea could be the spark that lets others change the world for the better. So let’s hear it!

Add Robotic Farming To Your Backyard With Farmbot Genesis

Growing your own food is a fun hobby and generally as rewarding as people say it is. However, it does have its quirks and it definitely equires quite the time input. That’s why it was so satisfying to watch Farmbot push a weed underground. Take that!

Farmbot is a project that has been going on for a few years now, it was a semifinalist in the Hackaday Prize 2014, and that development time shows in the project documented on their website. The robot can plant, water, analyze, and weed a garden filled with arbitrarily chosen plant life. It’s low power and low maintenance. On top of that, every single bit is documented on their website. It’s really well done and thorough. They are gearing up to sell kits, but if you want it now; just do it yourself.

The bot itself is exactly what you’d expect if you were to pick out the cheapest most accessible way to build a robot: aluminum extrusions, plate metal, and 3D printer parts make up the frame. The brain is a Raspberry Pi hooked to its regular companion, an Arduino. On top of all this is a fairly comprehensive software stack.

The user can lay out the garden graphically. They can get as macro or micro as they’d like about the routines the robot uses. The robot will happily come to life in intervals and manage a garden. They hope that by selling kits they’ll interest a whole slew of hackers who can contribute back to the problem of small scale robotic farming.

Hackaday Prize Entry: Harmonicas, Candy, And Van Halen

Watch enough How It’s Made, and you’ll soon become very enthusiastic about computer vision and compressed air. In factories all around the world, production lines automatically sort the wheat from the chaff by running a product underneath a camera and blowing defective product off the line.

For his Hackaday Prize entry, [Fabien] is attempting this same task. He’s building a machine that will rapidly sort candy with computer vision and precisely controlled jets of air. He’s also planning for the Van Halen reunion and building a CNC harmonica.

Right now, the design has a hopper full of M&Ms dropping through a channel where a camera looks at each individual piece of candy. A Raspberry Pi, camera, and OpenMV detect all the red, yellow, brown, and blue M&Ms, and send that information to a computer controlling a suite of pneumatic valves. When these valves open, candy of different colors is shuffled off into it’s own bin. It’s the perfect device for someone responsible for reading Van Halen’s rider.

In an interesting little side project, [Fabien] needed a way to test the pneumatic valves before building the color sensor and candy chute. He had a harmonica lying around, and built something we’re surprised we’ve never seen before. It’s a CNC harmonica, capable of belting out a few tunes. You can check out that testing video after the break.

Continue reading “Hackaday Prize Entry: Harmonicas, Candy, And Van Halen”

Eye Tracking Makes The Musical Eye Conductor For Everyone!

For his final project at the Copenhagen Institute of Interaction Design, [Andreas Refsgaard] decided to make something that matters : a system that allows anyone to control a musical instrument using only their eyes and facial expressions. Someone should enter this into a certain contest that’s running…

Dubbed the Eye Conductor, [Andreas] has created a highly customizable system that allows for a control interface that can be operated using only your eyes, and some facial expressions. Designed with the intent to allow everyone to enjoy playing music, [Andreas] user test the system at schools, housing communities for people with physical disabilities, and anyone he could find in a wheel chair. His intent is to continue the project so that all people can enjoy playing music.

The system is open, designed for inclusion and can be customised to fit the physical abilities of whoever is using it.

Continue reading “Eye Tracking Makes The Musical Eye Conductor For Everyone!”