The Handsfree Icebucket Challenge Backpack

20140824_131835-1024x768

The ALS Ice Bucket Challenge has taken the internet by storm as tv stars, musicians, athletes, kids, students and everyone in between have thrown freezing water all over themselves in an effort to raise awareness (and millions of dollars) to help cure the neurodegenerative disease. So when [Christopher] was challenged by a friend, he decided to make an icebucket backpack that would pour the liquid from above without having to use his hands.

The wearable device uses a Barometric pressure sensor that is triggered when air is blown into a tube. This sensor is attached to an Arduino Uno. Once activated, the pouring process begins drenching the person below in ice cold water. It’s a little unnecessary, but it gets the job done in a fun, maker-style way. Now if you make something similar, don’t forget to actually support the cause and donate money.

To see the icebucket backpack in action, check out the video after the break:

Continue reading “The Handsfree Icebucket Challenge Backpack”

A Virtual Touchscreen (3D Ultrasonic Radar)

virtualtouch

Producing items onto a screen simply by touching the air is a marvelous thing. One way to accomplish this involves four HC-SR04 ultrasonic sensor units that transmit data through an Arduino into a Linux computer. The end result is a virtual touchscreen that can be made at home.

The software of this device was developed by [Anatoly] who translated hand gestures into actionable commands. The sensors attached to the Arduino had an approximate scanning range of 3m, and the ultrasonic units were modified to broadcast an analog signal at 40 kHz. There were a few limitations with the original hardware design as [Anatoly] stated in the post. For example, at first, only one unit was transmitting at a time, so there was no way the Arduino could identify two objects on the same sphere. However, [Anatoly] updated the blog with a 2nd post showing that sensing multiple items at once could be done. Occasionally, the range would be finicky when dealing with small items like pens. But besides that, it seemed to work pretty well.

Additional technical specifications can be found on [Anatoly]’s blog and videos of the system working can be seen after the break.

Continue reading “A Virtual Touchscreen (3D Ultrasonic Radar)”

Open Source Marker Recognition For Augmented Reality

marker

[Bharath] recently uploaded the source code for an OpenCV based pattern recognition platform that can be used for Augmented Reality, or even robots. It was built with C++ and utilized the OpenCV library to translate marker notations within a single frame.

The program started out by focusing in on one object at a time. This method was chosen to eliminate the creation of additional arrays that contained information of all of the blobs inside the image; which could cause some problems.

Although this implementation did not track marker information through multiple frames, it did provide a nice foundation for integrating pattern recognition into computer systems. The tutorial was straightforward and easy to ready. The entire program and source code can be found on Github which comes with a ZERO license so that anyone can use it. A video of the program comes up after the break:

Continue reading “Open Source Marker Recognition For Augmented Reality”

A Geiger Counter For An Off-Road Apocalypse Vehicle

prog2

If the world comes to an end, it’s good to be prepared. And let’s say that the apocalypse is triggered by a series of nuclear explosions. If that is the case, then having a Geiger counter is a must, plus having a nice transport vehicle would be helpful too. So [Kristian] combined the two ideas and created his own Geiger counter for automotive use just on the off chance that he might need it one day.

It all started with a homemade counter that was fashioned together. Then a display module with a built-in graphics controller that was implemented to show all kinds of information in the vehicle. This was done using a couple of optocouplers as inputs. In addition, a CAN bus interface was put in place. As an earlier post suggests, the display circuit was based on a Microchip 18F4680 microcontroller. After that, things kind of got a little out of control and the counter evolved into more of a mobile communications center; mostly just because [Kristian] wanted to learn how those systems worked. Sounds like a fun learning experience! Later the CPU and gauge was redesigned to use low-quiescent regulators. A filtering board was also made that could kill transients and noise if needed.

The full project can be seen on [Kristian]’s blog.

The Counter-Strike Airsoft Robot

F6R9THZHR4SR0HV.MEDIUM

[Jon] and his brother converted an RC car into a robot that can fire airsoft pellets into the air. The little motorized vehicle was disassembled and a handheld was attached to the top. A pulling mechanism was put in place and a safety procedure was added to make sure no accidents occurred.

The chassis stand was created to hold the handle. The setup was then tested at this point, and a Raspberry Pi server was configured to have a camera that would act as the eyes for the robot. Once everything was in place, the wheels hit the ground and the vehicle was able to move around, positioning itself to aim the servos at a designated target. Footage was transmitted via the web showing what the robot was looking at.

A video of the remote-controlled counter-strike robot can be seen after the break. You could consider this your toy army. That makes this one your toy air force.

Continue reading “The Counter-Strike Airsoft Robot”

Lucid Dreaming With Plastic Milk Cartons

milk-carton-mask-layoutBeing aware that oneself is in a dream can be a difficult moment to accomplish. But as [Rob] showed on his blog, monitoring the lucid experience once it happens doesn’t have to be costly. Instead, household items can be fashioned together to make a mask that senses REM sleep cycles. We were tipped off to the project by [Michael Paul Coder] who developed an algorithm to communicate inside a dream.

[Rob] cut up plastic milk cartons for this ‘DreamJacker’ project and attached a webcam to produce a simple way to detect eye movements. A standard game adapter with a triangular array of white LED’s was added to the plastic cover in order to provide the necessary illumination needed for the camera. After testing it out, he switched to red light to balance sensitivity issues. Another iteration later and [Rob] attempted to create hypnagogic imagery during the drowsiness state that occurs right before falling asleep. He did this by fitting a single tri-color LED that he scrapped from Christmas lights that were dumped on his street.

The mask is tied to the back of the head with shoelaces, and acts like an eye patch during Wake Back to Bed sessions (WBTB). The end result produces an eerie looking graph of eye twitching taken throughout the night. We would be interested confirming that this setup helps the user experience a lucid dream, so it might be time to make our own.

Since writing his post, [Rob] has since adapted a mouse for use inside the mask cup to integrate with the LucidScribe REM FIELD-mouse plugin developed by [Michael Paul Coder].