Geoffrey The Giraffe’s Last Call Of Toys For Hacking

Many of us in the United States frequently browse the shelves of Toys R Us for things to hack on. Sadly that era will soon end with the chain’s closing. In the meantime, the entire store becomes the clearance shelf as they start liquidating inventory. Depending on store, the process may begin as soon as Thursday, March 22. (Warning: video ads on page.)

While not as close to hacker hearts as the dearly departed Radio Shack or Maplin, Toys R Us has provided the hacker community with a rich source of toys we’ve repurposed for our imagination. These toys served various duties including chassis, enclosure, or parts donor. They all had low prices made possible by the high volume, mass market economics that Toys R Us helped build. Sadly it was not able to keep its head above water in the low margin cutthroat competition of retail sales in America.

As resourceful consumers, we will find other project inspirations. Many projects on this site have sourced parts from Amazon. In commercial retail, Target has started popping up in increasing frequency. And no matter where new toys are sold, wait a few years and some fraction will end up at our local thrift store.

We’ll always have some nostalgia for Geoffrey the Giraffe, but toy hacking must go on.

An Introduction To Storm Detector Modules

Lightning storm detectors have been around for a surprisingly long time. The early designs consisted of a pair of metal bells and a pendulum. When there was a charge applied, for example by connecting one bell to the ground and the other to a lightning rod, the bells would ring when a lightning storm was close by. In the mid 18th century, these devices were only practical for demonstration and research purposes, but very likely represent the earliest devices that convert electrostatic charge to mechanical force. A bit over a hundred years later, the first lightning detector was considered by some as the first radio receiver as well.

As soon as I found out about storm detector chips, I knew I would have to get one working. For about $25, I ordered an AMS AS3935 module from China. This chip has been featured before in a number of excellent projects such as Twittering lightning detectors, and networks of Sub-Saharan weather stations. While there’s an Arduino library for interfacing with this IC, I’m going to be connecting it up to an ESP8266 running the NodeMCU firware, which means digging into the datasheet and writing some SPI code. If any of the above tickles your fancy, read on! Continue reading “An Introduction To Storm Detector Modules”

What To Do With Your Brand New Ultrasonic Transducer

We wager you haven’t you heard the latest from ultrasonics. Sorry. [Lindsay Wilson] is a Hackaday reader who wants to share his knowledge of transducer tuning to make tools. The bare unit he uses to demonstrate might attach to the bottom of an ultrasonic cleaner tank, which have a different construction than the ones used for distance sensing. The first demonstration shows the technique for finding a transducer’s resonant frequency and this technique is used throughout the video. On the YouTube page, his demonstrations are indexed by title and time for convenience.

For us, the most exciting part is when a tuned transducer is squeezed by hand. As the pressure increases, the current drops and goes out of phase in proportion to the grip. We see a transducer used as a pressure sensor. He later shows how temperature can affect the current level and phase.

Sizing horns is a science, but it has some basic rules which are well covered. The basic premise is to make it half of a wavelength long and be mindful of any tools which will go in the end. Nodes and antinodes are explained and their effects demonstrated with feedback on the oscilloscope.

We have a recent feature for an ultrasonic knife which didn’t cut the mustard, but your homemade ultrasonic tools should be submitted to our tip line.

Continue reading “What To Do With Your Brand New Ultrasonic Transducer”

3D Printed Raspberry Pi NAS With Dual Drive Bays

While it might not pack the computational punch you’d usually be looking for in a server platform, you can’t beat how cheap the Raspberry Pi is. As such, it’s at the heart of many a home LAN, serving up files as a network attached storage (NAS) device. But the biggest problem with using the Pi in a NAS is that it doesn’t have any onboard hard drive interface, forcing you to use USB. Not only is this much slower, but doesn’t leave you a lot of options for cleanly hooking up your drives.

This 3D printable NAS enclosure designed by [Paul-Louis Ageneau] helps address the issue by integrating two drive bays which can accommodate 2.25 inch laptop hard disk drives and their associated IB-AC6033-U3 USB adapters. The drives simply slide into the “rails” designed into the case without the need for additional hardware. There’s even space in the bottom of the case for a USB hub to connect the drives, and a fan on the top of the case to help keep the whole stack cool. It still isn’t perfect, but it’s compact and doesn’t look half bad.

The design is especially impressive as it doesn’t require any supports, an admirable goal to shoot for whenever designing for 3D printing. As an added bonus, the entire case is designed in OpenSCAD and licensed under the GPL v3; making modification easy if you want to tweak it for your specific purposes.

This certainly isn’t the strongest Raspberry Pi enclosure we’ve ever seen, that title would have to go to the ammo case that does double duty as a media streamer, but looks like it would make a great home for that new 3 B+ you’ve got on order.

Bring Deep Learning Algorithms To Your Security Cameras

AI is quickly revolutionizing the security camera industry. Several manufacturers sell cameras which use deep learning to detect cars, people, and other events. These smart cameras are generally expensive though, compared to their “dumb” counterparts. [Martin] was able to bring these detection features to a standard camera with a Raspberry Pi, and a bit of ingenuity.

[Martin’s] goal was to capture events of interest, such as a person on screen, or a car in the driveway. The data for the events would then be published to an MQTT topic, along with some metadata such as confidence level. OpenCV is generally how these pipelines start, but [Martin’s] camera wouldn’t send RTSP images over TCP the way OpenCV requires, only RTSP over UDP. To solve this, Martin captures the video stream with FFmpeg. The deep learning AI magic is handled by the darkflow library, which is itself based upon Google’s Tensorflow.

Martin tested out his recognition system with some cheap Chinese PTZ cameras, and the processing running on a remote Raspberry Pi. So far the results have been good. The system is able to recognize people, animals, and cars pulling in the driveway.  His code is available on GitHub if you want to give it a spin yourself!

Look Upon Eyepot, And Weep For Mercy

Hope you weren’t looking forward to a night of sleep untroubled by nightmares. Doing his part to make sure Lovecraftian mechanized horrors have lease in your subconscious, [Paul-Louis Ageneau] has recently unleashed the horror that is Eyepot upon an unsuspecting world. This Cycloptic four legged robotic teapot takes inspiration from an enemy in the game Alice: Madness Returns, and seems to exist for no reason other than to creep people out.

Even if you aren’t physically manifesting nightmares, there’s plenty to learn from this project. [Paul-Louis Ageneau] has done a fantastic job of documenting the build, from the OpenSCAD-designed 3D printed components to the Raspberry Pi Zero and Arduino Pro Mini combo that control the eight servos in the legs. If you want to play along at home all the information and code is here, though feel free to skip the whole teapot with an eyeball thing.

A second post explains how the code is written for both the Arduino and Pi, making for some very illuminating reading. A Python script on the Pi breaks down the kinematics and passes on the appropriate servo angles to the Arduino over a serial link. Combined with a web interface for control and a stream from the teapot’s Raspberry Pi Camera module, and you’ve got the makings of the world’s creepiest telepresence robot. We’d love to see this one stomping up and down a boardroom table.

Seems we are on a roll recently with creepy robot pals. Seeing a collaboration between Eyepot and JARVIS might be too much for us to handle. Though we have a pretty good idea how we’d want to control them.