LEDs And Pi Let You Virtually Decorate This Online Christmas Tree

Anyone who has decorated a Christmas tree knows that the lights are what really make the look. But no matter how many strings you wrap around it, there never seems to be enough. Plus the standard sets either sit there and do nothing, or just blink on and off at regular intervals. Yawn.

But hackers aim higher, and [leo.currie]’s interactive “paintable” Christmas tree takes the lighting game a step beyond. The standard light strings are replaced with strings of WS2811 RGB LEDs which are wired to an ESP8266. A camera connected to a Raspberry Pi is setup up to stream images of the tree to all and sundry on the Interwebz, but with a special twist: it also creates a map of every light on the tree. That allows the lights to be controlled individually in response to user inputs on a web page hosted on the Pi. The upshot is that you can paint the tree with any color you like in real time, or upload various animated GIFs to display on the tree. You can play with the tree directly, or watch a replay on the video below when that Pi inevitably gets hugged to death.

Imagine the possibilities with this. Why not hang a lot of LED strings vertically from the eaves of your house and make a huge, low-resolution display? We’ve featured plenty of large, interactive LED Christmas displays before, and we’d love to see what you come up with.

Continue reading “LEDs And Pi Let You Virtually Decorate This Online Christmas Tree”

A Star-Trek-Inspired Robot With Raspberry Pi And AI

When [314Reactor] got a robot car kit, he knew he wanted to add some extra things to it. At about the same time he was watching a Star Trek episode that featured exocomps — robots that worked in dangerous areas. He decided to use those fictional devices to inspire his modifications to the car kit. Granted, the fictional robots were intelligent and had a replicator. So you know he won’t make an actual working replica. But then again, the ones on the TV show didn’t have all that either.

A Raspberry Pi runs Tensorflow using the standard camera.  This lets it identify objects of interest (assuming it gets them right) and sends the image back to the operator along with some identifying information. The kit already had an Arduino onboard and the new robot talks to it via a serial port. You can see a video about the project, below.

Continue reading “A Star-Trek-Inspired Robot With Raspberry Pi And AI”

Neural Network Knows When Cat Wants To Go Outside

Neural networks are computer systems that are vaguely inspired by the construction of animal brains, and much like human brains, can be trained to obey the whims of the almighty domestic cat. [EdjeElectronics] has built just such a system, and his cat is better off for it.

The build uses a Raspberry Pi, fitted with the Pi Camera board, to image the area around the back door of the house. A Python script regularly captures images and passes them to a TensorFlow neural network for object recognition. The TensorFlow network returns object type and positions to the Python script. This information can be used to determine if there is a cat in the frame, and if it is inside or outside. If the cat remains in position for ten consecutive frames, a text message is sent via Twilio, indicating to the owner to let the cat in or out, as the case may be.

Thirty years ago, object classification was a pie-in-the-sky technology, but now you can run it on a $30 computer to figure out where your pets are. What a time we live in! A similar solution to this problem may be a cat door that unlocks via facial recognition. Video after the break.

[Thanks to Baldpower for the tip!]

Continue reading “Neural Network Knows When Cat Wants To Go Outside”

Behold The WT-220: A ‘Clever’ VT-220 Terminal

[John Whittington] failed to win a bid for an old VT-220 serial terminal on eBay, so he decided to make his own version and improve it along the way. The result is the Whitterm-220 (or WT-220) which has at its core a Raspberry Pi and is therefore capable of more than just acting as a ‘dumb’ serial terminal.

Rear of the WT-220 with paint-filled laser engraving and all necessary connectors.

The enclosure is made from stacked panels of laser-cut plywood with an acrylic plate on the back for labels and connectors, where [John] worked paint into the label engravings before peeling off the acrylic’s protective film. By applying paint after laser-engraving but before peeling off the film, it acts as a fill and really makes the text pop.

Near the front, one layer of clear acrylic among the plywood layers acts as a light guide and serves as a power indicator, also doing double duty as TX/RX activity lights. When power is on, that layer glows, serving as an attractive indicator that doesn’t interfere with looking at the screen. When data is sent or received, a simple buffer circuit tied to the serial lines lights up LEDs to show TX or RX activity, with the ability to enable or disable this functionality by toggling a GPIO pin. A video overview is embedded below, where you can see the unit in action.

Continue reading “Behold The WT-220: A ‘Clever’ VT-220 Terminal”

A Pi Cluster To Hang In Your Stocking With Care

It’s that time of year again, with the holidays fast approaching friends and family will be hounding you about what trinkets and shiny baubles they can pretend to surprise you with. Unfortunately there’s no person harder to shop for than the maker or hacker: if we want it, we’ve probably already built the thing. Or at least gotten it out of somebody else’s trash.

But if they absolutely, positively, simply have to buy you something that’s commercially made, then you could do worse than pointing them to this very slick Raspberry Pi cluster backplane from [miniNodes]. With the ability to support up to five of the often overlooked Pi Compute Modules, this little device will let you bring a punchy little ARM cluster online without having to build something from scratch.

The Compute Module is perfectly suited for clustering applications like this due to its much smaller size compared to the full-size Raspberry Pi, but we don’t see it get used that often because it needs to be jacked into an appropriate SODIMM connector. This makes it effectively useless for prototyping and quickly thrown together hacks (I.E. everything most people use the Pi for), and really only suitable for finished products and industrial applications. It’s really the line in the sand between playing around with the Pi and putting it to real work.

[miniNodes] calls their handy little device the Carrier Board, and beyond the obvious five SODIMM slots for the Pis to live in, there’s also an integrated gigabit switch with an uplink port to get them all connected to the network. The board powers all of the nodes through a single barrel connector on the side opposite the Ethernet jack, leaving behind the masses of spider’s web of USB cables we usually see with Pi clusters.

The board doesn’t come cheap at $259 USD, plus the five Pi Compute Modules which will set you back another $150. But for the ticket price you’ll have a 20 core ARM cluster with 5 GB of RAM and 20 GB of flash storage in a 200 x 100 millimeter (8 x 4 inch) footprint, with an energy consumption of under 20 watts when running at wide open throttle. This could be an excellent choice for mobile applications, or if you just want to experiment with parallel processing on a desktop-sized device.

Amazon is ready for the coming ARM server revolution, are you? Between products like this and the many DIY ARM clusters we’ve seen over the years, it looks like we’re going to be dragging the plucky architecture kicking and screaming into the world of high performance computing.

[Thanks to Baldpower for the tip.]

Dozens Of Servos Flip The Segments Of This 3D-Printed Digital Clock

A digital clock based on seven-segment displays? Not exciting. A digital clock with seven-segment displays that’s really big and can be read across a football field? That’s a little more interesting. A large format digital clock that uses electromechanical seven-segment displays? Now that’s something to check out.

This clock comes to us by way of [Otvinta] and is a nice example of what you can do with 3D-printing and a little imagination. Each segment of the display is connected to a small hobby servo which can flip it 90°. Mounted in a printed plastic frame, the segments are flipped in and out of view as needed to compose the numerals needed to display the time. The 28 servos need two Pololu controller boards, which talk to a Raspberry Pi running Windows IoT, an interesting design choice that we don’t often see. You’d think that 28 servos clattering back and forth might be intolerable, but the video below shows that the display is actually pretty quiet. We’d love to see this printed all in black with white segment faces, or even a fluorescent plastic; how cool would that look under UV light?

We’re not saying this is the only seven-segment servo clock we’ve seen, but it is a pretty slick build. And of course there’s more than one way to use servos to tell the time.

Continue reading “Dozens Of Servos Flip The Segments Of This 3D-Printed Digital Clock”

An Over-engineered LED Sign Board

Never underestimate the ability of makers in over thinking and over-engineering the simplest of problems and demonstrating human ingenuity. The RGB LED sign made by [Hans and team] over at the [Hackheim hackerspace] in Trondheim is a testament to this fact.

As you would expect, the WS2812 RGB LEDs illuminate the sign. In this particular construction, an individual strip is responsible for each character. Powered by an ESP32 running FreeRTOS, the sign communicates using MQTT and each letter gets a copy of the 6 x 20 framebuffer which represents the color pattern that is expected to be displayed. A task on the ESP32 calculates the color value to be displayed by each LED.

The real question is, how to calibrate the distributed strings of LEDs such that LEDs on adjacent letters of the sign display an extrapolated value? The answer is to use OpenCV to create a map of the LEDs from their two-dimensional layout to a lookup table. The Python script sends a command to illuminate a single LED and the captured image with OpenCV records the position of the signal. This is repeated for all LEDs to generate a map that is used in the ESP32 firmware. How cool is that?

And if you are wondering about the code, it is up on [Github], and we would love to see someone take this up a level. The calibration code, as well as the Remote Client and ESP32 codes, are all there for your hacking pleasure.

Its been a while since we have seen OpenCV in action like with the Motion Tracking Turret and Face Recognition. The possibilities seem endless. Continue reading “An Over-engineered LED Sign Board”