[Emily]’s Eerie Educational Electric Eyeball Entertains

Like many of us, [Emily’s Electric Oddities] has had a lot of time for projects over the past year or so, including one that had been kicking around since late 2018. It all started at the Hackaday Superconference, when [Emily] encountered the Adafruit Hallowing board in the swag bag. Since that time, [Emily] has wanted to display the example code eyeball movement on a CRT, but didn’t really know how to go about it. Spoiler alert: it works now.

See? It’s educational.

Eventually, [Emily] learned about the TV out library for Arduino and got everything working properly — the eyeball would move around with the joystick, blink when the button is pressed, and the pupil would respond visually to changes in ambient light. The only problem was that the animation moved at a lousy four frames per second. Well, until she got Hackaday’s own [Roger Cheng] involved.

[Roger] was able to streamline the code to align with [Emily]’s dreams, and then it was on to our favorite part of this build — the cabinet design. Since the TV out library is limited to black and white output without shades of gray, Emily took design cues from the late 70s/early 80s, particularly the yellow and wood of the classic PONG cabinet. We love it!

Is Your Pet Eye the worst video game ever, as [Emily] proclaims it to be? Not a chance, and we’re pretty sure that the title still rests with Desert Bus, anyway. Even though the game only lasts until the eye gets tired and goes to sleep, it’s way more fun than Your Pet Rock. Don’t miss the infomercial/explanation/demonstration video after the break. If one video is just not enough, learn more about [Emily’s] philosophy of building weird projects from the Supercon talk she presented. It’s also worth mentioning that this one fits right into the Reinvented Retro contest.

Why are eyeballs so compelling? We can’t say for sure, but boy, this eyeball web cam sure is disconcerting.

Continue reading “[Emily]’s Eerie Educational Electric Eyeball Entertains”

Speech Recognition On An Arduino Nano?

Like most of us, [Peter] had a bit of extra time on his hands during quarantine and decided to take a look back at speech recognition technology in the 1970s. Quickly, he started thinking to himself, “Hmm…I wonder if I could do this with an Arduino Nano?” We’ve all probably had similar thoughts, but [Peter] really put his theory to the test.

The hardware itself is pretty straightforward. There is an Arduino Nano to run the speech recognition algorithm and a MAX9814 microphone amplifier to capture the voice commands. However, the beauty of [Peter’s] approach, lies in his software implementation. [Peter] has a bit of an interplay between a custom PC program he wrote and the Arduino Nano. The learning aspect of his algorithm is done on a PC, but the implementation is done in real-time on the Arduino Nano, a typical approach for really any machine learning algorithm deployed on a microcontroller. To capture sample audio commands, or utterances, [Peter] first had to optimize the Nano’s ADC so he could get sufficient sample rates for speech processing. Doing a bit of low-level programming, he achieved a sample rate of 9ksps, which is plenty fast for audio processing.

To analyze the utterances, he first divided each sample utterance into 50 ms segments. Think of dividing a single spoken word into its different syllables. Like analyzing the “se-” in “seven” separate from the “-ven.” 50 ms might be too long or too short to capture each syllable cleanly, but hopefully, that gives you a good mental picture of what [Peter’s] program is doing. He then calculated the energy of 5 different frequency bands, for every segment of every utterance. Normally that’s done using a Fourier transform, but the Nano doesn’t have enough processing power to compute the Fourier transform in real-time, so Peter tried a different approach. Instead, he implemented 5 sets of digital bandpass filters, allowing him to more easily compute the energy of the signal in each frequency band.

The energy of each frequency band for every segment is then sent to a PC where a custom-written program creates “templates” based on the sample utterances he generates. The crux of his algorithm is comparing how closely the energy of each frequency band for each utterance (and for each segment) is to the template. The PC program produces a .h file that can be compiled directly on the Nano. He uses the example of being able to recognize the numbers 0-9, but you could change those commands to “start” or “stop,” for example, if you would like to.

[Peter] admits that you can’t implement the type of speech recognition on an Arduino Nano that we’ve come to expect from those covert listening devices, but he mentions small, hands-free devices like a head-mounted multimeter could benefit from a single word or single phrase voice command. And maybe it could put your mind at ease knowing everything you say isn’t immediately getting beamed into the cloud and given to our AI overlords. Or maybe we’re all starting to get used to this. Whatever your position is on the current state of AI, hopefully, you’ve gained some inspiration for your next project.

Simple Probe Sniffs Out EMI

Unable to account for the strange glitches he was seeing on his DIY CNC router, [Daniël Van Den Berg]  wondered if his electronics might be suffering from some form of electromagnetic interference (EMI). So he did what any good hacker would do, and rummaged through the parts bin to build an impromptu EMI detector.

[Daniël] is quick to point out that he’s not an electrical engineer, and makes no guarantees about the accuracy of his tossed together gadget. But it does seem to work well enough in his testing that he’s able to identify particularly “noisy” electronic components, so it’s probably worth putting one together just to hear what your hardware is pumping into the environment.

The hardware here is very simple, [Daniël] just attached a coil of solid copper wire to one of the analog pins on an Arduino Nano with a resistor, and hung a speaker off of one of the digital pins. From there, it just took a few lines of code to read the voltage in the coil and convert that into a tone for the speaker. The basic idea is that a strong alternating magnetic field will set up voltage fluctuations in the coil large enough for the Arduino’s ADC to read.

If you’re looking for a bit more insight into what kind of interference your electronic creations might be putting out, [Alex Whittimore] gave a fantastic presentation during the 2020 Hackaday Remoticon about performing RF debugging using a cheap RTL-SDR dongle.

Perfecting A 3D Printed Camera Motion Control Rig

If you’ve ever watched one of those high production value YouTube videos and wondered how they’re able to get those smooth shots where the camera seems to be spinning around an object, you were probably looking at the product of an motorized camera motion system. There’s no question these rigs can produce visually striking shots, but their high cost usually keeps them out of the hands of us lowly hackers.

Unless of course you do like [Andy], and build your own. The latest version of this impressive rig features the ability to continuously rotate thanks to commercial 12-wire slip rings, with optical endstops so the machine can still be homed at the beginning of a move. An onboard Raspberry Pi and Arduino Uno are responsible for controlling the stepper motors, the configuration of which ends up being reminiscent of a standard 3D printer.

The MQTT remote can hold a phone for live video.

The software [Andy] has come up with lets him synchronize the camera rig with a small rotating platform he built, which allows for even more complex shots as demonstrated in the video below. It also supports a very slick MQTT-enabled remote controller that he built as a previous project, which makes taking direct control over the camera and monitoring its status much easier.

Want to add a little polish to your own project videos? [Andy] has released all of the files and information you’d need to build your own version of his motion control rig, though we wouldn’t blame you for feeling a bit intimidated by this one. It might not be the most elaborate camera motion control system we’ve seen, but it’s certainly up there. If you just want an overhead video and don’t need those fancy tracking shots, perhaps a modified VESA arm would fit the bill.

Continue reading “Perfecting A 3D Printed Camera Motion Control Rig”

Making Minty Fresh Music With Markov Chains: The After Eight Step Sequencer

Step sequencers are fantastic instruments, but they can be a little, well, repetitive. At it’s core, the step sequencer is a pretty simple device: it loops through a series of notes or phrases that are, well, sequentially ordered into steps. The operator can change the steps while the sequencer is looping, but it generally has a repetitive feel, as the musician isn’t likely to erase all of the steps and enter in an entirely new set between phrases.

Enter our old friend machine learning. If we introduce a certain variability on each step of the loop, the instrument can help the musician out a bit here, making the final product a bit more interesting. Such an instrument is exactly what [Charis Cat] set out to make when she created the After Eight Step Sequencer.

The After Eight is an eight-step sequencer that allows the artist to set each note with a series of potentiometers (which are, of course, housed in an After Eight mint tin). The potentiometers are read by an Arduino, which passes MIDI information to a computer running the popular music-oriented visual programming language Max MSP. The software uses a series of Markov Chains to augment the musician’s inputted series of notes, effectively working with the artist to create music. The result is a fantastic piece of music that’s different every time it’s performed. Make sure to check out the video at the end for a fantastic overview of the project (and to hear the After Eight in action, of course)!

[Charis Cat]’s wonderful creation reminds us of some the work [Sara Adkins] has done, blending human performance with complex algorithms. It’s exactly the kind of thing we love to see at Hackaday- the fusion of a musician’s artistic intent with the stochastic unpredictability of a machine learning system to produce something unique.

Thanks to [Chris] for the tip!

Continue reading “Making Minty Fresh Music With Markov Chains: The After Eight Step Sequencer”

Rodriguez — IV Curve Tracer On The Cheap

In response to an online discussion on the Electrical Engineering Stack Exchange, [Joseph Eoff] decided to prove his point by slapping together a bare-bones IV curve tracer using an Arduino Nano and a handful of passives. But he continued to tinker with the circuit, seeing just how much improvement was possible out of this simple setup. He squeezes a bit of extra resolution out of the PWM DAC circuit by using the Timer1 library to obtain 1024 instead of 256 steps. For reading voltages, he implements oversampling (and in some cases oversampling again) to eke out a few extra bits of resolution from the 10-bit ADC of the Nano. The whole thing is controlled by a Python / Qt script to generate the desired plots.

While it works and gives him the IV curves, this simplicity comes at a price. It’s slow — [Joseph] reports that it takes several minutes to trace out five different values of base current on a transistor. It was this lack of speed that inspired him to name the project after cartoon character Speedy Gonzales’s cousin,  Slowpoke Rodriguez, AKA “the slowest mouse in all of Mexico”. In addition to being painstakingly slow, the tracer is limited to 5 volts and currents under 5 milliamps.

[Joseph] documents the whole design and build process over on his blog, and has made the source code available on GitHub should you want to try this yourself. We covered another interesting IV curve tracer build on cardboard ten years ago, but that one is much bigger than the Rodriguez.

A Phased-Array Ultrasonic 3D Scanner From Scratch

Who wouldn’t want an autonomous drone to deliver cans of fizzy drink fresh from the fridge? [Alex Toussaint] did, and in thinking how such a machine might work he embarked on a path that eventually led him to create a fully functional ultrasonic 3D scanner. In writing it up he’s produced a straightforward description of how the system works, which should also be of interest to anyone curious about phased array radar. He starts with an easy-to-understand explanation of the principle behind phased array beam forming, and there follows his journey into electronics as he uses this ambitious project to learn the art from scratch. That he succeeded is testament to his ability as well as his sheer tenacity.

He finally arrived at a grid of 100 ultrasonic emitters controlled from an Arduino through a series of shift register boards. Using this he can steer his ultrasonic beam horizontally as well as vertically, and receive echoes from objects in three-dimensional space. The ornamental bird example he uses for his scanning tests doesn’t quite emerge in startling clarity, but it is still clear that an object of its size and rough shape is visible enough for the drone in his original idea to detect it. If you would like to experiment with the same techniques and array then all the resources can be found in a GitHub repository, meanwhile we’re still impressed with the progress from relative electronics novice to this. We hope the ideas within it will be developed further.

We’ve seen ultrasonic arrays before, but mainly used in levitation experiments.