ESP32 Powered Crunch-E Makes Beats On The Go

There’s no shortage of devices out there for creating electronic music, but if you’re just looking to get started, the prices on things like synthesizers and drum machines could be enough to give you second thoughts on the whole idea. But if you’ve got a well stocked parts bin, there’s a good chance you’ve already got most of what you need to build your own Crunch-E.

A Crunch-E built from stacked modules

Described by creator [Roman Revzin] as a “keychain form factor music-making platform”, the Crunch-E combines an ESP32, an MAX98357 I2S audio amplifier, an array of tactile buttons, and a sprinkling of LEDs and passives. It can be built on a perfboard using off-the-shelf modules, or you can spin up a PCB if you want something a bit more professional. It sounds like there’s eventually going to be an option to purchase a pre-built Crunch-E at some point as well.

But ultimately, the hardware seems to be somewhat freeform — the implementation isn’t so important as long as you’ve got the major components and can get the provided software running on it.

The software, which [Roman] is calling CrunchOS, currently provides four tracks, ten synth instruments, and two drum machine banks. Everything can be accessed from a 4 x 4 button array, and there’s a “cheat sheet” in the documentation that shows what each key does in the default configuration. Judging by the demo video below, it’s already an impressively capable platform. But this is just the beginning. If everything goes according to plan and more folks start jamming on their own Crunch-E hardware, it’s not hard to imagine how the software side can be expanded and adapted over time.

Over the years we’ve seen plenty of homebrew projects for producing electronic music, but the low-cost, simple construction, and instant gratification nature of the Crunch-E strikes us as a particularly compelling combination. We’re eager to see where things develop from here.

Continue reading “ESP32 Powered Crunch-E Makes Beats On The Go”

A business card-sized, solar-powered weather station.

2024 Business Card Challenge: Weather Or Not You Get The Job

What’s the easiest way to break the ice with someone you’ve just met? If you’re not immediately talking shop, than it’s probably the time-tested subject of the weather. So what better way to get the conversation started than with a lovely solar-powered circuit sculpture of a business card that displays the weather?

We love that the frame has a built-in stand; that’s a great touch that really turns this card into something that someone might keep on their desk long-term. The brains of this operation is an ESP32 TTGO E-paper board, which checks the battery voltage first before connecting to Wi-Fi and getting data from the OpenWeatherMap API. It displays the information and then goes to sleep for 15 minutes.

For power, [BLANCHARD Jordan] is using a 5 V solar panel and a small battery from an old vape pen. We love to see projects that keep those things out of the landfills, so don’t sleep on using them.

You have just a few weeks left to enter the 2024 Business Card Challenge, so fire up those soldering irons and get hackin’!

Donkey Kong Bongos Ditch The GameCube, Go Mobile

Historically speaking, optional peripherals for game consoles tend not to be terribly successful. You’ll usually get a handful of games that support the thing, one of which will likely come bundled with it, and then the whole thing fades into obscurity to make way for the next new gimmick.

For example, did you know Nintendo offered a pair of bongos for the GameCube in 2003? They were used almost exclusively by the trio of Donkey Konga rhythm games, although only two of them were ever released outside of Japan. While the games might not have been huge hits, they were successful enough to stick in the memory of [bl3i], who wanted a way to keep the DK bongo experience alive.

The end result is, arguably, more elegant than the hokey musical controller deserves. While most people would have just gutted the plastic bongos and crammed in some new hardware, [bl3i] went through considerable effort so the original hardware would remain intact. His creation simply snaps onto the bongos and connects to them via the original cable.

Internally, the device uses an Arduino to read the output of the bongos (which appeared to the GameCube essentially as a standard controller) and play the appropriate WAV files from an SD card as hits are detected. Add in an audio amplifier module and a battery, and Nintendo’s bongos can finally go forth into the world and spread their beats.

As far as we’re able to tell, this is the first time the Donkey Kong bongos have ever graced the pages of Hackaday in any form, so congratulations to [bl3i] for getting there first. But it’s certainly not the first time we’ve covered ill-conceived game gadgets — long time readers will perhaps be familiar with Nintendo’s attempt to introduce the Robotic Operating Buddy (ROB) to households back in 1985.

Continue reading “Donkey Kong Bongos Ditch The GameCube, Go Mobile”

AI Kayak Controller Lets The Paddle Show The Way

Controlling an e-bike is pretty straightforward. If you want to just let it rip, it’s a no-brainer — or rather, a one-thumber, as a thumb throttle is the way to go. Or, if you’re still looking for a bit of the experience of riding a bike, sensing when the pedals are turning and giving the rider a boost with the motor is a good option.

But what if your e-conveyance is more of the aquatic variety? That’s an interface design problem of a different color, as [Braden Sunwold] has discovered with his DIY e-kayak. We’ve detailed his work on this already, but for a short recap, his goal is to create an electric assist for his inflatable kayak, to give you a boost when you need it without taking away from the experience of kayaking. To that end, he used the motor and propeller from a hydrofoil to provide the needed thrust, while puzzling through the problem of building an unobtrusive yet flexible controller for the motor.

His answer is to mount an inertial measurement unit (IMU) in a waterproof container that can clamp to the kayak paddle. The controller is battery-powered and uses an nRF link to talk to a Raspberry Pi in the kayak’s waterproof electronics box. The sensor also has an LED ring light to provide feedback to the pilot. The controller is set up to support both a manual mode, which just turns on the motor and turns the kayak into a (low) power boat, and an automatic mode, which detects when the pilot is paddling and provides a little thrust in the desired direction of travel.

The video below shows the non-trivial amount of effort [Braden] and his project partner [Jordan] put into making the waterproof enclosure for the controller. The clamp is particularly interesting, especially since it has to keep the sensor properly oriented on the paddle. [Braden] is working on a machine-learning method to analyze paddle motions to discern what the pilot is doing and where the kayak goes. Once he has that model built, it should be time to hit the water and see what this thing can do. We’re eager to see the results.
Continue reading “AI Kayak Controller Lets The Paddle Show The Way”

EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator

Alibaba’s EMO (or Emote Portrait Alive) framework is a recent entry in a series of attempts to generate a talking head using existing audio (spoken word or vocal audio) and a reference portrait image as inputs. At its core it uses a diffusion model that is trained on 250 hours of video footage and over 150 million images. But unlike previous attempts, it adds what the researchers call a speed controller and a face region controller. These serve to stabilize the generated frames, along with an additional module to stop the diffusion model from outputting frames that feature a result too distinct from the reference image used as input.

In the related paper by [Linrui Tian] and colleagues a number of comparisons are shown between EMO and other frameworks, claiming significant improvements over these. A number of examples of talking and singing heads generated using this framework are provided by the researchers, which gives some idea of what are probably the ‘best case’ outputs. With some examples, like [Leslie Cheung Kwok Wing] singing ‘Unconditional‘ big glitches are obvious and there’s a definite mismatch between the vocal track and facial motions. Despite this, it’s quite impressive, especially with fairly realistic movement of the head including blinking of the eyes.

Meanwhile some seem extremely impressed, such as in a recent video by [Matthew Berman] on EMO where he states that Alibaba releasing this framework to the public might be ‘too dangerous’. The level-headed folks over at PetaPixel however also note the obvious visual imperfections that are a dead give-away for this kind of generative technology. Much like other diffusion model-based generators, it would seem that EMO is still very much stuck in the uncanny valley, with no clear path to becoming a real human yet.

Continue reading “EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator”

Bidirectional Data Transfer Through Mud?

We take easy communications for granted these days. It’s no bother to turn on a lightbulb remotely via a radio link or sense the water level in your petunias, but how does a drilling rig sense data from the drill head whilst deep underground, below the sea bed? The answer is with mud pulse telemetry, about which a group of researchers have produced a study, specifically about modelling the signal impairments and strategies for maintaining the data rate and improving the signal quality.

If you’re still confused, mud pulse telemetry (MPT) works by sending a modulated pressure wave vertically through the column of mud inside the drilling tube. It’s essential to obtain real-time data during drilling operations on the exact angle and direction the drill bit is pointing (so it can be corrected) and details of geological formations so decisions can be made promptly. The goal is to reduce drilling time and, therefore, costs and minimize environmental impact — although some would strongly argue about that last point.

Continue reading “Bidirectional Data Transfer Through Mud?”

A business card-sized love detector in a 3D-printed package.

2024 Business Card Challenge: Who Do You Love?

When you hand your new acquaintance one of your cards, there’s a chance you might feel an instant connection. But what if you could know almost instantly whether they felt the same way? With the Dr. Love card, you can erase all doubt.

As you may have guessed, the card uses Galvanic Skin Response. That’s the fancy term for the fact that your skin’s electrical properties change when you sweat, making it easier for electricity to pass through it. There are two sensors, one on each short end of the card where you would both naturally touch it upon exchange. Except this time, if you want to test the waters, you’ll have to wait 10-15 seconds while Dr. Love assesses your chemistry.

The doctor in this case is an RP2040-LCD-0.96, which is what it sounds like — a Raspberry Pi Pico with a small LCD attached. For the sensors, [Un Kyu Lee] simply used 8mm-wide strips of nickel. If you want to build your own, be sure to check out the build guide and watch the video after the break for a demonstration of Dr. Love in action.

Continue reading “2024 Business Card Challenge: Who Do You Love?”