A clipping of the "3D Printing & Modelling" skill tree. An arrow pointing up says "Advanced" and there are several hexagons for various skills on the page including blanks for writing in your own options and some of the more advanced skills like "Print in Nylon or ASA material"

Maker Skill Trees Help You Level Up Your Craft

Hacking and making are great fun due to their open ended nature, but being able to try anything can make the task of selecting your next project daunting. [Steph Piper] is here with her Maker Skill Trees to give you a map to leveling up your skills.

Featuring a grid of 73 hexagonal tiles per discipline, there’s plenty of inspiration for what to tackle next in your journey. The trees start with the basics at the bottom and progressively move up in difficulty as you move up the page. With over 50 trees to select from (so far), you can probably find something to help you become better at anything from 3D printing and modeling to entrepreneurship or woodworking.

Despite being spoiled for choice, if you’re disappointed there’s no tree for your particular interest (underwater basket weaving?), you can roll your own with the provided template and submit it for inclusion in the repository.

Want to get a jump on an AI Skill Tree? Try out these AI courses. Maybe you could use these to market yourself to potential employers or feel confident enough to strike out on your own?

[Thanks to Courtney for the tip!]

Continue reading “Maker Skill Trees Help You Level Up Your Craft”

A C64 SID Replacement With Built-in Games

Developer [frntc] has recently come up with a smaller and less expensive way to not only replace the SID chip in your Commodore 64 but to also make it a stereo SID! To top it off, it can also hold up to 16 games and launch them from a custom menu. The SIDKick Pico is a simple board with a Raspberry Pi Pico mounted on top. It uses a SID emulation engine based on reSID to emulate both major versions of the SID chip — both the 6581 and the 8580. Unlike many other SID replacements, the SIDKick Pico also supports mouse and paddle inputs, meaning it replaces all functionality of the original SID!

Sound can be generated in three different ways: either using PWM to create a mono audio signal that is routed out via the normal C64/C128 connectors, an external PCM5102A DAC board, or using a different PCB design that has pads for an on-board DAC and TL072 op-amp. While many Commodore purists dislike using replacement chips, the reality is that all extant SID chips were made roughly 40 years ago, and as more and more of them fail, options like the SIDKick Pico are an excellent way to keep the sound of the SID alive.

If you want to hear the SIDKick Pico in action, you can check out the samples on the linked GitHub page, or check out the video below by YouTuber Wolfgang Kierdorf of the RETRO is the New Black channel. To get your hands on a SIDKick Pico, you can follow the instructions on the GitHub page for ordering either bare PCBs or pre-assembled PCBs from either PCBWay or your board manufacturer of choice.

Continue reading “A C64 SID Replacement With Built-in Games”

A persistence-of-vision business card which displays information when shaken (not stirred).

2024 Business Card Challenge: Make Them Shake Your Handiwork

Before COVID, people traditionally sealed their initial introduction to each other with a handshake. Nowadays, that activity seems kind of questionable. But you can still give them something to shake if you build this persistence of vision (POV) business card from [chaosneon] to show your credentials in blinkenlights form.

As you might have guessed, the input comes from a tilt switch. The user simply shakes the card back and forth, and the sensor detects the direction and cadence of the shake. Cleverly, the pattern plays forward-ways on the swing, and backwards on the back stroke, which just reinforces the POV effect. Don’t worry about how slow or fast to shake it, because the timing adjusts for your speed.

The first version used individual white LEDs, hand-soldered to an ATtiny2313. Now, in the updated version which you can see in the demo video after the break, [chaosneon] is using an RGB NeoPixel strip, which only needs one data wire to connect to the microcontroller. Thanks to this, [chaosneon] was able to to downsize to an ATtiny85.

Continue reading “2024 Business Card Challenge: Make Them Shake Your Handiwork”

ESP32 Powered Crunch-E Makes Beats On The Go

There’s no shortage of devices out there for creating electronic music, but if you’re just looking to get started, the prices on things like synthesizers and drum machines could be enough to give you second thoughts on the whole idea. But if you’ve got a well stocked parts bin, there’s a good chance you’ve already got most of what you need to build your own Crunch-E.

A Crunch-E built from stacked modules

Described by creator [Roman Revzin] as a “keychain form factor music-making platform”, the Crunch-E combines an ESP32, an MAX98357 I2S audio amplifier, an array of tactile buttons, and a sprinkling of LEDs and passives. It can be built on a perfboard using off-the-shelf modules, or you can spin up a PCB if you want something a bit more professional. It sounds like there’s eventually going to be an option to purchase a pre-built Crunch-E at some point as well.

But ultimately, the hardware seems to be somewhat freeform — the implementation isn’t so important as long as you’ve got the major components and can get the provided software running on it.

The software, which [Roman] is calling CrunchOS, currently provides four tracks, ten synth instruments, and two drum machine banks. Everything can be accessed from a 4 x 4 button array, and there’s a “cheat sheet” in the documentation that shows what each key does in the default configuration. Judging by the demo video below, it’s already an impressively capable platform. But this is just the beginning. If everything goes according to plan and more folks start jamming on their own Crunch-E hardware, it’s not hard to imagine how the software side can be expanded and adapted over time.

Over the years we’ve seen plenty of homebrew projects for producing electronic music, but the low-cost, simple construction, and instant gratification nature of the Crunch-E strikes us as a particularly compelling combination. We’re eager to see where things develop from here.

Continue reading “ESP32 Powered Crunch-E Makes Beats On The Go”

Donkey Kong Bongos Ditch The GameCube, Go Mobile

Historically speaking, optional peripherals for game consoles tend not to be terribly successful. You’ll usually get a handful of games that support the thing, one of which will likely come bundled with it, and then the whole thing fades into obscurity to make way for the next new gimmick.

For example, did you know Nintendo offered a pair of bongos for the GameCube in 2003? They were used almost exclusively by the trio of Donkey Konga rhythm games, although only two of them were ever released outside of Japan. While the games might not have been huge hits, they were successful enough to stick in the memory of [bl3i], who wanted a way to keep the DK bongo experience alive.

The end result is, arguably, more elegant than the hokey musical controller deserves. While most people would have just gutted the plastic bongos and crammed in some new hardware, [bl3i] went through considerable effort so the original hardware would remain intact. His creation simply snaps onto the bongos and connects to them via the original cable.

Internally, the device uses an Arduino to read the output of the bongos (which appeared to the GameCube essentially as a standard controller) and play the appropriate WAV files from an SD card as hits are detected. Add in an audio amplifier module and a battery, and Nintendo’s bongos can finally go forth into the world and spread their beats.

As far as we’re able to tell, this is the first time the Donkey Kong bongos have ever graced the pages of Hackaday in any form, so congratulations to [bl3i] for getting there first. But it’s certainly not the first time we’ve covered ill-conceived game gadgets — long time readers will perhaps be familiar with Nintendo’s attempt to introduce the Robotic Operating Buddy (ROB) to households back in 1985.

Continue reading “Donkey Kong Bongos Ditch The GameCube, Go Mobile”

AI Kayak Controller Lets The Paddle Show The Way

Controlling an e-bike is pretty straightforward. If you want to just let it rip, it’s a no-brainer — or rather, a one-thumber, as a thumb throttle is the way to go. Or, if you’re still looking for a bit of the experience of riding a bike, sensing when the pedals are turning and giving the rider a boost with the motor is a good option.

But what if your e-conveyance is more of the aquatic variety? That’s an interface design problem of a different color, as [Braden Sunwold] has discovered with his DIY e-kayak. We’ve detailed his work on this already, but for a short recap, his goal is to create an electric assist for his inflatable kayak, to give you a boost when you need it without taking away from the experience of kayaking. To that end, he used the motor and propeller from a hydrofoil to provide the needed thrust, while puzzling through the problem of building an unobtrusive yet flexible controller for the motor.

His answer is to mount an inertial measurement unit (IMU) in a waterproof container that can clamp to the kayak paddle. The controller is battery-powered and uses an nRF link to talk to a Raspberry Pi in the kayak’s waterproof electronics box. The sensor also has an LED ring light to provide feedback to the pilot. The controller is set up to support both a manual mode, which just turns on the motor and turns the kayak into a (low) power boat, and an automatic mode, which detects when the pilot is paddling and provides a little thrust in the desired direction of travel.

The video below shows the non-trivial amount of effort [Braden] and his project partner [Jordan] put into making the waterproof enclosure for the controller. The clamp is particularly interesting, especially since it has to keep the sensor properly oriented on the paddle. [Braden] is working on a machine-learning method to analyze paddle motions to discern what the pilot is doing and where the kayak goes. Once he has that model built, it should be time to hit the water and see what this thing can do. We’re eager to see the results.
Continue reading “AI Kayak Controller Lets The Paddle Show The Way”

EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator

Alibaba’s EMO (or Emote Portrait Alive) framework is a recent entry in a series of attempts to generate a talking head using existing audio (spoken word or vocal audio) and a reference portrait image as inputs. At its core it uses a diffusion model that is trained on 250 hours of video footage and over 150 million images. But unlike previous attempts, it adds what the researchers call a speed controller and a face region controller. These serve to stabilize the generated frames, along with an additional module to stop the diffusion model from outputting frames that feature a result too distinct from the reference image used as input.

In the related paper by [Linrui Tian] and colleagues a number of comparisons are shown between EMO and other frameworks, claiming significant improvements over these. A number of examples of talking and singing heads generated using this framework are provided by the researchers, which gives some idea of what are probably the ‘best case’ outputs. With some examples, like [Leslie Cheung Kwok Wing] singing ‘Unconditional‘ big glitches are obvious and there’s a definite mismatch between the vocal track and facial motions. Despite this, it’s quite impressive, especially with fairly realistic movement of the head including blinking of the eyes.

Meanwhile some seem extremely impressed, such as in a recent video by [Matthew Berman] on EMO where he states that Alibaba releasing this framework to the public might be ‘too dangerous’. The level-headed folks over at PetaPixel however also note the obvious visual imperfections that are a dead give-away for this kind of generative technology. Much like other diffusion model-based generators, it would seem that EMO is still very much stuck in the uncanny valley, with no clear path to becoming a real human yet.

Continue reading “EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator”