New Part Day: RP2040 Chips In Single Unit Quantities

Since the launch of the Raspberry Pi Pico back in January the little board with its newly-designed RP2040 microcontroller has really caught the imagination of makers everywhere, and we have seen an extremely impressive array of projects using it. So far the RP2040 has only been available on a ready-made PCB module, but we have news today direct from Eben Upton himself that with around 600k units already shipped, single-unit sales of the chip are commencing via the network of Raspberry Pi Approved Resellers.

This news will doubtless result in a fresh explosion of clever projects using the chip, but perhaps more intriguingly it will inevitably result in its appearance at the heart of a new crop of niche products that go beyond simple clones of the Pico in different form factors. The special ingredient of those two PIO programmable state machines to take the load of repetitive tasks away from the cores raises it above being merely yet another microcontroller chip, and we look forward to that feature being at their heart.

The Broadcom systems-on-chip that power Raspberry Pi’s existing range of Linux-capable boards have famously remained unavailable on their own, meaning that this move to being a chip vendor breaks further new ground for the Cambridge-based company. It’s best not to think of it in terms of their entering into competition with the giants of the microcontroller market though, because a relative minnow such as the RP2040 will be of little immediate concern to the likes of Microchip, ST, or TI. A better comparison when evaluating the RP2040’s chances in the market is probably Parallax with their Propeller chip, in that here is a company with a very solid existing presence in the education and maker markets seeking to capitalise on that experience by providing a microcontroller with that niche in mind. We look forward to seeing where this will take them, and we’d hope to eventually see a family of RP2040-like chips with different package and on-board peripheral options.

Neural Networks Emulate Any Guitar Pedal For $120

It’s a well-established fact that a guitarist’s acumen can be accurately gauged by the size of their pedal board- the more stompboxes, the better the player. Why have one box that can do everything when you can have many that do just a few things?

Jokes aside, the idea of replacing an entire pedal collection with a single box is nothing new. Your standard, old-school stompbox is an analog affair, using a combination of filters and amplifiers to achieve a certain sound. Some modern multi-effects processors use software models of older pedals to replicate their sound. These digital pedals have been around since the 90s, but none have been quite like the NeuralPi project. Just released by [GuitarML], the NeuralPi takes about $120 of hardware (including — you guessed it — a Raspberry Pi) and transforms it into the perfect pedal.

The key here, of course, is neural networks. The LSTM at the core of NeuralPi can be trained on any pedal you’ve got laying around to accurately reproduce its sound, and it can even do so with incredibly low latency thanks to Elk Audio OS (which even powers Matt Bellamy’s synth guitar, as used in Muse‘s Simulation Theory World Tour). The result of a trained model is a VST3 plugin, a popular format for describing audio effects.

This isn’t the first time we’ve seen some seriously cool stuff from [GuitarML], and it also hearkens back a bit to some sweet pedal simulation in LTSpice we saw last year. We can’t wait to see this project continue to develop — over time, it would be awesome to see a slick UI, or maybe somebody will design a cool enclosure with some knobs and an honest-to-god pedal for user input!

Thanks to [Mish] for the tip!

Continue reading “Neural Networks Emulate Any Guitar Pedal For $120”

Raspberry Pi RP2040: Hands-On Experiences From An STM32 Perspective

The release of the Raspberry Pi Foundation’s Raspberry Pi Pico board with RP2040 microcontroller has made big waves these past months in the maker community. Many have demonstrated how especially the two Programmable I/O (PIO) state machine peripherals can be used to create DVI video generators and other digital peripherals.

Alongside this excitement, it raises the question of whether any of this will cause any major upheaval for those of us using STM32, SAM and other Cortex-M based MCUs. Would the RP2040 perhaps be a valid option for some of our projects? With the RP2040 being a dual Cortex-M0+ processor MCU, it seems only fair to put it toe to toe with the offerings from one of the current heavyweights in the 32-bit ARM MCU space: ST Microelectronics.

Did the Raspberry Pi Foundation pipsqueak manage to show ST’s engineers how it’s done, or should the former revisit some of their assumptions? And just how hard is it going to be to port low-level code from STM32 to RP2040? Continue reading “Raspberry Pi RP2040: Hands-On Experiences From An STM32 Perspective”

Omnibot From The 80s Gets LED Matrix Eyes, Camera

[Ramin assadollahi] has been busy rebuilding and improving an Omnibot 5402, and the last piece of hardware he wanted to upgrade was some LED matrix eyes and a high quality Raspberry Pi camera for computer vision. An Omnibot was something most technical-minded youngsters remember drooling over in the 80s, and when [ramin] bought a couple of used units online, he went straight to the workbench to give the vintage machines some upgrades. After all, the Omnibot 5402 was pretty remarkable for its time, but is capable of much more with some modern hardware. One area that needed improvement was the eyes.

The eyes on the original Omnibot could light up, but that’s about all they were capable of. The first upgrade was installing two 8×8 LED matrix displays to form what [ramin] calls Minimal Expressive Eyes (MEE), powered by a Raspberry Pi. With the help of a 3D-printed adapter and some clever layout, the LED matrix displays fit behind the eye plate, maintaining the original look while opening loads of new output possibilities.

Adding a high quality Raspberry Pi camera with wide-angle lens was a bit more challenging and required and extra long camera ribbon connector, but with the lens nestled just below the eyes, the camera has a good view and isn’t particularly noticeable when the eyes are lit up. Having already upgraded the rest of the hardware, all that remains now is software work and we can’t wait to see the results.

Two short videos of the hardware are embedded below, be sure to give them a peek. And when you’re ready for more 80s-robot-upgrading-action, check out the Hero Jr.

Continue reading “Omnibot From The 80s Gets LED Matrix Eyes, Camera”

3D Printed Terminal Takes Computing Back In Time

It’s hard to look at today as anything but the golden age of computing. Even entry level machines have quad-core processors and a terabyte or more of storage space, to say nothing of the incredible amount of tech packed into the modern smartphone. But even so, there’s something to be said for the elegant simplicity of early desktop computers.

Looking to recreate the feeling of those bygone days, [Pigeonaut] created the Callisto II. Its entirely 3D printed case snaps together without glue or screws, making it easy to assemble, and the parts have been sized so they’ll be printable even on smaller machines like the Prusa Mini. Inside you’ll find a 1024×768 Pimoroni HDMI 8″ IPS LCD, 60% mechanical keyboard, four-port USB 3 hub, Raspberry Pi 4, and a 22 watt USB power supply to run it all.

The internal components can be easily accessed with the hatch on the rear of the case, and there’s plenty of room inside to add new hardware should you want to toss in a hard drive or even swap out the Pi for a different single-board computer.

To really drive home the faux-retro concept of the Callisto II, [Pigeonaut] has created a website for the fictional computer company behind the machine, replete with all the trappings you’d expect from the early web. There’s even a web-based “operating system” you can use to show off your freshly printed Callisto II.

Incidentally the II suffix isn’t just part of the meme, there really was a Callisto before this one. We covered the earlier machine back in 2019, and while we’re a bit sad to see that the functional 3.5 inch floppy drive has been deleted, we can’t deny the overall aesthetics have been greatly improved in the latest version.

Continue reading “3D Printed Terminal Takes Computing Back In Time”

Pi Pico Project Plays Pong Perfectly

Even as technology keeps progressing, we find ourselves coming back to the classics again and again. Pong is quite possibly the classic game, and the Raspberry Pi Pico is one of the latest microcontrollers. So [Nick Bild] combined them expertly in his Pico Pong project, which includes gesture controls and a custom VGA output.

Rolling your own VGA signal is no simple feat, and this project takes full advantage of the Pico’s features to pull it off. Display data is buffered in memory, while a Programmable I/O (PIO) program reads straight from the buffer via Direct Memory Access (DMA) and writes straight to the display. This allows for nanosecond-precision while leaving the CPU free to handle inputs and run the game. Even with the display work offloaded, the ARM processor had to be massively overclocked at 258 MHz, well over its 133 MHz specs, to make things run smoothly. And still [Nick] found himself limited to a 640×350 resolution and serendipitously-retro-accurate monochrome color scheme.

Gesture controls come from a pair of IR light beams hooked up to the GPIO. IR LEDs shine up toward reflectors, and the light bounces back down to detectors. Blocking one of the beams causes your paddle to move up or down, which looks pretty responsive in the video (embedded below).

We’ve seen [Nick] play Pong before, though at that time it was handheld and based on the venerable 6502. And just recently we wrote about the Raspberry Pi Pico powering another classic game: Snake.

Continue reading “Pi Pico Project Plays Pong Perfectly”

Mind-Controlled Flamethrower

Mind control might seem like something out of a sci-fi show, but like the tablet computer, universal translator, or virtual reality device, is actually a technology that has made it into the real world. While these devices often requires on advanced and expensive equipment to interpret brain waves properly, with the right machine learning system it’s possible to do things like this mind-controlled flame thrower on a much smaller budget. (Video, embedded below.)

[Nathaniel F] was already experimenting with using brain-computer interfaces and machine learning, and wanted to see if he could build something practical combining these two technologies. Instead of turning to an EEG machine to read brain patterns, he picked up a much less expensive Mindflex and paired it with a machine learning system running TensorFlow to make up for some of its shortcomings. The processing is done by a Raspberry Pi 4, which sends commands to an Arduino to fire the flamethrower when it detects the proper thought patterns. Don’t forget the flamethrower part of this build either: it was designed and built entirely by [Nathanial F] as well using gas and an arc lighter.

While the build took many hours of training to gather the proper amount of data to build the neural network and works as the proof of concept he was hoping for, [Nathaniel F] notes that it could be improved by replacing the outdated Mindflex with a better EEG. For now though, we appreciate seeing sci-fi in the real world in projects like this, or in other mind-controlled projects like this one which converts a prosthetic arm into a mind-controlled music synthesizer.

Continue reading “Mind-Controlled Flamethrower”