The Strider mechanism might look similar to Strandbeest walkers, but it lifts its feet higher, allowing it to traverse rougher terrain. [Chen]’s little 3D printed version is driven by a pair of geared N20 motors, with three legs on each side. The ESP32 camera board allows for control and an FPV video feed using WiFi, with power coming from a 14500 LiFePO4 battery. The width required by the motors, leg mechanisms, and bearings means the robot is quite wide, to the point that it could get stuck on something that’s outside the camera’s field of view. [Chen] is working to make it narrower by using continuous rotation servos and a wire drive shaft.
We’ve seen no shortage or riffs on the many-legged walkers, like the TrotBot and Strider mechanism developed by [Wade] and [Ben Vagle], and their website is an excellent resource for prospective builders.
The Internet has brought us the ability to share data all over the globe, and nearly instantaneously at that. It’s revolutionized the sharing of science across the world, and taking advantage of this global data network is this earthquake display from [AndyGadget].
The build relies on an ESP32 fitted with an ILI9486 TFT display. The screen is in color and has a nice 480×320 resolution. This enables it to display a reasonably legible world map using the Web Mercator projection to fit the rectangular screen. The microcontroller then pulls in information from Seismic Portal, a site that aggregates data from seismographs and other sensors scattered all over the world. Data from the site is pulled into the device live and overlaid on the world map, allowing the viewer to see the location of any current earthquakes at a glance.
These days everyone’s excited about transparent OLED panels, but where’s the love for the classic Nokia 5110 LCD? As the prolific [Nick Bild] demonstrates in his latest creation, all you’ve got to do is peel the backing off the the late 90s era display, and you’ve got yourself a see-through cyberpunk screen for a couple bucks.
In this case, [Nick] has attached the modified display to a pair of frames, and used an Adafruit QT Py microcontroller to connect it to the ESP32 powered ESP-EYE development board and OV2640 camera module. This lets him detect QR codes within the wearer’s field of vision and run a TensorFlow Lite neural network right on the hardware. Power is provided by a 2000 mAh LiPo battery running through an Adafruit PowerBoost 500.
The project, intended to provide augmented reality reminders for medical professionals, uses the QR codes to look up patient and medication information. Right now the neural network is being used to detect when the wearer has washed their hands, but obviously the training model could be switched out for something different as needed. By combining these information sources, the wearable can do things like warn the physician if a patient is allergic to the medication they’re currently looking at.
There are few things that we all can agree we hate, and the shrill of your alarm clock waking you from a wonderful slumber is definitely high on that list. To wake up more naturally, [nutstobutts] created an automated curtain opener.
The curtain opener is very simple; a stepper motor in the control box pulls a string, which is run to an idler on the far side of the curtain rod and through two clips, attached to the back of each curtain. This design makes it so that both curtains will open smoothly at the same time, and will always come closed again directly in the center. This design is especially favorable for students in dorms or those that live in an apartment, as the installation requires no screws in the wall or permanent modification to the curtains.
The curtains can be opened and closed either by pressing a button on the control box or by sending HTTP requests to the ESP32 that controls everything. This allows for integration with many different IoT systems, for instance [nutstobutts] has been having Home Assistant open the curtains every morning at 6:30 a.m. in lieu of an alarm clock, and then closing them automatically at 9:00 a.m. to help save on cooling costs.
The best streamers keep their audience constantly engaged. They might be making quips and doing the funny voices that everyone expects them to do, but they’re also busy reading chat messages aloud and responding, managing different scenes and transitions, and so on. Many streamers use a type of macro keyboard called a stream deck to greatly improve the experience of juggling all those broadcasting balls.
Sure, there are dedicated commercial versions, but they’re kind of expensive. And what’s the fun in that, anyway? A stream deck is a great candidate for DIY because you can highly personalize the one you make yourself. Give it clicky switches, if that’s what your ears and fingers want. Or don’t. It’s your macro keyboard, after all.
[Patrick Thomas] and [James Wood] teamed up to build the perfect stream deck for [James]’ Twitch channel. We like the way they went about it, which was to start by assessing a macro pad kit and use what they learned from building and testing it to design their ideal stream deck. The current version supports both the Arduino Pro Micro and the ESP32. It has twelve key switches, a rotary encoder, an LED bar graph, and an OLED screen for choosing between the eight different color schemes.
Computer engineering student [sherwin-dc] had a rover project which required streaming video through an ESP32 to be accessed by a web server. He couldn’t find documentation for the standard camera interface of the ESP32, but even if he had it, that approach used too many I/O pins. Instead, [sherwin-dc] decided to shoe-horn a video into an I2S stream. It helped that he had access to an Altera MAX 10 FPGA to process the video signal from the camera. He did succeed, but it took a lot of experimenting to work around the limited resources of the ESP32. Ultimately [sherwin-dc] decided on QVGA resolution of 320×240 pixels, with 8 bits per pixel. This meant each frame uses just 77 KB of precious ESP32 RAM.
His design uses a 2.5 MHz SCK, which equates to about four frames per second. But he notes that with higher SCK rates in the tens of MHz, the frame rate could be significantly higher — in theory. But considering other system processing, the ESP32 can’t even keep up with four FPS. In the end, he was lucky to get 0.5 FPS throughput, but that was adequate for purposes of controlling the rover (see animated GIF below the break). That said, if you had a more powerful processor in your design, this technique might be of interest. [Sherwin-dc] notes that the standard camera drivers for the ESP32 use I2S under the hood, so the concept isn’t crazy.
We’ve covered several articles about generating video over I2S before, including this piece from back in 2019. Have you ever commandeered a protocol for “off-label” use?
The ESP32 has enabled an uncountable number of small electronics projects and even some commercial products, thanks to its small size, low price point, and wireless capabilities. Plenty of remote sensors, lighting setups, and even home automation projects now run on this small faithful chip. But being relegated to an electronics enclosure controlling a small electrical setup isn’t all that these tiny chips can do as [Eirik Brandal] shows us with this unique piece of audio and visual art.
The project is essentially a small, automated synthesizer that has a series of arrays programmed into it that correspond to various musical scales. Any of these can be selected for the instrument to play through. The notes of the scale are shuffled through with some random variations, allowing for a completely automated musical instrument. The musical generation is entirely analog as well, created by some oscillators, amplifiers, and other filtering and effects. The ESP32 also controls a lighting sculpture that illuminates a series of LEDs as the music plays.
The art installation itself creates quite haunting, mesmerizing tunes that are illustrated in the video linked after the break. While it’s not quite to the realm of artificial intelligence since it uses pre-programmed patterns with some randomness mixed in, it does give us hints of some other projects that have used AI in order to compose new music.