ESP32 Video Input Using I2S

Computer engineering student [sherwin-dc] had a rover project which required streaming video through an ESP32 to be accessed by a web server. He couldn’t find documentation for the standard camera interface of the ESP32, but even if he had it, that approach used too many I/O pins. Instead, [sherwin-dc] decided to shoe-horn a video into an I2S stream. It helped that he had access to an Altera MAX 10 FPGA to process the video signal from the camera. He did succeed, but it took a lot of experimenting to work around the limited resources of the ESP32. Ultimately [sherwin-dc] decided on QVGA resolution of 320×240 pixels, with 8 bits per pixel. This meant each frame uses just 77 KB of precious ESP32 RAM.

His design uses a 2.5 MHz SCK, which equates to about four frames per second. But he notes that with higher SCK rates in the tens of MHz, the frame rate could be significantly higher — in theory. But considering other system processing, the ESP32 can’t even keep up with four FPS. In the end, he was lucky to get 0.5 FPS throughput, but that was adequate for purposes of controlling the rover (see animated GIF below the break). That said, if you had a more powerful processor in your design, this technique might be of interest. [Sherwin-dc] notes that the standard camera drivers for the ESP32 use I2S under the hood, so the concept isn’t crazy.

We’ve covered several articles about generating video over I2S before, including this piece from back in 2019. Have you ever commandeered a protocol for “off-label” use?

Continue reading “ESP32 Video Input Using I2S”

Incredibly Slow Films, Now Playing In Dazzling Color

Back in 2018 we covered a project that would break a video down into its individual frames and slowly cycle through them on an e-paper screen. With a new image pushed out every three minutes or so, it would take thousands of hours to “watch” a feature length film. Of course, that was never the point. The idea was to turn your favorite movie into an artistic conversation piece; a constantly evolving portrait you could hang on the wall.

[Manuel Tosone] was recently inspired to build his own version of this concept, and now thanks to several years of e-paper development, he was even able to do it in color. Ever the perfectionist, he decided to drive the seven-color 5.65 inch Waveshare panel with a custom STM32 board that he estimates can wring nearly 300 days of runtime out of six standard AA batteries, and wrap everything up in a very professional looking 3D printed enclosure. The end result is a one-of-a-kind Video Frame that any hacker would be proud to display on their mantle.

The Hackaday.IO page for this project contains a meticulously curated collection of information, covering everything from the ffmpeg commands used to process the video file into a directory full of cropped and enhanced images, to flash memory lifetime estimates and energy consumption analyses. If you’ve ever considered setting up an e-paper display that needs to run for long stretches of time, regardless of what’s actually being shown on the screen, there’s an excellent chance that you’ll find some useful nuggets in the fantastic documentation [Manuel] has provided.

We always love to hear about people being inspired by a project they saw on Hackaday, especially when we get to bring things full circle and feature their own take on the idea. Who knows, perhaps the next version of the e-paper video frame to grace these pages will be your own.

Continue reading “Incredibly Slow Films, Now Playing In Dazzling Color”

Video Feedback Effects Make A Glorious Spectacle In HD

Video feedback is perhaps best known for its appearance in the film clip to Bohemian Rhapsody. It’s not a particularly popular effect that you see too often, as it’s rather messy to set up what with cameras filming screens and what not. Regardless, the effects possible are glorious, as demonstrated by [Dave Blair] and his amazing video feedback kinetic sculpture.

No computer is involved at all in the process – it’s just classic, old school video feedback. It’s produced by pointing a camera at a screen and feeding the image back to that same screen. Three cameras are combined with twin video switchers and a beam-splitting pane of glass, along with a source image via an HDMI input.

By turning and spinning the various cameras, [Dave] is able to generate beautiful curving fractal-like effects using the source imagery, with a rainbow of color melting and warping together as he interacts with the sculpture. It’s a beautiful effect and something we’re surprised we don’t see more of in the video industry.

Hopefully [Dave] is enlisted to put his machine to work on the next [Doja Cat] film clip so we can get more of this goodness. Video after the break.

Continue reading “Video Feedback Effects Make A Glorious Spectacle In HD”

Hackaday Links Column Banner

Hackaday Links: July 18, 2021

Tell the world that something is in short supply, and you can bet that people will start reacting to that news in the ways that make the most sense to them — remember the toilet paper shortage? It’s the same with the ongoing semiconductor pinch, except that since the item in short supply is (arguably) more valuable than toilet paper, the behavior and the risks people are willing to take around it are even more extreme. Sure, we’ve seen chip hoarding, and a marked rise in counterfeit chips. But we’d imagine that this is the first time we’ve seen chip smuggling quite like this. The smuggler was caught at the Hong Kong-Macao border with 256 Core i7 and i9 processors, valued at about $123,000, strapped to his legs and chest. It reminds us more of “Midnight Express”-style heroin smuggling, although we have to say we love the fact that this guy chose a power of 2 when strapping these babies on.

Speaking of big money, let’s say you’ve pulled off a few chip heists without getting caught, and have retired from the smuggling business. What is one to do with the ill-gotten gains? Apparently, there’s a big boom in artifacts from the early days of console gaming, so you might want to start spreading some money around there. But you’d better prepare to smuggle a lot of chips: last week, an unopened Legend of Zelda cartridge for the NES sold for $870,000 at auction. Not to be outdone, two days later someone actually paid $1.56 million for a Super Mario 64 cartridge, this time apparently still in the tamperproof container that displayed it on a shelf somewhere in 1996. Nostalgia can be an expensive drug.

And it’s not just video games that are commanding high prices these days. If you’ve got a spare quarter million or so, why not bid on this real Apollo Guidance Computer and DSKY? The AGC is a non-flown machine that was installed in LTA-8, the “lunar test article” version of the Landing Module (LM) that was used for vacuum testing. If the photos in the auction listing seem familiar, it’s with good reason: this is the same AGC that was restored to operating condition by Carl Claunch, Mike Stewart, Ken Shiriff, and Marc Verdiell. Sotheby’s estimates the value at $200,000 to $300,000; in a world of billionaire megalomaniacs with dreams of space empires, we wouldn’t be surprised if a working AGC went for much, much more than that.

Meanwhile, current day space exploration is going swimmingly. Just this week NASA got the Hubble Space Telescope back online, which is great news for astronomers. And on Mars, the Ingenuity helicopter just keeps on delivering during its “operations demonstration” mission. Originally just supposed to be a technology demonstration, Ingenuity has proven to be a useful companion to the Perseverance rover, scouting out locations of interest to explore or areas of hazard to avoid. On the helicopter’s recent ninth flight, it scouted a dune field for the team, providing photographs that showed the area would be too dangerous for the rover to cross. The rover’s on-board navigation system isn’t great at seeing sand dunes, so Ingenuity’s images are a real boon to mission planners, not to mention geologists and astrobiologists, who are seeing promising areas of the ancient lakebed to explore.

And finally, most of us know all too well how audio feedback works, and all the occasions to avoid it. But what about video feedback? What happens when you point a camera that a screen displaying the image from the camera? Fractals are what happens, or at least something that looks a lot like fractals. Code Parade has been playing with what he calls “analog fractals”, which are generated just by video feedback and not by computational means. While he’d prefer to do this old school with analog video equipment, it easy enough to replicate on a computer; he even has a web page that lets you arrange a series of virtual monitors on your screen. Point a webcam at the screen, and you’re off on a fractal journey that constantly changes and shifts. Give it a try.

Analog Camera Goes Digital

The digital camera revolution swept through the world in the early 2000s, and aside from some unique situations and a handful of artists still using film, almost everyone has switched over to digital since then. Unfortunately that means that there’s a lot of high quality film cameras in the world that are gathering dust, but with a few pieces of equipment it’s possible to convert them to digital and get some more use out of them.

[befinitiv]’s latest project handles this conversion by swapping in a Raspberry Pi Zero where the film cartridge would otherwise be inserted into the camera. The Pi is attached to a 3D-printed case which mimics the shape of the film, and also houses a Pi camera right in front of the location where the film would be exposed. By removing the Pi camera’s lens, this new setup is able to take advantage of the analog camera’s optics instead and is able to capture images of relatively decent quality.

There are some perks of using this setup as well, namely that video can be broadcast to this phone over a wireless connection to a computer via the Raspberry Pi. It’s a pretty interesting build with excellent results for a remarkably low price tag, and it would be pretty straightforward to interface the camera’s shutter and other control dials into the Raspberry Pi to further replicate the action of an old film camera. And, if you enjoy [befinitiv]’s projects of bringing old tech into the modern world, be sure to check out his 80s-era DOS laptop which is able to run a modern Linux installation.

Continue reading “Analog Camera Goes Digital”

Video De-shaker Software Measures Linear Rail Quality

Here’s an interesting experiment that attempts to measure the quality of a linear rail by using a form of visual odometry, accomplished by mounting a camera on the rail and analyzing the video with open-source software usually used to stabilize shaky video footage. No linear rail is perfect, and it should be possible to measure the degree of imperfection by recording video footage while the camera moves down the length of the rail, and analyzing the result. Imperfections in the rail should cause the video to sway a proportional amount, which would allow one to characterize the rail’s quality.

To test this idea, [Saulius] attached a high-definition camera to a linear rail, pointed the camera towards a high-contrast textured pattern (making the resulting video easier to analyze), and recorded video while moving the camera across the rail at a fixed speed. The resulting video gets fed into the Deshaker plugin for VirtualDub, of which the important part is the deshaker.log file, which contains X, Y, rotate, and zoom correction values required to stabilize the video. [Saulius] used these values to create a graph characterizing the linear rail’s quality.

It’s a clever proof of concept, especially in how it uses no special tools and leverages a video stabilizing algorithm in an unusual way. However, the results aren’t exactly easy to turn into concrete, real-world measurements. Turning image results into micrometers is a matter of counting pixels, and for this task video stabilizing is an imperfect tool, since the algorithm prioritizes visual results instead of absolute measurements. Still, it’s an interesting experiment, and perfectly capable of measuring rail quality in a relative sense. Can’t help but be a bit curious about how it would profile something like these cardboard CNC modules.

VGA From Scratch On A Homebrew 8-bit Computer

[James Sharman] has built an impressive 8-bit homebrew computer. Based on TTL logic chips, it has a pipelined design which makes it capable of Commodore-level computing, but [James] hasn’t quite finished everything yet. While it is currently built on its own custom PCB, it has a limiting LCD display which isn’t up to the standards of the rest of the build. To resolve this issue, he decided to implement VGA from scratch.

This isn’t a bit-bang VGA implementation, either. He plans for full resolution (640×480) which will push the limits of his hardware. He also sets goals of a 24-bit DAC which will allow for millions of colors, the ability to use sprites, and hardware scrolling. Since he’s doing all of this from scratch, the plan is to keep it as simple as possible and make gradual improvements to the build as he goes. To that end, the first iteration uses a single latching chip with some other passive components. After adding some code to the CPU to support the new video style, [James] is able to display an image on his monitor.

While the image of the parrot he’s displaying isn’t exactly perfect yet, it’s a great start for his build and he does plan to make improvements to it in future videos. We’d say he’s well on his way to reproducing a full 8-bit retrocomputer. Although VGA is long outdated for modern computers, the standard is straightforward to implement and limited versions can even be done with very small microcontrollers.

Thanks to [BaldPower] for the tip!

Continue reading “VGA From Scratch On A Homebrew 8-bit Computer”