Several frames from Bad Apple

PineTime Smartwatch And Good Code Play Bad Apple

PineTime is the open smartwatch from our friends at Pine64. [TT-392] wanted to prove the hardware can play a full-motion music video, and they are correct, to a point. When you watch the video below, you should notice the monochromatic animation maintaining a healthy framerate, and there lies all the hard work. Without any modifications, video would top out at approximately eight frames per second.

To convert an MP4, you need to break it down into images, which will strip out the sound. Next, you load them into the Linux-only video processor, which looks for clusters of pixels that need changing and ignores the static ones. Relevant pixel selection takes some of the load off the data running to the display and boosts the fps since you don’t waste time reminding it that a block of black pixels should stay the way they are. Lastly, the process will compress everything to fit it into the watch’s onboard memory. Even though it is a few minutes of black and white pictures, compiling can take a couple of hours.

You will need access to the watch’s innards, so hopefully, you have the developer kit or don’t mind cracking the seal. Who are we kidding, you aren’t here for intact warranties. The video resides in the flash chip and you have to transfer blocks one at a time. Bad Apple needs fourteen, so you may want to practice on a shorter video. Lastly, the core memory needs some updating to play correctly. Now you can sit back and…watch.

Pine64 had a rough start with the single-board computers, but they’re earning our trust with things like soldering irons and Google-less Linux mobile phones.

Continue reading “PineTime Smartwatch And Good Code Play Bad Apple”

Super 8 Camera Brought Back To Life

The Super 8 camera, while a groundbreaking video recorder in its time, is borderline unusable now. Even if you can get film for it (and afford its often enormous price), it still only records on 8mm film which isn’t exactly the best quality of film around, not to mention that a good percentage of these cameras couldn’t even record audio. They were largely made obsolete by camcorders in the late ’80s and early ’90s, although some are still used for niche artistic purposes. If you’d rather not foot the bill for the film, though, you can still put one of these to work with the help of a Raspberry Pi.

[befinitiv] has a knack for repurposing antique analog equipment like this while preserving its aesthetic. While the bulk of the space inside of this camera would normally be used for housing film, this makes a perfect spot to place a Raspberry Pi Zero, a rechargeable battery, and a power converter circuit all in a 3D printed enclosure that snaps into the camera just as a film roll would have. It uses the Pi camera module but still makes use of the camera’s built in optics which include a zoom function. [befinitiv] also incorporated the original record button so that from the outside this looks like a completely unmodified Super 8 camera.

The camera can connect to a WiFi network and can stream live video to a computer, or it can record video files to an internal SD card. As a bonus, thanks to the power converter circuit, it is also capable of charging a cell phone. [befinitiv] notes that many of the aesthetic properties of 8 mm film seem to be preserved when using this method, and he has several theories as to why but no definitive answer. If you’d like to take a look at some of his other projects like this, check out this analog camera that is now able to take digital pictures. Continue reading “Super 8 Camera Brought Back To Life”

Creating Video From A ROM

We’re used to computers with display screens, yet how many of us have created the circuitry to drive one directly? Sure, we’ve coded up an SPI display driver on a microcontroller, but create the hardware to generate a usable video signal? That’s a little more difficult. [Jdh] has given it a go though, with a TTL video card.

In this case it’s not a card so much as a collection of breadboards, but all the logic is there to generate the complex array of video timings necessary for synchronisation, and to output the bits sequentially at the right voltage levels for the analogue monitor. It’s worth pointing out though that it’s not a composite video signal that’s being created sinceit’s monochrome only with no subcarrier.

In the end he encounters the problem that his ROM isn’t fast enough for the pixel rate and thus the image has artefacts, but it does at least produce a recognisable and readable something on the screen. Old hands in the video business might point out that analogue TVs were a bit forgiving when it came to exact timings and line counts so the circuit could quite possibly be simplified, and also that trading away some of the resolution might fix the ROM speed issue. But it’s an impressive piece of work, and should be of particular interest for anyone interested in how video works.

Fans of video cards on breadboards should also check out [Ben Eater’s] 7400-series video card.

Continue reading “Creating Video From A ROM”

ESP32 Video Input Using I2S

Computer engineering student [sherwin-dc] had a rover project which required streaming video through an ESP32 to be accessed by a web server. He couldn’t find documentation for the standard camera interface of the ESP32, but even if he had it, that approach used too many I/O pins. Instead, [sherwin-dc] decided to shoe-horn a video into an I2S stream. It helped that he had access to an Altera MAX 10 FPGA to process the video signal from the camera. He did succeed, but it took a lot of experimenting to work around the limited resources of the ESP32. Ultimately [sherwin-dc] decided on QVGA resolution of 320×240 pixels, with 8 bits per pixel. This meant each frame uses just 77 KB of precious ESP32 RAM.

His design uses a 2.5 MHz SCK, which equates to about four frames per second. But he notes that with higher SCK rates in the tens of MHz, the frame rate could be significantly higher — in theory. But considering other system processing, the ESP32 can’t even keep up with four FPS. In the end, he was lucky to get 0.5 FPS throughput, but that was adequate for purposes of controlling the rover (see animated GIF below the break). That said, if you had a more powerful processor in your design, this technique might be of interest. [Sherwin-dc] notes that the standard camera drivers for the ESP32 use I2S under the hood, so the concept isn’t crazy.

We’ve covered several articles about generating video over I2S before, including this piece from back in 2019. Have you ever commandeered a protocol for “off-label” use?

Continue reading “ESP32 Video Input Using I2S”

Incredibly Slow Films, Now Playing In Dazzling Color

Back in 2018 we covered a project that would break a video down into its individual frames and slowly cycle through them on an e-paper screen. With a new image pushed out every three minutes or so, it would take thousands of hours to “watch” a feature length film. Of course, that was never the point. The idea was to turn your favorite movie into an artistic conversation piece; a constantly evolving portrait you could hang on the wall.

[Manuel Tosone] was recently inspired to build his own version of this concept, and now thanks to several years of e-paper development, he was even able to do it in color. Ever the perfectionist, he decided to drive the seven-color 5.65 inch Waveshare panel with a custom STM32 board that he estimates can wring nearly 300 days of runtime out of six standard AA batteries, and wrap everything up in a very professional looking 3D printed enclosure. The end result is a one-of-a-kind Video Frame that any hacker would be proud to display on their mantle.

The Hackaday.IO page for this project contains a meticulously curated collection of information, covering everything from the ffmpeg commands used to process the video file into a directory full of cropped and enhanced images, to flash memory lifetime estimates and energy consumption analyses. If you’ve ever considered setting up an e-paper display that needs to run for long stretches of time, regardless of what’s actually being shown on the screen, there’s an excellent chance that you’ll find some useful nuggets in the fantastic documentation [Manuel] has provided.

We always love to hear about people being inspired by a project they saw on Hackaday, especially when we get to bring things full circle and feature their own take on the idea. Who knows, perhaps the next version of the e-paper video frame to grace these pages will be your own.

Continue reading “Incredibly Slow Films, Now Playing In Dazzling Color”

Video Feedback Effects Make A Glorious Spectacle In HD

Video feedback is perhaps best known for its appearance in the film clip to Bohemian Rhapsody. It’s not a particularly popular effect that you see too often, as it’s rather messy to set up what with cameras filming screens and what not. Regardless, the effects possible are glorious, as demonstrated by [Dave Blair] and his amazing video feedback kinetic sculpture.

No computer is involved at all in the process – it’s just classic, old school video feedback. It’s produced by pointing a camera at a screen and feeding the image back to that same screen. Three cameras are combined with twin video switchers and a beam-splitting pane of glass, along with a source image via an HDMI input.

By turning and spinning the various cameras, [Dave] is able to generate beautiful curving fractal-like effects using the source imagery, with a rainbow of color melting and warping together as he interacts with the sculpture. It’s a beautiful effect and something we’re surprised we don’t see more of in the video industry.

Hopefully [Dave] is enlisted to put his machine to work on the next [Doja Cat] film clip so we can get more of this goodness. Video after the break.

Continue reading “Video Feedback Effects Make A Glorious Spectacle In HD”

Hackaday Links Column Banner

Hackaday Links: July 18, 2021

Tell the world that something is in short supply, and you can bet that people will start reacting to that news in the ways that make the most sense to them — remember the toilet paper shortage? It’s the same with the ongoing semiconductor pinch, except that since the item in short supply is (arguably) more valuable than toilet paper, the behavior and the risks people are willing to take around it are even more extreme. Sure, we’ve seen chip hoarding, and a marked rise in counterfeit chips. But we’d imagine that this is the first time we’ve seen chip smuggling quite like this. The smuggler was caught at the Hong Kong-Macao border with 256 Core i7 and i9 processors, valued at about $123,000, strapped to his legs and chest. It reminds us more of “Midnight Express”-style heroin smuggling, although we have to say we love the fact that this guy chose a power of 2 when strapping these babies on.

Speaking of big money, let’s say you’ve pulled off a few chip heists without getting caught, and have retired from the smuggling business. What is one to do with the ill-gotten gains? Apparently, there’s a big boom in artifacts from the early days of console gaming, so you might want to start spreading some money around there. But you’d better prepare to smuggle a lot of chips: last week, an unopened Legend of Zelda cartridge for the NES sold for $870,000 at auction. Not to be outdone, two days later someone actually paid $1.56 million for a Super Mario 64 cartridge, this time apparently still in the tamperproof container that displayed it on a shelf somewhere in 1996. Nostalgia can be an expensive drug.

And it’s not just video games that are commanding high prices these days. If you’ve got a spare quarter million or so, why not bid on this real Apollo Guidance Computer and DSKY? The AGC is a non-flown machine that was installed in LTA-8, the “lunar test article” version of the Landing Module (LM) that was used for vacuum testing. If the photos in the auction listing seem familiar, it’s with good reason: this is the same AGC that was restored to operating condition by Carl Claunch, Mike Stewart, Ken Shiriff, and Marc Verdiell. Sotheby’s estimates the value at $200,000 to $300,000; in a world of billionaire megalomaniacs with dreams of space empires, we wouldn’t be surprised if a working AGC went for much, much more than that.

Meanwhile, current day space exploration is going swimmingly. Just this week NASA got the Hubble Space Telescope back online, which is great news for astronomers. And on Mars, the Ingenuity helicopter just keeps on delivering during its “operations demonstration” mission. Originally just supposed to be a technology demonstration, Ingenuity has proven to be a useful companion to the Perseverance rover, scouting out locations of interest to explore or areas of hazard to avoid. On the helicopter’s recent ninth flight, it scouted a dune field for the team, providing photographs that showed the area would be too dangerous for the rover to cross. The rover’s on-board navigation system isn’t great at seeing sand dunes, so Ingenuity’s images are a real boon to mission planners, not to mention geologists and astrobiologists, who are seeing promising areas of the ancient lakebed to explore.

And finally, most of us know all too well how audio feedback works, and all the occasions to avoid it. But what about video feedback? What happens when you point a camera that a screen displaying the image from the camera? Fractals are what happens, or at least something that looks a lot like fractals. Code Parade has been playing with what he calls “analog fractals”, which are generated just by video feedback and not by computational means. While he’d prefer to do this old school with analog video equipment, it easy enough to replicate on a computer; he even has a web page that lets you arrange a series of virtual monitors on your screen. Point a webcam at the screen, and you’re off on a fractal journey that constantly changes and shifts. Give it a try.