ESPFLIX Brings Streaming Video To The World Of Microcontrollers

These days, if you’ve got a TV that’s a little too old to directly access streaming services, you’ve got plenty of options. Apple TV, Chromecast, and a cavalcade of Android boxes are available to help get content on your screen. However, if you’re really stuck in the past, ESPFLIX might just be for you.

Control of the system is achieved by an Apple TV remote.

Yes, that’s right – it’s an online streaming service running on an ESP32. [rossumur] has achieved this feat through a careful use of codecs, and some efficient coding strategies to make it all come together. Video is MPEG1, at just 352×192 resolution. Audio is via the SBC codec, originally intended for use with Bluetooth devices. It’s chosen here for its tiny sample buffers, making it easier to decode in the limited RAM of the ESP32. Output is via composite video, generated on the ESP32 itself.

The titles themselves consist of public domain content, running off an Amazon Web Services instance. With limited RAM on the ESP32, there’s not much buffering to be had, so [rossumur] is bankrolling an AWS Cloudfront instance which should make it possible to use ESPFLIX from most places around the world with a solid internet connection.

We’ve seen [rossumur]’s work before, with the ESP_8_BIT serving as a prelude to this project’s capabilities. Video after the break.

Continue reading “ESPFLIX Brings Streaming Video To The World Of Microcontrollers”

Apollo Missions Get Upgraded Video

July 20th marked the anniversary of the first human setting foot on the moon. If you were alive back then, you probably remember being glued to the TV watching the high-tech images of Armstrong taking that first step. But if you go back and watch the video today, it doesn’t look the way you remember it. We’ve been spoiled by high-density video with incredible frame rates. [Dutchsteammachine] has taken a great deal of old NASA footage and used their tools to update them to higher frame rates that look a lot better, as you can see below.

The original film from the moon landing ran between 12 frames per second and as low as 1 frame per second. The new video is interpolated to 24 frames per second. Some of the later Apollo mission film is jacked up to 60 frames per second. The results are great.

Continue reading “Apollo Missions Get Upgraded Video”

Hackaday Links Column Banner

Hackaday Links: June 7, 2020

For many of us who were in college at the time, the 1989 release of Will Wright’s classic SimCity sounded the death knell of our GPAs. Being able to create virtual worlds and then smite them with a tornado or a kaiju attack was the stuff of a procrastinator’s dreams. We always liked the industrial side of the game best, and took great pains in laying out the factory zones, power plants, and seaports. Those of a similar bent will be happy to know that Maxis, the studio behind the game, had a business simulations division, and one of their products was a complete refinery simulator the studio built for Chevron called, unsurprisingly, SimRefinery. The game, which bears a striking resemblance to SimCity, has been recovered and is now available for download, which means endless procrastination by playing virtual petrochemical engineer is only a mouse click away.

Speaking of time wasters, we stumbled upon another simulation this week that sucked away a couple of hours of productivity. As RTL-SDR.com reports, YouTuber called Information Zulu has a 24/7 live stream showing arrivals and departures at Los Angeles International Airport. That may sound boring, but the cameras used to watch the runways are virtual, and the planes are animated based on ADS-B data being scooped up by an RTL-SDR dongle. We pinged Information Zulu and asked for a rundown of the gear behind the system, but never heard back. If we do, we’ll post a full article on what we learned, because the level of detail is amazing. The arriving and departing planes sport the correct livery for the airline, the current weather conditions are shown, taxiing is shown in real time, and there’s even an audio feed from air traffic control.

If you’re looking to gain back a little of the productivity lost to the last two items, Digi-Key might be able to help with their new PCB Builder service. All you have to do is upload your gerbers and select your materials, and they’ll give you options for a bunch of different quick-turn fabrication houses. Looks mighty convenient.

Steve Mould dropped a video this week about vibration analysis. That might not sound very exciting, but the fascinating bit is how companies are now using motion amplification video techniques to show how and where industrial equipment is moving, even if those motions are too subtle to be seen by the naked eye. It’s frankly terrifying to see how pipes flex and tanks expand and contract, and how pumps and motors move relative to each other. The technique used is similar to the way a person’s pulse can be detected on a video by the subtle color change as blood rushes into capillaries. We’d love to see someone tackle a homebrew version of this so we can all see what’s going on around us.

And finally, we want to remind everyone that the Hackaday Prize is back, and that you should get your entries going. What’s new this year is the Dream Team challenges, where four worthy non-profits organizations will each assemble a three-person team to work on a specific pain-point in their process. The application deadline has been extended to June 9, and there are two $3,000 microgrants, one in June and one in July, for each team member. So look through the design briefs and see if your skills match their needs.

Receive Analog Video Radio Signals From Scratch

If you’ve been on the RTL-SDR forums lately you may have seen that a lot of work has been going into the DragonOS software. This is a software-defined radio group that has seen a lot of effort put into a purpose-built Debian-based Linux distribution that can do a lot of SDR out of the box. The latest and most exciting project coming from them involves a method for using the software to receive and demodulate analog video.

[Aaron]’s video (linked below) demonstrates using a particular piece of software called SigDigger to analyze an incoming analog video stream from a drone using a HackRF. (Of course any incoming analog signal could be used, it doesn’t need to be a drone.) The software shows the various active frequency ranges, allows a user to narrow in on one and then start demodulating it. While it has to be dialed in just right to get anything that doesn’t look like snow, [Aaron] is able to get recognizable results in just a few minutes.

Getting something like this to work completely in software is an impressive feat, especially considering that all of the software used here is free. Granted, this wouldn’t be as easy for a digital signal like most TV stations broadcast, but there’s still a lot of fun to be had. In case you missed the release of DragonOS, we covered it a few weeks ago and it’s only gotten better since then, with this project just as one example.

Continue reading “Receive Analog Video Radio Signals From Scratch”

FPGA Raises Component Video From A Sinclair ZX Spectrum

An abiding memory of the early-80s heyday of 8-bit computing for many is operating their computer from the carpet in front of the family TV. While the kids in the computer adverts had parents who bought them a portable colour telly on which to play Jet Set Willy, the average kid had used up all the Christmas present money on the computer itself. The cable would have been an RF connection to the TV antenna socket, and the picture quality? At the time we thought it was amazing because we didn’t know any different, but with the benefit of nearly 40 years’ hindsight, it was awful.

For ZX Spectrum owners in 2020 a standard modification is to bring out a composite video signal, but [c0pperdragon] has gone a step or two beyond that with a component video interface. And this isn’t a mod in which the signals are lifted from the Spectrum’s colour encoder circuitry, instead it uses an FPGA hooked directly to the ULA chip to generate the component video itself.

The Altera chip sits on a little PCB designed to occupy the footprint of the original Astec modulator, and sports a neat bundle of wires hooked up to the various Spectrum signals it needs. There are a couple of jumpers to select the output type and resolution, it supports YPbPr or RGsB outputs and both 288p and 576p. If you think perhaps it looks a little familiar, that’s because it’s the sister project of an earlier board for the Commodore 64. So if you have a Spectrum and are annoyed by UHF and PAL, perhaps it’s worth a look.

Full-Colour, Full-Motion Video – On An Audio Cassette!

A lot of projects we feature use video in some form or other, but that video is invariably digital, it exists as a stream of numbers in a computer memory or storage, and is often compressed. For some of us who grew up working with composite video there is a slight regret that we rarely get up-close and personal with an analogue stream, so [Kris Slyka]’s project putting video on a conventional audio cassette is a rare opportunity.

It's fair to say this isn't the highest quality video.
It’s fair to say this isn’t the highest quality video.

Readers with long memories may recall the Fisher-Price PixelVision toy from the late 1980s which recorded black-and-white video on a conventional cassette running at many times normal speed. This system does not take that tack, instead it decreases resolution and frame rate to a point at which it can be recorded at conventional cassette speeds. The result is not particularly high quality, but with luminance on one side of a stereo recording and chrominance on the other it does work.

The video below the break is a run through the system, with an explanation of how video signals work. Meanwhile the code for both encoder and decoder are available through the magic of GitHub. If you’re interested further, take a look at our examination of a video waveform.

Continue reading “Full-Colour, Full-Motion Video – On An Audio Cassette!”

P-51 Cockpit Recreated With Help Of Local Makerspace

It’s surprisingly easy to misjudge tips that come into the Hackaday tip line. After filtering out the omnipresent spam, a quick scan of tip titles will often form a quick impression that turns out to be completely wrong. Such was the case with a recent tip that seemed from the subject line to be a flight simulator cockpit. The mental picture I had was of a model cockpit hooked to Flight Simulator or some other off-the-shelf flying game, many of which we’ve seen over the years.

I couldn’t have been more wrong about the project that Grant Hobbs undertook. His cockpit simulator turned out to be so much more than what I thought, and after trading a few emails with him to get all the details, I felt like I had to share the series of hacks that led to the short video below and the story about how he somehow managed to build the set despite having no previous experience with the usual tools of the trade.

Continue reading “P-51 Cockpit Recreated With Help Of Local Makerspace”