Real-time flight data used to be something that was only available to air traffic controllers, hunched over radar scopes in darkened rooms watching the comings and goings of flights as glowing phosphor traces on their screens. But that was then; now, flight tracking is as simple as pulling up a web page. But where’s the fun in that?
To bring some of that old-school feel to his flight tracking, [Jarrett Cigainero] has been working on this ADS-B scope that uses a real radar CRT. As you can imagine, this project is pretty complex, starting with driving the 5FP7 CRT, a 5″ round-face tube with a long-persistence P7-type phosphor. The tube needs about 7 kV for the anode, which is delivered via a homebrew power supply complete with a custom flyback transformer. There’s also a lot going on with the X-Y deflection amps and beam intensity control.
The software side has a lot going on as well. ADS-B data comes from an SDR dongle using dump1090 running on a Raspberry Pi 3B. The latitude and longitude of each plane within range — about 5 nautical miles — is translated to vector coordinates, and as the “radar” sweeps past the location, a pip lights up on the scope. And no, you’re not seeing things if you see two colors in the video below; as [TubeTime] helpfully explains, P7 is a cascade phosphor that initially emits a bright-blue light with some UV in it, which then charges up a long-persistence green phosphor.
Even though multicolored icons and satellite imagery may be more useful for flight tracking, we really like the simple retro look [Jarrett] has managed to pull off here, not to mention the hackery needed to do it.
I like the idea, but it somehow looks way less good than I was expecting. Especially the sweeping line is *very* jittery.
The narration pretty well explains the source of the current limitations. I think it’s great for hackers to do this because it lets others better understand pitfalls. Posting a flawless project with no explanation is much less interesting.
I second this. A failure is only a failure if nothing can be learnt from it.
That’s why I respect those who admit their mistakes and talk about them.
That way, others can learn from it or better understand their own, similar mistakes.
“hacking” the Raspi’s SPI bus for this was fun and allows for simple hookup for anyone who wants to just feed the analog data straight into an oscilloscope in XY mode, or their own dedicated XY scope. To do it right, or better, would be to use the SPI bus as it was meant to be used, IE: 8 bit data instead of forcing it to be 16 bit and making the CS change take too long, feed that into a custom circuit or FPGA that could buffer the data, and then either output directly to a custom DAC, or output to another SPI bus that is designed and dedicated to the LTC1662 DAC’s.
The other problem with this seems to be memory access speed in the Raspi. While the SPI bus is DMA’ing or what ever kernel routines are driving it, the other threads are unable to run or process data to be buffered and thus a bottleneck results.
To continue doing it digitally; What really needs to be done to solve this would be to send the raw aircraft ‘blip’ positions, and sweep angle to an FPGA and have it do the heavy lifting of generating the bitstream for intensity data, and the XY analog signal calculations.
Another way of doing it would be to generate the sweep with analog circuits or a rotating magnet like they did in the old’n days and have sync pulses sent to a circuit that buffers the bitstream generated by the Raspi along with sweep angle sync pulses sent directly to the Raspi to generate an IRQ or trigger a call-back function. I would say this bitstream could be sent directly out the SPI bus, but playing with the Raspi’s SPI bus has taught me that even if the byte delay is set to 0, there is still a slight delay between every 8 bits of data sent out. In the past though, I have used a dsPIC’s SPI bus to generate composite video data with a fair amount of success and no delay between bytes so long as the IRQ the SPI module generates is serviced quickly and without delay.
I wonder if part of the problem is the rolling shutter effect from the camera, I’d be curious to see it at a very high fps like 240.
Oh, that flickering? I’m using a small neon bulb as an ~80v regulator/shunt for the intensity circuit and it has begun flickering. It’s really a disappointment because it worked really well at first while being a really simple solution.
Reminds me of the days of 8s/7.2s Slow Scan TV..
RADAR indicators that used P7 phosphors generally used a yellow filter over them to eliminate the blue-white short-persistence component of the light the screen produces, which reduces operator fatigue.
Wondering if you could do something similar with an old digital storage scope and just feed it via the 2 channel inputs…..
Well, yeah, you could, but then you’re just plotting the data on a computer display, right?
If it looks cool then what does it matter? I think I mentioned doing just this in one of my videos or on the github repo README for those who don’t have an analog scope, or want to have the persistence but doesn’t have a scope with long persistence phosphors. Maybe someone just wants to get their feet wet with minimal investment and they already have a digital scope, as Raspi, and some DAC’s they can use. Shoot, I even though about writing a graphics routine that would simulate the P7 phosphor glow, but I’m not good a programming graphics. ;)
Could you explain in more detail the process of driving the 5FP7 CRT and the components involved, such as the custom flyback transformer?