By now you’ve almost certainly heard about the recent release of a high-resolution satellite image showing the aftermath of Iran’s failed attempt to launch their Safir liquid fuel rocket. The geopolitical ramifications of Iran developing this type of ballistic missile technology is certainly a newsworthy story in its own right, but in this case, there’s been far more interest in how the picture was taken. Given known variables such as the time and date of the incident and the location of the launch pad, analysts have determined it was likely taken by a classified American KH-11 satellite.
The image is certainly striking, showing a level of detail that far exceeds what’s available through any of the space observation services we as civilians have access to. Estimated to have been taken from a distance of approximately 382 km, the image appears to have a resolution of at least ten centimeters per pixel. Given that the orbit of the satellite in question dips as low as 270 km on its closest approach to the Earth’s surface, it’s likely that the maximum resolution is even higher.
Of course, there are many aspects of the KH-11 satellites that remain highly classified, especially in regards to the latest hardware revisions. But their existence and general design has been common knowledge for decades. Images taken from earlier generation KH-11 satellites were leaked or otherwise released in the 1980s and 1990s, and while the Iranian image is certainly of a higher fidelity, this is not wholly surprising given the intervening decades.
What we know far less about are the orbital surveillance assets that supersede the KH-11. The satellite that took this image, known by its designation USA 224, has been in orbit since 2011. The National Reconnaissance Office (NRO) has launched a number of newer spacecraft since then, with several more slated to be lifted into orbit between now and 2021.
So let’s take a closer look at the KH-11 series of reconnaissance satellites, and compare that to what we can piece together about the next generation or orbital espionage technology that’s already circling overhead might be capable of.
Continue reading “Watching The Watchers: The State Of Space Surveillance”
Synthetic-aperture radar, in which a moving radar is used to simulate a very large antenna and obtain high-resolution images, is typically not the stuff of hobbyists. Nobody told that to [Henrik Forstén], though, and so we’ve got this bicycle-mounted synthetic-aperture radar project to marvel over as a result.
Neither the electronics nor the math involved in making SAR work is trivial, so [Henrik]’s comprehensive write-up is invaluable to understanding what’s going on. First step: build a 6-GHz frequency modulated-continuous wave (FMCW) radar, a project that [Henrik] undertook some time back that really knocked our socks off. His FMCW set is good enough to resolve human-scale objects at about 100 meters.
Moving the radar and capturing data along a path are the next steps and are pretty simple, but figuring out what to do with the data is anything but. [Henrik] goes into great detail about the SAR algorithm he used, called Omega-K, a routine that makes use of the Fast Fourier Transform which he implemented for a GPU using Tensor Flow. We usually see that for neural net applications, but the code turned out remarkably detailed 2D scans of a parking lot he rode through with the bike-mounted radar. [Henrik] added an auto-focus routine as well, and you can clearly see each parked car, light pole, and distant building within range of the radar.
We find it pretty amazing what [Henrik] was able to accomplish with relatively low-budget equipment. Synthetic-aperture radar has a lot of applications, and we’d love to see this refined and developed further.
[Henrik] is at it again. Another thoroughly detailed radar project has shown up on his blog. This time [Henrik] is making some significant improvements to his previous homemade radar with the addition of Synthetic Aperture Radar (SAR) to his previous Frequency Modulated Continuous Wave (FMCW) system.
[Henrik’s] new design uses an NXP LPC4320 which uniquely combines an ARM Cortex-M4 MCU along with a Cortex-M0 co-processor. The HackRF also uses this micro as it has some specific features that can be taken advantage of here like the Serial GPIO (SGPIO) which can be tediously configured and high-speed USB all for ~$8 in single quantity. The mixed signal design is done in two boards, a 4 layer RF board and 2 layer digital board.
Like the gentleman he is, [Henrik] has included schematics, board files, and his modified source from the HackRF project in his github repo. There is simply too much information in his post to attempt to summarize here, if you need instant gratification check out the pictures after the break.
The write-up on his personal blog is impressive and worth look if you didn’t catch our coverage of his single board Linux computer, or his previous radar design.
Continue reading “An Improvised Synthetic Aperture Radar”
What could possibly be better than printing out a few low-resolution voxels on a MakerBot? A whole lot of things, but how about getting those voxels with your own synthetic aperture radar? That’s what [Gregory Charvat] has been up to, and he’s documented the entire process for us.
The build began with an ultra wideband impulse radar we saw a while ago. The radar is built from scraps [Greg] picked up on eBay, and is able to image a scene in the time domain, creating nice linear sweeps on a MATLAB plot when [Greg] runs in front of the horns.
With an impulse radar under his belt, [Greg] moved up the technological ladder to something that can produce vaguely intelligible images with his setup. The synthetic aperture radar made from putting his radar horns on the carriage of a garage door opener. The horns slowly scan back and forth along the linear rail, taking single impulse readings and adding them together in an image. In the video below, [Greg] is able to image a few pieces of copper pipe only a few inches in diameter. The necessary equipment for this build only cost [Greg] a few hundred bucks at the Dayton Hamvention, and a similar setup could be put together for even less.
If building an X band impulse synthetic aperture radar isn’t impressive enough. [Greg] also 3D printed one of his radar images on a MakerBot. That’s just applying stlwrite to the 2D radar image and feeding it into MakerWare. Gotta have that blog cred, doe. It also makes for the best headline I’ve ever written.
Continue reading “DIY Ultra Wideband Impulse Synthetic Aperture Radar And A MakerBot”
Learn why you were pulled over, quantify the stealthiness of your favorite model aircraft, or see what various household items look like at 10 GHz. In this post we will describe the basics of Synthetic Aperture Radar (SAR) imaging, beginning with a historical perspective, showing the state of the art, and describing what can be done in your garage laboratory. Lets image with microwaves!
Continue reading “Radar Imaging In Your Garage: Synthetic Aperture Radar”
A few profs from MIT’s Lincoln Lab are giving those poor MIT undergrads something to do over winter break: they’re teaching a three-week course on building a laptop-powered radar system capable of radar ranging, doppler, and synthetic aperture imaging. Interestingly, the radar system that teams will build for the class has a BOM totaling $360, and they’re also putting the entire class online if you’d like to follow along and build your own.
From the lecture notes from the course, the radio system is made out of an off-the-shelf LNA, oscillator, and splitter. By connecting two coffee can ‘cantennas’, it’s possible to record a .WAV file from the signal coming from the radar and use MATLAB to turn that audio signal into a doppler radar.
It’s a very ambitious project that goes deep down the rabbit hole of RF and analog design. One of the lecturers made a YouTube demo of the radar in ranging mode; you can check that out after the break.
Continue reading “Build A $360 Synthetic Aperture Radar With MIT’s OpenCourseware”