A neat visualization of wireless signals was released last week showing off what our world might look like if we could see radio signals. While it’s an awesome visual effect, it’s really not what we would see. At least not with our puny human eyes.
The app uses data like WiFi hotspots, cell towers, and other wireless devices to create an augmented reality effect showing where the signals are propagating from. Site specific versions of the app also include the wired communication infrastructure as well to give a complete window into the science-fiction-sounding title of “infosphere”.
But like a user on Gizmodo commented, if we could actually see radio signals, they would just be flashes of light. Radio waves are just electromagnetic wavelengths longer than infrared light after all. Though if we could see those wavelengths, what’s the chance we have light speed vision too?
Still. It’s a pretty cool visualization. [Vijen], the creator, says he hopes to release the app to the public after its been exhibited at the ZKM Center of Art and Media in Karlsruhe.
[via Gizmodo]
If image sensors are sensitive to visible light why not make a sensor to be sensitive to radio? with pixels too but with different size and material and outputting a voltage for each one.
Directly translated, the sensor would have the be enormous. You could do it with a combination of a tetrahedral antenna, 4x SDR, multilateration, lag-time analysis, and a type of Radon transform. You would need a very accurate clock in order to get any kind of dimensional fidelity too.
All in all, it’s a daunting project with lots of science and engineering. The hardware could be relatively cheap though.
An excellent foundation:
http://hackaday.com/2015/06/05/building-your-own-sdr-based-passive-radar-on-a-shoestring/
Wavelength is possibly the reason we can’t easily make radio image sensors. Longer wavelengths means bigger pixels, bigger sensor, and bigger camera.
Also, how do you focus the photons to form an image? In the end, you’d essentially be building a small radio telescope.
Not small, gigantic. a radio telescope is a single sensor with “magnification”, to get an image you need thousands of them. so you would be building the worlds largest radio telescope and would probably take up a huge section of nevada desert. with structures to keep them all in a plane instead of following the curvature of the earth.
It would also discover new science, as we haven’t as a species built one like that yet.
A 4″ telescope has a optical collection area many millions of times the wavelength of the EM radiation of interest; 405 – 775 mm
A radio telescope like Parks in Australia (instantly watch “the Dish” if you don’t know what it is, or visit http://www.atnf.csiro.au/outreach/visiting/parkes/qa.html for more concise and boring view of achievements) which is 67m in diameter may be only two to ten times the wavelength in diameter.
Thus, it can’t really make an “image”. Other EM properties can come in useful, though, like interference patterns giving us some very large arrays (which span the width of the globe)
Also, don’t we already see at light speed? (Even if the response time of the retina and optic centres is about 20ms?)
Or a big one. A very, very big one: https://eiscat3d.se/
To put it in perspective, the physical length (wavelength) of common radio waves range from a few inches (2.5 ghz wifi), 10 feet (FM radio), to over a thousand feet (AM radio).
We have radars with huge phased arrays (think 1500 pixels in 1x1m plane). In passive mode, they can measure wave direction, amplitude and frequency, so it is basically analogous to a camera.
Exactly, was waiting for someone to say that.
And as for the wavelength requiring ‘huge pixels’, it all depends on the sensitivity of the amplifier, you can use antennas 1/8th or 1/16th or whatever of the wavelength and it’ll work, assuming you can boost the signal enough. Which is why those digital TV antennas are often silly little sticks instead of the big things they used to use with the same frequencies.
But when you think about it, visible light sensors work on another principle, they work by absorption rather than like an antenna works which is by creating a flow in a bit of conductor and amplifying that flow. So now the question is, could you make a sensor like your eye has but for radio frequencies? I know certain materials absorb the energy of radio signals, like how is done with those microwave kilns, but can you turn that into pixels and tune it to a frequency and then read it? Over to you guys at MIT or Max Planck or RIT or even CERN or what have you, spend some of that research money. I think it’s worth researching.
But it would probably be used to power more horrendous NSA/Pentagon crap though, dammit. (Even if Russia or china or Europe did the research).
Quote:”But when you think about it, visible light sensors work on another principle”
Visible light consists of photons and they are converted to electrical energy anyway when they are ‘absorbed’ as you mentioned.
The real difference is the front end of the electronics behind the detector. An antenna is intended to receive only one transmission at a time so the antenna most often forms part of a tuned circuit where resonance (or Q factor) is part of the detector (sensitivity) and because resonance is part of the process, wave length becomes critical.
This is unlike SDR where the whole range (bandwidth) of signals (channels) is amplified by a wide band Amp and DSP filters out the required signal ie resonance does not play a role here and therefore a resonant wavelength is not required.
The same applies for a wideband detector for 2 dimensional detection … wavelength and resonance do not need to be primary considerations became DSP can access the required information from a front end sensor that is wideband.
So in theory you could put a 2D optical sensor (CMOS camera array) behind a black sheet of plastic and encase the whole detector in a Faraday like filter to get rid of the unwanted signals, wind the bias up so that radio wave signals trigger the CMOS. The tricky part is to make the ‘Lens’ so that it uses nodal intermodulation to cause the signals source location to determine it’s destination location on the sensor.
So this is a ‘do-able’ thing but I don’t know if you can get enough info from the low frame rates of the interface circuitry usually built into CMOS sensor arrays. It would be ideal to have a sensor array that was surrounded in FPGA (on-chip).
Also, in radar you selectively illuminate with carefully constructed beams and intensively post-process your reception. It’s more like scanning with a laser in a 3d scanner than “seeing”
You also use cm wavelength EM, most of which is absorbed by atmosphere over distance.
The highest fidelity ones are also very, very large.
Very cool
Yeah, sure. Next thing you’re telling me the GUIs in movies don’t actually convey information.
I didn’t see in the article where they’re trying to “see” ham radio transmissions. If you stick to a shorter wavelength, like WiFi (~5in), I don’t see why, in theory, you couldn’t build something of reasonable size. You might not want to carry it, but it wouldn’t need to be building sized.
Put together an AESA similar to the radar mentioned above and see what develops.
this pluss VR head set hells yes
http://hackaday.com/2015/02/17/mapping-wifi-signals-in-3-dimensions/
Precisely what I first thought of too!
If we could see radio waves the way we see visible light, everything would be a lot more transparent and fuzzy. Imagine the world looking like everything was made of various shades of glass, but being unable to focus sharply on anything.
Also, any RF sources would look like light sources. Your wifi router and cell phone would glow dimly, your microwave oven would look like a blast furnace, and a transmission tower would look like a lighthouse.
“Though if we could see those wavelengths, what’s the chance we have light speed vision too?”
What’s that supposed to mean, light speed vision?
Ditto. I was thinking “don’t we already have that?”
Though flicker detection really determines how ‘fast’ our eyes are. Welcome to the 50/60 Hz debate?
This is a new popular misconception, your eyes work by the receptors receiving light and a chemical reaction occurring, and yes there is a finite time for that to happen and a finite time for it to reset, a time they measured, can’t recall it from my memory but both receptors and neurons simply have a time needed to do things and then reset. Or in other words, yes there is a max ‘framerate’ (or should I say pixelrate?) a receptor can handle.
And talking of which, I saw a headline the other day that Parkinsons is neurons working overtime and then burning out Canadian researcher found.
Yes, and based in that time, it determines how quickly we can see certain types of changes, like flicker. When I said 50/60 Hz debate, I’m referring to the power grid and lighting. I’m not referring to 144, 120, and 240 Hz display refresh technologies.
I thought it was pretty clear he meant something equivalent to “we wouldn’t be able to perceive EM wavefronts”
Visible light is comprised solely of EM waves.
And we can’t see the wavefront of visible light either. At least, not without using stroboscopic effects q.v. http://web.media.mit.edu/~raskar/trillionfps/
er, or dealing with astronomical distances. We certainly can see the EM wavefronts in nebulae or in our own solar system.
I would expect that we would see varying levels of intensity as “rays” emitted from the antennas. I say this given that our existing visual ability is an integration of the light that hits our retinas for the given (1/refresh rate) of human eyes. Imagine a moving average filter with a decaying coefficient with respect to time. A signal is pass through it, effectively integrating the intensity for a given time period.
Antennas are relatively analogous to a point source for a reasonable distance. I say this only as an indication of the physical SIZE and the granularity of our vision, not any relation to the actual mechanism of transmission that a particular antenna/emitter employs. This is only a means of modeling.
Given the receptive area of our eyes, less than a square centimetre, we would see pulsating “dots” in our visual field. The pulsation would be approximately discrete I expect. Think rounded square wave.
Just my thoughts on the matter
Just wanted to add to this conversation:
You would need a lens to narrow down the Field of View so that you can process the direction correctly.
Or some other way of processing direction like beam forming does for microphones.
You could do that by surrounding the array with a grounding shield except for a window, or with the AESA radar techniques mentioned above, you can use reception timing between the elements to process direction.
Are radio photons bigger than visible light ones? I know you can’t visualise features on a microscope smaller than some fraction of the wavelength of the light you’re using. So does that make a radio photon a big spread-out thing across meters? Or, at least, that the resolution of radio-vision would be on the order of metres? For something with a wavelength of that size.
I think the problem here is that in order to capture a photon, you need to be able to capture one single wave of the wavelength that you want to “see”, so whereas visible light is in the nanometer range, something like wifi which is in the 5cm range, you would need a sensor where one pixel is approximately the same size as one wavelength. If not, you’re not going to see enough of a change in energy for it to register. If you have a 2.5cm pixel to capture a 5cm signal, you’re only going to catch half a wave which wouldn’t make enough of a change to trigger your sensor. For something where the wavelength is in the meters, you’d need a single pixel to match the wavelength. There was recently an xray telescope sent into orbit which has a focus length of something like a hundred meters since the wavelengths are so long. They use a single pixel and then sweep it across the sky to generate the images.
I would like to point, perhaps someone already did, if we could see in the radio spectrum it would be identical to what we see now. Radio sources would look just like a light bulb, radiating energy in a 360 degree sphere, with reflections showing opaque and translucent materials to radio. No difference, not including color. Color does not actually exist but the wavelength received by your eye is translated to your brain, where the brain supplants a color. That fact all people nearly the same I think proves of intelligent design, but the comparison is hard and true.