The Problem with Software Defined Radio

There’s a problem with software defined radio. It’s not that everyone needs to re-learn what TEMPEST shielding is, and it’s not that Bluetooth is horribly broken. SDR’s biggest problem is one of bandwidth and processing. With a simple USB TV Tuner, you can listen in on aircraft, grab Landsat images from hundreds of miles up, or sniff the low-power radios used in Internet of Things things. What you can’t do is make your own WiFi adapter, and you can’t create your own LTE wireless network. This is simply a problem of getting bits from the air to a computer for processing.

At HOPE last weekend, the folks behind the very capable LimeSDR and a new company working with Lime’s hardware laid out the possibilities of what software defined radio can do if you make a link to a computer very fast, and add some processing on the SDR itself.

The key feature of the LimeSDR, and all boards derived from Lime Micro’s tech is the LMS7002M. It’s a Field Programmable RF transceiver with coverage from 100kHz to 3.8GHz, a programmable IF filtering from 600kHz to 80MHz, and — this one is important — on-chip reconfigurable ‘signal processing’ and a fast USB 3.0 interface to a computer.

The Fairwaves XTRX

Aside from the Lime, another company was also at HOPE showing off the latest SDR wares they have to offer. Fairwaves was there with the XTRX, a software defined radio built around  the same Lime Micro LMS7002M chip in a miniPCIe form factor.

This tiny card uses the same tech found in the LimeSDR with one key difference. Instead of a USB 3.0 port, the XRTX connects to a computer through the PCI bus, sending data to RAM at 8Gb/s. That’s fast.

The miniPCIe form factor also has another interesting application. The folks at Fairwaves are working on putting this device in a miniPCIe to PCIe x1 adapter – that makes sense, it’s all the same signals, just a different form factor.

This also means you can run four XTRX boards with a yet-to-be-designed PCIe 16x adapter. Putting four of these SDRs in a single card means phased array antennas, 8×8 MIMO, and other techniques that make this massive SDR very interesting. The Fairwaves team only had a handful of these boards assembled, but when this goes on sale, you’ll be able to build a rig that blows the roof off the price/performance ratio of any other SDR.

In the talk presented at HOPE (not available independently of other talks yet, but starting 1:46:12 into this live recording), the folks behind the LimeSDR talked about the possible applications of this hardware. In a year or two, you’ll be able to build a portable 3G or 4G base station for about $2500. That’s an incredible advancement in the state of the art, and something that’s only possible because of on-chip processing and very fast access to a computer’s memory.

72 thoughts on “The Problem with Software Defined Radio

      1. Ok, my ignorance does not allow me to understand why one would build a personal LTE station.
        I mean, I can imagine a standard GSM mobile network where there’s no signal coverage and you want to establish a communication or provide it in small villages, etc. but why LTE? Anyway a walkie talkie there may be a cheaper solution.
        The only thing that pops in my mind is that stuff that the police did few months ago somewhere in america (new orleans?) that they were wandering around with a personal GSM station in a van to intercept communications acting as the man in the middle.

        1. It’s not ignorance, it’s lack of imagination, Daryldee:

          1. As an amateur radio operator, to say, “I got this working!” and to use it. It’s the same as building anything else, but this is near-cutting-edge.

          2. As with the open-source OpenBTS, besides the above, there are places in the world that can’t afford it. This can give them LTE, especially since, at least in this country (USA), carriers are not only dropping 2G (CDMA / GSM) but are talking about dropping 3G (HSPA+ / EvDO) already as well! LTE is just *THAT* more efficient and lets people use a reasonably modern device as user equipment.

          3. Why not WiFi, then? Low range and lousy performance in busy networks. Remember WiMAX? That was SUPPOSED to be “long-range WiFi”. It couldn’t compete with LTE (either version). LTE is simply the best wide-area terrestrial high-speed data tech we have right now (2010’s and probably 2020’s).

      2. That same thread, again. {bored}
        So, as I can see none really cared trying it IRL, only ranting about tech possibility.

        Here is the sad truth: you Ettus / BladeRf / HackRF SDR needed for your OSMCOM BTS // IMSI catcher WILL FAIL EVERY FUCKING TIME. ( source? i’ve actually made the test )

        You can’t have a portable STINGRAY // IMSI CATCHER to pwn peoples in an open street.

        Ok: here is the very simple explanation everyone can understand: output power of a typical, USB powered ( or powerbank ) SDR will typically be under 100 mw:
        – HackRF: 50mw ( lawl )
        – Ettus / bladeRF: 100mw
        I’m not even opening the antennas pandora thread.

        Someone here would surely care to explain how you’re supposed to beat a street regular BT output power with your toy? Because, you know, you phone will ALWAYS PICK THE STRONGEST BT SIGNAL AROUND.

        So the only places you are able to pick phone with your l33+ package is INSIDE A FARADAY CAGE or an underground disco.


        FYI, a police stingray IS able to outpower a street BT, but you have to power it with a car battery array, it has a 10 W HRF amplifier built in, plus a funcky van with huge antennas on it, and heatsinks to swallow that power output.
        Try to compete with this

        1. @PsychoBilly, you are aware that the device is USB 3.0 so it can get much more than 100mA? And also, what stops you from adding an RF amplifier with the LiIon battery of your choice? You could easily get several watts from such a set up. Also, if you use a directional antenna you can concentrate the power on your “target” exponentially. Furthermore, there are less urbanized areas where the towers are relatively far and therefore their signal is weak. So it’s very easy to outpower a legitimate cell tower. By the way, there are several factors for the cell phone to choose a cell;RSSI is not the only one.

          1. Obviously you don’t know what you’re talking about:

            – USB 3.0 is about data rate, not output power, that peak on most laptops at 150 mA/h @ 5v ( 750 mWh )

            – Additionally the power bottleneck on RF boards isn’t the supplied amperage, but the embed RF amplifier that will just BURN, so each manufacturer is capping them to some nominal output power ( around 10 dbm ~ 100mw )

            – A power RF amplifier is a heavy beast that will narrow vastly your interception spectrum. So you would need to have various, along with various HUGE antennas ( remember? “don’t open the antenna pandora box” )
            Also: ” several watts ” are you serious?? Do you know the output power of a regular street BT? You could have a chance with a scooter battery, a well tunes amplifier, and a huge antenna maybe, which isn’t exactly stealth nor portable.

            – I tried everything to get a hit in a urban street with an ettus and it failed miserably.

          2. @PsychoBilly. I do know what I am talking about. A USB 3.0 port can take 100W under the right conditions (not from a noteboook), or up to 4.5W in normal circumstances. A LiIon battery can easily deliver more Wh than a motorcycle sealed lead acid battery. As I said, you could set up an external power amplifier that would not fry because it would be correctly designed. The antenna size is not important. As power decreases with d², you don’t need that much power unless the victim is significantly closer to the tower than to you. I am not making this up, so don’t be a troll.

          3. I don’t know how this Stingray thread got started, but here is what I know as a ham radio and computer guy:

            1. USB has both a battery charging (not much power) and power delivery (up to 20V @ 5A for, gee wiz, 100 watts) available since USB 2! But don’t take my word for it:


            2. This probably isn’t enough for a Stingray. The photo shows an omni-directional setup, which would require a lot more power. Also, the amplifier MUST be *VERY* linear due to both OFDM and high-order QAM (for LTE, anyway).

            3. There are forum commentaries in various places from cell company network engineers that state that any time one of these (or anything else like trying to run your own base station) pops up, they instantly know about it. There is one particular comment from a Verizon guy that said that, when they were targeting a certain person (as opposed to the picture above, which would be use for blanket monitoring of protesters), VZW would send a message to force the phone onto the stingray.

            Again, while this could be used for a Stingray, I sincerely doubt that is the intent at all of any of this; that said, every tool can be used as a weapon, so…

      3. @marcus not quite. srsLTE let’s you run up a User Equipment (UE). There is no eNodeB (“base station”) from SRS, yet. Present options for this include OpenAirInterface (F/OSS, albeit with FRAND patent licensing required for commercial use) and Amarisoft LTE (proprietary).

        The OAI folks demonstrated their stack running with LimeSDR, so you can have a LTE network-in-a-box for R&D use at a cost of $289 plus whatever you spend on the host computer. Say, maybe $600 all-in?

        Looking forward to having an Core i7 Intel NUC fitted out with an XTRX :o)

        Myriad-RF Community Manager

      1. yes, but that’s bound by the host interfaces. You’re right USB isn’t the perfect choice here, but to be honest, the problem is in the OS: Even over PCIe and 10 Gigabit Ethenet, the time between a packet reaching the host computer’s hardware and being passed on to the user software (that does all the signal handling) is typically orders larger than the pure bus latency.

        I honestly don’t see why this would be different for these devices than it is for USRPs – their PCIe driver might be a few µs cooler, but for USB, there’s pretty much nothing you can do.

        Also, as noted, there /are/ (multiple) existing LTE implementations (amarisoft, srs, openairinterface) that are available and already work with USRPs.

        1. You can either:

          1) Do it all in the FGPA. More design complexity of course.
          2) Use a SoC where you have very low latency between the FPGA fabric and the CPU. However most are based upon ARM hence the CPU’s are not the most powerful, but you can offload more processing in the FPGA fabric. Maybe Intel will put FPGA fabrics on their Xeons soon.
          3) Use a low latency interconnect like UltraPath. More complexity in the interface, and it might change with CPU generations.

      2. Amarisoft built a very capable LTE base station (eNB) using USRP N200 and N210 as the radio front-end. The I/Q processing as well as the full LTE stack (PHY, MAC, RLC, PDCP, RRC/X2AP/S1AP) are deployed on a standard x86 based server. By “very capable” I mean it can handle several hundred UEs (users). In LTE you have a TTI of 1ms so I’d say latency requirements are good here. See here:

      3. I donated quite a bit to the limeSDR campaign as it has huge potential for accelerating research, and plan to try a NIOS II soft CPU with FPGA based mapped kernel memory overlays. i.e. attempting to boot a minimal standalone linux kernel right on the front end itself to handle procedural parts of certain protocols.
        Note the CPU will be relatively low performance, but the io operations are very capable.

        Personal time limits what I can push into the repo, but the core functions should be made available eventually.

    1. Actually, next laptop will probably have Thunderbolt for that reason: being able to plug in a 10GE card and do up to 300 MS/s complex short16 with a high-bandwidth SDR device.

  1. >With a simple USB TV Tuner, you can listen in on aircraft, grab Landsat images from hundreds of miles up…

    Can anybody please elaborate a bit more on the Landsat images receiving? Few minutes with google gave me no good result.. :/

    1. Possibly meant imagery from NOAA weather satellites at 137MHz. It’d be pretty neat to get Landsat imagery though! I haven’t heard of this being done though.

        1. Storage isn’t the issue most of the time. A regular SSD or raid array can easily handle this throughput if all you care about is storing the data. The big issue is taking that data and then doing something with it. Moving the processing into the device itself is a huge improvement, because you’re only dealing with the decoded signals, not the raw signal data.

          1. Very true, but sometimes you do not have the processing power to do everything in realtime. A lot of people interested in iridium, split the capture and decoding functions. And then there is the SLC failure in Landsat 7, which makes it’s data slightly less desirable. And the 1.2 degree directional antennas, which make getting access to the signal one of physical location which is a PITA.

    2. I suspect they meant NOAA. – just look at the data rates and frequencies for Landsat 7:

      2 omni-directional antennas (5W)
      telemetry data rates of 1.2 kbit/s and 4.8 kbit/s, and 256 kbit/s of playback data, 2 kbit/s of command data.
      frequencies of 2106.4 MHz (uplink) and 2287.5 MHz (downlink).

      3 steerable antennas (3.5W)
      75 Mbit/s per channel (total of 150 Mbit/s per antenna)
      frequencies: 8082.5 MHz, 8212.5 MHz, 8342.5 MHz

  2. As usual, HaD accurate articles … The LMS7002M *does not* have an USB 3.0 interface. It wouldn’t make any sense, because it is a chip meant for embedded applications where USB is not exactly common or even desired. USB is needed only for things interfacing to a personal computer.

    The chip has a proprietary digital interface and you need to roll your own interface logic – a fast CPU or an FPGA – if you want to talk USB to it. That’s what LimeSDR actually does – that big ass Altera FPGA is not there just for fun.

    1. WiFi is local technology, and roaming doesn’t work as flawlessly, it’s not cell-based, so your APs overlap and interfere, and there’s a whole bunch of things beyond the pure transportation of packet data that make up a phone / mobile internet system. Also, WiFi will have a hard time accomodating high Doppler, really bad SNR, loads and loads of users; in fact, WiFi’s multi-user capabilities are really bad … people complain about how they can’t make a phone call in a crowd of > 5000 people. Imagine 5000 WiFi devices associated to a single access point. They all trey to not interfere by “listening, and if there’s nothing, asking, then hoping for luck”, whereas LTE and other mobile standards assign spectrum ressources centrally – which at first doesn’t sound so democratic and cool like CSMA/CA, but if you consider that they squeeze a lot more out of the available spectrum, it becomes very clear that you just can’t scale self-organizing networks very well, if two ends of the network can’t even hear each other, because the interference and attenuation between them is bad.

      1. By the way, that’s why I sometimes disagree with the Freifunk people, which try to bring free (as in both beer and speech) metropolitan access networks to the masses using WiFi.

        WiFi allows for choosing whether to send a packet in a “First I ask for permission with the access point, then I wait, then I get permission, then I wait, then I send” mode or “I hope this won’t be interfering with anything else” mode. The throughput for the first is very low if packets aren’t gigantic, and the second doesn’t scale well at all if more than a couple stations are active (if more than 10 people are sending 1/20th of the total time, how high is the chance they’ll interfere?) – which, in a crowded inner city, will almost certainly be the case.

        No, for access networks with lots of users, a central assignment of spectrum ressources is critical, though it would be very desirable if that could be more flexible than it is now – but 4G (i.e. LTE) took a very big step with regards to spectrum access granularity and hence bought a lot more bit/s per Hz – not to mention the coding and modulation advantages it has over 3G.

        1. I sporadically reads about SDR, so I don’r really know the topic. From your comments and from my knowledge of CSMA/CA based access protocols I understand the advantages of rolling your own LTE network instead of WiFi…but…isn’t it illegal? I mean, WiFi operates on unlicensed spectrum, LTE doesn’t, right?

          1. LimeMicro from what I’ve read have been making deals with some telcos. One Major issue for telcos is erecting masts for coverage. Picocells, and femtocells are all the rage these days to get around that pesky planning permission. My suspicion is that Lime may have future plans as middlemen to, at the owners option, install a telco binary base station and remunerate people a pittance for routing calls over SIP. So if you are in the middle of a desert you will get close to zero, and if you are in the middle of an extremely busy city in a spot that has poor coverage, you might make enough to fully pay for your limesdr+PA’s+filters after a year. It will save operators site rental costs for masts.
            But honestly I’m just speculating here.

          2. Operating at any frequency is usually much less an issue of what communication standard you’re using but whether you’re in a band that your device is licensed to operate in – i.e. telcos have paid literally billions to get the licenses for the 800, 900, 1800, 1900, … MHz band that they have /exclusive/ rights to.

            In contrast, the 2.4 and 5 GHz bands are unlicensed bands, meaning that certain devices under certain limits can simply use these. Whether a SDR with a LTE software at the back fulfills these criteria is up to individual legislation. LTE does have a specification in place that allow its operation in ISM bands, so this is totally an intended usage.

          3. @Truth: you make it sound like this is all future stuff!

            It’s not; there’s wide deployment of so-called pico-cells, which basically are indoor base stations that your telco bundles with your e.g. DSL access, so you (and other customers) get high-speed internet indoors, whilst the outdoor macrocells get a reduced load. Aside from you potentially sharing parts of your DSL bandwidth with others, that’s pretty much a win/win situation.

            So, yeah, the telco 7 telco supplier market has always been one of the first to adapt SDR technology. In fact, any LTE basestation you meet (in fact, pretty much any basestation of the last decade, probably) is an SDR device – you’ll notice that LTE has new “releases” every few months; imagine if telcos had to replace base station hardware at that rate!

            Take the AD936x as an example: It’s a very popular RF frontend for SDR devices, used by the Ettus B2xx and E3xx devices, as well as numerous other self-built and commercial SDRs. If you look at its datasheet, you’ll notice a set of measurements that highly align with the desire to operate the device at ISM, 3G/4G and mostly: Wimax frequencies – and I suspect Wimax was the original intended application, but then that access technology didn’t really take off, and so Analog Devices looked for other markets.

            So yes, telco equipment manufacturers would definitely be a prime partner for Lime – which I guess is probably more interested in selling millions of their chipsets than thousands of their SDR devices. So, if you want to, look at the LimeSDR as a very interesting hybrid research/marketing/production project of an RF semiconductor company.

    1. TEMPEST= Telecommunications Electronics Material Protected from Emanating Spurious Transmissions. There are radio receivers sensitive enough to the frequencies that your keyboard and screen emit through EM leakage that they can tell what your typing as you type it.

  3. One question from the Q&A made me smile –
    “You mentioned open source, are you working with such developers as SDR# and GnuRadio ?”
    One is closed source (and primarily focused on their own pretty good Airspy RX hardware) and one is fully open source.

    1. And the “other one” that is Open (for your normal “Maker”) is a ROYAL PAIN to install and get working properly – even on a Linux machine (AND it’s a one-way trip – forget about trying to easily and cleanly remove it should you need to upgrade).

        1. @Truth,

          Thanks for mentioning PyBOMBS – it’s included in my original comment. PyBOMBS tries to help resolving dependencies etc, but the problem is that’s it is a One-Way-Trip! Try getting rid of everything PyBOMBS installs (if you are lucky enough to get it to work in the first place). I can’t. This is unacceptable for me. I don’t have (and can’t afford) to run the likes of PyBOMBS on a “virtual” instance, the CPU/GPU and memory requirements are too high (unaffordable for a usable VM). Regardless, why should we ever need to resort to a VM to install/test/use/upgrade software in Linux, especially if we want to uninstall everything cleanly if we do NOT want to use a VM? Yeah, I know how to deal with logs, package managers, etc., etc., to try to get my machine back to pre-install state after a (failed) uninstall (and don’t forget to deal with your corrupted env settings – and more!).

          There is still a fundamental problem with Linux/Unix desktop OS’s, and that’s the ability to properly/cleanly/and (most important) uninstall any and all software and dependencies etc. at will!

    1. No. USB 3.0 is 5 Gbit/s (small b not capitol B) and USB 3.1 is 10 Gbit/s. Practically USB 3.0 is limited to about 390 MByte/s. My opinion is based on the highest measured speed I’ve seen (FX3).

      As far as, “SDR’s biggest problem is one of bandwidth and processing.” a) can we just re-title this article to “NextGen SDRs brings higher host interface speeds”? And b) add a filter to HaD so I can exclude all Brian articles from my day-to-day browsing?

      1. That was a bit harsh, Alan.

        oops. I’m normally the guy to complain about Gb vs GB; my brain must have been on letting things slip mode.

        > based on highest measured speeds I’ve seen.

        Exactly my experiences, by the way; for example, take the B2xx: Frontend->FPGA->FX3->USB3->xHCI on host computer. In theory, rates shouldn’t be the problem, the FX3 isn’t overtasked, but I’ve yet to see any host controller that takes more than 50MS/s reliably. 50MS/s * (12b I + 12b Q) =600 Mb/s in bog-normal USB bulk packets of max length. Should be the easiest to process for the host controller and its driver. In fact, it seems to be the case that some controllers and their drivers were optimized for storage devices and work with their profiles better, but many chipsets just “die” right on the spot if you try to put that many packets per second through them. They crash and disappear. Great thing to do for a host controller, isn’t it?

    1. It’s not quite that straightforward – you can only do fun things like MIMO if you can synchronize the oscillators of the SDRs and run them in lockstep together.

    1. It is funny, that was my exact same reaction on seeing the crapstore approach (made me wonder if the target market was only the gilded cage iSheep folk). The problem is their sales pitch, it that it is a total turn off to people who can program in C/C++ and create their own gateware. But since the hardware and software is open source so it is not really an issue. My one gripe would be them using Altium instead of KiCad for the PCB, but that should eventually be fixed ” In addition, a KiCAD recapture and layout is planned.” –

    2. KiCAD does not deal as well with impedance controlled lines as Altium does. KiCAD is also not as good with multisheet design.
      But I imagine that the main reason for using Altium was that the engineers ware familiar with it and could get the board to market faster that way.

      I have one LimeSDR on the way and I’m most interested in running LTE on the 2300MHz amateur band.
      2300-2450MHz is there just waiting to be used for something cool. >:)
      IF the duplexers ware smaller and more available, one might be able to build compact transverters to push GSM900/850 handsets with the 45MHz duplex spacing up to 2400-2450MHz. GSM2400 :D

  4. My problem (as just a casual SDR tinkerer, apparently lightyears behind you lot) was not having any hardware choices in the void between the $10 dongle and a $300+ board. But then, courtesy of some comments here, I found out about Airspy. So… thanks :-)

    If there are any other suggestions for lower-cost SDR options, for those of us who are not about building 4G base stations, thanks in advance. Especially HF, for us elderly shortwave/amateur radio holdouts.

    1. SDRPlay. There is matching software (SDRConsole), or you can use SDRSharp, SDRUno (a developing version of a formerly commercial program that the SDRPlay people now own and are giving away free) and various other packages under Windows or Linux.

  5. I understand the problem not in getting data into the processor but rather the OS latency. I assumed that the way they were able to get things like LTE working was by doing a lot of the processing in an FPGA and using the processor more as a supervisor than a workhorse. I don’t know if that is true. I believe the state of the art solutions pair up the SDR chip (lime or ad9361 for example) with something like a Zynq SoC. Like a picoZed SDR type of an architecture. Low latency activities seem best suited to gates in an FPGA.

  6. I’ve yet to see what I’d like, an SDR radio that interfaces with smartphones (including iPhones), letting them do a lot of the processing, and does PSK on the ham bands. That’d give us something that’d fit in a pocket and talk to the world.

      1. Thanks for the link, I might purchase one for watching local channels on camping trips.

        P.S. It would have been funny if you had put a comment about Stargate in here. ;)

          1. Haha, dammit. There are a few stations left, but they are few and far between.
            I have a friend that got a ‘great deal’ on a smartphone that could receive analog television. I tried to warn him…

        1. My experience is cheap USB RTL-SDR on my N900(Maemo5) is pretty weak on 2m and even worse on air band. For example sitting in the airport FBO I can barely read small aircraft coming out of the ramp thought the tower comes through fine and that is with a 1/2 wave wire antenna cut to the center of air band.

    1. “fits in your pocket and talks to the world”, where would your antenna be ? I could understand talks to your city, or even talks your country. But you will need a big antenna and a lot of power to talk to the world. You could probably drop the bit-rate (A la WSPR) to get more range with lower power, but you would still need that big antenna.

  7. Our team uses the PCIe interface between the SDR and the Ubuntu host. Using Amarisoft LTE (eNodeB + EPC), we get up to 118Mbps with an iperf test at 20 MHz bandwidth and 2 x 1 MIMO configurations over-the-air. We are able to lock the oscillators frequency for MIMO, but also calibrate the phase (less than -+ 0.25 degrees at 2.4 GHz) on many transmitters and receivers for beamforming (LTE-A).

  8. Our team uses the PCIe interface between the SDR and the Ubuntu host. Using Amarisoft LTE (eNodeB + EPC), we get up to 118Mbps with an iperf test at 20 MHz bandwidth and 2 x 1 MIMO configurations over-the-air. We are able to lock the oscillators frequency for MIMO, but also calibrate the phase (less than -+ 0.25 degrees at 2.4 GHz) on many transmitters and receivers for beamforming (LTE-A)

  9. My interest in sdr is for amateur radio use – the cost of most transceivers is way beyond working people in many cases, and impossible to afford for youngsters. I want a simple sdr transceiver that uses external power amplification and switching, so I can work HF and VHF amateur bands without having to find thousands of dollars, pounds, euros, or whatever. Most mobile phones, Arduino, Raspberry Pi, can process the standard AM, FM, SSD, and some digital modes, so there is a market for equipment that splits the processing onto one device, while another does the simple radio frequency handling. The idea of portable access nodes is dealt with by dirrational aerials bouncing signals off satellites, balloons, the moon, etc. Ham radio has done this for decades. And we have repeaters, Crossland included, and nodes via the Internet to remote transceivers. So why reinvent the wheel? Build cheaper. Not more me too stuff that’s been done to death.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s