Wifibroadcast Makes WiFi FPV Video More Like Analog

Normal WiFi is not what you want to send video from your quadcopter back to the first-person-view (FPV) goggles strapped on your head, because it’s designed for 100% correct, two-way transmission of data between just two radios. Transmission of analog video signals, on the other hand, is lossy, one-way, and one-to-many, which is why the longer-range FPV flights all tend to use old-school analog video transmission.

When you’re near the edge of your radios’ range, you care much more about getting any image in a timely fashion than about getting the entire video sequence correctly after a delay. While WiFi is retransmitting packets and your video is buffering, your quadcopter is crashing, and you don’t need every video frame to be perfect in order to get an idea of how to save it. And finally, it’s just a lot easier to optimize both ends of a one-way transmission system than it is to build antennas that must receive and transmit symmetrically.

And that’s why [Befinitiv] wrote wifibroadcast: to give his WiFi FPV video system some of the virtues of analog broadcast.

In particular, two Raspberry Pis combined with WiFi radios that can be put into monitor mode enable him to custom tailor the packets that get sent, allowing his rig to sidestep WiFi’s acknowledgment scheme, add in a custom retransmission routine that helps limit lost packets, and even allow multiple receivers to listen in to the same signal so that a diversity reception scheme could be implemented.

If you’re flying WiFi video on planes or ‘copters, you should give wifibroadcast a look, and check out [Befinitiv]’s 3km WiFi video trial for inspiration.

23 thoughts on “Wifibroadcast Makes WiFi FPV Video More Like Analog

    1. Yes but with an RX antenna you don’t need to be concerned about the antenna reflecting power back into itself. On HF some of the best RX antennas are terrible TX antennas(long wire). I don’t know of any examples of microwave antennas that work better for RX and TX but there may be some.

      On an RX only antenna you can be less concerned about getting a perfect match (low SWR) and focus on getting more gain.

      1. In all cases one desires ideal power transfer and thus good conjugate impedance matching (low SWR) on receiving AND transmitting side. The only main thing that may differ is the radiation pattern. On tx side an antenna with an isotropic pattern is desired, the cloverleaf being one that approaches this the most. On rx side a directional antenna is desired depending on your motivation in pointing the thing.

  1. Looks like the video encoding end uses mostly default settings, which will not suit all use cases. One can get more out of lossy transport stream by using MPEG Data Partitioning and tweaking of P/B frame amounts plus other encoding settings.

    1. I believe you’re right. [Befinitiv] did mention upping the key-frame rate, which makes the video recover faster after a big glitch, but otherwise I don’t think he’s co-optimizing video encoding/transmission parameters with the network stuff.

      If you know a bunch about this, and want to help out, I’m sure that the project can use you.

    1. I know of a project that did exactly this for the control channel (from RC transmitter to copter).

      One of the main advantages of both of these projects is that you can use radio modules that sell in millions instead of RC-specific products, which means lower costs and higher quality.

        1. We will need progressive h264 encoder processing video as it comes in, and producing output data every 4+? lines (4×4 quantization block in h264 + some buffer for motion vector) to get latency comparable to Analog systems. Current encoders deal in whole frames, so you first wait whole frame for the camera picture buffer to fill, then another frame + encoder processing time, then you send that thing (size/link speed), then again you decode whole thing all at once adding even more delay.

          Afair OnLive, the failed very stupid idea of remote game clients, had some solutions (and probably stupid obvious software patents) dealing with progressive encoding/decoding of video stream.

          btw 120-150ms is about the same latency you get out of Mobius (or Gopro) if you use it for FPV and record video at the same time. Its not terrible, its just too high for acrobatics and racing, but perfectly fine for strolls over the park or what have you.

          1. in 60fps, this is not a big deal as 1s/60 = only 16ms, so waiting for the whole frame adds only 6ms. Actually, if you would be displaying in async mode, you would se probably quite bad tearing in the video as you on your PC if you disable V-sync. I believe the main root cause for the latency is the nature of h264. I believe that MJPEG or h264 with key frame (which is very simmilar to MJPEG) in every frame would be much better.

    1. With normal Ethernet, you can cut the RX wires in the cable and have an TX only cable going into a secure logging only server. Because there is no possibility in retransmitting faulty packets, UDP must be used with this configuration. TCP would never be able to create a socket connection, because the three way handshake would (SYN,SYN-ACK,ACK) timeout everytime on the second part of the handshake
      http://www.inetdaemon.com/img/internet/3-way-handshake.gif.

      UDP is required, but putting WiFi chip into injection mode is not quite as simple as snipping a differential pair of wires in a cat5/cat6 cable.

    2. It’s lower level than that – you don’t even connect to a network. So it’s more like the beacon packets that allow you to see what networks are available, but used to carry arbitrary data instead.

  2. Excellent stuff, and I think everybody owes a big thank you to Befinitiv. Not just for hacking this together but also for not going all protective and shit like that.
    Oh and making clear instructions too, imagine that.
    It all seems so normal, but we know it isn’t when we think about it.

  3. Might be worth looking at the new HD over coax video solutions in the security market which are based on NTSC signals and doing up to 1080p currently over typical analogue coax cables at the same bandwidth (or there abouts) as NTSC.

  4. There is actually a standard 802.11 mode for this, it is called OCB (Outside Context of BSS). It turns 802.11 into a simple broadcast radio. Support is in the linux kernel.

Leave a Reply to Danny BokmaCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.