Skip to content

(True) Unidirectional Wifi broadcasting of video data for FPV

January 25, 2015

This post shows how to broadcast data over 802.11 devices with a true unidirectional data flow. No need to be associated to a network (thus no risk of being disassociated) and no acknowledgements are sent from receiver to transmitter. Please note that this is ongoing work and just a proof of concept.

EDIT: For a complete overview, refer to this page:


My plan for this spring: Fly FPV! There are already a lot devices on the market for transmitting analog video signals over 2.4GHz or 5.8GHz. But to me that always seemed a bit outdated. Of cause, several people thought that too and tried to send their video data over Wifi. A good example for doing so:

“Sparky flight” took a RaspberryPi, encoded the video stream as h264 and sent it over Wifi to a PC. He was able to get a latency of down to 85ms glass-to-glass, which is quite nice! But: All Wifi solutions have the same problem: When you loose your Wifi connection, you are immediately blind. That is of cause not acceptable in a FPV setup. This is where the old analog link has a really big advantage. When you are getting out of range, the signal slowly degrades. You would have still time to react and turn your plane around.


So I thought: Wouldn’t it be possible to archive the same advantage of a slowly degrading link over Wifi? Not out of the box but I’ll show you how.


The basic approach is: The video transmitter sends its data as a true broadcaster into the air. Regardless of who is listening. The receiver listens all the time and when he is in range of the transmitter he’ll receive the data. When he starts getting out of range he’ll eventually receive not all packets but still some. This behaviour is comparable to that of an analog signal path.

The main problem is that the Wifi standard does not support such a mode. Devices always need to know to whom they are sending their data. This relationship is created by the “Wifi association”. If your PC is associated with your router, both devices know to whom they are talking. One of the reasons for this association is to make the data transfer more reliable. A receiver of a packet always acknowledges the reception to the transmitter. If no acknowledgement has been received, the transmitter has the chance to re-transmit the packet. Once they loose their association, the cannot exchange data anymore.

The Wifi ad-hoc mode comes pretty close to an unassociated “broadcast style” way of transmitting data. Unfortunately, it seems as if modern 802.11 standards aren’t supporting ad-hoc mode anymore. If you buy a  802.11ac card and put it into ad-hoc mode, it most likely falls down to 11MBPS. It is not required by the standard to support ac rates in ad-hoc mode 😦


To solve this issue I wrote two small programs that serve as a raw transmitter and raw receiver. They are pretty much hacked together out of a program called “packetspammer”


You find my programs here:


After compiling the sources with “make”, you’ll have two programs called “tx” and “rx”. Both take as a mandatory argument a wlan interface that has been put into monitor mode. The tx program reads data over stdin and sends it with a raw 802.11 packet into the air. On the other side the rx program listens on a device and outputs received data to stdout. The packets of the transmitter are recognized by their fake MAC address (13:22:33:44:55:66). The packets only contain a valid 802.11 header (so that they are not rejected by the wifi card). The rest of the packet is filled with raw data.
Following is an example on how to use the programs:



sudo ifconfig wlan0 down 
sudo iwconfig wlan0 mode monitor 
sudo iwconfig wlan0 channel 1 
sudo ifconfig wlan0 up
sudo ./rx mon0


sudo ifconfig wlan0 down
sudo iwconfig wlan0 mode monitor
sudo iwconfig wlan0 channel 1
sudo iwconfig wlan0 rate 54M
sudo ifconfig wlan0 up
sudo ./tx mon0

Everything you’ll type into the tx shell should now appear on the rx shell. The tx program also supports as parameters the maximum length of the packets and the number of retransmissions. A retransmission rate of 3 for example will cause the transmitter to transmit each packet three times. This increases the chances that the receiver receives one of them correctly. To avoid that the data is being received 3 times each packet contains a 32bit sequence number. If a sequence number is received more than one time the subsequent packages with identical sequence numbers will be ignored. I admid that this type of redundancy is rather stupid. The problem is that most Wifi cards discard packets with wrong FCS (frame check sequence) completely. So the more classical approaches of redundancy (hamming-codes, …) are not so easy to use. There is definetely still some work to do here! Feel free to participate 🙂


Writing text over a true broadcast-connection is nice. But how about video? Actually it is really simple. GStreamer is a nice tool for this purpose. My test-setup looks as follows:


(usb webcam) <—> (raspberry pi) <—> (wifi dongle)  ~~~~airgap~~~~> (wifi dongle)<—>(PC)


On the raspberry pi I execute:

gst-launch-1.0 -v v4l2src ! 'video/x-raw, width=640, height=480, framerate=30/1' !  omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0


In words: Receive raw video from V4L2, transform it into 640×480 30fps, encode it as h264 with 500kbit/s and increased keyframerate (this helps if packets get dropped), directly write the video data (without container) to stdout. The video data is then piped into the tx program with two retransmissions and a packet length of 1024 bytes.




And to receive and display it on my PC:

sudo ./rx wlan0 | gst-launch-1.0 fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false


In words: Receive data on interface wlan0 and pipe it into gstreamer. Gstreamer receives the data, parses the raw h264 data, decodes it and displays the video.





The video quality is ok, maybe a bit too low for actual flying. In the settings above the latency is quite ok, maybe between 100 and 200ms. I noticed that when I increased the encoder bitrate that the latency was bigger. But I still need to look into that. I think by using the original raspi-cam it should be possible to archive the ~100ms of the “Sparky Flight” guy.

Dropped packets turn out to behave as expected. The video image is partly disturbed but continues to run. A rough estimate: Up to a loss of 5-10% of the packets the video should still be usable to rescue your plane. See below an example of a transmitted video with a packet loss rate of approximately 2.5%:

Unfortunately I wasn’t able to change the power of the transmitted packets. There is a field in the radiotap header which I have set but it seems to be ignored. Otherwise my solution would be perfect for FPV. You could (as a Bolivian of cause 😉 ) buy one of these cheap Wifi cards with 1W output power and have an extremely cheap long distance video link (and since this is a true unidirectional link you would only need a single high power card in your plane). You could also use the Raspi to gather and transfer GPS information or battery capacity live. Of cause this would then be realized as a side-channel and written into the image on the receiving device. In contrast to those (shitty) analog devices that write directly onto the transmitted image…

If you are interested in participating, please share your experiences in the comments, take my code, modify it, improve it 🙂 My gut feeling is that there are only little things left to do for having a true digital (possibly HD) FPV system with low cost equipment.

My next step: Make some range experiments…


From → Uncategorized

  1. Hi befinitiv,

    I’ve had a quick read of your post and it looks interesting. A have a couple of initial thoughts.
    1. How does it handle a 720p HD stream. Finding a low latency HD solution is my main aim. For an SD stream, which would include 640×480, current analogue systems still offer the lowest latency solution.
    2. Have you though of using MJPEG for the stream. As each frame would be complete you wouldn’t need to send them three times.

    I’ll try to get it compiled and tested in the next couple of weeks.


  2. Hi Gary

    At the moment I do not have a camera that supports higher resolutions. I think I’ll buy a Raspberry camera shortly. You are right, concerning latency we won’t win against analog SD. But I like the idea of using the Raspberry also for telemetry.

    MJPEG might be worth trying. Do you have experience with that format in terms of bandwidth required compared to h264?

    It seems as if you have already used your setup outside of your lab. What kind of 802.11 hardware do you use for that? (B, G, N or AC?) What is your experience of the point where you get out of range? Does the video stream break apruply or are you warned by errors in the transmission?

    Best regards,

  3. Andy permalink

    Hey befinitiv

    very interesting article. Since a few days I can’t get rid of the idea doing FPV over Wifi and I also had this idea of doing a unidirectional broadcast. So I found your page 😉 I think your right that your are getting better results with the original Raspberry Cam (see ). Please tell me your results of the range test that would be very interesting.

    This is also a very nice video for that topic:

    Best regards

  4. Very nice progress! Have been thinking of this for a few years now after reading this article (2009) 🙂

    All my efforts for the last months have been on AvrMiniCopter.

    Let me know if you need any assistance. I’m really looking to get HD video

    • Hi Gregory

      You are right, the idea is basically the same as in the post. Speaking of assistance: Do you have any USB wifi dongles laying around? If so, maybe you could measure the injection rate of them by using my tx tool (it reports the rate). To do so just execute “cat /dev/urandom | sudo ./tx -f 1024 wlan0”. Try to do that on a not so busy channel because busy channels will lower the injection rate. You should also try to set the cards tx rate to 54M “sudo iwconfig wlan0 rate 54M” and see with a second card if you are really sending at 54M. Many drivers ignore that setting 😦 I have a post in preparation where I’ve tested several cards using that method. But I only have a few models available. So assistance in that respect would be great!

      Best regards,

      • Gregory permalink

        Hi befinitiv,

        Sorry for the delay. I’ve just tried it and i can send some data but it stops after a second:
        Raw data transmitter (c) 2015 befinitiv GPL2
        Trouble injecting packet#

        What could be the cause?


        You might want to send me a response on my email: g r e g d 7 2 0 0 2 (at) g m a i l . com

    • Hi Gregory

      Do you know the chipset of your card? Was it in monitor mode when you made your test? Some network managers like to fiddle around with that so you have to make shure that they are disabled.


  5. Hi befinitiv

    Great work here! Have you tried receiving the video in mobile device like iPhone or iPad? Is that possible?


  6. Roberto Sale permalink

    Hi Befinitive, thanks for your help!

    When I use sudo ./tx -r 2 -f 1024 wlan0
    I can see the keyboard input on my RX,previously putting the interface in monitor mode, of course.
    But when I use:

    gst-launch-1.0 -v v4l2src ! ‘video/x-raw, width=640, height=480, framerate=30/1′ ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0

    returns this:

    ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
    Additional debug info:
    gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
    streaming task paused, reason error (-5)
    Trouble injecting packet: send: No such device or address

    And the wlan0 stop being a monitor, and becomes a managed automatically when I intro that command!

    Can you help me please? I’m using Raspbian in an rp2, and a rtl8187 wireless adapter and a C270 webcam.


    • Roberto Sale permalink

      This command works ok, it returns the streaming in raw:
      gst-launch-1.0 -v v4l2src ! ‘video/x-raw, width=640, height=480, framerate=30/1′ ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink

      sudo ./tx wlan0
      Works ok, but together they gave me the previous post error :/

      • Strange! Did you try to kill the ifplugd process? This one usually is responsible for changing the interface back to managed mode. You might also take a look at dmesg. Some Raspis have problems providing enough power to a wifi card. If you see there that the card did make a “reconnect” you should try using an external powered hub. One of my commenters also noted that you can increase the current capabilities via a config file under /boot. But since I only have the A+ I don’t know nothing more about that 😉

        Please note that the RTL8187 performs very poor using wifibroadcast (refert to ). At best you would archive a throughput of 500kbps with that chip.

  7. Roberto Sale permalink

    Thanks befinitiv again! Today I bought an TP-Link WN722N, and get the link working. With the 8187 I needed to make iwconfig wlan0 rate 54M. That was the former problem.
    With the TP-Link works flawlessly. The only problem I have, is latency. I think is in the order of 400ms. Does the receiver wifi adapter have something to do with that?
    Thanks for everything!

  8. Constantin permalink

    First ( I think I already heard about it,but I’m not sure )
    I played a littlebit with hello_camera (an example with OMax to controll the camera) and I was surprised how similar the code looked like a gstreamer code (pipes,pads).
    In Gstreamer this funktionality is available , does someone know if the following exists in OMax: an pad which changes your fps (f.e. takes 30fps input and makes 60fps output) because you could simply put this between the camera source and the encoder and get higher fps which leads to lower latency; of course, the image data wouldn’t be more ( f.e the stream would contain each frame two times) but latency is nearly everything 🙂

  9. Constantin permalink

    I wanted to say thank you for wifibroadcast,it definitely makes the link more reliable for long range fpv.
    I was using the rpi hardware for 1/2 year before, with a normal network wifi hotspot link, and I did a lot of research and testing about latency I want to share here,too.
    1) H.264 has 3 types of frames: I,P and B frames. The problem are any B-frames ( bidirectionally predictive-coded frames ) as the not only refer to “past” frames (like I and P) but “future” frames,too. To encode/decode them you have to wait for more frames coming in, which increases latency. Fortunately,the baseline standart doesn’t include B-frames, so I compared the latency using pf=high and pf=baseline. But,surprisingly it didn’t make any difference ! The reason: allthough using high profile the rpi encoder produces not any B-frames, not according to the standart; and so do the most mobile phone h.264 encoders. Result: on rpi always use pf=high, even without B-frames it is more effective in reducing data than pf=baseline. With webcams it may be different.
    2) higher fps => lower latency . this seems to be true for all resolutions and is easy to explain: The encoder buffers a specific amount of frames, the decoder,too. Overall latency is number of framebuffers * 1000ms/fps + latency by network buffers etc.
    But,watch out: the rpi is only capable of handling 49fps on 720p, if you use 720p 60fps it will use the resolution which matches with this fps (600*480 in this case) and upscale it.
    The best to use (in my opinion) 720p 49fps and 600*480 90fps
    3) how to reduce the number of frames the encoder buffers: none (hardware);
    the number of frames the decoder buffers: = number of threads the decoder uses (only on a desctop pc,where encoding is done in the cpu) this information is according to an articel i found on intel developer; but i found no way to controll the number of threads in the gstreamer decoder (it simply ignores the parameter)

    4) latency testing using a rpi connected via ethernet to my 3years old laptop:
    1.1) 720p, 49fps, pf high,Bitrate variabel
    1Mbit/s ~110ms
    2Mbit/s~110 ms
    7Mbit/s ~100ms bis 9SEKUNDEN

    1.2) 49fps, 3Mbit/s , variable resolution
    720p ~90-170ms
    600*480 ~ 90-140 ms

    1.3) var. Fps, 3Mbit/s mit 640*480 und 720p
    640*480 25fps 160ms
    640*480 35fps 160ms
    640*480 60fps 100ms
    640*480 90fps 90ms ZIEMLICH KONSTANT
    720P 25fps 170ms
    35fps 150ms
    49fps 130ms

    1.4) variable intra refresh period 1280*720 3Mbit/s pf=high
    -g =1 1.5Sekunden
    -g=10 90ms-170ms
    -g=100 stark schwanken 90ms-170ms

    1.5) 720p pf high und pf baseline
    720p 25fps baseline 170ms high 170ms
    720p 49fps baseline 80-170ms high 90-170ms

    Lg from Germany

  10. Manuel permalink

    Thanks for that great article and source code. I was thinking about enhancing the setup a bit: What do you think about using several receiver stations wired together by ethernet. Maybe each using a different kind of antenna. This should greatly increase the chance of receiving the packets on at least one receiver. Since you already have a filter for duplicate packets, this should be an easy extension.

    • Hi Manuel

      This is already supported. You can connect up to 8 USB wifi dongles to your receiving computer and they’ll all work in parallel. I use this feature sometimes with two dongles, one with an omni antenna and the other with a double biquad antenna.

  11. Hello,

    I am trying to use your wifibroadcast tool for a robotics project. I have a PS3Eye webcam which is only capable of streaming YUY2. So I have used the following command to stream:

    gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,format=\(fourcc)YUY2,width=640,height=480 ! ffmpegcolorspace ! x264enc tune=zerolatency byte-stream=true bitrate=5000 ! h264parse ! fdsink | ./tx -b 8 -r 4 -f 1024 wlan1

    And for receieve:
    ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false

    Both sides are a notebook both equipped with Intel Wifi cards (Tx: Taylor Peak, Rx: Condor Peak).

    I experience huge amount of lagg in the system (typically 4-5 sec).

    I have did the following experiment to see how much delay is added by the encoding and it was not relevant (few hundred ms):
    gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,format=\(fourcc)YUY2,width=640,height=480 ! ffmpegcolorspace ! x264enc tune=zerolatency byte-stream=true bitrate=5000 ! h264parse ! fdsink | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false

    My questions would be the following: where should I start the troubleshooting?

    Both notebooks has internal antennas (omnidirectional AFAIK) and lying on the same desk.

    The output of the rx shows:
    New clock: GstSystemClock
    Signal (card 0): -26dBm
    Signal (card 0): -27dBm
    Signal (card 0): -27dBm

    and similar results. Are these results bad/good?
    Thank you for your help in advance!

    • Hi

      The most common causes for lag are:

      1) Occupied channel: The channel is so busy that the WIFI cards cannot send the needed data rate. This way the data queues up and introduces lag.
      2) Bad injection rate of the wifi cards: I my tests I have seen several cards with bad injection rates. You can measure the rate by using the TX program and try to inject as fast as possible (using a big file, “yes” or the like).
      3) Insufficient processing power

  12. David Posea permalink

    Great work, but i have problems with the fifo mode. You have to start tx first to create the pipes, then two programs to send data to them. For me tx fails to connect on fifo0. Also can’t change the OSD data packet_size, blocks, fecs, etc. I’m going to write a new fifo control program , probably that only deals with the most common, need two ports for osd data. Do you have more thoughts on the subject? I certainly don’t mind doing the work, everyone is going to want mavlink iorking with thier vehicle. Should I fork and exec a new tx process to handle the osd data? I should havie something to send to you in a day or two, full mavlink implementation, even sending commands back to the vehicle. That won’t each much bandwidth at all.

  13. click here for pay day loan review

Trackbacks & Pingbacks

  1. Finding the right WIFI dongle (And patching its kernel driver and firmware) | befinitiv
  2. Improved retransmission scheme and multiplexing | befinitiv
  3. FPV range and latency test in a noisy environment | befinitiv
  4. FPV range test in a park | befinitiv
  5. Details about my flight setup and first tests of wifibroadcast in the air | befinitiv
  6. Porting wifibroadcast to Android | befinitiv
  7. Using Raspberry as a display device for FPV video received via wifibroadcast | befinitiv

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: