Skip to content

Wifibroadcast – Analog-like transmission of live video data

Wifibroadcast is a project aimed at the live transmission of HD video (and other) data using wifi radios. One prominent use case is to transmit camera images for a first person view (FPV) of remote controlled aircrafts.
In contrast to a normal wifi connection wifibroadcast tries to mimic the advantageous properties of an analog link (like graceful signal degradation, unidirectional data flow, no association between devices).
Note: Before using wifibroadcast you have to check if the regulatories of your country allow such a use of wifi hardware.

If you like the Wifibroadcast project please consider to donate. This would help me to buy additional WIFI hardware so that the hardware support can be extended.

Buy Now Button

The following video shows what you can expect from wifibroadcast. It has been recorded on the ground and also shows some transmission errors due to blocked line of sight.

Why normal wifi is a bad choice for FPV applications

  • Association: Video transmitter and receiver need to be associated. If one device looses association (for example due to too weak signal strength) the video transmission stops instantly.
  • Error-free transmission: Wifi transmits either data that is correct or no data. In an FPV scenario this means that even if you received data with just small errors it would be rejected completely. This could result in stalling video although you have received useful data.
  • Two-way communication: Even if you are sending data only from source to sink a bi-directional data flow is required using wifi. The reason for this is that a wifi receiver needs to acknowledge the received packets. If the transmitter receives no acknowledgements it will drop the association. Therefore, you would need equally strong transmitters and antennas both on the aircraft and on the ground station. A setup with a strong transmitter in the air using an omnidirectional antenna and a weak device on the ground using a high-gain antenna is not possible with normal wifi.
  • Rate control: Normal wifi connections switch automatically to a lower transmission rate if signal strength is too weak. Due to this it is possible that the (automatically) selected rate is too low to transfer the video data. This way the data would queue up and introduce an unpredictable latency that can be up to several seconds.
  • One to one transfers: Unless you use broadcast frames or similar techniques a normal wifi data flow is a one to one connection. A scenario where a bystander just locks onto your “channel” as in analog video transmission to watch your stream is not easy to accomplish using traditional wifi.
  • Limited diversity: Normal wifi limits you to the number of diversity streams that your wifi card offers.

What wifibroadcast makes different

Wifibroadcast puts the wifi cards into monitor mode. This mode allows to send and receive arbitrary packets without association. Additionally, it is also possible to receive erroneous frames (where the checksum does not match). This way a true unidirectional connection is established which mimics the advantageous properties of an analog link. Those are:

  • The transmitter sends its data regardless of any associated receivers. Thus there is no risk of sudden video stall due to the loss of association
  • The receiver receives video as long as it is in range of the transmitter. If it gets slowly out of range the video quality degrades but does not stall. Even if frames are erroneous they will be displayed instead of being rejected.
  • The traditional scheme “single broadcaster – multiple receivers” works out of the box. If bystanders want to watch the video stream with their devices they just have to “switch to the right channel”
  • Wifibroadcast allows you to use several low cost receivers in parallel and combine their data to increase probability of correct data reception. This so-called software diversity allows you to use identical receivers to improve relieability as well as complementary receivers (think of one receiver with an omnidirectional antenna covering 360° and several directional antennas for high distance all working in parallel)
  • Wifibroadcast uses Forward Error Correction to archive a high reliability at low bandwidth requirements. It is able to repair lost or corrupted packets at the receiver.

What hardware is required?

A typical setup looks like this:


GNU/Linux computer

The choice of the embedded computer is free. However, Raspberry Pi’s are highly recommended due to the low price and easy camera integration. Normal PCs running GNU/Linux are also supported. An experimental port to Android has been shown here.

Wifi card

Not all wifi cards are compatible with wifibroadcast. This is because wifibroadcast uses injection mode which not fully supported by many wifi chipsets. Tests have shown that the ATHEROS AR9172 delivers good performance under wifibroadcast for 2.4GHz. You can find a list of wifi cards using this chip here.
For 5GHz operation Ralink RT5572 based devices are recommended since they work out of the box. The CSL-300 (with two antenna connectors!) and TP-LINK TL-WDN3200 are based on this chips (for more cards refer to here).

The following 2.4GHz cards have been successfully tested:

  • TP-LINK TL-WN722N: This card delivers 19dBm output power and is affordable (9€).
  • Alfa AWUS036NHA: This card is more expensive than the TP-LINK but it includes a 30dBm power amplifier. Using this card on the aircraft should give you a better range.

The following 5GHz cards have been successfully tested:

  • CSL-300: This is the recommended 5GHz RX card since it provides external antenna connectors (Only the model with two antenna connectors!)
  • ALFA AWUS051NH v2: This is the recommended 5GHz TX card since it provides better range when used as TX compared to the CSL-300.
  • TP-LINK TL-WDN3200: Wifibroadcast works also with this card. Note, however, that due to the missing external antenna connectors the maximum distance of a video link is limited.

Using Wifibroadcast for FPV

For a basic setup you need two Raspberry PIs, one Raspberry PI camera, a HDMI display and two compatible WIFI sticks.

For your convenience, there is a prebuild image available for your Raspberry PIs. You can download it here:
Put the onto two SD cards (as explained here).

Connect the camera and one WIFI stick to your TX Raspberry PI, insert the SD card and power it up. After ~30s you should see the LED of the WIFI card blink (this is only true for Atheros cards). This means that you are transmitting your video!

On the RX side, connect display and WIFI stick to the RX Raspberry PI, insert the SD card and power it up. When the boot is finished you should see the live image of the camera on the screen.

If you want to use an already existing Raspberry PI image, you can read here how to install everything manually. You’ll find there also information on how to record the video to a thumb drive.


  • My 2.4GHz RC disturbs the video transmission You could use 5GHz WIFI equipment. Or if you are brave you can modify your RC system.
  • What is the recommended hardware? For the transmitter Raspberry PI B,B+,A+,B2 can be used. The receiver can also use all these devices in case of a single RX WIFI card. If you want to use a diversity setup with bit-rates higher than 3MBPS then the USB part of the first generation of Raspberry PIs is a bottleneck. The Raspberry PI 2 however is more than capable of handling two RX WIFI receivers at high bit-rates.
  • What max. range can be expected? Tests showed that you can expect more than 3km +-30° (horizontally and vertically) on 2.4GHz with a double biquad antenna. If you use dipole antennas you can reach up to 900m with a 360° (horiz) coverage
  • What about interferece with other networks? The WIFI cards still obeys to the standard wifi CSMA/CA (collision avoidance). This means that using wifibroadcast will use free slots in the channel and does not disrupt other users (at least not more than someone transferring the same amount of data using normal wifi). As said above it is up to the user to check the regulatories of his country.
  • If I have packet drops the image is disturbed too long until it recovers. Try to decrease the interval between the h264 key-frames. You can do this by using the “-g” option of raspivid.
  • What can I do to make reception more robust? Several options depending on the source of the problem: Switch to a less frequented channel, use software diversity, use better antennas, increase the number of FEC packets.
  • How do I transmit telemetry data in parallel to the video stream? Refer to this blogpost.
  • My latency is high and seems to be increasing over time This is usually caused by insufficient throughput at the tx side. Try to switch to a less frequented channel or lower the data rate and number of FEC packets.

Other resources

  1. Jozsef Voros permalink

    I really like the idea, thank you very much for sharing it. As FPV pilot, I can’t wait for an affordable HD digital downlink.
    The wifi hw (considering the billion units) is probably one of the most advanced hw for digital transmission, I agree the shortcoming for FPV is the protocol, because it optimised for different applications. Using a proprietary protocol it became similar to dvb (-c, -t) broadcasting, but on legal frequencies.
    I have several questions (I am EE but not familiar with the wifi system):
    – do you consider FEC? (dvb uses multiple level FEC with good results)
    – not sure how deep the proprietary protocol is, I mean it really uses the bare metal, or there is still a low level firmware layer in between your code and the metal? (if there is such a low level firmware layer, is it using FEC?)
    – your solution is strictly one way, or there is room for a slower uplink as well? (orders for the autopilot, info for trade on quality vs reliability in case of higher packet loss etc)
    – do you think the DJI Lightbridge is a similar solution?

    Bst regard,

    • Hi Jozsef

      Thanks for sharing your ideas! To answer your questions:

      FEC: The system uses 1/2 FEC. This is the standard method also used by wifi (refer to index 3). On top of this I also implemented dumb retransmission. At that point there is still some room for improvement.

      The wifi stack was only mildly modified by me. I patched the (existing) firmware of the wifi card to always send my packets at the desired rate. Additionally I patched the kernel drivers to set the TX power to a fixed level. With my adaptations you could still use the wifi card on most normal wifi networks.

      The solution is not strictly one way. You can easily send data in both ways. However, the main idea of the project is that you are able to setup a unidirectional link (in contrast to normal wifi). This way you can use an omni antenna on the quad with high tx power and a high gain antenna on the ground with no tx power. But you are free to change the antenna setup to your needs.

      • Hi, I am excited to find out about this project, I saw a post on DYIDrones that led me here.
        I really enjoy FPV flying and have been doing this with typical 600 line scan cameras for 3 years, more recently taping the video from a go pro.

        I have been wanting HD for my googles for a long time and this looks fantastic.
        Although I currently have fatshark googles, with of HD, I will need HD ready googles.

        I am a nubee at working with the Rasberry and the languages and development environment, but I aim to dive in and learn to do what I need to do to get a system working.

        I am ready to buy all the components and hope someone can give me advice on my purchases.
        I have seen a Rasberry PI with a kit, that includes some distributions, and a usb connection cable for my computer. I prefer using a MAC but also have a PC, which is best for development? I’m guessing that there is a compiler and other software pieces I will need.

        Any advice is very welcome and I hope my questions are not too annoying.



      • Alan permalink


        Can we use wifibroadcast with this raspberry pi Zero for $5,00?
        1Ghz core, 512 MB, Mini HDMI, micro USB but no PiCam CSI port.
        Maybe with USB hub and USB cam?
        look at:



      • jholster permalink

        Alan, it’s definitely possible to use Raspberry Pi Zero, but lack of CSI port makes it less ideal. Zero has only one usb port, so camera + wifi requires a hub, eliminating the weight/size saving of Zero. I would rather use A+ (connectors desoldered if weight is critical). Also the rpi cam works out-of-the-box with rpi’s gpu h264 encoding, which is mandatory for low-latency. It’s really pity that Zero does not have CSI port.

        Good alternative for official Rpi is odroid-w, fully compliant rpi-clone with CSI port made. I’m personally using two of these in my fixed wing fpv project. Note: only one batch was made, they will probably soon run out of stock.

  2. Nicolas permalink

    Very nice. Do you have a recommendation for a high power USB wireless adapter? Have you thought about adding some sort of FEC coding with interleaving? I think that should make the video feed more robust. There is a tool called udpcast that does that on the application layer, maybe it can be used as a starting point (?)

    • Thanks! As a high power adapter I would recommend this one: Alfa AWUS036NHA . This is the one I currently have on my quad. This adapter includes a 1W amplifier. The transmission uses a 1/2 FEC at the wifi level. My software additionally does plain retransmissions. And you are right, there is potential for improvement at this point.

      • Nicolas permalink

        Hi, thanks for your reply.

        have been doing more reading on your site about wifibroadcast, the more I read, the more I like it 🙂

        Sorry, did not realize that you already have a basic re-transmission algorithm implemented.

        I have a question about how the rx application works: What is the output of it?
        Packets? Or a “raw” video stream? I am asking, because I’m wondering if it would be possible to change the rx application in such a way, that it puts out valid (i.e. with header checksum re-calculated in case of bit-errors) packets? This would allow to distribute the video stream to another device, so that one can build a groundstation RPi with big antennas and then relay that video over another 5Ghz wifi link to the goggles.

        Regarding other wifi network’s beacons and traffic interfering:
        It seems that Atheros chipsets can also use custom frequencies from 2312 to 2732Mhz. That would allow HAMs to use the 2300-2400Mhz HAM range and normal people to use 2487Mhz (still inside the legal ISM band, but 25Mhz away from standard wifi channels 1-11).

        Here is more info on that topic:

      • Huh, the frequency changing idea is very interesting! Yet another point why I like the atheros chipset even more. I will definitely look into that.

        To your rx question: The rx and tx programs behave like a (unrelieable) pipe. So in theory everything you pipe into tx should be outputted by rx. You can even try this by running rx and tx alone and type into the tx shell. Then that entered line (with a latency of one line) should appear on the rx-side. This can be used for any data you like. My quad for example sends its video data by piping the raspivid output into the tx program. Additionally, a second tx instance is running that transmits the FrSky telemetry. It’s as simple as this “cat /dev/ttyUSB0 | tx wlan0”. Your bridge shouldn’t be that hard to realize.

      • Nicolas permalink

        Thanks for the explanation. So I guess I could just pipe the output of the rx application to (e.g.) netcat and then receive that somewhere else. I’ll find a way to do that, great 🙂

        Regarding FEC and interleaving (sorry to bother you again about it … :))

        Your explanation about the rx and tx application reading data from stdin on the tx side and sending it out to stdout on the rx side got me thinking again of udpcast. I didn’t realize it before, but udpcast appears do be doing exactly this when using async mode (used for sattellite links) with streaming. Just not using raw frames over a monitor interface, it needs the “normal” operating system functions to create sockets etc.

        Like you already said, there is still a lot room for improvement in that area. But I guess that is far from being trivial to implement. When looking at udpcast, it seems the FEC and interleaving in it seems pretty advanced and is also fully configurable. I think the only thing that needs to be done is to find a way to make it send and receive raw frames over a monitor interface, and voila, we have a full-blown FEC/interleaving implementation.

        I think there are two ways to achieve this:

        – Patch udpcast so that it sends/receives raw frames over a monitor interface. It already has support for multicast, unicast and broadcast. I guess it wouldn’t bee too much effort to adapt the function that sends broadcasts.

        – Patch kernel/drivers/whateverelse to create a wlan interface that looks like a “normal” (non-monitor) interface to applications but has acks/re-transmits and the association logic disabled like a monitor interface. Not sure if that is possible at all and how much effort it is. The big advantage would be, that it could be used for other traffic types as well, like logging in via ssh or transferring whatever data to or from the copter.

        – Make some kind of virtual wrapper-interface using tun/tap or something. So that udpcast speaks to the wrapper-interface that behaves like a normal network interface. The wrapper-interface will then speak to the atheros monitor-mode interface. Also not sure if possible at all and how much effort.

        I guess I’ll need to do some tests to see what happens when piping the output of raspivid to udpcast udp-sender on the tx side and then piping the output of udpcast udp-receiver to hello_video or gstreamer on the rx side. If that works nicely with a videostream over a “normal” network interface, I’d say using udpcast would be definetely the way to go.

        What do you think?

      • Frank permalink

        Hi befinitiv,

        Do you know if it is possible to run this virtual on a windows machine and a debian disto with a usb filter on virtualbox?


      • I never tested that but I don’t see a reason why this would fail. If you know more please let us know 🙂

      • “Do you know if it is possible to run this virtual on a windows machine and a debian disto with a usb filter on virtualbox?”

        Yes, but not with TL-WN722 nor other ath9k-htc adapters.

      • frank permalink

        I have do some tests i on windows 7 with virtualbox and debian with lxd, including the guest additions and A tl-wn722 adaptor.
        when i run it with ./rx -b 8 wlan0 | mplayer -fps 60 -cache 1024 –
        I get the messages:
        ‘no bind found for key ‘w’ (where ‘w’ is random character)
        but is keep running but with no picture.

        the led on the tl-wn722 is flashing rapidly,
        so the card is working, but no picture yet.

        when i do the same but with
        ./rx -b 8 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
        i get :
        could not initialise Xv output
        no Xv Port available

        any one tips?


      • I know the Xv error from my Android experiments using xsdl. I got around it by using ximagesink. My memory is a bit faded on that one but I think you need to add a color conversion into the pipeline for getting xvideosink to work with the h264 decoder.

      • > including the guest additions and A tl-wn722 adaptor

        Wow, I used to think its impossible — usually ath9k-htc adapters can not be properly initial;ized in the guest OS, I believe there’s something to do with firmware upload.

        > could not initialise Xv output

        AFAIK there’s no Xv support in VirtualBox, xvinfo will tell you “no available adaptors”. That is because Xv needs 2D acceleration, and 2D acceleration in VirtualBox is only available for Windows guests (

  3. russell permalink

    Thank you for sharing your work. I tested it with a single wifi dongle and it is working great! I have a few questions: a. Using diversity, would the latency and range be improved in the field ? b. Is there a quick way to save the image to a SD card with your rx code, I assume using gstreamer and a tee off would work. Thanks again for your effort

    • Thanks! Happy to hear that things work well for you 🙂

      I have no field experiences with diversity. For sure I can say that latency is not affected by that feature (not better, not worse). Range might improve a bit but you should not expect miracles. I think the primary advantage is more robust live image. I’m waiting for next week-end to try it out 🙂

      Concerning video storage: I use ‘tee’ on my receiving raspberry and it works quite well. This of course saves just the raw h264 stream. You can find some scripts that help you with starting things automatically and converting raw h264 into a playable .avi here: (still under development)

      • Nicolas permalink

        Oh cool, you already have some startup scripts. When I have time, I’ll be putting together a stripped-down minibian image with wifibroadcast automatically starting after bootup so that people without linux experience can give this a try without too much hassle.

        I think it should be possible to put a configfile under the fat32 /boot partition, right? This way, people could make config changes easily by putting the SD card into their computer or phone/tablet and editing the config file.

      • Nicolas permalink

        Well, tried minibian now. It’s still way too big and bloated, not good.

        Then found buildroot, that looks very good for our purpose. Bootup is very fast, about 5 seconds and it’s only 30MB or so including everything. Running completely from ramdisk is also possible, should make things more reliable.

        Managed to get the wireless card, raspivid and the tx/rx applications working, but somehow raspivid is buggy, sometimes it doesn’t send the stream to stdout. File seems to work all the time. Not sure why that is, need to try another kernel or rpi-firmware maybe.

      • Wow, cool, 5 seconds sounds great. And I also like the ram-disk approach since I always fear a failing SD card!

  4. russell permalink

    Thank you for the link to the startup script. I just ordered more TP-link wifi dongle to try out your software diversity. Also, i did try your gstreamer script but could not get the pipe working on the RPi. I am going to play around with it and see if I can get something working and post it for others. Thanks again!

    • russell permalink

      Did a quick range test today, want to share with everyone that the TP-link wn-722n with the dipole antenna can easily reach 700 meter without any special setup and under not so ideal situation, not exactly line of sight and over a canyon with vegetation. One thing which is interesting is that the lost pixels always appeared on the lower half of the video, not sure why. Also, have anyone tried to streamed to gstreamer to store the image on the Rpi? I have not been able to get it to work yet. Thx

  5. demi4816 permalink

    Really cool project. I have followed the progress in low-cost WIFI-FPV with great interest and your approach seems quite reliable.
    Have you tried a link over a range-extender? My dream is to be able to fly very low but still with a great range. So one plane could carry a range extender and maintain line-of-sight with the ground station and a second quad/plane could do proximity-flight.

  6. Jaime Machuca permalink

    What would I need to use instead of raspivid If I want to stream from a different camera like the logitec c920?

    • Roberto Sale permalink

      Hi jamie. The command is in the first post of befinitiv:

      gst-launch-1.0 -v v4l2src ! ‘video/x-raw, width=640, height=480, framerate=30/1’ ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0

      I am actually trying to get this working. My latency is extremely high, using a logitech c270, and I don’t know why.
      Do anyone?

      • The latency from a c270 is due to it not being attached to the GPU (like the Pi Camera) so it has to do the compression on the CPU. This is a curse of all USB web cams, much to the users of RasPi CCTV system Motions dismay.

        The C920, by contrast, has an integrated hardware h264 encoder, and is supported by v4l and gstreamer ( leading to near-as-dammit 1080p30 real time encoding.

      • I tested the C920 but I wasn’t really impressed with the latency in h264 mode. After I noticed that it is much higher than the PIs latency I didn’t even bother to measure it. But I guess it was around 300-400ms.
        MJPEG mode on the other hand was excellent with respect to latency. But with (much) the higher data rate I guess we won’t gain anything…

  7. Justin permalink

    In your post you mention that you’ve setup the design to be unidirectional but it’s possible to use it bidirectional with other data streams. If you’ve got the TX resending data to use up the available bandwidth (“Dumb retransmission”) is it possible for the TX unit to receive too? I’d imagine you’d need some kind of duty cycle or CTS method of dividing the bandwidth between the TX and RX functions?
    I had been working on a dual computer octo with an on board RPi to enable more functionality, a much nicer OSD etc. but my sticking point thus far had been association with an AP at range (Although the Ubiquiti stuff isn’t too bad). By enabling it to be a high bandwidth, reliable 2-way link it would allow you send MAVLink to the UAV as well as receive decent video back (I’m assuming this is what 3DR have done with their yet-to-be-released Solo based on the range and specs). Imagine what would be possible having the WHOLE system open sourced and not locked down (Although in the case of the Solo and regular fliers it’s probably best to follow KISS protocol).

  8. Nicolas permalink

    Found some interesting read about wifi cards and their monitor and injection capabilities under linux:

    Click to access

    They managed to get injection with different Atheros and Ralink 2.4/5Ghz dualband cards working, there are also kernel patches and a library that allows to create a tap interface that looks like a normal interface to applications and then forwards the packets received to a wifi monitor inferface.

  9. Trailblazer permalink

    Hi ! My HD Link just started to work and I am really excited! Thank you for sharing this awesome project with us 🙂 If everything is running fine, the setup will be planted either on my TBS Discovery or my MPX Heron Sailplane. I think the new Headplay goggles would be a perfect fit for this Videolink. It has an inbuilt HD Screen (1280×720) with HDMI input. Unfortunately it is barely available.

    What do you think about a little OLED+buttons extension on the GPIO Pins for easy on-field Configuration. Maybe a serial Link to an Android Device (FTDI or BT) is feasible.

    Greets Andre

  10. DAGA permalink

    I wonder if 3D would be possible. With two raspberry Pi’s and two wn722n on the copter and two wn722n on one pi on the ground. The wn722n ofcourse are sending on different channels. Then you just have to display the two streams in side by side.

    Is this that simple or am I totally wrong?

    • Yes, I don’t see a reason why this would not work. I would suggest highest possible framerate or synchronized cameras to minimize dizzyness with movement.

      • DAGA permalink

        It would be cool if you can include a 3D possibility in your project as I dont feel confident enough to do it on my own.

        I really dont know what exactly is needed to achieve that. Would it be possible to recieve both streams via one wn722n? So you also can use two wn722n for diversity on the ground?!

        So the setup would be:

        -Two pi’s with a pi camera and a wn722n for both on the copter.(i think odroid w would be perfect)
        -One pi on the ground with 2, 3 or even 4 wn722n for diversity.

        This would be great. I am already using HD FPV with a raspberry pi, but via normal wifi. I also recorded a video with two cameras for side by side and I have to say the effect is awesome!

  11. Roberto Sale permalink

    Hi befinitive! Thanks for sharing this awesome proyect!
    Can you please post the code for streaming with a webcam instead using the raspberry camera?
    I’m trying to compile your code to Openwrt.
    Thanks a lot!

    • Roberto Sale permalink

      Hi, sorry for that. I get from a previous post this command:
      gst-launch-1.0 -v v4l2src ! ‘video/x-raw, width=640, height=480, framerate=30/1’ ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0

      When I use sudo ./tx -r 2 -f 1024 wlan0, i can see the keyboard input on my RX, but when I use gst-launch…(the previous command) returns me this:
      ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
      Additional debug info:
      gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
      streaming task paused, reason error (-5)
      Trouble injecting packet: send: No such device or address
      And the wlan0 stop being a monitor, and becomes a managed automatically when I intro that command!
      Can you help me please? I’m using Raspbian in an rp2, and a rtl8187 wireless adapter.

    • At the beginning I also worked with webcams. Please take a look at my firt post “true unidirectional… “(cant link it because I’m on my mobile phone)

  12. computersaysnah permalink

    Hey, very interesting! I will try this soon on my quad, which has a RPi for this same purpose.

    By the way, I reached your post in looking for possible problems between wifi and RC interference out of fear of loosing control of the quadcopter. I have the W722N mounted on the quad, near the edge of one of the booms, and a turnigy 9x receiver, where the antenna is at the center of the quad.

    1. What would be the best orientation between antennas in order to minimize interference? 90 degrees between both?
    2. And more to the topic, does your code change the way standard wifi works, in the sense that it chooses a channel and only uses the 20MHz band around? I’d like to know if using this code increases the chance of interference or this code is at another level which should not change anything at the lowest layer.

    I appreciate your answer.

  13. Trailblazer permalink

    I found some nice code snippet that makes it possible to modify the omxoutput, like rescaling and positioning of the output on the display, maybe you want to add it in the hello_video and make it accessible through command line parameters:

    • Trailblazer permalink

      in addition to my post: Erroneus picture content appears mostly in the lower half of the stream due to corrupted data. This is bad, because this is the more interesting part (ground) of the view if you are flying ( instead of the sky). A quick solution is to rotate the camera 180° and flip the video in the renderer with adding :
      configDisplay.set = OMX_DISPLAY_SET_TRANSFORM;
      configDisplay.transform = OMX_DISPLAY_ROT180;
      in hello_video.c

  14. flow86 permalink

    thanks for this awesome project!
    one Question:
    do you believe it would run on a Raspberry A+?
    (smaller then normal Raspberry, but only 256MB ram)
    with the same speed as on an normal Raspberry?


    • Thanks! I use nothing but the A+. On the ground as well as in the air 🙂

      • schalonsus permalink

        For diversity with A+ you are using regular USB Hub?

      • Yes but it seems as if the USB part of the raspi is a bottleneck. As soon as there are two sticks connected I couldn’t go over 3mbps without loosing much packets (even when I used only one of them with RX -> this is usb related and therefore probably cant be fixed inside the rx program)

  15. Iver permalink

    Hi befinitiv,

    Just bought two TP-LINK TL-WN722N, set up wifibroadcast, made a TX script and a RX script. Started up. And it rocks (after cleaning up own mistakes)!! well done!!

    ‘Have also bought a 9db stacked dipole for TX, two (claimed)4W 2.4ghz bi-directional rf amps for TX and RX and a PCB yagi antenna for RX. Will be interesting to see how that setup works out when I get the opportunity to test it away from the city…

    FYI: It seems that mplayer does a better job (lower latency, fewer image errors) of displaying the recieved h.264 data for some reason. I used:

    sudo ./rx -b 8 wlan1 | mplayer -fps 60 -cache 1024 –

    …in stead of your:

    sudo ./rx -b 8 wlan1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false

    …in my RX script.


  16. I have been searching for a solution to this (and other similar problems with WiFi) for years! Thank you!

  17. What sort of latency would you expect in this setup, in perfect conditions?

  18. Hey,
    i looked at the script and is there the retransmission parameter -r missing?

  19. Hi, I have some problems with the current master branch after following the instructions (did not try the older versions yet).

    The tx program quits and says “Trouble injecting packet: send: Resource temporarily unavailable”

    In the dmesg log, the message “device wlan0 left promiscuous mode” will appear.

    If the interface rate gets too high, something seems to happen that makes the program (or the driver?) crash or reconfigure itself, and this can easily be simulated by setting the -f parameter to something like 256.

    For now, I put the script into a restart loop, but the restart delays would probably make it dangerous as a camera on my quadcopter, since the restart time is quite unpredictable 🙂

    Linux 3.18.8-1-ARCH if that helps. Frame rate is at 24 fps, h264 quality = “medium”.

    • Could you please post the complete wifi initialization code and also the tx line? What wifi card are you using=

      • I’m using two TL-WN722Ns.

        The WiFi config script:

        ip link set down wlan0
        iw dev wlan0 set monitor otherbss fcsfail
        ip link set up wlan0
        iw dev wlan0 set channel 6

        rx: ./rx -b 8 wlp0s20u1 | mplayer -fps 60 -cache 1024 –

        Maybe I have a different version of the ath9k_htc driver: info here ->

      • Could you provide as well the tx command line? If I understood correctly than your problem is on the tx side. Please make sure that you do not have any network managers running (ifplugd etc).

      • sorry, that was what I actually meant to copy:

        raspivid -ih -t 0 -w 1280 -h 720 -fps 24 -b 4000000 -n -g 60 -pf medium -o – | sudo ./tx -b 8 -r 2 wlan0

        This is basically the example, just 30 fps -> 24 and quality high -> medium

        network managers don’t interfere, the problem occurs when there is a high rate of change in the video.

      • So far that looks good. Just a wild guess but I heard from other people that the USB hub from the raspi cannot provide enough power by default. Did you try the same with an externally powered usb hub? (I only use A+ so I cannot reproduce that)

      • Same problem. If that were the problem, you could also set max_usb_current=1 in /boot/config.txt (allows 1.2A).

        Maybe libpcap is to blame for this, since there are no errors in the dmesg log.

      • I used a simple way to test the transmission limits:

        cat /dev/zero | pv -L “${SIZE}k” | sudo ./tx -b 8 -r 2 wlp0s20u1

        (needs `pv` to rate limit the pipe)

        On my Intel computer, the message “Trouble injecting packet: send: Resource temporarily unavailable” occurs more frequently as the data rate is increased, and from my observation (Intel macbook air) the problem starts to occur from around 300K/s.

        Shouldn’t it be possible to catch the error in the program and reinitialize the WiFi device quickly, without breaking the pipe and forcing the restart of the camera?

  20. acostaja2013 permalink

    Thank you for sharing. I’ve few suggestions to the project (wich I’m considering for an Telepresence Surveillance Bot).

    The receiver side an more “polilte” solution could be done thru an WiFi Router running OpenWRT, if you get one with dual WiFi radios you can use one radio to RX WifiBroadcast and then bridge-multicast to the other radio working as an std AP, then monitor the video thru VLC on a PC or Smartphone ( I envisioned an Cardboard like solution could be done later).

    The “bridge-multicast” could require some server procesing the “udp-like” streams from WiFiBroadcast and relaying then to N receiver.

    Another advantage is that this OpenWRT solution could run on very optimized hardware as Ubiquity’s Bullets giving you both solid solutions and aestetics, of course this is an much more expensive setup but open a wide range of possibilities for WiFiBroadcast beyond FPV.

  21. russell permalink

    I got between 0.17 sec to 0.2 sec ish. With two wifi receiver, it works extremely well, i cannot tell a difference between line of sight or not perfectly line of sight. I also used UDP and gstream before, in comparison, this is better in latency.

  22. Interesting. We have used WiFi broadcast for wireless distribution of OS images but we never used monitor mode, only ad hoc. How fast can you transmit at monitor mode? Can you benefit from TXOP speedups?

    Our basic approach does not use IP (works directly over L2), explores Fountain Codes, w/o any feedback, and transmits at the maximum possible speed with 802.11g (54 Mbit/s); with TXOP we can reach a goodbut of 42 Mb/s. We have a variant with limited feedback for accelerating decoders when close to their end. The following publications addressed this topic:

    Fast image file distribution with Fountain Codes via a Wi-Fi Ad-Hoc network, using low power processors, Carlos Faneca, José Vieira and André Zúquete, NETWORKS 2014

    DETIboot: A fast, wireless system to install operating systems on students laptops, Carlos Faneca, José Vieira, André Zúquete, João Cardoso, ACEC 2014

    Towards Dynamic Adaptation in Broadcasting with Hybrid Rateless Codes, Carlos Faneca, José Vieira, André Zúquete, Julio Cano, André Moreira, Luís Almeida, 7th Workshop on Adaptive and Reconfigurable Embedded Systems, 2015

    Orchestrating Feedback for Hybrid Rateless Codes, Carlos Faneca, José Vieira, André Zúquete, Julio Cano, André Moreira, Luís Almeida, European Wireless 2015

    Authenticated File Broadcast Protocol, Simão Reis, André Zúquete, Carlos Faneca, José Vieira,
    IFIP SEC 2015

  23. Robert permalink

    HI selzer. I was having the same problem with the wifi card. The solution was using iwconfig wlan0 rate 54M. That keeps my wifi adapter away for this situation.

    • Thanks for the suggestion.

      Trying `iwconfig wlan0 rate 54M` gives me this error when wlan0 is in monitor mode:

      Error for wireless request “Set Bit Rate” (8B20) :
      SET failed on device mon0 ; Input/output error.

      I created a second interface, mon0, and set the wlan0 interface to 54M, but the same problem still occurs. Possibly I got the order wrong.

      The only unusual thing about my setup are the relatively recent Linux versions (and therefore also atheros drivers), so maybe it’s really a driver bug (list of possibly related bugs:

      I’ll probably try out Debian stable or an earlier kernel on the raspberry pi.

      • Roberto Sale permalink

        Selzer, those are the commands I use:
        First: iw dev, and there check what phy# is your adapter.
        sudo iw phy phy0 interface add mon0 type monitor //this creates a mon0 interface in monitor
        sudo iw dev wlan0 del //delete the wlan0 interface, leaving only mon0
        sudo ifconfig mon0 up
        sudo iw dev mon0 set channel 9
        sudo iwconfig mon0 rate 54M

        If that doesn’t work, try changing the rate before creating the monitor mode interface.

      • Thanks @Roberto, but those exact commands give me the “Set Bit Rate” error. Changing the rate before does not help either. I seriously suspect some kind of driver problem (will try out an older kernel soon).

      • Seb permalink

        iwconfig mon0 rate 54M also gives me the same error on a raspberry pi 2 with the latest raspbian. max_usb_current=1 in /boot/config.txt helped increasing the tx rate.
        I also reduced the bitrate to 2000000 and increased retransmission to 4, also helped.
        Range without these modifications was around 5m :D, now I need to try on free field to see if it really helped.

      • Try “iw dev mon0 set bitrates legacy-2.4 54” (54 Mbps bitrate) or “iw dev mon0 set bitrates ht-mcs-2.4 5” (MCS mode 5)

      • @Seb max_usb_current did not help in my case. 3x retransmission will immediately result in the error. -r 1 works fine, but the range is ~2m.

        @Oleg Artamonov tried both, they give me this error: command failed: Input/output error (-5)

        Anyway, it’s awesome that you are trying to get it to work on the blackswift, I’m one of the backers on Kickstarter, and a Raspberry Pi is a bit too big for my 250 quadcopter 😀

      • Seb permalink

        @Oleg Thanks, syntax of your second command is not accepted by iw or the card. Capabilities show bitrates is not an accepted command for the card.

  24. lesto permalink

    I’m extremely interested has i’m developing something similar, but right now i’m more focused on removing as much latency as possible, and i was planning to use normal wifi connection (never heard about this transmission mode)

    I’m really interested in having a chat together.

    ps. What is the latency in your full system? i managed to go down to ~100ms in full HD and ~80ms on HD 60FPS

    • The latency for 720p is between 120 and 200ms, depending on the options and decoding device. How did you manage to get such low latencies? 80ms for HD is quite good. And how did you measure them?

  25. peter permalink

    I compiled the tx for OpenWrt and rx is on PC. I start the tx with your
    I get a lot of this “TX RESTART: Detected” messages, and loose a lot of packets.

    To change the d parameter up to 5 didn’t really help.

    Do you have any idea ?


    • This message appears if a sequence number is received that it significantly lower than the highest number seen. One example would be a restart of the tx which then starts again with a sewuence number of 0. Other reasons might be out of order packets. How many adapters are you using for rx and how busy is your cpu?

      • lesto permalink


      • lesto permalink

        i had written a long answer some time ago but it is not there. So the test has been done recording a timer and taking a photo of the timer and the screen of the receiver.
        Going HD@60fps give the best results probably because encoding latency is dependent on how faster come the next frame (so @120fps, if the processing power is fine, we should see more improvement)
        ps. Please contact me personally as i would like to talk with you

      • Sorry, but your email address is not visible to me. Could you write it in plain-text?

      • lesto permalink

        mauro at inkonova dot se

  26. peter permalink

    Ok, understand, you put a sequenz number somewhere in the header, and you order it on rx side.
    normal, you should get 1,2,3,…. and if you have a very “unorderd” number, you are out of sync.

    On rx side, I have just one Wlan device.
    But on the tx side, I receive a udp stream, and forward it via netcat to tx. Maybe this could be a problem.

    • Using netcat with binary data could be problematic. Try instead socat (you’ll find an example in my “port to android” post)

      • peter permalink

        What is a roughly performance number on the tx side?
        I get the message xx data packages send, and the rate.
        What is the rate and the package count ?

      • package count is the number of packages at the input of tx (multiply this to get the user data sent). interface rate is the number of packets per second on the interface. the difference to the input packages is the retransmission count. so in case of two retransmissions a input packet count of 1 corresponds to two interface packets

  27. cyberteque permalink

    this looks great, but….

    I’ve been googling and ogling the results trying to find a way to get text overlay.

    So far I’m thinking either do the heading, altitude, attitude data overlay at the transmitting end or send the video stream and AHRS data separately.

    I’m told my 900Mhz XBee link is not legal here in Australia (real bummer as they were fairly pricey), so I’m only using out “in the bush”.

    My usual way of developing this stuff is on the ground with my rover, I’m going to give wifibroadcast a go sending the control data as well as receiving the video and AHRS.

    Using a single data link would be so much better than 2 or 3 RF links.

  28. Hi guys, great work here!

    I am using a c920 and odroid U3 to do the same thing with 2 ar9271 dongles.

    Have successfully got the streaming working, however currently getting ~700ms latency.

    The logitech c920 has built in hardware encoder which makes 3mbit h264 video at any resolution (up to 1080p), and so I am looking for a way to easily pipe the camera h264 into wifibroadcast with as little latency as possible.

    The odroid cpu is at 0.7% sending 1080p (which seems strangely low even for hardware encoding) so cant be cpu bottleneck.

    Recieving PC is a hex core i7 etc so doubt it is an issue there as well.

    Quality seems really good, just the lag.


  29. schalonsus permalink

    Testing with ZMR250.
    Thank you befinitiv for your great project!

  30. Really great idea!

    I’m trying to port it to OpenWRT 14.07 (AR9331 SoC) — I need HD FPV to be smaller than RPi + USB Wi-Fi, and with AR9331 it is possible to fit everything (camera, computer, Wi-Fi) literally in the matchbox.

    But what I get is a lot of lost packets:
    Lost a packet ac020004! Lossrate: 0.874602 (3020 / 3453)
    Lost a packet ac020005! Lossrate: 0.874638 (3021 / 3454)
    Lost a packet ac020006! Lossrate: 0.874674 (3022 / 3455)
    Lost a packet ac020007! Lossrate: 0.874711 (3023 / 3456)
    TX RESTART: Detected blk 16004000 that lies outside of the current retr block buffer window (max_block_num = 16e04000) (if there was no tx restart, increase window size via -d)

    Bigger window size doesn’t help. Lossrate is always around 0.87.

    Transmitter is AR9331 with integrated Wi-Fi or external TL-WN722N, receiver is CentOS virtualbox installation with Linksys WUSB600N. VirtualBox or Linksys is not to blame as it works well with RPi-based transmitter. But OpenWRT — no matter what the settings are, same shit. 0.87. WUSB600N (Ralink RT2870) as transmitter’s adapter — 0.87 again.

    Probably some bloody mess in OpenWRT’s Wi-Fi subsystem, but may be someone has other ideas to check?

    • That is quite strange. The TX restart message says that you lost 14680064 packets which does not make sense. I guess that the packet contents are not well (like something is prepended to the packets content). This way the sequence number (the first 4 bytes of each packet) is assumed to be at a wrong place. The best way to debug this issue is to capture the transmitted packets (maybe with bogus-data instead of actual video data) with wireshark and inspect if the sequence number is where it is supposed to be. Really helpful for this is to setup a display filter for the wifibroadcast MAC address.

      • Here comes the fix 🙂

        wifibroadcast / tx.c

        *(uint32_t*)pu8 = pcnt; → *(uint32_t*)pu8 = le32_to_cpu(pcnt);

        AR9331 is big-endian.

      • Wow, cool! What hardware do you use? Do you have any pictures?

      • (it’s not selling yet, but I’m one of developers so I have a few)

        What I want to make — the final goal — is ready-to-use matchbox-sized HD FPV. It’s even possible to create special Black Swift version with onboard power amplifier and U.fl antenna connector.

      • BTW, if you want to discuss it in more details, can you drop me a line at I have some thoughts I don’t want to share in public yet.

  31. Jozsef Voros permalink

    I think a Gopro, Mobius, keychain#16 etc with firmware mod for driving the wifi adapter (omiting the relative big Raspberry on Tx side) could be a revolution in FPV. Small, inexpensive, with the picture quality of these latest videos (or even better as the Raspberry camera is inferior to a Gopro or Mobius) could be a new age in FPV. Comparable to the DJI Lightbridge on a way better price level.
    Not sure what is the availability of the camera development kits, but using such a kit can’t be to difficult to make the mods necessary to use wifibroadcast.

    • What I’m doing now is porting wifibroadcast to (25×35 mm microcomputer with onboard Wi-Fi). It has USB port so one can use USB webcam with H.264 hardware encoding.

      With OEM camera module whole system can be fitted inside matchbox.

      • peter permalink

        I have it already running on openwrt. Carambola2….

        Looks really ncie the small board …

      • Josh permalink

        What latency are you achieving with the blackswift and carambola?
        Any posts on the limitations of these platforms?

  32. Alan permalink

    Hi ,

    Do you use the desktop on the raspberry or headless on hdmi,
    i see no picture on the rpi desktop

    • You can use both. But I prefer using my modified “hello_video” program that displays directly on HDMI.

  33. Alan permalink

    Thanks for your support.
    i have one more question:
    When i connect the raspberry to my HDMI monitor, and execute: sudo ./rx -b 8 wlan0 | /opt/vc/src/hello_pi/hello_video/hello_video.bin i don’t see any video siginal.
    On the TX, i have a slow flashing led on the wificard and the led on the cam is on.

    How can i check if the TX works fine?

  34. russell permalink

    I ran into this a few times. Usually the problem is a typo on the tx side, pls check to see if the camera brd LED is lighted up as well to confirm you are streaming. The wifi will blink even the camera is not streaming in my case.

    • Alan permalink

      Do i need to patch the wificards before i can use them?

  35. fc3sbob permalink

    Hey guy’s, I’m waiting for another TL-WN722N to arrive for the receiver but I have setup my pi camera with one. One thing, I’ve setup my Raspberry Pi 2 to exactly how it’s setup above. I even wrote a start bash script and everything seems to start up correctly but the light on the TP-Link card is on solid with maybe a blink here and there. I’ve gone through everything over and over again, I would expect it to blink a lot. Unfortunately since I don’t have a second adapter yet I can’t troubleshoot it.

    Here is my bash script, which is the same code from above

    #!/bin/bash -x
    sudo killall ifplugd
    sudo ifconfig wlan0 down
    sudo iw dev wlan0 set monitor otherbss fcsfail
    sudo ifconfig wlan0 up
    sudo iwconfig wlan0 channel 13
    raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o – | sudo ./tx -b 8 -r 2 wlan0

    • fc3sbob permalink

      I should mention that it just creates a huge video file until the filesystem is full, then crashes.

    • Alan permalink

      Hi fc3sbob,

      The same issue here, when I view the wlan0 (TL-WN722N) with ifconfig, no tx package has been sent.(tx/0 rx/0), but tx and raspivid are running.

      When i change the wifi card with the Alfa AWUS036NH, nothing.

      Do we need to patch the wireless cards?

      • fc3sbob permalink

        Hi Alan, at least I’m not alone.. We must be doing something wrong here. I’m using SSH and I’ve also tried with just a keyboard at the Pi, so that’s not an issue. I did copy the patched firmware files to /lib/firmware, and just to double check I looked at the last modified date on the file and it’s from the time I copied it. I also have the same issue when I log in with another ssh session and check ifconfig, TX is at a very low number like 300kb and RX is 0.

      • fc3sbob permalink

        Hey Alan, I got it working, I don’t know why I didn’t try this before.. I must have messed up somewhere before hand while playing with this rasbian image so I just copied a new raspbian image to my SD card, Went through the initial raspbian setup, expanded the file system, enabled the camera, enabled SSH, Didn’t overclock it (like I usually do) once it restarted and I got the IP address, I just SSH’d into it and followed the setup again at the top of this page, then I ran “sudo pico” and copied this TX script

        saved it as in my wifibroadcast folder, ran “sudo chmod 755” to make it executable, then “sudo ./” and it’s now broadcasting, easy as that!.

      • Alan permalink

        Hi fc3sbob,

        Thats good news!,
        Thanks for sharing your experience.

        I do the same thing as you did, and use your script.
        at least its working, for a few secons and then its stops with the messages:

        Trouble injecting packet: send: No such device or address

        Witch wificard are you using for TX an RX and what do you use for memory split?

        Will you share your RX script aswell?

      • fc3sbob permalink

        I’m using a TP-Link TL-WN722N. I actually don’t have a second one yet, it’s on it’s way and should be here next week. I’m just getting the transmitter prepped. This particular script I didn’t write, I don’t remember the guy’s name but I found it in a forum post. He also has pre-made raspian images with the software pre-installed that you should try out since you are having these issues. You can find the TX/RX scripts here along with the Raspberry Pi images. Although I’m not sure if the images are for the original Pi or the Pi2

        I’m not using a raspberry pi as the reciever, just a laptop running Linux Mint, so my RX script will be the same as the one at the top of this page minus the hello_video/hello_pi stuff.

      • Alan permalink

        hi fc3sbob,

        thanks for your help.
        finally it’s works! the problem on my side was the wifi card on the TX, i use a the Alfa AWUS036NH on the TX side and the TL-WN722N on the RX site.
        I get a latency of > 5 seconds(!)
        thats not a good choice.
        when i change the Alfa AWUS036NH on the RX site and TL-WN722N in the TX, its finally works.
        So, i order a TL-WN722N for the RX, and forget the Alfa AWUS036NH.

        thanks again.

  36. Alan permalink

    I can’t get it to work.
    I setup a SSH session to the RX, and start the RX script, but no stream output to the HDMI .
    Or is it only possible to start the rx on the HDMI console and not from SSH?

  37. Alan permalink

    Hi befinitiv,

    Thanks for sharing this awesome project!
    is there a possibility to add encryption to the video feed and add OSD with mavlink protocol?
    And maybe add mavlink OSD

    • Hi Alan

      That should be doable. Wifibroadcast just transports data and does not care about the contents. So you could just add a “cipher” module between raspivid and wifibroadcast. I don’t know the mavlink protocol but as long as it is unidirectional I don’t see a problem transporting it with wifibroadcast.

      • Alan permalink

        Thanks for your information.

        Is there a possibility, to run/compile RX on windows 7 with VLC for instance?

  38. Roberto Sale permalink

    Hi. I’m trying with my setup and with 60FPS I get ~160ms , and with 30fps, I get 230ms of delay, but the sharpness and the light in the 30fps are considerably better than in the 60fps. Why is that?

    • If it is dark then 60fps leaves less time for exposure. The sensor then increases gain which lowers image quality. Could this be the case or did you observe the effect under bright conditions?

      • Roberto Sale permalink

        Yes befinitiv, looks like it is that case. The lines have a lots of aliasing, but I will give this a shot, becouse the delay is considerably lower. Do you figuring out some idea to improve the image quality?
        BTW, I have a method to measure the delay:
        In the screen of my pc, where I receive and decode the video,plays a stopwatch count, and in a corner, the video feed. Then, making a screen print, I substract the stopwatch time with the time on the video feed jajaja.

  39. Roberto Sale permalink

    Ok, I think I get the best so far at the moment.
    The best for playing is this command:
    sudo ./rx wlan0 | mplayer -fps 60 -cache 1024 –

    In the raspberry, I use:
    raspivid -ih -t 0 -w 1296 -h 730 -fps 49 -b 3000000 -n -g 60 -pf high -vf -o – | sudo ./tx -f 1024 -r 2 wlan0
    The 1296×730, 49p, is becouse is the full field of view, and I don’t loss any quality ( )
    The latency is 133ms!! 🙂 I cant wait to put this in my glider :D.
    BTW, is any difference in the receiver wifi dongle? I mean, can I use other than the 722n, or the inyection rate have something to do with the receiving rate or can I use other adapter?

    • schalonsus permalink

      How you got mplayer running?
      When i try to use your command i just get a black screen.

      • Roberto Sale permalink

        I don’t know why when the code is pasted here, the last “-” before the “|” gets longer.
        If you are copy pasting the code, replace the long guideon after the “-vf -o ” with a normal one and try again!

      • schalonsus permalink

        I typed the code by hand, so this should not be the problem.
        Here is the message i get over ssh when trying to use the code

      • schalonsus permalink

        Tested playing a video from the internet and mplayer works so far.
        And i also get picture now with the stream, but it plays in super slow motion. Dont know whats wrong.

        I think the bottleneck of this system is the path from stdin to hdmi out, since the recorded videos on the receiving pi are super smooth.

    • fc3sbob permalink

      I’ll admit, that long dash was driving me crazy. I couldn’t find it on my keyboard! but then I realized that it could be replaced with whatever you want.

  40. fc3sbob permalink

    So I got it working, I setup the raspberry pi camera in the front of my house and took my laptop to the far end of the backyard, right at the fence it started to loose signal. I would guess about 175ft. Not bad considering it’s going through 4 walls before coming outside. But I also realized that my SSH connection to the pi was still working out that far, so my laptops wifi it’s self is pretty good. Anyways I added a second TP-Link card to the laptop and went out to the back of the yard again and it didn’t loose a packet. I didn’t feel like walking down the street with my laptop in hand to test it just yet, But I’m sure in the open air with no obstacles it should give me a pretty good distance. I plan on getting better antenna’s for all 3 adapters too.

    • fc3sbob permalink

      I wonder if it would be possible to start and maintain a SSH session over a data pipe. This would eliminate the need for a second wifi adapter on my Pi connected to my home network, Which won’t be available when I’m out in a field flying my quad.

  41. DAGA permalink

    Really great project! I just got 3D working:
    Two Pis are sending two seperate 720p streams to one Pi with two Tplink wn722n. The first stream is sent over channel 1 and the second over channel 13. I modified the video.c in order to postion and resize the video and compiled two versions. One for the left image and one for the right image.
    As the streams are 720p and wont fit in SBS on a 1080p display, i modified the config.txt to output a 2560×1440 image. Here is an indoor test with a 4k TV:

    I also tried to send the streams over the same channel on different ports. If I use one Tplink for the RX it works, but I got some framedrops. If I use two Tplink in diversity the RX pi freezes. What could be the problem here? The diversity works fine with only one stream.

    • Wow, that looks cool! Must be quite an experience to fly 3D FPV. Do you plan to use an Oculus as a display device?

      Why your rx freezes is currently not clear to me. In principle the two rx instances should be independent. Sorry, can’t help you on this one.

    • Broccoli permalink

      Hi Daga,
      I had a very similar idea :), but without the 3D. ( also have the Gear VR ).
      Do you document/share your project somewhere ? how much did the 6incher cost ?

      How will you mount the screen to the VR ? do some 3d-printing mount ? 🙂

      BR Mathias

    • jholster permalink

      Cool to know that RPi can drive a 2560×1440 screen! There are some nice 6″ 2560×1440 displays with HDMI controller board on aliexpress, I’m considering to buy one. What framerate do you get? Can you please post your config.txt?

      • jholster permalink

        (My question was for you DAGA).

  42. Seb permalink

    Hey, while I’m trying to get a stable link (and my RC seems to interfere with WiFi), you guys ever tried setting channel to 40MHz Bandwidth to increase throughput? “sudo iw dev wlan0 set channel 13 HT40-” Theoretically that would reduce power level per 20MHz and decrease range, right?

    • schalonsus permalink

      Throughput is not a real problem i think.
      Even with a wn722n as tx and good antennas i manage to get 1km range. But its a good idea, will try if it makes any difference.

      Biggest bottleneck seems to be the way from stdin to hdmi out. The recorded videos on my receiving pi are looking really crisp and playing super smooth, but on my TFT feed from the pi the video looks like 20fps and not as crisp as the recorded.

  43. Walkeer permalink

    Hi guys, this may be the first trully affordable HD FPV downstream available if it will be possible to obercome some details:
    5.8GHz spectrum: since majority of FPV users have 2.4GHz radio for RC, 5.8GHz would be best for the streaming, it is more or less standard for fpv setup for miniquads. Therefore, we need some usable 5.8 Ghz wifi, prefferably 802.11N or even AC for better range and more bandwidth(more retransmissions possible) which can be used with this project. Any ideas? Perhaps some integrated atheros solutions from ubnt or mikrotik?
    Better signal loss behavior: I believe this would be fixed by usage of MJPEG rather than h264, or use h264 with key frame in evey frame, as we do not want dependsncy between frames which regurlar h264 has
    Lower latency: perhpas MJPEG with significantly reduced processing needs will lover the latency. Do we know what is the main cause of the latency? With CSI/MIPI camera on raspberry 2 the camera hsould not be an issue, so is it the wifi transmission? Regular wifi link has latency around 1-5ms, si that should not be the case here as well, so probably the encoding/decoding? i will try to play with this part

  44. Nicolas permalink

    Hey befinitiv, I have seen you have started implementing FEC from udpcast. Thanks, thanks, thanks.

    Now I feel bad, because I haven’t had time to dig into the buildroot stuff further. But I will do that sometime 🙂

    But I have looked into the other options in the meantime, namely this one I wrote about earlier:

    “– Make some kind of virtual wrapper-interface using tun/tap or something. So that udpcast speaks to the wrapper-interface that behaves like a normal network interface. The wrapper-interface will then speak to the atheros monitor-mode interface. Also not sure if possible at all and how much effort.”

    I’d say it is actually possible and probably the cleaner solution because this way, we can use any normal network application we’d like to use and don’t need to touch wifibroadcast for new functionality. Drawback would be of course, that it’s harder to control buffers and latency because there is more stuff in the chain.

    And I had another idea regarding the TX side. One could use two wifi cards on two different channels for the TX side to increasy bandwidth and resiliency against noise/interference and other wifi networks. If the FEC/interleaving occurs before sending out the packets on the physical interfaces, that would mean we have an interleaved and FECed data stream running over two different frequencies. I’d say the probability of both links having heavy interference (that the FEC/interleaving can’t deal with) at the same should be around zero, giving a very very stable link even in the worst conditions.

    Here is how I think it could work (please bear with me, I’m not a programmer)

    Here is a tutorial about tun/tap programming in C:

    • Hi Nicolas

      You noticed the new branch quite fast 🙂 Udpcast was a very good tip of yours, so thank you on that! I was surprised to see how fast the code is. It can use the full bandwidth (22mbit/s) in case of the Raspi A+ with 1/2 coding. A rough comparison of wifibroadcast retrans vs fec shows that by comparing just the error counts a coding rate of 1/2 with retransmissions is roughly as relieable as a 3/4 rate with FEC. However, this was just at a block length of 4 packets. Using longer blocks significantly improves FEC compared to RETR. So using FEC might open up the possibility of using MCS1 ( ) for increasing the range/relieability.

      The code is close to being finished but it still needs lots and lots of testing… but I’m quite excited. Without FEC wifibroadcast was already superior to wifi FPV but with that being available you would be completely out of mind using normal wifi for FPV 🙂

      Right now I am putting all my efforts on getting FEC into mainline. So there will be no spare time for TUN/TAP, sorry. But the idea sounds interesting!

      • Nicolas permalink

        Thanks for the reply. Yeah, I should’ve looked into that tun/tap thing earlier, not when you are almost done building something else 😉 For me personally, fec inside wifibroadast is better, I don’t need any functionality that wifibroadcast doesn’t have and there’s less hassle with added latency and jitter because of buffers and queues in logical interfaces and udpcast etc. Also, there is no wasted bandwidth with IP/UDP headers, allowing for smaller packet sizes without too much overhead penalty.

  45. Wiggles permalink

    I’m trying this with a pair of RT3070 based usb wifi adaptors, When I transmit I can get it to send packets no problem however when I try the receiver commands all i get is:
    DLT_IEEE802_11_RADIO Encap

    And nothing more after that, is this an issue with the RT3070 or another setup issue?

    • Try to use another channel on RX/TX. If i use channel 13 with these card i get the same issue. If i using channel 1 f.e. i get images, but with a strange slow mo effect. Don´t know why.

    • EchoG permalink

      Hi Wiggles, Did you have any luck with getting it to work ? I get the same problem, however on the AWUS051NH as a receiver though

  46. Nicolas permalink

    Here are some interesting links about how to change medium access strategy with ath9k_htc driver:

    Click to access acsac2014.pdf

  47. Roberto Sale permalink

    Hi again. Can you please explain me what is the keyframe rate or “intra refresh period” and its consecuences in the streaming?
    And how do you record the streaming?
    Thanks a lot!

    • Hi

      Roughly speaking: The h264 stream consists of two types of images. Full frames (key frames, K) and difference frames (D). The stream looks like the following: KDDDDDDKDDDDDD. At the receiver each image is computed out of the keyframe and its difference frames. If there was some corruption in between, this error does persist until the next key frame. So by increasing the keyframe rate you get faster a good image is a corruption happened. I would suggest to have a keyframe at least each 0.5s.

      For recording a stream you can take a look at my rx script:

  48. Robert permalink

    Thanks! That helps a lot. And in the – g option of the raspivid, what is that number? Ie, if I have – g 8,is that 1 key frame and 8 different frames?

    • I think it should be 1 keyframe and 7 difference frames. As an example, if you set your frame rate to 30fps and your keyframe rate to 15, then you would have two keyframes per second -> each 0.5 a keyframe.

      • Hi,

        I was quite surprised that the video from the CSI port on Raspberry Pi (physically connected to the GPU) is being converted to h264 using only the GPU, which explains how the very slow CPU is able to perform it. My question is, is it possible to use keyframe every frame? if yes, why you are not using it? Is it because of bandwidth reasons?

        Now, the ultimate question: what is the main cause of the latency? The WIFI has usually around 1-10ms, so it must be the camera, the conversion to h264 in the GPU or the receiver side, where the video might be cached in some buffer of the video player.

        Many thanks for any insights.

  49. Alexandre Licinio permalink

    Hi all, i want to share my experience and a tweak for decrease the latency of the rx side and deport the receiver far away (max length of an ethernet cable). Maybe i repeat something that someone did already, so sorry for that.


    What i wanted to do is to use this fork of hello_video by Dom (Raspberry Pi Engineer & Forum Moderator) with wifibroadcast. Latency is better without clock.
    And also i wanted to deport the rx side with a long ethernet cable and have another rpi to output hdmi. Because sometime you can’t have 100meters of hdmi !!


    [TX]rpi(A)[WIFI] start wifibroadcast as usually
    this is my command line fo raspivid
    raspivid -n -o – -t 0 -md 0 -pf high -g 0 -w 1280 -h 720 -b 4000000 -fps 30 -a CAM-A

    sudo ./rx -b 8 -r 4 -f 1024 wlan0 | nc -l 5001

    git clone
    cd userland/host_applications/linux/apps/hello_pi/
    cp -r hello_video_simple/ /opt/vc/src/hello_pi/
    cd /opt/vc/src/hello_pi/
    cd hello_video_simple
    nc “ip of the rpi(B)” 5001 | /opt/vc/src/hello_pi/hello_video_simple/hello_video_simple.bin

    if you break the rpi(C) it will break the rpi(B) so try to loop the command to avoid this breakdown (i didn’t test it already).

    It should be nice to do diversity with ethernet maybe with socat…


    Do you think wifibroadcast could work with Mikrotik device ?
    I want to use other standalone encoder with the ability of wifibroadcast and not use OFDM device with non-license free.

    • Nicolas permalink

      Cool, that hello_video_simple with lower latency sounds really interesting. Now we have improvements in latency on both TX and RX sides! (befintiv already made some latency related changes to raspivid)

      Regarding wifibroadcast on a Mikrotik device. You mean some kind of wireless AP? Not sure what those Mikrotiks run, but I have read on some other forum that somebody made it work on a wireless AP with OpenWRT.

      • Alexandre Licinio permalink

        Yes, i was thinking to use Mikrotik Routerboard such Basestation2 and Basestation5 (AP and CPE). Nice to hear about openwrt. The thing is i want to use standalone encoder (not linux device with raspivid) for other applications.

        I have sometimes on RX side and error “could not fully reconstruct block …” and no image appears, why ? do you have any idea ?

      • Alexandre Licinio permalink

        i just checked the latency is the same even if you use hello_video_simple

  50. natxopedreira permalink


    I mounted the system using a rpi and a desktop ubuntu, using two wn722n.

    I compiled and get this running without issues (thanks for all the info here!!) but im getting one estrange behavior.

    Sometimes instead of losing frames or getting worst in quality, seems like they are stored anywhere in a buffer and then you are seeing old frames coming from past.

    Has anyone seen this?

    • Does this occur with error messages on rx like “lost packet” or “unable to reconstruct block”? I saw some h264 decoders that show the behavior you described when you give them invald/corrupted data at the input.

  51. natxopedreira permalink

    yes when it happens i see those messages on the rx side.

    Any clue of how can i avoid that?

  52. Thijs Sillen permalink

    Had a bit of trouble to get it working but it does now.
    Full details here:

    With link to image files , full details

  53. Kieran permalink

    It might be nice to discuss this in an interactive setting so I started #wifibroadcast on Freenode IRC

  54. Hornet WL permalink

    Regarding the “trouble injecting packet”: I think this problem is caused by a sudden rise in the bitrate of the video stream that exceeds the link capacity. I can trigger this problem reliably by pointing the camera into the light and moving my finger over the lens with a distance of a few millimeters.

    Unfortunately similar things happen quite often when handling the plane before launch. Any chances to catch this issue in software without needing to restart the tx program?

  55. I’ve joined the Wifibroadcast users, I ordered all the parts last week and got it working today.. thanks to nicely documented project.
    I have a couple of issues even though it is streaming nicely. I cannot end the streaming or get either Raspberry-Ip to respond. It seems to be locked into and doing only one thing.. stream the video.
    Is this just me? Is there a keystroke that would interrupt and kill the parent process’s?

    The other issue, not really an issue but a question. to make this easy to use, what is the best approach to initiating the streaming, then ending it at the end of an FPV flight. Should I add touch sensitive screens and place icons there linked to scripts for tx, rx, shutdown, (login?)
    how do we make this a embedded device rather than a computer, any Ideas to share?

    Richard Evans

    • Hi Richard

      Glad to hear about a new happy user 🙂 By showing my current groundstation most of your qiestions should be answered. Could you remind me again in two weeks about that because currently I’m on holidays.

  56. Eric Maglio permalink

    Thanks for posting such an awesome project! I’m still fairly new to programming the Raspberry Pi, but got everything running without much difficulty.

    I’m hoping I can find some help adding a MAVLINK serial stream over the Wifibroadcast link to handle the telemetry connection from a Pixhawk autopilot. Can anyone provide some tips? I’ve got MAVProxy running on the airborne Pi and successfully communicating with the Pixhawk, but I’m not sure how to get that data down to the ground. I figure I can’t just run another instance of tx, since the telemetry connection is 2-way.

  57. natxopedreira permalink


    In TX im using a raspberrypi B, wifi dongle wn722n, and i use the scripts from this repo.Installed following the info on this blog

    • Strange. Unfortunately, I have no RP B to reproduce your setup. Using A+ and 2 with the 722 that problem never occured. Maybe you could look at dmesg after the error message? A todo for me would be to make the error message better so that it also states the reason for the failure.

      • natxopedreira permalink

        Yep its extrange.

        Let me say that i have tested varios methods to make the stream: gstreamer udp, mjpg, and rtsp stream…. All suffered from the same behaviour……. wich its very extrange.

        Im going to try to borrow a rpi2 to test. And look for a more descriptive error msg.

        I thinked that it was a know issue an feel a little bit frustration…. but for your answer seems like it doesn’t have to happen, im going to do more test.

  58. natxopedreira permalink

    Well i tried again using a rpi2 for TX and a rpiB for RX.

    And now it works!!! Yesterday i tested and i was not able to see the estrange behaviour.

    So it seems like you can not use a rpiB for TX, i would love to know why……

    Thanks a lot for this!!!!! Its very nice!

    Now i only need to think a way to put the signal in an oculus dk1.

  59. Hello befinitiv, thanks for such a great software, finally we can FPV in style. 😉

    I have the system working between 2 pi’s and another PC running Ubuntu 14.04 vmware.

    On the PC the rx software is working but when piping it into mplayer only shows some strange color bars. Here is a picture.

    Any ideas what could be wrong?

    This is the command im using on the PC.

    ./rx -b 8 wlan0 | mplayer -fps 60 -cache 1024 –

    Thanks alot.

    • Did you check if all systems are running the same revision of wifibroadcast? (Check with “hg parent”). Especially if you are using the latest version with FEC make sure that all parameters match both on rx and tx. You will get it to work, I’m sure 🙂

      • Robert permalink

        Hi again. That’s the problem. If you don’t use – f parameter in the TX, it uses the default bytes per frame(1470). So, or you add – f 1024 in the TX command, or you replace – cache 1024 for – cache 1470 in the rx command.
        Hope that helps.

    • Robert permalink

      Hi. Can you post the TX command? I’m pretty sure that is the problem.

      • Hi.
        Here is the command on the tx pi.

        raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o – | sudo /home/pi/wifibroadcast/tx -b 8 -r 2 wlan0

        And i made sure both tx and rx are the same version.

        changeset: 55:ea4b247f4415
        tag: tip
        user: befi
        date: Sun Jul 19 21:01:54 2015 +0200
        summary: added license text

        Thanks for the fast reply Robert and Befinitiv

      • small update. I cloned the newest version into the tx and rx pi. compiled the code and run the same commands as above. Now the rx pi cant decode the video via the hello_video.bin from bitbucket. But if i pipe the rx -b 8 -r 2 wlan0 to a file and then open it with omxplayer it works.

  60. Natxopedreira permalink

    Try with the scripts from the butbucket repo

    • chris permalink

      Check your command line just before the pipe, you have (dash) (lowercase o) (space) (em(long)dash) -o – |

      What you need is (dash)(lowercase o) (space) (dash) -o – |

      • chris permalink

        Hmmm, it looks like this CMS system here automagically turns some dashes into emdashes (an emoji plugin?), this answers the question why this is a recurring problem. Just backspace over that long dash in the command line you copy pasted and replace it with the regular dash found on your keyboard. DON’T copy paste from my reply above (or any other comment here) without being aware you need to manually edit it after pasting.

    • It was the -f 1024… Thanks Robert.

      After that it worked fine until i moved the gear into my RC car. Then the picture started having a huge delay more then 10 secs.

      I traced the source of the problem to a small DC-DC converter i have added that was injecting noise into the Alfa AWUS036NHA. I need to build a small shield case for both TX and RX wifi devices for sure.

      Next step will be to have the picture side by side and feed it to my Oculus Rift. The Pi manages to do an ok job driving the sdk2 at 1920×1080.

      Any ideas how to make the picture SBS?

      • Natxopedreira permalink

        I think that you can make it with a gstreamer pipe, but i think that it would add latency, because you have to resize the video and duplicate the render. Also you will need to do something with the cromatic aberration and distort to be ol for the oculus

        The distort can be mak using a diferent lens on the camera, more on the 110 fov.

        Im making test to see if i can output the opemax video render to a opengl texture or something to work on

        If i can get a texture to opengl i can make a oculus player for this, for one camera or 3d (using two). I will post here

      • DAGA permalink

        I already got side by side working in 1440p using a raspberry pi:

  61. natxopedreira permalink

    Beifnitiv i uploaded a modded version of raspivid with the possibility to set parameters to the output like position, size of the origen and render. Its a fragment of code that found on the raspberry forums.

    Works great maybe you can merge with your version, also i can mod this to take the parameters from the console call if you want.

    Copyright (c) 2012, Broadcom Europe Ltd
    All rights reserved.
    Redistribution and use in source and binary forms, with or without
    modification, are permitted provided that the following conditions are met:
    * Redistributions of source code must retain the above copyright
    notice, this list of conditions and the following disclaimer.
    * Redistributions in binary form must reproduce the above copyright
    notice, this list of conditions and the following disclaimer in the
    documentation and/or other materials provided with the distribution.
    * Neither the name of the copyright holder nor the
    names of its contributors may be used to endorse or promote products
    derived from this software without specific prior written permission.
    // Video deocode demo using OpenMAX IL though the ilcient helper library
    #include <stdio.h>
    #include <stdlib.h>
    #include <string.h>
    #include "bcm_host.h"
    #include "ilclient.h"
    static int video_decode_test()
    COMPONENT_T *video_decode = NULL, *video_scheduler = NULL, *video_render = NULL, *clock = NULL;
    COMPONENT_T *list[5];
    TUNNEL_T tunnel[4];
    ILCLIENT_T *client;
    int status = 0;
    unsigned int data_len = 0;
    memset(list, 0, sizeof(list));
    memset(tunnel, 0, sizeof(tunnel));
    if((client = ilclient_init()) == NULL)
    return -3;
    if(OMX_Init() != OMX_ErrorNone)
    return -4;
    // create video_decode
    if(ilclient_create_component(client, &video_decode, "video_decode", ILCLIENT_DISABLE_ALL_PORTS | ILCLIENT_ENABLE_INPUT_BUFFERS) != 0)
    status = -14;
    list[0] = video_decode;
    // create video_render
    if(status == 0 && ilclient_create_component(client, &video_render, "video_render", ILCLIENT_DISABLE_ALL_PORTS) != 0)
    status = -14;
    list[1] = video_render;
    // create clock
    if(status == 0 && ilclient_create_component(client, &clock, "clock", ILCLIENT_DISABLE_ALL_PORTS) != 0)
    status = -14;
    list[2] = clock;
    memset(&cstate, 0, sizeof(cstate));
    cstate.nSize = sizeof(cstate);
    cstate.nVersion.nVersion = OMX_VERSION;
    cstate.eState = OMX_TIME_ClockStateWaitingForStartTime;
    cstate.nWaitMask = 1;
    if(clock != NULL && OMX_SetParameter(ILC_GET_HANDLE(clock), OMX_IndexConfigTimeClockState, &cstate) != OMX_ErrorNone)
    status = -13;
    // create video_scheduler
    if(status == 0 && ilclient_create_component(client, &video_scheduler, "video_scheduler", ILCLIENT_DISABLE_ALL_PORTS) != 0)
    status = -14;
    list[3] = video_scheduler;
    set_tunnel(tunnel, video_decode, 131, video_scheduler, 10);
    set_tunnel(tunnel+1, video_scheduler, 11, video_render, 90);
    set_tunnel(tunnel+2, clock, 80, video_scheduler, 12);
    // setup clock tunnel first
    if(status == 0 && ilclient_setup_tunnel(tunnel+2, 0, 0) != 0)
    status = -15;
    ilclient_change_component_state(clock, OMX_StateExecuting);
    if(status == 0)
    ilclient_change_component_state(video_decode, OMX_StateIdle);
    memset(&format, 0, sizeof(OMX_VIDEO_PARAM_PORTFORMATTYPE));
    format.nSize = sizeof(OMX_VIDEO_PARAM_PORTFORMATTYPE);
    format.nVersion.nVersion = OMX_VERSION;
    format.nPortIndex = 130;
    format.eCompressionFormat = OMX_VIDEO_CodingAVC;
    format.xFramerate = 60 << 16;
    if(status == 0 &&
    OMX_SetParameter(ILC_GET_HANDLE(video_decode), OMX_IndexParamVideoPortFormat, &format) == OMX_ErrorNone &&
    ilclient_enable_port_buffers(video_decode, 130, NULL, NULL, NULL) == 0)
    int port_settings_changed = 0;
    ilclient_change_component_state(video_render, OMX_StateIdle);
    memset(&configDisplay, 0, sizeof(OMX_CONFIG_DISPLAYREGIONTYPE));
    configDisplay.nSize = sizeof(OMX_CONFIG_DISPLAYREGIONTYPE);
    configDisplay.nVersion.nVersion = OMX_VERSION;
    configDisplay.nPortIndex = 90;
    configDisplay.fullscreen = OMX_FALSE;
    configDisplay.noaspect = OMX_TRUE;
    // dest size
    configDisplay.dest_rect.x_offset = 0;
    configDisplay.dest_rect.y_offset = 0;
    configDisplay.dest_rect.width = 640;
    configDisplay.dest_rect.height = 360;
    // source size
    configDisplay.src_rect.x_offset = 0;
    configDisplay.src_rect.y_offset = 0;
    configDisplay.src_rect.width = 640;
    configDisplay.src_rect.height = 360;
    int stat = 0;
    stat = OMX_SetParameter(ILC_GET_HANDLE(video_render), OMX_IndexConfigDisplayRegion, &configDisplay);
    printf ("stat= %#x\n", stat);
    int first_packet = 1;
    ilclient_change_component_state(video_decode, OMX_StateExecuting);
    while((buf = ilclient_get_input_buffer(video_decode, 130, 1)) != NULL)
    // feed data and wait until we get port settings changed
    unsigned char *dest = buf->pBuffer;
    data_len = read(STDIN_FILENO, dest, buf->nAllocLen-data_len);
    if(port_settings_changed == 0 &&
    ((data_len > 0 && ilclient_remove_event(video_decode, OMX_EventPortSettingsChanged, 131, 0, 0, 1) == 0) ||
    (data_len == 0 && ilclient_wait_for_event(video_decode, OMX_EventPortSettingsChanged, 131, 0, 0, 1,
    port_settings_changed = 1;
    if(ilclient_setup_tunnel(tunnel, 0, 0) != 0)
    status = -7;
    ilclient_change_component_state(video_scheduler, OMX_StateExecuting);
    // now setup tunnel to video_render
    if(ilclient_setup_tunnel(tunnel+1, 0, 1000) != 0)
    status = -12;
    ilclient_change_component_state(video_render, OMX_StateExecuting);
    buf->nFilledLen = data_len;
    data_len = 0;
    buf->nOffset = 0;
    first_packet = 0;
    if(OMX_EmptyThisBuffer(ILC_GET_HANDLE(video_decode), buf) != OMX_ErrorNone)
    status = -6;
    buf->nFilledLen = 0;
    if(OMX_EmptyThisBuffer(ILC_GET_HANDLE(video_decode), buf) != OMX_ErrorNone)
    status = -20;
    // wait for EOS from render
    ilclient_wait_for_event(video_render, OMX_EventBufferFlag, 90, 0, OMX_BUFFERFLAG_EOS, 0,
    // need to flush the renderer to allow video_decode to disable its input port
    ilclient_flush_tunnels(tunnel, 0);
    ilclient_disable_port_buffers(video_decode, 130, NULL, NULL, NULL);
    ilclient_state_transition(list, OMX_StateIdle);
    ilclient_state_transition(list, OMX_StateLoaded);
    return status;
    int main (int argc, char **argv)
    return video_decode_test();

  62. Daniel permalink


    I try to setup an UBUNTU PC as the RX.

    The first part of the setup is working ust fine:

    sudo apt-get install mercurial libpcap-dev iw
    hg clone
    cd wifibroadcast

    But the I have serious trouble. It looks like if some directories are not beeing created:

    Here is what my terminal is responding to my commands and some additional directory infos to track down the problem.

    On the Raspberry I have no problems at all…

    schluff@ubuntu:/opt$ cd $HOME
    schluff@ubuntu:~$ hg clone
    destination directory: hello_video
    abort: destination ‘hello_video’ is not empty
    schluff@ubuntu:~$ cp hello_video/video.c /opt/vc/src/hello_pi/hello_video
    cp: cannot create regular file ‘/opt/vc/src/hello_pi/hello_video’: No such file or directory
    schluff@ubuntu:~$ cd /opt/vc/src/hello_pi/
    bash: cd: /opt/vc/src/hello_pi/: No such file or directory
    schluff@ubuntu:~$ ls
    Desktop Downloads hello_video Pictures Templates wifibroadcast
    Documents examples.desktop Music Public Videos
    schluff@ubuntu:~$ cd \opt
    bash: cd: opt: No such file or directory
    schluff@ubuntu:~$ cd /opt
    schluff@ubuntu:/opt$ ls

    any suggestions whats going wrong?

    Thanks a lot,


    • natxopedreira permalink

      I tried to reply to your question at the fpv german forum but i dont undertand german so made a mistake with the reply from, anyway…

      Hello video is only for raspberry, in ubuntu you can use a gstreamer pipeline, you have a example in this post.


    • natxopedreira permalink


      You can not use hello_video in ubuntu, it will only run in raspbian.

      In ubuntu you can use a gstreamer pipeline, you have an example in the post

  63. natxopedreira permalink

    I have this running on a compute module with two CSI cameras, sending one 3d sbs video stream and using an oculus rift to view the 3d video. Its great !!!! Thanks for wifibroadcast!

    • Walkeer permalink

      Could you please share what HW are you using exactly? Or some more details how you managed to connect it to oculus rift? many thanks

      • natxopedreira permalink

        Yes of course.

        On TX:

        – compute module with io board
        – two CSI camera adapters with 2 cameras
        – one tp-link wifi stick
        – raspivid with -3d sbs flag

        On RX:
        – ubuntu 14.04
        – to view the stream in oculus i modify a script that use python + pygtk +gst. So basically i have a gstreamer pipeline that does:

        + make a copy of the stream to have one frame for each eye, crop and rotate, later each eye can have an X-Y offset and another setting to increase/decrease the IPD distance between the eyes render.

  64. Jolyon permalink

    I wonder if over clocking the tx pi will help reduce latency?

    The link below discusses the possibilities of changing GPU freq by adding “force_turbo=1” and then setting “h264_freq” and “isp_freq”. That will increase the speed on the camera bus and the h264 hardware encoding.


  65. Seb permalink

    My wifibroadcast platform for the 3dr y6…

  66. IverM permalink

    This project rocks! Again thx befinitiv for bringing this out!

    A few observations:

    The new(ish) FEC and 10 mbit mode works great!!

    Be careful with the -G option in raspivid – it can quickly increase the video size beyond available bandwith producing framerate slowdowns and lag . Especially with rapid scene changes. Precisely what we want to avoid in FPV.

    Also, I’ve noticed that the raspicam/raspivid produces micro-stutter everytime it transmits an I-frame. It is especially noticeable in e.g. cctv-like street scenes where you have an overall static scene with small moving parts (pedestrians, cars, bicycles etc. seen from a distance). Even if you don’t enable the -G option the default raspivid setttings will send an I-frame circa every 1.5 second and produce stutter.
    My workaround is to increase the time between sending I-frames drastically. E.g. to 10 seconds. This may of course be problematic in rf noisy enviroments where the I-frames are needed more often, but at least it gets rid of the stutter (I personally don’t notice 1 stutter per 10 secs, but YMWV).

    These cheap and compact pcb yagi antennas can be soldered directly onto the WN722 adapter for use on your reciever side:

    I’ve done it and It works fine. But they are very directional and have some side lobes (as do all yagis) that may take in noise (they certainly do here in my wifi congested neighborhood).

    These amplifiers seem to add a little bit of power to the signal. But not enough to justify the additional weight and power on anything but a ground rover or a very large fixedwing rc plane, I would say. The claimed 4 w is bullshit. They are max 2 w and I would even doubt that. Better antennas are the better solution:

    …although the 2.4ghz antenna supplied with the booster is quite good 🙂

    Two questions:

    I tried to make a relay with a third Rpi (model B) in between transmitter (model A) and reciever (model B+) this way on the relay, roughly speaking:

    rx | tx

    But it only produced a lot of errors and no working relay-retransmission. What could be wrong?

    Also, since using the new FEC and 10 mbit mode it seems there is no difference in quality of recieved signal when using diversity (two NW722 adapters). Any particular explanation for that?

    Feature request:
    Relay mode ! unless its already there by default and I just was too stupid to make it work… 8)


    • IverM permalink

      It may be hello_video which produces the micro-stutter, not raspivid/raspicam. Haven’t tested against mplayer lately though…

      Re diversity: I meant that there seems to be no difference using 1 WN722 addapter (no diversity) or 2 adapters (diversity) now with the FEC and 10/11mbit mode…


    • Thanks! I also noticed the I-Frame stutter. I like to fly with 2Hz I-frame rate and there the stuttering is really noticeable. But I prefer safety over comfort 🙂 However I would not recommend to go over 2s. If you are unlucky the persisting disturbances might be quite harsh.

      Your relay approach might suffer from insufficient USB bandwidth on the relay station. Try first to get the relay mode working with a normal linux pc to make sure that it is a wifibroadcast and not a raspberry pi issue.

      I will take a look into your diversity observations. If I find something I will make a commit but probably not a new blogpost. So better check on bitbucket 🙂

      • IverM permalink

        Glad to hear that its not just me being picky about stutter 🙂 I should add that for normal FPV use I set -fps 49 -G 70 so about 1.4 hz refresh and I don’t notice the stutter. It’s only when observing a mainly static scene with small moving objects that’s it really annoying.
        Good idea to check relay on pc first – will do that!
        Will keep an eye on bitbucket also…

      • Iver permalink

        Ok, so I set wifibroadcast up on an Ubuntu linux pc as a relay. The transmitter is an Rpi A+ sending on ch 8 using an WN722N. Receiver is an B+ on ch 3 using an WN722N. Wlan1 and wlan2 are two WN722N adapters sitting in an external powered USB hub. All are fairly close together physically. On the relay pc I start this script:

        # relay script

        sudo killall ifplugd #stop management of interface

        sudo ifconfig wlan1 down
        sudo iw dev wlan1 set monitor otherbss fcsfail
        sudo ifconfig wlan1 up
        sudo iwconfig wlan1 channel 8

        sudo ifconfig wlan2 down
        sudo iw dev wlan2 set monitor otherbss fcsfail
        sudo ifconfig wlan2 up
        sudo iwconfig wlan2 channel 3

        sudo ./rx -b 8 -r 4 -f 1024 wlan1 | sudo ./tx -b 8 -r 4 -f 1024 wlan2

        A few packets gets through to the receiver via the relay but not nearly enough for live video, just mainly garbage, like when you have a very weak signal in a normal setup The relay pc outputs a lot of these errors:

        Could not fully reconstruct block 2aa! Damage rate: 0.880974 (1917 / 2176 blocks)
        TX RESTART: Detected blk 2ab that lies outside of the current retr block buffer window (max_block_num = 1485) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ab! Damage rate: 0.881029 (1918 / 2177 blocks)
        Could not fully reconstruct block 2ac! Damage rate: 0.881084 (1919 / 2178 blocks)
        Could not fully reconstruct block 148d! Damage rate: 0.881138 (1920 / 2179 blocks)
        TX RESTART: Detected blk 2ac that lies outside of the current retr block buffer window (max_block_num = 148e) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ac! Damage rate: 0.881193 (1921 / 2180 blocks)
        Could not fully reconstruct block 1490! Damage rate: 0.881247 (1922 / 2181 blocks)
        TX RESTART: Detected blk 2ac that lies outside of the current retr block buffer window (max_block_num = 1491) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ac! Damage rate: 0.881302 (1923 / 2182 blocks)
        TX RESTART: Detected blk 2ad that lies outside of the current retr block buffer window (max_block_num = 1491) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ad! Damage rate: 0.881356 (1924 / 2183 blocks)
        TX RESTART: Detected blk 2ad that lies outside of the current retr block buffer window (max_block_num = 1495) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ad! Damage rate: 0.881410 (1925 / 2184 blocks)
        TX RESTART: Detected blk 2ad that lies outside of the current retr block buffer window (max_block_num = 1496) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ad! Damage rate: 0.881465 (1926 / 2185 blocks)
        TX RESTART: Detected blk 2ae that lies outside of the current retr block buffer window (max_block_num = 1498) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2ae! Damage rate: 0.881116 (1927 / 2187 blocks)
        Could not fully reconstruct block 149b! Damage rate: 0.881170 (1928 / 2188 blocks)
        5504 data packets sent (interface rate: 38.761)
        Could not fully reconstruct block 14a0! Damage rate: 0.881224 (1929 / 2189 blocks)
        Could not fully reconstruct block 14a1! Damage rate: 0.881279 (1930 / 2190 blocks)
        Could not fully reconstruct block 14a2! Damage rate: 0.881333 (1931 / 2191 blocks)
        Could not fully reconstruct block 14a5! Damage rate: 0.880583 (1932 / 2194 blocks)
        Could not fully reconstruct block 14a8! Damage rate: 0.879836 (1933 / 2197 blocks)
        Could not fully reconstruct block 14aa! Damage rate: 0.879491 (1934 / 2199 blocks)
        TX RESTART: Detected blk 2b0 that lies outside of the current retr block buffer window (max_block_num = 14ab) (if there was no tx restart, increase window size via -d)
        Could not fully reconstruct block 2b0! Damage rate: 0.879545 (1935 / 2200 blocks)
        Could not fully reconstruct block 14ac! Damage rate: 0.879600 (1936 / 2201 blocks)
        5568 data packets sent (interface rate: 39.028)
        Could not fully reconstruct block 14af! Damage rate: 0.878857 (1937 / 2204 blocks)
        Trouble injecting packet: send: Resource temporarily unavailable

        ‘Seems that the relay TX has trouble retransmitting what it gets directly from the relay RX…
        I know the relay is not a *supported* feature (yet 🙂 but do you think there could be some simple way to make it work, befinitiv?


      • Try to use a different port on the two links via the -p option. It seems as if the rx on the relay also sees the packages from the tx (via loopback or similar). using different ports should filter those out.

      • Iver permalink

        Ok, wil try that now.
        Something else; I removed the USB hub and plugged the two relay WN722N adapters directly into the Ubuntu linux laptop. That alone helped quite a bit although there are still a number of errors and evntually the tx stills quits…

      • Iver permalink

        Added your -p option by putting TX and relay-RX on port 1 and relay-RX and RX on port 2. It is *almost* working now! By *almost* I mean that there are more errors than there would be at the same distance with a normal TX-RX link. But the data is being relayed by the relay and there is live video. After some time (10sec-3mins, varying) the relay tend to quit with output like this:

        52800 data packets sent (interface rate: 688.696)
        52864 data packets sent (interface rate: 689.530)
        52928 data packets sent (interface rate: 684.414)
        52992 data packets sent (interface rate: 685.241)
        53056 data packets sent (interface rate: 686.069)
        53120 data packets sent (interface rate: 686.897)
        53184 data packets sent (interface rate: 687.724)
        Could not fully reconstruct block 2a78! Damage rate: 0.051058 (350 / 6855 blocks)
        Could not fully reconstruct block 2a79! Damage rate: 0.051196 (351 / 6856 blocks)
        53248 data packets sent (interface rate: 688.552)
        Could not fully reconstruct block 2a85! Damage rate: 0.051252 (352 / 6868 blocks)
        Could not fully reconstruct block 2a86! Damage rate: 0.051390 (353 / 6869 blocks)
        53312 data packets sent (interface rate: 689.379)
        Could not fully reconstruct block 2a8d! Damage rate: 0.051483 (354 / 6876 blocks)
        53376 data packets sent (interface rate: 684.308)
        53440 data packets sent (interface rate: 685.128)
        53504 data packets sent (interface rate: 685.949)
        Trouble injecting packet: send: Resource temporarily unavailable


      • Iver permalink

        …should be relay-TX in post above, first line, last word…

      • Iver permalink

        UPDATE: It basically works now! As long as there is an ok rf link between all adapters it keeps running. There are more errors than normally at same distance. But this is to be expected since there are now 2 links and 4 adapters in one chain. I.e. more scope for interference and errors. And thae resulting end-stream accumulates all errors generated in between.

      • Iver permalink

        UPDATE II: Can work for long periods and the suddenly quit, hmm:

        64704 data packets sent (interface rate: 698.245)
        64768 data packets sent (interface rate: 698.935)
        64832 data packets sent (interface rate: 699.626)
        64896 data packets sent (interface rate: 700.317)
        64960 data packets sent (interface rate: 701.007)
        65024 data packets sent (interface rate: 701.698)
        65088 data packets sent (interface rate: 702.388)
        65152 data packets sent (interface rate: 698.057)
        65216 data packets sent (interface rate: 698.743)
        65280 data packets sent (interface rate: 699.429)
        65344 data packets sent (interface rate: 700.114)
        65408 data packets sent (interface rate: 700.800)
        65472 data packets sent (interface rate: 701.486)
        65536 data packets sent (interface rate: 702.171)
        65600 data packets sent (interface rate: 697.872)
        65664 data packets sent (interface rate: 698.553)
        65728 data packets sent (interface rate: 699.234)
        Trouble injecting packet: send: Resource temporarily unavailable

  67. Jolyon permalink

    @IverM Sorry if this a stupid question, but are you retransmitting on a different channel? If not it will interfere with itself. Just a thought

    • IverM permalink

      @jolyon: Not a stupid question at all but yes, I was using a different channel…PS did you test the overclocking of the GPU/h.264 part of the GPU?

      • Jolyon permalink

        Have not got results yet because still waiting for the raspicam to arrive. Will definitely post results here. I was half expecting everyone here to be overclocking their hardware since its so easy to configure.

      • jholster permalink

        I have tried overclocking (encoder, gpu, everything). Unfortunately it does not reduce latency. I guess the latency isn’t caused by a bottleneck in hardware performance. I guess that the (proprietary) camera/gpu code just isn’t optimized for latency (a frame stuck in the pipeline etc).

      • Unfortunately, yes 😦

      • jholster permalink

        Hi befinitiv, I couldn’t find info about how did you measure signal strength in your range experiments? rx.c does not print the signal strength? Thanks for all your effort, this is such a cool project! I’m building a fixed wing fpv drone with two odroid-w (mini-sized rpi clones) transmitting dual streams for 3D video. I have an idea to use mavlink over wifibroadcast not only for telemetry but also for controlling the plane (with help of autopilot & high gain antenna tracker).


      • Until now I have not yet seen a fixed-wing wifibroadcast video. And especially not in 3D. Please do not forget to post here 🙂

        I used wireshark a lot for the development of wifibroadcast. There you can extract the radiotap signal strength field and add it as a column in the packet view.

  68. With regards to 5GHz, how do you comply with the DFS regulations when you set a fixed channel?

  69. JosMon permalink

    Wifibroadcast was initially working great as I was bench testing.
    Using ALFA AWUS051NH for tx and rx.

    Now suddenly I’m seeing a large amount of lag(>3min) and low interface rates on the tx side?

    Raw data transmitter (c) 2015 befinitiv GPL2
    64 data packets sent (interface rate: 32.000)
    128 data packets sent (interface rate: 27.429)
    192 data packets sent (interface rate: 36.000)
    256 data packets sent (interface rate: 25.600)
    320 data packets sent (interface rate: 30.000)
    384 data packets sent (interface rate: 32.000)
    448 data packets sent (interface rate: 33.600)
    512 data packets sent (interface rate: 28.444)
    576 data packets sent (interface rate: 24.686)
    640 data packets sent (interface rate: 24.615)
    704 data packets sent (interface rate: 24.558)
    768 data packets sent (interface rate: 25.600)
    832 data packets sent (interface rate: 26.000)
    896 data packets sent (interface rate: 26.353)
    960 data packets sent (interface rate: 26.667)
    1024 data packets sent (interface rate: 26.947)
    1088 data packets sent (interface rate: 27.661)
    1152 data packets sent (interface rate: 27.000)
    1216 data packets sent (interface rate: 26.824)

    On the rx:
    Pipeline is PREROLLED …
    Setting pipeline to PLAYING …
    New clock: GstSystemClock
    Could not fully reconstruct block b! Damage rate: 0.166667 (2 / 12 blocks)
    Could not fully reconstruct block 16! Damage rate: 0.130435 (3 / 23 blocks)
    Could not fully reconstruct block 172! Damage rate: 0.010782 (4 / 371 blocks)
    Could not fully reconstruct block 17c! Damage rate: 0.013123 (5 / 381 blocks)
    Could not fully reconstruct block 1ee! Damage rate: 0.012121 (6 / 495 blocks)
    Could not fully reconstruct block 21e! Damage rate: 0.012891 (7 / 543 blocks)
    Could not fully reconstruct block 276! Damage rate: 0.012678 (8 / 631 blocks)

    I Tried: sudo iwconfig wlan0 rate 54M
    but get the following error:
    Error for wireless request “Set Bit Rate” (8B20) :
    SET failed on device wlan0 ; Input/output error.

    I’m wondering if there’s maybe some driver issue.

    • Hi

      Unfortunately I have no experience with the AWUS036NH. What you are seeing looks like the card is running with 1mbps. You can check that by executing “iwconfig”. The i/o error when setting the rate is also unknown to me. You could try to use the iw tool instead but the chances are low that this will solve the problem… sorry.

  70. Iver permalink

    Cheaper Rpi cameras, lenses etc. here:
    Didn’t try them myself yet but thought I would mention…

  71. Hey,
    I’m trying to get this working on 5.8 ghz, I’m using the CSL 300, which was mentioned to be working with wifi broadcast. I managed to get the drivers working, and moniter mode works, so it’ll work as a receiver, but I don’t know how to do the transmitter side.

    Does anybody here have experience with the CSL 300? It would really help if there was some instructions?

    I really want 5.8 ghzs because you get 10 times the legal power here in the UK, pluss you can still use 2.4 ghz rc link.


    • I’ve not managed to get 5.8ghz to work either with both the CSL or the ALFA.

  72. I think I’m going to buy the 2.4 Ghz, and use the CSL for receiving, that should work!

    I think there is no reason why wifibroadcast couldn’t work two ways befinitiv? It would be great to use it for mavlink uploads!

  73. Okay I read all Fnoop Dogg’s posts, and it turns out quite a few people are having success with 5.8ghz. I’ve ordered myself the exact same setup as him!

  74. Patrick permalink

    hi hqdby1

    can you post the setup you will take here?

  75. @patrick

    Okay, I’ve ordered:
    1 x CSL 300Mbit USB WLAN Adapter – this/these will be my recievers, they feature high sensitivity but low power.
    1 x Alfa AWUS051NH V2 – this will be my transmitter, it features high power, with mediocre sensitivity.

    Antennas – I might experiment here, but i’m going to start off using some antennas I have to hand a pair of clover leaf’s:

    1 x Aomway 5.8GHz 4-Leaf Clover Antenna pair

    In the future I will probably upgrade to some SpiroNet antennas, if these aren’t any good.

    Once the system works I’ll buy another CSL and use that for diversity, on that I’ll put a helical on the other (I think this will be good due to the wide band).

    I’m running 2.4 ghz link for my control, I’m using the Long Range LR9 reciever with a turnigy 9xr pro with XJT module. I have it hooked up to a 2.4ghz booster that I may or may not use… Standard range with the system is 3 km… with the booster I may get 7-10 Km range with say 3db omnidirectional, which is far more than I really need. With a high gain antenna…. god knows! No need to worry about UHF systems 😉

    Either way, video link will drop out a long way before the rc link.
    This leaves me space for a usualy 900/433 mhz telemetry link 🙂

    In the future I might use a small dish with wifibroadcast, this should give me a lot of range. I’ll think think about using wifibroadcast on a plane.

    More info and pictures can be found on my blog at:

    However this is a work in progress, I’ll be updating the blog with my setup as I go. I shall describe how/if I got wifibroadcast to work, I’ll post the command lines etc..


  76. Raspivid source was modified 19 days ago.
    Changes: “Added support for network and (optional) fflush to decrease latency”
    Will this have effect on wiffibroadcast latency?

    • I don’t think it will be huge. On friday I developed an alternative version wifibroadcast that hooks directly into raspivid and “takes over” the fwrite function (it’s in the low_lat_raspivid_hook branch). This should have three advantages over normal wifibroadcast: 1. directly hooking into raspivid. this means no fflush is necessary. 2. NALU header moved to the end. this should allow the decoder to display frames one frame earlier. 3. wifibroadcast variant without FEC and fixed block length. This variant does not hold any data back, every packet is directly forwarded to the decoder. However, I did not notice any big improvements. So after the first (disappointing) test I stopped working on it. It might be that there is still something wrong with it but currently I have no time to continue working on it.

  77. fc3sbob permalink

    I had this running perfectly for a while, I put it aside for a bit and came back a few weeks later to play around with it because I got an old netbook running and I want to use it as the receiver. All I get now is mostly a corrupted looking green screen and some of the transmitted image can be seen at the very top of the stream. I started again from scratch using my desktop and laptop as the receiver. I have 3 TL-WN722N Cards so I verified that I don’t have a bad one, verified that the camera just isn’t messed up but it works fine on the Pi, tried different channels, tried the “stock” setup commands in my start scripts. I get the same corrupted looking video feed every time. The only thing that has changed is that the 2A power adapter for my Pi died and I’ve tried a few other ones, mostly around 1A (samsung, apple, a few others). I’m going to have to look for a better power supply and try again because I am getting the rainbow image in the top left indicating that there is a power issue no matter what adapter I try.

    • fc3sbob permalink

      I scrounged up a 2A power supply which fixed the power issue, still the same thing. hhmm

      • Ala permalink

        hi fc3sbob,
        on the picture i see that you are using a acer mini laptop.
        a have a question about this.
        i will use also that kind of 10″ note book for wifibroadcast.
        whats your setup on the laptop,like os?
        i using lubuntu but its runs not verry well, i think that the performance of the videocard is not good, how itz running on your laptop?


    • Could you post here the command lines for tx and rx?

      • fc3sbob permalink

        Thanks for responding, but I actually got it working, I had saved the tx script off the Pi and reused it after I re flashed the card. Somehow I didn’t notice that the -r value was different and I overlooked it even after comparing it with my rx. but it’s working fine now.

        Here is my crappy test setup. Once I get some more time, I’ll setup the script on startup don’t have to SSH into the Pi to start the tx script I’ll add the second receiving card and do some distance testing with different antennas

  78. Patrick permalink

    Right now everything is quite big and not integrated. What would be the next step to put the hardware of a raspberry with minimal size + transmitter interface on one small pcb. As soon this is available this will really rock. There are some other nice features which could be done like virtual gimbal or deshaker. That would be a great thing for kickstarter.

  79. Okay so my hardware has arrived, I’ve set the system up using 5.8 Ghz. Though I’m having a bit of trouble with Gstreamer. I’m unfamiliar with it and probably am asking silly questions.

    First I wanted to ask what should happen when I run the Gstreamer command?
    At the moment I get these messages returned:
    Setting pipeline to PAUSED …
    Pipeline is PREROLLING …
    DLT_IEEE802_11_RADIO Encap

    If this is an error can anybody help? If not and the pipeline is working correctly how do I actually view the stream? Does Gstreamer open a window? Do I need some kind of player?

    • So it appears a stream window is meant to pop up… Any ideas why my pipeline isn’t playing? I can supply further information.
      I made a bash script,

      Command lines are:

      cd /home/james/wifibroadcast/ #set target

      sudo echo “Configuring Wifi card”

      sudo killall ifplugd #stop management of interface
      sudo ifconfig wlan1 down #turn off wifi card
      sudo iw dev wlan1 set monitor otherbss fcsfail #put card in monitor mode
      sudo ifconfig wlan1 up #turn on wifi card
      sudo iw reg set BO #set reg
      sudo iwconfig wlan1 rate 18M fixed #set data rate
      sudo iwconfig wlan1 channel 149 #set channel

      echo “Starting reciever”

      sudo ./rx -b 8 -r 4 -f 1024 wlan1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false

      I get no errors upto the gstreamer part.

      • natxopedreira permalink

        Seems like its not receiving anything?

        Check the TX part to see if its transmiting, and that you are not getting any error for each command for rx/tx part

        Wich system are you using on rx?

    • WinerS permalink

      Hi, James.

      I had same issue trying to receive video on Linux laptop.
      Solve in next way:
      Original initialization line for Pi
      raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o – | sudo ./tx -b 8 -r 4 -f 1024 wlan0 – does not work for me.
      I use:
      raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o – | sudo ./tx -b 8 -r 4 -f 1024 wlan0

      Difference (high -o – ) vs (high -o – ) I have check several times first does not work second works. :-).
      Hope it helps. I spent 3 days to find.

      • WinerS permalink

        Again displays wrong. Apparently there is something wrong with web page codding. There is simply short minus after (high -o) instead of long. – –

  80. Jolyon permalink

    I’ve had some good luck using v4l2 instead of raspivid. No idea why, but my latency is on average 20-40ms less. Might be something I’ve done wrong on the raspivid setup, but couldnt get less than 220ms, now im getting between 180-200ms. RX script was the same in both cases, on laptop using mplayer.

    Here is what I’m using currently:

    sudo modprobe bcm2835-v4l2
    v4l2-ctl –set-fmt-video=width=1280,height=720,pixelformat=4
    v4l2-ctl –set-ctrl video_bitrate=3000000
    v4l2-ctl –set-parm=49
    cd wifibroadcast
    sudo killall ifplugd #stop management of interface
    sudo ifconfig wlan0 down
    sudo iw dev wlan0 set monitor otherbss fcsfail
    sudo ifconfig wlan0 up
    sudo iwconfig wlan0 channel 13
    v4l2-ctl –stream-mmap=0 –stream-to=- | sudo ./tx -b 8 -r 4 -f 1024 wlan0

    The issue with v4l2 is that my RX wont decode unless the TX starts *after* the RX. This would be bad if I get a bad patch of signal.

    Any ideas how I can get latency closer to you guys using raspivid?


    • Aight! It worked! So it appears I typed — instead of – so that is why the transmitter was sending useless packets which the receiver couldn’t use.

      At least it worked for a while…. My raspberry pi crashed a few seconds into it. I thought perhaps it was due to not being able to supply enough power, so I added a exturnal USB power, it helped ain’t but it still crashed. Any ideas?

      • Okay, I believe the exturnal USB hub couldn’t even supply enough power!!! This WiFi card is a beast! Gonna need to make a custom cable 😉

        But amazing, 5.8ghz / 2.4ghz wifibroadcast dual channel works, I shall hold my celebrations till tomorrow when I get the cable sorted.

      • WinerS permalink

        Hi James,

        Not shure which Pi you use. A+ have up to 2Amp USB. All the rest models is software limited. You can try to unlock additional power like max_usb_current=1 in Pi config file (google it for details) – gives 1.2Amp. My A+ works without Powered HUB.

    • Jolyon permalink

      I’ve started using a RPI to RX instead of my laptop and now I’m getting about 150ms. Startup scripts work a treat too. This is great and thanks so much to befinitiv!! You are a legend.

      I’ll just mention one thing I noticed if it hasn’t been covered already. The shutdown script by default uses pin 25. This is GPIO numbering, not PHYSICAL numbering (they are different). If someone is silly enough to connect the 3.3v button to physical pin 25 (GND) it will probably kill their pi.

  81. Patrick permalink

    @James do you might post some pictures that would be great.

  82. natxopedreira permalink

    To check power issues if you have one of the new pi, when booting if you see a “rainbow square” in the upper right it means that you have low voltage, so a bad/not eought power supply

  83. Uttam permalink

    Guys, how do you get the shutdown script to work? I have the button configured as described, and have verified that a button press takes the shutdown pin from 0 to about 2.7V. How do i now get the script to work? I tried copying init.d/shutdown to /etc/init.d, and then update-rc.d shutdown start, but nothing happens. Thanks!!

    • You could first try to run the shutdown script by hand. Or even one step before that try if you can read the value of the GPIO pin. Just look into the script and apply the commands by hand.

  84. I made some experiment by comparing wifibroadcast older version vs. newer version. Sadly to say that newer version (example above) acts with plenty of incorrect decoded frames.
    I have 4 pics of WN722 adapters they show similar results. In first place when updated to newer version I though that some problems with hardware, but finally I figured out that frame droppings appears in near range 1-2 meters. This same happening 10 meters and 30 meters away. Also when connecting WN722 to 2W dropped frames appears in wider range. Thus changing SD card in Raspberry with older wifibroadcast version this same hardware work totally better. No dropped frames 100-150 meters away picture are crystal clear.

    Also I noticed that older version somehow more resistant to my FrsKy 2.4GHz RC control. I am using 13 Wifi channel where Frsky Transmitter partly covers 13 channel. With newer version RC controller have strong influence to broadcast quality.

    Older version reception is affected when 150 mW RC transmitter is 1-0.5 meters away reception 4dbi antenna. Newer version is affected 1-0.5 meters also 5-30 meters distance of RC from reception antenna.

    Also I tried to change wifi channels results with newer version quite similar and again older version showing better results, where visually almost no dropped frames in range 100-150 meters.

    Has anyone experienced similar effect?

    • What versions are you referring to? Could you give us the mercurial commit id? (“hg log -l1”) And also the command lines used to start them?

  85. RX/TX side:

    changeset: 50:baf91f383243
    branch: fec
    tag: tip
    user: befi
    date: Wed Jul 15 19:07:13 2015 +0200
    summary: corrected packet header types to be packed. What was I thinking??

    TX Param:
    raspivid -vs -ih -t 0 -w 1080 -h 600 -fps 35 -b 3000000 -n -pf high -o – | sudo /home/pi/wifibroadcast/tx -b 8 -r 3 wlan0

    RX Param:
    sudo /home/pi/wifibroadcast/rx -b 8 wlan0 | /opt/vc/src/hello_pi/hello_video/hello_video.bin

    Newer version:

    TX side: changeset: 57:424307cb6f61
    tag: tip
    user: befi
    date: Wed Aug 19 12:11:50 2015 +0200
    summary: added -x parameter to tx which defines the number of transmissions of a block. If -r is 0 then this parameter can be used to reproduce the old _retransmission count_ transmission. This transmission is less bandwidth effective than FEC but due to no processing faster (lower latency)

    TX Param:
    raspivid -sh 20 -ih -t 0 -w 1080 -h 600 -fps 35 -b 3000000 -n -g 10 -pf high -o – | sudo /home/pi/wifibroadcast/tx -b 8 -r 4 -f 1024 wlan0

    RX Side:

    changeset: 59:ca50865f0a5a
    branch: low_lat_raspivid_hook
    tag: tip
    user: befi
    date: Fri Sep 11 21:26:44 2015 +0200
    summary: first draft of low latency raspivid hook

    RX Param:
    sudo /home/pi/wifibroadcast/rx -b 8 -r 4 -f 1024 wlan0 | /opt/vc/src/hello_pi/hello_video/hello_video.bin

    Updated TX to Spe 11 ver.:
    changeset: 59:ca50865f0a5a
    branch: low_lat_raspivid_hook
    tag: tip
    user: befi
    date: Fri Sep 11 21:26:44 2015 +0200
    summary: first draft of low latency raspivid hook

    RX side had a version of Wed Aug 19 12:11:50 2015 +0200 it seems that TX Side and RX side had few week difference.

    I have updated RX side to Fri Sep 11 21:26:44 2015 +0200 and repeated test.

    I may say that results where almost the same but flickers came more seldom. Also this version quite easy is affected by FRSKY RC module which works on 2.4 GHz. While older version (Wed Jul 15 19:07:13 2015 +0200) posses quite impressive immunity to it.

    Also I noticed one common problem for newer version: frame errors appears in particular frame row:
    Wed Jul 15 19:07:13 2015 +0200 look at bottom three pictures
    frames mostly are corrupted from top ~20 % and from bottom. It seems that some errors in reception corrupts almost whole frame.

    Also several times I observed strange behavior, that reception shows that no errors in received packets (no damaged rate at all) video is displayed corrupted.

    In first place I thought that some problems with camera, but preview in TX side showed no corrupter frames. Thus it seems that there may some problems in TX side when forming packets.

    • Ok, there are two problems: The old version (baf91f3) is an intermediate commit that should not be used. Bad packets can cause the software to crash. So this is very dangerous. A second problem is that the parameters of rx and tx don’t match.

      I know that this does not quite add up with your findings. It is strange because there are no fundamental differences between the two versions. So possibly another change between the two SD cards has caused your observations (channel, kernel patches)? You could try this by staying on the same SD card and check out and compile the different wifibroadcast versions.

  86. Iver permalink

    Caution: Don’t use the USB cables that comes with the WN722 adapter. They’re shit.
    I was getting unexplainable 7-10sec lag despite using settings that I know normally works. The difference was that I had added the USB cable between TX RpiA+ and WN722 adapter. Once the cable was removed again and the the adapter fitted directly into the Rpi USB slots, latency was back to normal, i.e approx 200msec.

  87. Jolyon permalink

    Just did a comparison between v4l2-ctl and raspivid. Looking at the photos, it appears as though v4l2-ctl is consistently faster than raspivid. I took about 5 photos of each case.

    Results (of the best setups I could achieve for each)
    V4l2-ctl, ~166ms,
    raspivid, ~183ms,

    I’d really e interested to know why this is the case, and whether its a problem with my setup, or if v4l2 really is faster.

  88. Kevin permalink

    Dear all,

    Here is are two scripts (RX&TX) based on somes i saw on this page, if this can help someone :

    syntax : ./script INTERFACE_WIRELESS CHANNEL

    RX :


    echo ##################RX Script#################
    echo Start this script with root authority


    if [ “$1” != “” ] && [ “$2” != “” ];
    echo $1 choose by user to be the wireless interface TP-LINK 722N
    echo Setting wifi adapter in MONITOR mode
    ifconfig $1 down && iw dev $1 set monitor otherbss fcsfail
    echo Setting maximum Tx Power
    iw reg set BO
    iwconfig $1 txpower 30
    ifconfig $1 up
    echo Setting Channel $2
    iwconfig $1 channel $2
    echo Starting HD Video reception…
    #./rx -p $PORT -b $BLOCK_SIZE -r $FECS -f $PACKET_LENGTH
    #./rx -p $PORT -b $BLOCK_SIZE -r $FECS -f $PACKET_LENGTH | mplayer -fps 15 -cache 1024 –
    ./rx -p $PORT -b $BLOCK_SIZE -r $FECS -f $PACKET_LENGTH $1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false

    echo Please choose the interface of your TP-LINK 722N as the first argument
    echo Then the wifi channel as the second argument

    TX :


    echo ##################TX Script#################
    echo Start this script with root authority


    if [ “$1” != “” ] && [ “$2” != “” ];
    echo $1 choose by user to be the wireless interface TP-LINK 722N
    echo Stopping ifplugd
    service ifplugd stop
    killall ifplugd
    echo Setting wifi adapter in MONITOR mode
    ifconfig $1 down && iw dev $1 set monitor otherbss fcsfail
    echo Setting maximum Tx Power
    iw reg set BO
    iwconfig $1 txpower 30
    ifconfig $1 up
    #iwconfig wlan0 rate 54M
    #iw dev wlan0 set bitrates legacy-2.4 54
    #iw dev wlan0 set bitrates ht-mcs-2.4 5
    echo Setting Channel $2
    iwconfig $1 channel $2
    echo Starting HD Video transmission…
    raspivid -ih -t 0 -w $WIDTH -h $HEIGHT -fps $FPS -b $BITRATE -n -g $KEYFRAMERATE -pf high -o – | ./tx -p $PORT -b $BLOCK_SIZE -r $FECS -f $PACKET_LENGTH $1
    #raspivid -n -w 1280 -h 720 -b 1000000 -fps 15 -t 0 -o -| ./tx -b 8 -r 2 -f 1024 $1
    echo Please choose the interface of your TP-LINK 722N as the first argument
    echo Then the wifi channel as the second argument

    Feel free to add some comments

  89. Jolyon permalink

    Any else got buildroot working with wifibroadcast? I have everything working except wifibroadcast because it requires linpcap0.8 library which I can’t get working with buildroot.

    The time from power off to start loading raspivid and piping to wifibroadcast is about 5-6 seconds. It would be great if there is a way to get libpcap-dev on buildroot, or if there a way to get wifibroadcast compiled with the non-dev version of libpcap-dev (which is compatible with buildroot).

    • Thomas Kindler permalink

      Have you tried to patch the makefile to use $(CC) instead of hardcoded gcc?
      That way, I got rx/tx to compile with buildroot 2015-08 just fine.

      But I’m stuck at getting the camera to work. How did you enable the raspi camera in buildroot?

      • Jolyon permalink

        I installed the ‘extended’ rpi drivers and rpi-userland to get the camera working.

        I got libpcap-dev working with buildroot by copying the raspbian libs over. I dont know of another way to get it working. Can you explain how you did?

        I eventually got wifibroadcast running on buildroot, but it doesn’t get received by the other pi. It is sending packets (apparently) but nothing coming in on the RX side. It was at that stage I stopped trying.

      • Tommy Larsen permalink

        Thomas Kindler, how did you modify makefile to achieve this?

  90. Dave permalink

    The atheros chipset supports operating in the 2.3GHz band, has anyone tried this? Do you think operating on 2.3GHz will eliminate the interference with 2.4GHz radios? This page provides details on how to modify the drivers for the dongle to add the extra channels, however I’m not sure how to install them.
    Any advice on how to install the patched drivers would be greatly appreciated.

    P.S. Using 2.3GHz is legal for licensed amateur radio operators in most countries.

  91. Walkeer permalink

    Wonderfull news, can you please test how it works with taranis or simmilar 2.4GHz RC?

  92. I use Ralink RT5572 wing and it works perfectly with your system.
    Just change the following in your scripts

    prepare_nic function {
    echo “updating wifi ($ 1, $ 2)”
    ifconfig $ 1 down
    iw dev $ September 1 otherbss monitor fcsfail
    ifconfig $ 1 down
    iw reg Set is
    iwconfig $ 1 channel $ 2
    iwconfig $ 1 rate 24M

    Thank you very much for all your effort

  93. jholster permalink

    I received a shipment today: 4 pieces of Alfa AWUS036HNA adapters. These will be part of my experiment of building a flying wing with stereo fpv system & rc control link through wifibroadcast. By using dual wifi cards I can transmit dual 1280×720@49fps streams using 6 mbps rate per card, improving receiver sensitivity by many dBm (vs. higher data rate). I also plan to replace traditional rc radio link by wifibroadcast mavlink upstream. In theory I see no reasons why this wouldn’t work, but in practice there is lot of code to write 🙂

  94. Luke permalink

    I have discovered the hard way that the Alfa adapter is quite power hungry! Regular wall adapters for phones won’t put out the amps needed to power the Alfa as a TX. Without enough power my Alfa status light would go solid blue.

  95. Hey I am so thrilled I found your blog, I really found you by
    error, while I was browsing on Bing for something else, Anyways I am here now and
    would just like to say thank you for a incredible post and a all round entertaining blog (I also
    love the theme/design), I don’t have time to
    look over it all at the minute but I have bookmarked it and also added in your RSS feeds, so when I have time I
    will be back to read much more, Please do keep up the superb job.

  96. Lorbass permalink

    5 GHz Version Transmission distance 15 to 20 m, only
    With the recommended WLAN Sticks 300-CSL (2 Antennas) the System works, but only for
    a verry short distance. The Signal strength Indication of the OSD shows -64dBm at a distance
    of 2m and -90dBm at 15m in the free air without Telecommand.
    On the PC I can see about 6 weaks WLAN activities. Unknown which Frequency.

    Has anyone Ideas whats the reason and experience with this 5 GHz WLAN Sticks?

  97. natxopedreira permalink

    Im also testing 5.8 using a pair of CSL 300, and yes i have very little range.

    I want to check two points:

    + usb power issue: im (getting the rainbow square in my TX pi so i don’t have enough amps to run the raps and the dongle) don’t know if that affects to the signal

    + antennas: the omnidirectional antennas that comes with the CSL 300 its a 5.8ghz antenna? so the dongle comes to use in 2.4 or 5 ghz wifi and only comes with one antenna…..

    Someone else tried those dongles?¿?

  98. anemos permalink

    I tried to modify the tx (in accordance to the rx) script in order to save the video to a usb key. But the system freezes when I try to do so. Do you know why ?

    • Could you please post the command line where TX is being used? Also, the details of the freeze would be interesting: Does it never work, does the freeze occur after X seconds?

      • anemos permalink

        now that I retried it, that works every time, strange, I don’t know what happens before.
        here is the full script below, I change the sleep pause because I experienced some slow network card setup time, and the script is now launched from the rc.local rather than from your regular init.d file, this way, the process is now fully reliable.

        # tx script

        #if we detect no camera, we fall asleep
        if vcgencmd get_camera | grep -q detected=0; then
        sleep 365d


        #wait a bit. this helps automatic starting
        sleep 10

        source /home/pi/wifibroadcast_fpv_scripts/

        function prepare_nic {
        DRIVER=`cat /sys/class/net/$1/device/uevent | grep DRIVER | sed ‘s/DRIVER=//’`

        case $DRIVER in
        echo “Setting $1 to channel $CHANNEL2G”
        ifconfig $1 down
        iw dev $1 set monitor otherbss fcsfail
        ifconfig $1 up
        iwconfig $1 channel $CHANNEL2G
        echo “Setting $1 to channel $CHANNEL5G”
        ifconfig $1 down
        iw dev $1 set monitor otherbss fcsfail
        ifconfig $1 up
        iw reg set BO
        iwconfig $1 rate 24M
        iwconfig $1 channel $CHANNEL5G
        *) echo “ERROR: Unknown wifi driver on $1: $DRIVER” && exit

        # Make sure only root can run our script
        if [[ $EUID -ne 0 ]]; then
        echo “This script must be run as root” 1>&2
        exit 1

        prepare_nic $NIC
        RASPIVID_COMMAND=”raspivid -ih -t 0 -w $WIDTH -h $HEIGHT -fps $FPS -b $BITRATE -n -g $KEYFRAMERATE -pf high -o -”

        echo $TX_COMMAND

        if [ -d “$SAVE_PATH” ]; then
        echo “Starting tx for $NIC with recording”
        FILE_NAME=$SAVE_PATH/`ls $SAVE_PATH | wc -l`.h264
        echo “Starting tx for $NIC without recording”

        killall raspivid
        killall tx

  99. Shave permalink

    Hi, Is it possible to just repeat existing signal of wifi equipped cameras such as, sony action cams, go pros etc and relay it back to rx side. This can be useful for those who don’t mind latency and just need it for taking photos or to line up the shots. I know can it can be easily done conventionally via a router/repeater but doing this in a monitor/wifibroadcast style approach is what I’m looking for.

    • Hi Shave

      This should be possible – although a bit hacky. These COTS cameras use, as you mentioned, standard wifi. Thus without a router they won’t send any data. So what you would need to do is to have a router close to the camera so that the camera constantly sends its data to the router. These packets could then be intercepted in monitor mode and used in a wifibroadcast style. But there are many issues here:

      – You would need to rewrite the wifibroadcast rx program. It is not compatible with this operation.
      – Since camera and router are close to each other, the camera will most likely send with low TX power and a high rate – both will have a negative effect on the range.

      • Shave permalink

        Hi, currently I’m using tplink mr3020 via openwrt to act as repeater which indeed needs to be placed near the camera (sony action cam) and my Android or ios devices do the rx work using onboard WiFi and Sony app. I’m satisfied with the quality and I’m able to start or stop recording or change cameda settings too. Only wish I could convert it into your wifibroadcast / monitor mode for obvious advantages. Let me know if you come across any idea because raspberry pi camera just doesn’t have cut it for serious photo or video work (no 1080 60fps or 4k) or the quality that these dedicated action cams provide.

    • ALicinio permalink

      Hi Shave, as Befinitv said it is impossible to do with standard action cam but i did the same that you want with a rpi&picamera check this >

      ofc, image quality is not like gopro, but it is live and for at least 300€ for total tx+rx things, quite low compare to 10 000$ for the same from gorpo + vislink !!

      I think we have to improve quality and find good optics, like that it could be very nice. And for the price, picamera is good when sunlight is there


      • Shave permalink

        Well, gopro are over priced and i never owned a gopro. But Sony cams are priced quite reasonable. I bought as100 for $148. 50mbps XAVC codec, backlit sensor excellent low light performance and abity to change lenses without much DIY. Apple vs orange bro. Raspberry camera is great for fun and fpv but its not for serious recording at the moment. Until it gets a hardware upgrades, these cheap actions cams can’t be ignored. Xiomi yi is $65 with big community and hack scripts available. And many more cheap cams are coming with wifi onboard so it would be great to find an RX solution for them based on wifiroadcast principle.

      • ALicinio permalink

        You right, it could be awesome to have the best quality into a wifibroadcast application, but at this time this is impossible because you need to compress the signal or input it throught the csi interface. You can use that > but i don’t think you can compress the signal with raspivid, dunno.
        Or ifnot use a tiny encoder and wifibroadcast as pipe (ffmpeg avconv gstreamer) (check on auvidea or any others) but you will loose all the lightweight that a rpi + picamera afford.
        Or ifnot dunno if it’s possible you can setup a really closer ap from the cam, capture packets and puch it through wifibroadcast but i did read that the latency of the video over wifi is quite long for this action camera (like gopro).

        From my point of view i prefer work hard on the picamera + good optical and high bitrate. The solution for a good rx is to have diversity over ethernet, like that we can put (almost) everywhere receiver and not just over usb cable.

  100. Nathaniel permalink

    Hi, i found this project when i was searching for a HD video link, i tried this project and it works very well! good work! but i’m struggling with viewing the stream on my macbook, i want to use this computer because the screen is very good in the sunlight and have a long battery live so perfect for flying in the field.
    have someone already tried to view the stream on a mac?
    i want to use an external USB wifi dongle so i can connect a better antenna, and if possible even more USB dongles for diversity.

    • natxopedreira permalink

      you can user ubuntu over parallels on OSX, no native implementation for mac….

      • Nathaniel permalink

        hi, i also had this idea so i had already had ubuntu installed on my mac. the wifi card configuration works, the name of my wifi card is a bit different but thats not the problem. when i start the line:
        sudo ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
        i get the message: warning: no element “fdscr” and then it stops.
        what do i wrong? i have a working setup with 2 raspberry’s and that is working fine.
        if anybody knows hou to fix my problem please help me:)

  101. For anyone interested, I can confirm that the Zero can function as a receiver 😉

    • Athar Mian permalink

      All very interesting and extremely generous of the blog author and others to contribute ! Truly a new open source movement…

      Read the aka implementation quickly with PiZero receiver.

      Has anyone tried to use the $3 ESP8266 WiFi module/ mcu ( instead of the $15 TP WN722 card) with PiZero on either the Tx or Rx end? Here is the link:

      Of course the $5 PiZero is now much more available, and comes with a cam ribbon.

  102. Alberto permalink

    Did you try to implement two cameras and 3D video transmission? How about x265 compression?

  103. Michael permalink

    Hi befinitiv,

    I am new to Linux, I setup the video transmission on a RPi2, and it’s transmitting the video (I tested with another RPi2 connected to my TV), but I actually want to receive the video on a Linux Mint computer (same as ubuntu if you are not familiar with Mint), and I don’t know how to set it up to receive the video. I am using the TL-WN722N WiFi modules on both sides.
    The problem is that Linux Mint does not come with the RPi libraries for camera (source code for hello_video) for decoding the video.
    Please help me get started 🙂

    Thank You,

  104. Ben permalink

    Hello and thank you for this project. Thanks to you FPV is finally entering in the 21st century.

    I’m currently testing all the setup with:

    Raspberry PI B + TL-WN722N
    Ubuntu 15.10 + TL-WN722N

    I’m using this command on the receiver (ubuntu) side, i copied your and change the DISPLAY_PROGRAM:

    DISPLAY_PROGRAM=”gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false”

    I did copy this from someone, but I have no gstreamer knowledge….

    It’s working very nice, video is great and normal speed, and it can also record a directly playbable video file. The problem is that it produces a x2 speed video!

    I’m struggling with this problem. Have you or some here experience this problem ?

  105. Sebastian permalink

    Very cool project.
    I am running a Pi B (not +) as TX and a Pi 2 as RX. I have two CSL-300 (two antennas).
    I downloaded the wifibroadcast 0.4.
    Yesterday I managed to get the transmission working once, but with a lag of about 10 seconds. Today I get no video on the RX. I noticed several messages in the systemd journal and starts and stops endlessly:

    wbc [675] mmal: Failed to write buffer data (4096 from xxxxxxxx) – aborting

  106. Falko permalink


    awesome work. I did a week of research until i stumbled upon your solution when looking for a low latency video app on android. Seems like my plan to prefer MediaCodec over an embedded ffmpeg was the right decision.

    At the Moment i have a video stream between a Rasperry PI 2 and a Laptop with about 90ms latency. My setup for TX is slightly different to the one you are using.

    I use the v4l2 kernel module for the pi camera with h264_v4l2_rtspserver ( which creates a UDP RTP Uni/Multicast stream. You can force it to send out every single frame. So less buffering. GOP Structure currently cannot be changed. But if you go for 90FPS you drastically reduce the latency introduced by B-Frames.

    The RX Side in my test setup is just a Laptop with mplayer rtsp://:8554/unicast –benchmark which just renders frames as they come in ignoring any timing info and also ignoring jitter buffers.
    I planned to build a Android app from Scratch which uses MediaCodec but i would i will probably use the myMediaCodecPlayer-for-FPV as a base. Are you the author? Is there any License information?

  107. Hello, I’m trying to fork the video from my laptop to my phone so I can view it in VR. I’ve been looking around but I’m not quite sure how you fork the gstreamer pipeline to send it over UDP to another IP?

    Can anyone help with this, I’m running:

    sudo ./rx -b 8 wlan1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! autovideosink sync=false

    I’d like to send that to (my phone), and what would be the receive line of code?


    • Alexandre Licinio permalink

      Hi, with my colleague we did it from pi to android smartphone using gstreamer. The app you need to use is
      After that you have to launch your pi with sudo ./rx -b 8 wlan1 | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay ! udpsink host= port=5700
      Where host is the ip addres of your phone, you should see the video on the app.

      I hope this command line will work with your app.
      Please share you VR app on android using gstreamer, that could be fun to see that.


  108. Alan Janssen permalink

    Hi befinitiv,

    I have a question about channel 14 on the 2.4Ghz band.
    I know that channel 14 is for the Asia market, so in Europe its not been used.
    When I scan the channels on the wifi band, indeed there are no acces-points or clients on this channel 14, so channel 14 is very clear.

    When I put the RX on wifibroadcast on channel 14, it seems that wifibroadcast accept this channel 14.(no error messages)
    The TX seems to be accept channel 14 as well, but its dos not transmit any data(flashes one time and then steady on the dongel).

    Can we make wifibroadcast work on channel 14, with some modifications?



  109. Philippe Crochat - Anemos Technologies permalink

    hello, concerning the video on the phone, you have to download the beta version of tower indicated in the link by Alexandre Licinio, then you have to configure the port of the video widget to the one you put in your gst-launch command (5700 in the example)

    • Hello,

      So yes I did get the tower app working before hand, I was wondering how to tee the video stream so I can a- record it, B – view it in pc c- view it on my phone.

      I also found a good vr that should in theory be able to display the video in vr. I just need to configure it correctly.

      • Jakdaw permalink

        James: you can use tee itself, split the output of “rx” before it’s even fed into GStreamer:


        sudo ./rx -b 8 wlan1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! autovideosink sync=false


        sudo ./rx -b 8 wlan1 | tee >(gst-launch-1.0 -v ‘fdsrc ! h264parse ! avdec_h264 ! autovideosink sync=false) | gst-launch-1.0 -v ‘some-other-pipeline’

  110. Ronin permalink

    Hello everyone, please forgive the low-level question.
    I know wifibroadcast only works w 802.11 wifi, but is there a reason a pi / odroid/ arduino could not be programmed to encode the camera then pass stream to a traditional FPV transmitter which is received, decoded then passed out HDMI or ribbon? You’d be modulating a digital signal rather than analog, so these fpv tx/rx cant handle that? Why bother w 802.11 wifi and injection at all?

    • Alexandre Licinio permalink

      I don’t know if i understand correctly what you ask but the fpv “traditionnal” for analog use couldn’t be use for digital because of the modulation rate and type handle by this fpv module is too weak, too slow and not able to manage so much informations and then transmit it through the correct uplink frequency. with wifibroadcast we use wifi module not because “wifi” because the modulation rate can handle this.

  111. Paul A permalink

    Awesome project! I see Raspberry Pi 3 is out now and includes built in Wifi. would this be compatible with wifi broadcast and your project?

  112. Michael permalink

    Hello, this is wonderful! Thank you! I got the receiver to work in a VMWare guest running Ubuntu 15 and Win7 as host. For some reason i have to add videoconvert in the gstreamer pipeline. That took me some time to find out (Doh!). So my receiver command is now

    sudo wifibroadcast/rx -r 2 -f 1024 $1 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! videoconvert ! autovideosink sync=false

    I use two TPLINK TL-WN722N. On Ubuntu i compiled the firmware from sources obtained from There is probably a package for it but i didn’t find it …

    I want to use wifibroadcast also to send control packages to a little remote controlled vehicle. A RPI A+ will be on the vehicle and transmit video and telemetry back. Your rx and tx can indeed run on the same computer using different ports. That i just tested. I’ll be doing more tests at greater distances but it looks very promising so far 🙂


  113. Andre M permalink

    Have you considered trying to work with projects such as OpenWRT or DD-WRT, to see if this could be included into those projects, even if it is as an optional build item? I can imagine a scenario where you have a local broadcast point, that gets received by the routers and then rebroadcast on the local network.

  114. This is really interesting, You’re a very professional blogger.
    I have joined your feed and look forward to seeking more of your fantastic post.
    Additionally, I have shared your web site in my social networks

  115. Greg permalink

    Would there be any benefit to using a WiFi AC adapter that supports beam forming? Could we get even longer range? Is this even available for Raspberry Pi?

  116. Ronin permalink

    Another question from the beginner, how do you install/run wifibroadcast on a linux device (not raspberry pi) ?? My Tx device is a Pi, but Rx is device running Ubuntu.

    • Ronin permalink

      When I do the:
      sudo apt-get install mercurial libpcap-dev iw
      hg clone
      cd wifibroadcast

      I get a ‘not used locking for read only lock file’ error. This is on ubuntu touch.

    • Hi Ronin

      Please take a look at the “Manual Setup” instructions on my blog. In there you’ll find a command line for using wifibroadcast together with gstreamer.

  117. Ronin permalink

    mount /dev/loop0 / -o remount,rw worked, it was mounted with insufficient permission. This probly wont be my last question. ; /

  118. Lewis Ryan permalink

    Great project, thanks 🙂 I have this working with an RPi3 + TP-LINK TL-WN722N for tx, and toshiba NB550D netbook running mint linux and TP-LINK TL-WN722N for rx. Have noticed significant reduction in latency when using the netbook as receiver instead of another pi.

  119. Peter Curran permalink

    Any thoughts on using an external USB connected webcam – Logitech C920 has a built-in H.264 encoder. I don’t like the Rpi camera in a multirotor – very noisy and difficult to shield to avoid GPS glitches.

    • Jakdaw permalink

      C920 would certainly work – the question is whether they’re tuned for very low latency. Given the intended usecase is video conferencing one would hope that they’re up to it.

  120. Chris permalink

    What is the latency of this system? Not fpv race speed I guess?

    • Chris permalink

      Sorry. Was using my phone and didnt see the comments already made before I stumbled across this page.

      ~90ms seems as good as it gets it seems.

  121. With the same patched atheros firmware with kernel 4.4 tx injection is very slow

    [B]With the 3.18 kernel on RPI2 tx injection was correct:[/B]
    t:36.7 rtry:0 pkts:10608,byt:10862592 CUR:byt/s:1032184.9,mbit/s:7.9 AVG:byt/s:296180.7,mbit/s:2.3
    t:37.7 rtry:0 pkts:11436,byt:11710464 CUR:byt/s:847642.2,mbit/s:6.5 AVG:byt/s:310819.2,mbit/s:2.4
    t:38.7 rtry:0 pkts:12132,byt:12423168 CUR:byt/s:712664.8,mbit/s:5.4 AVG:byt/s:321194.6,mbit/s:2.5
    t:39.7 rtry:0 pkts:12960,byt:13271040 CUR:byt/s:844251.1,mbit/s:6.4 AVG:byt/s:334429.4,mbit/s:2.6
    t:40.7 rtry:0 pkts:13776,byt:14106624 CUR:byt/s:835442.8,mbit/s:6.4 AVG:byt/s:346739.4,mbit/s:2.6
    t:41.7 rtry:0 pkts:14400,byt:14745600 CUR:byt/s:607715.7,mbit/s:4.6 AVG:byt/s:353311.1,mbit/s:2.7
    t:42.7 rtry:0 pkts:14892,byt:15249408 CUR:byt/s:501980.8,mbit/s:3.8 AVG:byt/s:356798.6,mbit/s:2.7
    t:43.7 rtry:0 pkts:15384,byt:15753216 CUR:byt/s:502306.6,mbit/s:3.8 AVG:byt/s:360119.8,mbit/s:2.7
    t:44.8 rtry:0 pkts:15900,byt:16281600 CUR:byt/s:499568.8,mbit/s:3.8 AVG:byt/s:363409.4,mbit/s:2.8
    t:45.8 rtry:0 pkts:16356,byt:16748544 CUR:byt/s:466809.6,mbit/s:3.6 AVG:byt/s:365664.9,mbit/s:2.8
    t:46.8 rtry:0 pkts:16836,byt:17240064 CUR:byt/s:489968.2,mbit/s:3.7 AVG:byt/s:368327.4,mbit/s:2.8[/I][/SIZE]
    Now with the same patched Atheros firmware (26m/s) in kernel 4.4 tx injection is very slow :[/B]
    [B]Could you tell me where I have to look the root cause of this very small injection?
    [SIZE=”1″][I]:131.1 rtry:1285 pkts:14556,byt:14905344 CUR:byt/s:121318.6,mbit/s:0.9 AVG:byt/s:113684.6,mbit/s:0.9
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    t:132.1 rtry:1295 pkts:14664,byt:15015936 CUR:byt/s:109259.2,mbit/s:0.8 AVG:byt/s:113650.6,mbit/s:0.9
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    t:133.1 rtry:1305 pkts:14784,byt:15138816 CUR:byt/s:121335.8,mbit/s:0.9 AVG:byt/s:113709.0,mbit/s:0.9
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..
    inject failure, but 8 retries remain
    spinning for 0.100000 seconds..[/I][/SIZE]
    [B]Linux Image emlib for RPI2 + NAVIO+
    Visit my web site

    • I found why my injection rate is low with kernel 4.4

      The injection flow (patch 26meg / s):
      The firmware is in two places:
      Here is the good place :

      According to the log dmesg :
      [4.682705] usb 1-1.4: ath9k_htc: Firmware ath9k_htc / htc_9271-1.4.0.fw requested

      And of course I changed the wrong /lib/firmware/htc_9271.fw

      now flow rate is correct :

      t:9.1 rtry:0 pkts:7260,byt:7434240 CUR:byt/s:821469.6,mbit/s:6.3 AVG:byt/s:818787.6,mbit/s:6.2
      t:10.1 rtry:0 pkts:8100,byt:8294400 CUR:byt/s:849100.4,mbit/s:6.5 AVG:byt/s:821817.3,mbit/s:6.3
      t:11.1 rtry:0 pkts:8928,byt:9142272 CUR:byt/s:837782.6,mbit/s:6.4 AVG:byt/s:823260.8,mbit/s:6.3

  122. Any idea if this will work with a wifi booster? I bought this one:

    And i’m having issues, without the booster connected I get video fine, with the booster connected and powered I get no video except maybe a messed up frame here and there you cannot make out. Without the booster powered the video feed comes back but with very bad signal.

    It’s possible I just got a bad one but wanted to see if there was any other reasons this might be happening

    • with bad wifi booster it is difficult to receive any “good” frames. First because the noise is also amplified and also because the “frame rate” isn’t respected by such amplifiers, they cannot handle the rate because of their cheap “modulators”.
      To compare, you’ve to buy a really expensive booster with noise filtering and only for 54Mbit/s if you want this setup running.
      So really, don’t use that.

      Therefore, try to play with better antenna (in the correct band).
      this one is cheap but works great both 2.4 and 5

      Move to another channel or band (2.4 or 5)
      Try the trick to have the channel 14 in 2.4

      Use diversity (many dongles) from the receiver side.

      whynot, try this >

      what is your configuration ? dongles ? setup ?
      we have an irc channel where you can discuss #wifibroadcast


    • Nicolas permalink

      I think when using wifibroadcast, these bi-directional amps are not ideal. Although the TX side of wifibroadcast does just spill out packets, the carrier sense and collision avoidance is still being used, i.e. it first listens if there is another transmission already going on and doesn’t transmit until the air is clear. Now with such a bi-directional amp, your wifi card will probably pick up more other stuff and not transmit because of that.

      And of course, in general, many of those ebay/amazon amps are not very good quality.

      Although I haven’t tried it, I think it would make sense to hook up an unidirectional amp. But it needs to be a highly linear one that can deal with OFDM signals. Those are expensive, big and heavy if you want something like 2W.

      I’d just get a higher power wifi card (I think the 1W adverstised ones put out around 500mw in reality) and go for receiver diversity and better antennas, that should help a lot.

  123. Nathaniel permalink

    i’m using this project for some time on my TBS discovery and it works really good! first i started wit 2.4Ghz but i have a new RC transmitter on the 2.4Ghz so there was a lot of interference that results in a bad picture. also when i was near a building with WIFI the picture went very bad so i decided to go try the 5Ghz band. i bought 2 ALFA AWUS051NH v2 and it worked perfect, i got about 500-600 meters withe the stock antenna. but i notice that when i’m on the point of losing the signal the RSSI on the OSD is -69dB. according the datasheet of the ALFA AWUS051NH v2 the sensitivity 802.11bg is -93dB so now my question is why is my RSSI not going lower then -69dB? i know that the sensitivity is inversely proportional to the data rate, so setting the data rate lower solves this? in the script is is already set to 24M, do i need to go even lower?

    • You could try to lower the data rate. One little hint: Often, the drivers make up more or less the RSSI value. I would not trust the absolute value and use it more like a relative RSSI.

      • Nathaniel permalink

        thank you for your reaction, i will try to lower the data rate. for me is the range is a bit disappointing so i was looking for a cause of that. the WIFI transmitter claims 27dBm and if i run ifconfig is shows even 30dBm. with the Stock 5dB antenne on both sides i get the range of about 500-600 meter. with two 5.8GHZ fatshark cloverleaf i get around 250-350 meters but with a more stable picture. my goal is around 1000 meter range.
        has anyone experience withe the same ALFA WIFI? and what is the range you get?

  124. milkrong permalink

    Hi,I have some problem when set the wifi card to monitor (tl-wn722n)
    after”sudo iw dev wlan0 set monitor otherbss fcsfail”
    it shows “command failed,operation not supported(-95)”

  125. There is a new Raspberry camera! 8Mpix, better sensor, can do 720p60
    any idea if this will enable us to use 720p60 in raspivid anytime soon?

  126. paderb permalink

    I am considering buying an 5.8GHz FPV receiver with a built in 2.4GHz wifi transmitter (ebay item 381570072105). The item heading is: “For DJI Gopro FPV System 5.8Ghz Receiver + WIFI Transmitter 2in1 For Iphone Ipad”‘

    I am told by the manufacturer/seller that the wifi receiver has to be an android device with the correct app loaded. I wanted to know whether there is a way whereby I can receive and monitor the output on a PC laptop which has a 17” screen? The wifi dongle that I would be using is a NetGear WNDA3200, 802.11n item.

    Do you have any ideas on the subject?

  127. Daniel permalink

    Great project ! nd good heart for choosing to share it. nd spending nights writing clear manuals for a lot of us who never gave you a single dime.

    Everyone looks at this choice to let it out like its normal, but we all know it isn’t. Most people go protective on things like this and refuse to share.

    Thanks Befinitiv . . for being different

    • I totally agree! The amount of work he has put into this is incredible, and to share it like this is phenominal! Befinitiv has single handedly created an open sourced movement and inspired countless others!!

      Cheers to befinitiv!

  128. Joe permalink

    Hi Befinitiv,

    Fantastic effort. You inspired me to get a couple of pi and start my own experiments. I’ve written my own code to talk to the camera and encoder rather than use raspivid, although that code was a huge help in getting it working.

    I am experimenting with mjpeg rather than h264, it is less efficient but faster to encode and since there is no interfame coding less susceptible to lost packets (i’m hoping). So far I have 1280×720 @30fps with a very consistent 30ms lag to the encoder callback, running at 5Mbps, iv’e also tried it at 10Mbps and the quality improves with no additional latency. Latency was measured similarly to your led test and usually the led is switched on and becomes visible in the next frame (sometimes starts to turn on in the same frame), and I re-framed the jpeg stream with the timing information. Fiddling with the camera settings I can move this latency around so there is still room for improvement.

    I haven’t built the decoding end yet, rather just been recording the data to file and replaying on a programme i’ve written on the pc, stripping out the custom timing data and overlaying it on top.

    My goal is to create a lower definition stream as a fallback and interleave it with the 720p, then use your tx/rx code to deliver the data. I will integrate this directly into the camera code.

    I really need to write all this up on a blog and share the code.

    Cheers, Joe

    • Wow, that sounds great! If yiu could share your code and experiments that would be really helpful to other people.

      • Joe permalink

        I’ve made a little more progress and have the capture/encode pipeline working well now. I’ve dropped the resolution down to 800×600 whilst I sort out the rendering pipeline, this in itself should be much better than analogue and is the resolution of the fatshark hd goggles.

        The experimental setup is as follows:
        Sensor capturing in 4:3 1296×972 at 42fps
        Jpeg IMAGE (not video) encoder running at 800×600 42fps
        UDP 100Mb/s wired network between pi and pc

        The capture programme on the PI listens on a UDP port and receives commands to switch an led on/off.It continually captures encoded Jpegs and writes them out (each frame fits in a single datagram).

        A programme on the PC sends commands to switch the led on/off at 0.5Hz (on for 1s off for 1s). This programme also listens for jpeg packets and for each packet decodes the jpeg and runs a very simple (<100us) analysis to determine whether the led has switched on in that frame. This removes the host rendering pipeline from the test which since it is running in java isn't very controllable.

        Latency is ~31ms from the host sending the command to switch on the led to receiving the frame with the led on, this is pretty good. 800×600 at 42fps is 9.5Mbits/s and represents and latency of 1.3 frames including the ~1ms to send the command to the pi.

        Moving to the PI decode+hdmi pipeline will require further tuning.

        I've noticed a number of interesting issues with memory corruption using zero_copy on the camera/encoder so am not using it.

        If nothing else this shows it is possible to get very low latency frames out of the PI.

    • Adam permalink

      This is awesome, great work. I and I’m sure many others would love to hear more and maybe help out if possible.

      Have you looked at all at this OpenMAX implementation?

      PS I wonder how MPEG compares to H.264 with no interframe compression.

    • Don’t forget, at 60Hz it takes 16,67ms per frame. With 30ms (33,34), it’s about 2 frames behind of what is happening real. This is already very very fast. What you are seeing here is: capturing 1 frame, transmitting, decoding 1 frame. IMHO you can’t go faster then this.

      When I am doing only a capture (raspivid) with a camera recording the monitor that is attached at the same board, i get about 6 frames delay, that is 100ms (6×16,67ms). setup:
      Monitor resolution: 1920x1080x60Hz
      Capture video: Mode5-1280x720x48Hz

      But currently my 5 inch display (1080×1920) is driven my a MIPI/DSI 1.1 from Toshiba IC.

      Don’t forget, analogue can’t do much better, with their interlaced method it also takes about 2x 16,67ms before they have a complete picture! Or they must only do 1/2 frame (very low resolution), then it will be faster.

      Although I think H264 is more resilient for FPV use, by nature it introduces Digital Noise when WiFi (my aim) is getting lower. You don’t have all of this with MPEG, you get lost frames instead.

      • >> monitor that is attached at the same board, i get about 6 frames delay, that is 100ms (6×16,67ms).

        Made here a wrong statement, here’s the correct one: At Mode 5 i am recording at 48Hz and that takes 20,83ms per frame. 6×20,83=124,98ms delay.

        Now with the WiFi transmission there is virtually *no* delay. All the delay in the encodig/decoding. Ofcourse encodig takes the most hit in GPU power. Higher bitrate does result 1 frame quicker encoding. i-frames doesn’t do nothing that I can measure.

  129. Athar Mian permalink

    All very interesting and extremely generous of the blog author and others to contribute ! Truly a new open source movement…

    Read the aka implementation quickly with PiZero receiver, c. Jan 1, 2016 comment.

    Has anyone tried to use the $3 ESP8266 WiFi module/ mcu ( instead of the $15 TP WN722 card) with PiZero on either the Tx or Rx end? Here is the link:

    Of course the $5 PiZero is now much more available, and comes with a cam ribbon.

  130. Hello the ESP8266 baudrate is limited to 1Mbit/s which is two low a limit to transmit video.
    Where did you see that the rpi zero is now much more available ? it is out of stock everywhere.

    • Athar Mian permalink


      1. See here for the latest, April 2016, datasheet:

      On p16, the UART speed example is 115,200 baud x 40 (clock) = 4.5 Mbps. On a comment from June 2016, there is mention of a 960K baud rate updated.

      Also this chipset supports WiFi b/g/n. And many other functions: PWM, IR, DAC, LEDs, 10 Mbps Ethernet,…

      2. The PiZero Foundation announced last month production of 40K PiZero units/ month plus 200K+ inventory. Since then I haven’t heard many complaints.

  131. Chris permalink

    I am really desperated by now. My Problem is that I am using two ubiquiti Rockets via LAN instead of USB WiFi Sticks. I tried to change the wlan0 in every script to eth0 but every time I power up the system I only see the raspberry booting and the console where I can navigate through the system. Any suggestions, what else I can try. Thanks for your help!!!!

    • Hi Chris

      As far as I understand these Ubiquity-devices are Ethernet to Wifi gateways. Such a gateway does not allow the use of wifibroadcast. The reason for this is that wifibroadcast needs to talk directly to the WIFI chipset to put it into a non-standard mode. Only a few chipsets are capable of doing so. The models are listed on this page.

      • Chris permalink

        Thanks a lot for your answer. Well, then I will have to change my equipment 🙂

  132. Hi folks,

    Is anyone tried to re-write the driver ATHEROS 9271 to be able to do 5mhz or 10mhz bandwith ?
    If we can do this, the signal will be more reliable for sure.
    I’m sure we have to move forward in that direction because we don’t need 20mhz bandwith because of total data rate transmitted.
    And as you may know, less the frequency is more non-los you can do 🙂

    • jholster permalink

      Alexandre: There exists research / experimental drivers / hacks for the PCI versions of AR* chipsets but AFAIK the ath9k_htc driver for the USB version AR9271 does not support such features.

      • Alexandre Licinio permalink

        Ahhh ok, thanks. We did a driver with this support and setup instructions but we were stuck when wanted activate the 5MHz feature with iw.
        So maybe with the SR71-E from ubiquiti ?

  133. I wrapped up today an ultra mobile Pi Zero transmitter. Works perfectly fine.

    • Athar Mian permalink


      Congrats! I always thought that the $5 PiZero had great potential, now that it finally is in mass production mode from last month.

      Does anyone know how to use a PiZero with a cheap QAM tuner/ demod to process DVB- C cable TV signal? This way the PiZero could be a cheap but flexible set top box as well.

      The $10 RTL- SDR dongle would be expensive. But then many low end set top boxes ($30-35) use mass made digital QAM tuners from Realtek or STM ($2 STV series qam + ADC outputting mpegTS from QAM input.)

  134. la5xoa permalink

    For Micro-setup with Zero? Must have external 3.3v reg.

  135. Alex. permalink

    Thank you for your great work.
    I tried to use your program myself to communicate between two Raspberry Pis. One had a TL-WN722N dongle and the other a RT5370 dongle. When I tried to communicate command line to command line it gave me a output like this:

    HP0H,4, ,


    “hahahha” was actually a part of the sent message. So it must have received it in a way, but I only get a new message from the RX after every fifth line I sent on the TX side. Is this maybe because of the RT5370 dongle or is it another problem?

  136. An outstanding share! I have just forwarded this onto a coworker
    who has been doing a little homework on this.
    And he in fact ordered me dinner due to the fact that I stumbled upon it for him…
    lol. So allow me to reword this…. Thanks for the meal!!

    But yeah, thanks for spending the time to discuss this matter here on your website.

  137. Tony permalink

    2016-09-03 16:30EST

    Hi there, first and foremost congratulations on the efforts! I’m very excited to have found your project!
    I am very interested in contributing to this project, especially since I use Linux extensively and I have some experience with embedded systems.
    I’m building a quadcopter and simply don’t want to pay $400 for 1970s video technology. And the digital options seem too expensive.

    I didn’t read all the comments, but I’m sure many have pointed out that new digital receivers operate on the 2.4GHz band. I noticed your radio is 35 MHz?

    First, I’m afraid the wifi will interfere with RC. Second, you can get about 1Gbps on 5GHz, and that’s what I would like to aim for.

    I’m going to buy a few 5GHz USB dongles (I have a Linksys AC1200) and see if I can implement your code on these 5GHz dongles.

    I’m not very experienced in modifying firmware, but I speak C/C++ and many others.

    If you are still active in this project I hope we can collaborate.

    – Tony
    dr.guerau at gmail

    • If you could do that, it woudl be awesome! the new AC adapters are really good, please let us know how far did you get, thanks man!

    • Hi Tony

      Thank you! Your help is very appreciated!

      You are right, I first used wifibroadcast with my 35MHz RC system. This caused some crashes of my quad so I upgraded to 2.4GHz RC. And you are right, that interferes heavily with wifibroadcast. There are basically 3 ways around that:

      1) Use 5GHz wifi dongles
      2) Run 2.4GHz wifi dongles at 2.3GHz
      3) Modify the RC system ( )

      Option 1 reduces the maximum range since 5GHz does not penetrate as well as 2.4GHz.
      Option 2 is in most cases illegal.
      That’s why I use option 3. It’s a bit hacky but it workes very well and lets me use the legal wifi channels.

      Concerning the 1Gbps you should only use such a high rate if you make use of the bandwidth. Using a high bandwidth lowers your receivers sensitivity quite a lot (->reduced range) and increases the chances for bit-errors. Also, most embedded systems will not be able to provide data to the card at that rate. Most likely even the card will not be able to process that data rate. For example, the card I am currently using for 2.4GHz provides at 130mbit/s air data rate only 32mbit/s of net throughput.

  138. Nicolas permalink

    Hey Befi,

    it’s been a year or so since I last posted on your blog, not sure if you still remember me 😉

    Finally made a pre-built image that is easier to use for non-linux people (see here if interested: and have been doing a lot of testing lately.

    There seems to be an issue with diversity that has gone un-noticed so far:

    When using more than one card and there is just a little packet loss, all is good, diversity works as expected, picture is stable. Also, when one card does not receive anything at all, as long as the other card is still receiving packets, picture is stable.

    But when there is one card that has heavy packet loss, badblocks appear, although the other card still receives 100% (or almost 100%) of the packets. I have experimented with the “-d” parameter, it seems that when using “-d 2” the problem gets better, but does not disappear completely.

    Another thing I have observed is (but haven’t been able to reproduce yet) what user “Extra” in the comments on your blog already noticed: “Also several times I observed strange behavior, that reception shows that no errors in received packets (no damaged rate at all) video is displayed corrupted.”

    Here is a video that shows this behavior: it can be reproduced with both 722N and CSL-300Mbit sticks:

    • Hi Nicolas

      Of course I remember 🙂 I followed the work you have done there and I am very happy with what I have seen! One little request however: It is quite hard to extract all the things you have done out of the image itself. People would really benefit if you would share your process of the image creation and make it transparent to others what exactly has been done. This serves three important purposes:

      – More “open spirit”. Everything is clear to everyone.
      – Users can modify/extend the image to their own needs.
      – Just like you took things from this blog and improved them there might be sitting someone out there who wants to improve your images even further. And by giving this person the ability to recreate your images you would make it much more likely for the improvements to happen.

      Do you know about my image creation scripts?
      They are by no way perfect but I found them quite handy. For the first image I hated them because everything took longer than just doing the things by hand. But since the second image I was really glad because I could reuse everything from before with no extra work. If you have any questions for the image creation process then I am happy to help. I also have a forthcoming version somewhere in my stash for supporting a later rpi-image as well as kernels for rpi1 and 2. I never published it because I could not test things sufficiently. Testing really starts to suck. There are now so many combinations… but I think you know that even better than me by now 😉

      To clarify: Please don’t get me wrong – I really appreciate the amount of work you put into this FPV project! I am just a strong supporter of openess and I think (maybe together) we could make things more open 🙂

      Now to the actual question of your comment: Very interesting observation. I just looked into the source code of wifibroadcast and so far I have found no good explanation. If you look at rx.c:399 you see that a packet is only saved if at this slot we do not yet have a “sane” packet with a valid CRC. So this should work in both cases (first sane, second damaged or first damaged, second sane) and would not allow a sane packet to be replaced by a damaged one. However, if the damaged one is so damaged that the sequence number is corrupt, it might be that we are checking and writing the data into the wrong spot. But this is also not an explanation. Because if we assume that one card delivers mostly sane packets, then these packets should overwrite the data at the wrong spot. And in the other order the bad packet should not overwrite the good packet (because replacing a sane with a corrupted packet is excluded by line 399).

      Very strange! We must be missing something. I would suggest to add a printf somewhere around rx.c:392 that shows the card number, sequence num, block num, crc status. Then try to reproduce the issue and check if there is something suspicious in these numbers. Maybe also add the debug in rx.c:210

      • Nicolas permalink


        thanks for your quick reply.

        You are right about my image not being very friendly for modifying. I actually looked into your script, or alternatively into making .deb packages long ago, but that would’ve been even more work in making the scripts and testing them, so I skipped that step.

        But the good thing is, Rangarid has started a Wifibroadcast Image based on buildroot (what I wanted to do a year ago, but gave up again because way too much work …)

        When EZ-wfibroadcast 1.3 is released, I will start to put things on Github and also contribute to Buildroot Wifibroadcast.

        Regarding the diversity issue: I will look into what you wrote, thanks.

        BTW: I chose to disable reception of packets with failed FCS because sometimes that caused something in the kernel to lock-up, i.e. wifi interfaces would simply not give out any packets anymore, ifconfig up/down etc. would not help, only reboot.

      • Buildroot sounds like a very nice thing. Do you have a link to a repository? I think I’ll contribute to that as well.

        The FCS-thing you said is interesting. I assume that the video you showed me was also captured with no bad FCS-frames? If so, I’ll have to take another look at the WBC source. It sounds more and more like a real bug to me.

  139. Nicolas permalink

    The repository is

    Would be great if you also contribute to that.

    I have already started working on implementing TX diversity (seems to work, but code is crude, don’t have C skills …) and did some testing with transmitting SBUS and Multiwii serial protocol for R/C control through wifibroadcast from ground to air. Although theory says it’s not that perfect because of the shared-medium nature of wifi and collisions etc., the mechanisms that avoid collisions seem to be working good enough. Maybe it would be possible to implement sending CTS-to-self frames before transmitting the R/C control packets to reduce packet loss and collisions to zero. Not sure if that would take up too much air-time though with the additoinal packets, needs some testing I guess.

    But general problem with R/C control is that it’s not very good piping everything through wifibroadcast, I think it would be better to somehow integrate everything and make it just send one packet after the other, no fec and blocks etc. If one gets lost, so be it, there will be another one with updated stick positions following shortly anyway. BTW, I’m quite stunned looking at the SBUS protocol, it has no checksum and also no good distinguishable packet end marker. Cleanflight relies on counting bytes and timing only. Quite dangerous, I had all channels go crazy during testing.

    Yeah, the video was with no bad FCS frames. I’ll try and see what happens if I enable them again, maybe it gives a hint what might be wrong.

    • One additional question: Did you use different types of adapters for diversity? This might explain the behavior. Also, if the driver behaves strange and does not forward frames immediately after reception. You should add a printout with just the adapter number and sequence number to check that. Ideally, the same sequence number should appear twice, with no other sequence numbers in between.

      • Nicolas permalink

        No it was with either two 722N or two CSL300Mbti sticks.

        I’ll try your suggestion.

      • Nicolas permalink

        Hmm, I did not manage to insert debug lines and find the right variables to use etc.
        But I have commented out the existing debug lines, not sure if that helps.

        This is reception with both cards having near zero percent packet loss:
        24 packets coming in, 12 per card, 8 data + 4 fec
        adap 0 rec 23a4f blk 2f86 crc 1 len 1024
        adap 1 rec 23a50 blk 2f86 crc 1 len 1024
        adap 0 rec 23a50 blk 2f86 crc 1 len 1024
        adap 1 rec 23a51 blk 2f86 crc 1 len 1024
        adap 0 rec 23a51 blk 2f86 crc 1 len 1024
        adap 1 rec 23a52 blk 2f86 crc 1 len 1024
        adap 0 rec 23a52 blk 2f86 crc 1 len 1024
        adap 1 rec 23a53 blk 2f86 crc 1 len 1024
        adap 0 rec 23a53 blk 2f86 crc 1 len 1024
        adap 0 rec 23a54 blk 2f87 crc 1 len 1024
        removing block 2f86 at index 0 for block 2f87
        adap 1 rec 23a54 blk 2f87 crc 1 len 1024
        adap 0 rec 23a55 blk 2f87 crc 1 len 1024
        adap 1 rec 23a55 blk 2f87 crc 1 len 1024
        adap 0 rec 23a56 blk 2f87 crc 1 len 1024
        adap 1 rec 23a56 blk 2f87 crc 1 len 1024
        adap 0 rec 23a57 blk 2f87 crc 1 len 1024
        adap 1 rec 23a57 blk 2f87 crc 1 len 1024
        adap 0 rec 23a58 blk 2f87 crc 1 len 1024
        adap 1 rec 23a58 blk 2f87 crc 1 len 1024
        adap 0 rec 23a59 blk 2f87 crc 1 len 1024
        adap 1 rec 23a59 blk 2f87 crc 1 len 1024
        adap 0 rec 23a5a blk 2f87 crc 1 len 1024
        adap 1 rec 23a5a blk 2f87 crc 1 len 1024
        adap 0 rec 23a5b blk 2f87 crc 1 len 1024
        adap 1 rec 23a5b blk 2f87 crc 1 len 1024
        adap 0 rec 23a5c blk 2f87 crc 1 len 1024
        adap 1 rec 23a5c blk 2f87 crc 1 len 1024
        adap 0 rec 23a5d blk 2f87 crc 1 len 1024
        adap 1 rec 23a5d blk 2f87 crc 1 len 1024
        adap 0 rec 23a5e blk 2f87 crc 1 len 1024
        adap 1 rec 23a5e blk 2f87 crc 1 len 1024
        adap 0 rec 23a5f blk 2f87 crc 1 len 1024
        adap 1 rec 23a5f blk 2f87 crc 1 len 1024
        adap 0 rec 23a60 blk 2f88 crc 1 len 1024
        removing block 2f87 at index 0 for block 2f88
        adap 1 rec 23a60 blk 2f88 crc 1 len 1024
        adap 0 rec 23a61 blk 2f88 crc 1 len 1024
        adap 1 rec 23a61 blk 2f88 crc 1 len 1024
        adap 0 rec 23a62 blk 2f88 crc 1 len 1024

        This is reception with one card having 100% packet loss, other card near zero.
        As expected, only 12 packets coming in:
        adap 0 rec 32563 blk 431d crc 1 len 1024
        adap 0 rec 32564 blk 431d crc 1 len 1024
        adap 0 rec 32565 blk 431d crc 1 len 1024
        adap 0 rec 32566 blk 431d crc 1 len 1024
        adap 0 rec 32567 blk 431d crc 1 len 1024
        adap 0 rec 32568 blk 431e crc 1 len 1024
        removing block 431d at index 0 for block 431e
        adap 0 rec 32569 blk 431e crc 1 len 1024
        adap 0 rec 3256a blk 431e crc 1 len 1024
        adap 0 rec 3256b blk 431e crc 1 len 1024
        adap 0 rec 3256c blk 431e crc 1 len 1024
        adap 0 rec 3256d blk 431e crc 1 len 1024
        adap 0 rec 3256e blk 431e crc 1 len 1024
        adap 0 rec 3256f blk 431e crc 1 len 1024
        adap 0 rec 32570 blk 431e crc 1 len 1024
        adap 0 rec 32571 blk 431e crc 1 len 1024
        adap 0 rec 32572 blk 431e crc 1 len 1024
        adap 0 rec 32573 blk 431e crc 1 len 1024
        adap 0 rec 32574 blk 431f crc 1 len 1024
        removing block 431e at index 0 for block 431f
        adap 0 rec 32575 blk 431f crc 1 len 1024
        adap 0 rec 32576 blk 431f crc 1 len 1024
        adap 0 rec 32577 blk 431f crc 1 len 1024
        adap 0 rec 32578 blk 431f crc 1 len 1024

        This is reception in the problem-case, one adapter sees heavy packet loss.
        Not really sure how to interpret that, it looks like what you say, that packets come in delayed?

        adap 0 rec 100ba blk 1564 crc 1 len 1024
        adap 0 rec 100bb blk 1564 crc 1 len 1024
        adap 0 rec 100bc blk 1565 crc 1 len 1024
        removing block 1564 at index 0 for block 1565
        adap 1 rec 100bd blk 1565 crc 1 len 1024
        adap 0 rec 100bd blk 1565 crc 1 len 1024
        adap 1 rec 100bf blk 1565 crc 1 len 1024
        adap 0 rec 100be blk 1565 crc 1 len 1024
        adap 1 rec 100c0 blk 1565 crc 1 len 1024
        adap 0 rec 100bf blk 1565 crc 1 len 1024
        adap 1 rec 100c3 blk 1565 crc 1 len 1024
        adap 0 rec 100c0 blk 1565 crc 1 len 1024
        adap 0 rec 100c1 blk 1565 crc 1 len 1024
        adap 0 rec 100c2 blk 1565 crc 1 len 1024
        adap 0 rec 100c3 blk 1565 crc 1 len 1024
        adap 0 rec 100c4 blk 1565 crc 1 len 1024
        adap 0 rec 100c5 blk 1565 crc 1 len 1024
        adap 0 rec 100c6 blk 1565 crc 1 len 1024
        adap 1 rec 100c6 blk 1565 crc 1 len 1024
        adap 0 rec 100c7 blk 1565 crc 1 len 1024
        adap 0 rec 100c8 blk 1566 crc 1 len 1024
        removing block 1565 at index 0 for block 1566
        adap 1 rec 100ca blk 1566 crc 1 len 1024
        adap 0 rec 100c9 blk 1566 crc 1 len 1024
        adap 1 rec 100cc blk 1566 crc 1 len 1024
        adap 0 rec 100ca blk 1566 crc 1 len 1024
        adap 1 rec 100ce blk 1566 crc 1 len 1024
        adap 0 rec 100cb blk 1566 crc 1 len 1024
        adap 1 rec 100cf blk 1566 crc 1 len 1024
        adap 0 rec 100cc blk 1566 crc 1 len 1024
        adap 1 rec 100d8 blk 1567 crc 1 len 1024
        removing block 1566 at index 0 for block 1567
        vvvvmmmm vvmv
        adap 0 rec 100cd blk 1566 crc 1 len 1024
        adap 1 rec 100d9 blk 1567 crc 1 len 1024
        adap 0 rec 100ce blk 1566 crc 1 len 1024
        adap 1 rec 100dc blk 1567 crc 1 len 1024
        adap 0 rec 100cf blk 1566 crc 1 len 1024
        adap 0 rec 100d0 blk 1566 crc 1 len 1024
        adap 0 rec 100d1 blk 1566 crc 1 len 1024
        adap 0 rec 100d2 blk 1566 crc 1 len 1024
        adap 0 rec 100d3 blk 1566 crc 1 len 1024
        adap 0 rec 100d4 blk 1567 crc 1 len 1024
        adap 0 rec 100d5 blk 1567 crc 1 len 1024
        adap 0 rec 100d6 blk 1567 crc 1 len 1024
        adap 0 rec 100d7 blk 1567 crc 1 len 1024
        adap 0 rec 100d8 blk 1567 crc 1 len 1024
        adap 0 rec 100d9 blk 1567 crc 1 len 1024
        adap 0 rec 100da blk 1567 crc 1 len 1024
        adap 0 rec 100db blk 1567 crc 1 len 1024
        adap 0 rec 100dc blk 1567 crc 1 len 1024
        adap 0 rec 100dd blk 1567 crc 1 len 1024
        adap 0 rec 100de blk 1567 crc 1 len 1024
        adap 0 rec 100df blk 1567 crc 1 len 1024
        adap 0 rec 100e0 blk 1568 crc 1 len 1024
        removing block 1567 at index 0 for block 1568
        adap 1 rec 100e1 blk 1568 crc 1 len 1024
        adap 0 rec 100e1 blk 1568 crc 1 len 1024
        adap 1 rec 100e2 blk 1568 crc 1 len 1024
        adap 0 rec 100e2 blk 1568 crc 1 len 1024
        adap 1 rec 100e8 blk 1568 crc 1 len 1024
        adap 0 rec 100e3 blk 1568 crc 1 len 1024
        adap 1 rec 100eb blk 1568 crc 1 len 1024
        adap 0 rec 100e4 blk 1568 crc 1 len 1024
        adap 0 rec 100e5 blk 1568 crc 1 len 1024
        adap 0 rec 100e6 blk 1568 crc 1 len 1024
        adap 0 rec 100e7 blk 1568 crc 1 len 1024
        adap 0 rec 100e8 blk 1568 crc 1 len 1024
        adap 0 rec 100e9 blk 1568 crc 1 len 1024
        adap 0 rec 100ea blk 1568 crc 1 len 1024
        adap 0 rec 100eb blk 1568 crc 1 len 1024
        adap 0 rec 100ec blk 1569 crc 1 len 1024
        removing block 1568 at index 0 for block 1569
        adap 1 rec 100f2 blk 1569 crc 1 len 1024
        adap 0 rec 100ed blk 1569 crc 1 len 1024
        adap 1 rec 100f4 blk 1569 crc 1 len 1024
        adap 0 rec 100ee blk 1569 crc 1 len 1024
        adap 1 rec 100f6 blk 1569 crc 1 len 1024
        adap 0 rec 100ef blk 1569 crc 1 len 1024
        adap 1 rec 100f9 blk 156a crc 1 len 1024
        removing block 1569 at index 0 for block 156a
        vvmvvmvm vvmm
        adap 0 rec 100f0 blk 1569 crc 1 len 1024
        adap 1 rec 100fc blk 156a crc 1 len 1024
        adap 0 rec 100f1 blk 1569 crc 1 len 1024
        adap 1 rec 100fe blk 156a crc 1 len 1024
        adap 0 rec 100f2 blk 1569 crc 1 len 1024
        adap 0 rec 100f3 blk 1569 crc 1 len 1024
        adap 0 rec 100f4 blk 1569 crc 1 len 1024
        adap 0 rec 100f5 blk 1569 crc 1 len 1024
        adap 0 rec 100f6 blk 1569 crc 1 len 1024
        adap 0 rec 100f7 blk 1569 crc 1 len 1024
        adap 0 rec 100f8 blk 156a crc 1 len 1024
        adap 0 rec 100f9 blk 156a crc 1 len 1024
        adap 0 rec 100fa blk 156a crc 1 len 1024
        adap 0 rec 100fb blk 156a crc 1 len 1024
        adap 0 rec 100fc blk 156a crc 1 len 1024
        adap 0 rec 100fd blk 156a crc 1 len 1024
        adap 0 rec 100fe blk 156a crc 1 len 1024
        adap 0 rec 100ff blk 156a crc 1 len 1024
        adap 0 rec 10100 blk 156a crc 1 len 1024
        adap 1 rec 10101 blk 156a crc 1 len 1024
        adap 0 rec 10101 blk 156a crc 1 len 1024
        adap 1 rec 10102 blk 156a crc 1 len 1024
        adap 0 rec 10102 blk 156a crc 1 len 1024
        adap 0 rec 10103 blk 156a crc 1 len 1024
        adap 0 rec 10104 blk 156b crc 1 len 1024
        removing block 156a at index 0 for block 156b
        adap 1 rec 10104 blk 156b crc 1 len 1024
        adap 0 rec 10105 blk 156b crc 1 len 1024
        adap 1 rec 1010b blk 156b crc 1 len 1024
        adap 0 rec 10106 blk 156b crc 1 len 1024
        adap 1 rec 1010d blk 156b crc 1 len 1024
        adap 0 rec 10107 blk 156b crc 1 len 1024
        adap 0 rec 10108 blk 156b crc 1 len 1024
        adap 0 rec 10109 blk 156b crc 1 len 1024
        adap 0 rec 1010a blk 156b crc 1 len 1024
        adap 0 rec 1010b blk 156b crc 1 len 1024
        adap 0 rec 1010c blk 156b crc 1 len 1024
        adap 0 rec 1010d blk 156b crc 1 len 1024
        adap 0 rec 1010e blk 156b crc 1 len 1024
        adap 0 rec 1010f blk 156b crc 1 len 1024
        adap 1 rec 1010f blk 156b crc 1 len 1024
        adap 0 rec 10110 blk 156c crc 1 len 1024
        removing block 156b at index 0 for block 156c
        adap 1 rec 10114 blk 156c crc 1 len 1024
        adap 0 rec 10111 blk 156c crc 1 len 1024
        adap 1 rec 1011a blk 156c crc 1 len 1024
        adap 0 rec 10112 blk 156c crc 1 len 1024
        adap 1 rec 1011d blk 156d crc 1 len 1024
        removing block 156c at index 0 for block 156d
        vvvmmmvm vmmm
        adap 0 rec 10113 blk 156c crc 1 len 1024
        adap 1 rec 1011f blk 156d crc 1 len 1024
        adap 0 rec 10114 blk 156c crc 1 len 1024
        adap 1 rec 10120 blk 156d crc 1 len 1024
        adap 0 rec 10115 blk 156c crc 1 len 1024
        adap 1 rec 10121 blk 156d crc 1 len 1024
        adap 0 rec 10116 blk 156c crc 1 len 1024
        adap 1 rec 10124 blk 156d crc 1 len 1024
        adap 0 rec 10117 blk 156c crc 1 len 1024

      • Perfect, thanks a lot. This proves my assumption.

        With the default settings we have a buffer of one block (usually 8 DATA + 4 FEC packets). Let’s say, the block that is currently in the buffer has the ID 9. If wifibroadcast now sees a packet with block ID 10 then it has to make room for the new ID. Since we only have space for one block we need to remove the block 9. If now after the packet with the block ID 10 still packets with block ID 9 arrive they will be ignored (replacing a block in the buffer will only be done if the new ID is higher than the current one. Which makes sense, we do not want to go backwards in time).

        In the example above the card with ID 9 would be your card with good reception and ID 10 the one with bad reception. Just a single packet of ID 10 can therefore destroy the block 9.

        You see it very well in the debug outputs:

        adap 0 rec 100cb blk 1566 crc 1 len 1024
        adap 1 rec 100cf blk 1566 crc 1 len 1024
        adap 0 rec 100cc blk 1566 crc 1 len 1024
        — everything fine ’til here —
        adap 1 rec 100d8 blk 1567 crc 1 len 1024
        removing block 1566 at index 0 for block 1567
        — BAM! Card 1 received already block 1567. WBC replaces block 1566 with the new block (even if block 1566 is not yet finished) —
        vvvvmmmm vvmv ( “v” means valid, “m” missing. So we have 4 missing DATA packets and only 3 valid FECS. This means we cannot reconstruct the block.)
        — All following packets with block 1566 will be ignored —
        adap 0 rec 100cd blk 1566 crc 1 len 1024
        adap 1 rec 100d9 blk 1567 crc 1 len 1024
        adap 0 rec 100ce blk 1566 crc 1 len 1024
        adap 1 rec 100dc blk 1567 crc 1 len 1024
        adap 0 rec 100cf blk 1566 crc 1 len 1024
        — the following four packets are the four “m”s (missing) of 1566 that we now just throw away… —
        adap 0 rec 100d0 blk 1566 crc 1 len 1024
        adap 0 rec 100d1 blk 1566 crc 1 len 1024
        adap 0 rec 100d2 blk 1566 crc 1 len 1024
        adap 0 rec 100d3 blk 1566 crc 1 len 1024

        So it seems to be indeed that the busy card queues up the packets and delivers them later than the mostly idling card.

        As a countermeasure you should increase the number of buffered blocks with the parameter -d. If for example -d would be two then in the examples above wifibroadcast would keep block 1566 and 1567 at the same time. Maybe a value of two is a better default setting. This, however, increases latency by one block.

  140. Nicolas permalink

    I have already experimented with the “-d 2” setting, but like I wrote, the problem did not go away completely. I’ll test again though with debugging enabled. I really would like to avoid the added latency. Wouldn’t it be possible to add an adaptive jitter buffer in front of the whole interleaving/fec decode logic to keep the latency to a minimum? I guess debugging through the whole kernel to see why and where the packets get queued up and fixing that is just too much work (?) I’ll see if I can add some timestamping to see how much the packets get delayed.

    • Maybe there are situations where the delay between the cards exceed two blocks. Not unlikely judging by the debug logs you provided.

      The adaptive buffer would be possible. Back when I added the -d parameter I thought about that option but dismissed it because it would introduce an unpredictable jitter. Assuming just the oldest block is incomplete then the algo would have to wait until this block drops out of the maximum buffer span (just like now). If, however, all the newer blocks would be complete, WBC would spill all that buffered data out at once. In the current implementation WBC mostly follows the input data rate and tries to avoid bursts (except of course the burstiness from the blocks themselves).

      • Nicolas permalink

        Did a lot more testing. -d 2 or even -d 5 doesn’t fix this ‘late packets issue’ completely.

        This reduces usable range quite a lot, considering that signal differences between different RX cards can easily be 6db or more when the aircraft is moving and tilting etc.

        Testing showed that it gets quite glitchy when one card is at range limit although there are three other cards with still good reception. Those glitches wouldn’t be so bad if it wold be just single packets missing from the data stream. But since a late packet can destroy a whole block, the glitches get pretty bad and unflyable.

        Obviously, increasing the number of packets also doesn’t really help. It makes things better, because chances that a late packet is “out of the current block” is lower, but at the same time it makes things worse, because even more data gets destroyed if a packet happens to be “out of the current block”.

        However, it seems changing the kernel timer frequency to 1000Hz reduces this problem a lot.

        Only problem is that it makes the Pi lock-up sometimes. According to a post I found somewhere, these options in comdline.txt (in exactly that order) should fix that lock-up issue.

        dwc_otg.fiq_fsm_enable=0 dwc_otg.fiq_enable=0 dwc_otg.nak_holdoff=0

        So far it seems to work, but need to do more testing to make sure it’s really stable and see if there are still delayed packets occasionally. Maybe it also makes sense to play with other PREEMPT options or use RT patches, so far I have only changed the timer frequency …

        Now assuming that the higher timer frequency runs stable, maybe instead of the extra jitter buffer, would it be possible (and less work ;)) to modify the RX in such a way that it simply throws away those few remaining late packets?

        This would keep latency low and still be a lot better than like it is now, where a single packet at the wrong time can destroy a whole block.

  141. Hello.
    I just tested the telemetry text overlayed on video with picamera and mavproxy module (MyPiModule) before transmission in wifibroacast (255 chars max).
    Here you find an example ( of a python script :
    This solution is very simple and effective.

  142. deathstroke999 permalink

    Any guidance on how to patch the firmware on a different Atheros (ath9k driver compatible) wireless card. Specifically I was looking at the mikrotik r11e-2hnd. It has true diversity with multiplexing. Also, since it’s multiplexed you may be able to use a lower modulation and increase range. Furthermore, you can get up to 800mw of power out of this. Would be great to be able to figure out how to use this with wifibroadcast.

    *note that it should work with the Rpi as when I contacted their support they said the card should work with a mpcie to USB adapter, so would have to mod for up to 800mw over USB.

    *Also note that this card should support DFS which I believe the USB adapters don’t, thought it doesn’t seem wifibroadcast does nor does it seem it is being updated any longer.

    *The card also advertises a range between 2.1ghz and 2.7ghz which means ham radio patches out of band may be much more effective than the USB cards Befi tried.

    This is the card 3DR solo people use to upgrade which is how I actually found it, /’d after researching did find some similar 3×3:3 mimo cards with the same chipset but are much more expensive. There are also others though they don’t seem to be supported by the ath9k driver or are much older and provide very low power like 20dbm and lower sensitivity where your not really gaining anything from the extra mini stream. Ranges with this card on the Solo have been reported up to 2.5mi with Omni antennas on rx and some upgraded antennas on the controller. Getting these and being able to properly set them up between 2 rpis with better antennas I can only imagine the range (even without wifibroadcast).

    In any case if someone at least has guidance on how to change the modulation or wants to help me find out this may be a more viable future option then wifibroadcast in terms of range and reliability (I mean if the Solo works up to 2.5mi with these cards), while offering a full wifi link.

    Anyways check this out guys seems like a great option.

    • deathstroke999 permalink

      To be clear I would be looking to see how to patch the firmware I believe it’s the Atheros 9580 to make it work with wifibroadcast, if it will work. If DFS hopping/multiplexing features won’t work on the card that makes it a bit less helpful. Or on the other side if FEC and retransmission won’t work on the card because it would have to be implemented differently.

      On the other side I’m looking at the alternative to keep it as a regular wifi link (which poses some advantages) due to the already much higher reliability and performance of the card. Though changing modulation/power to full 800mw if wanted (depends on use). Also, while browsing open-wrt (they have real long range wifi blogs) not using overused Alfa cards it does seem there are some ways to modify FEC and retransmission from these sorts of cards without writing whole programs (of course unless you want some more funtioncality such as the unidirectional link).

      So hope this helps some people, if you can help please let me know. Also for Benefitiv, maybe this is something he can look into for possibly simplifying from a program like structure to more of a script like structure in allowing these features.

    • Nicolas permalink

      That Mikrotik card looks like a good deal. Another option might be the Ubiquiti SR71-E mPCIE card.

  143. Hello,a modified version of wifibroadcast that gets rid of the pipe and thus improves both reliability and latency has been written, I was just wondering where to put it. Please advise.

    Philippe Crochat
    Anemos Technologies

    • Hi philippe,

      I am very curious to see your changes. Do you have a bitbucket account? You could send me a pull request there. Or if it is easier for you you could just pack things up and send it to me by mail. As you wish 🙂

  144. hello befinitiv, I do have a github account, will that do with your bitbucket account? at what adresse should I do a PR?
    An image with the modified code has already been posted to the EZ-wbc forum but I’ll be glad to post it to the bitbucket as well.

    • Hi Philippe,

      I saw that you forked wbc on bitbucket. I’m curious to see what you will propose there 🙂

      Since you want to integrate wbc into Raspivid: There is a much simpler way that requires no recompiling of raspivid. I already implemented all the required bits and pieces in the low_lat_raspivid_hook branch ( ).
      It basically takes control over the fwrite function calls from raspivid.

      This allows also to do the tricks of moving the NALU header to the end of the NALU. Look at the fwrite function in txhook.c to see how you could make something similar.

      • PR done, tell me if you received it well. I’ll look at your new raspivid branch, looks very promising 🙂

      • Thanks! I will take a look into that during the next days. One clarification: The low latency branch I’ve mentioned is “discontinued”. It was an experiment to further reduce latency of WBC that I stopped working on. I mentioned it only since you could use some ideas from it (like the fwrite-hook). You should not try to use it as is.

  145. Thanks befinitiv for your answer and clarification. May I ask why this branch which look promising is discontinued?
    I’ll definitily try it as I love this idea of function overriding.
    is that you who develop the mymediacodec fpv app?

  146. Nicolas permalink

    Looks like I found a possible cause for the issues with Atheros cards sometimes not coming up (seems to be more likely to happen with more than 2 RX cards).

    Here you can see me constantly re-starting the rx program, after some retries, one or more cards give weird log messages and crash. It seems, for some reason the cards don’t like being switched to promiscous mode sometimes.

    [ 62.491439] device ec086b1c7834 left promiscuous mode
    [ 62.495125] ath: phy3: Set HW RX filter: 0x37
    [ 62.506186] device ec086b1c4645 left promiscuous mode
    [ 62.514370] ath: phy2: Set HW RX filter: 0x37
    [ 62.521136] device c46e1f21bbf1 left promiscuous mode
    [ 62.535178] ath: phy1: Set HW RX filter: 0x37
    [ 63.355188] device c46e1f21bbf1 entered promiscuous mode
    [ 63.368003] ath: phy1: Set HW RX filter: 0x37
    [ 63.375141] device ec086b1c4645 entered promiscuous mode
    [ 63.389368] device ec086b1c7834 entered promiscuous mode
    [ 63.392006] ath: phy3: Set HW RX filter: 0x37
    [ 63.392372] ath: phy2: Set HW RX filter: 0x37
    [ 66.795152] cfg80211: Verifying active interfaces after reg change
    [ 86.338279] device ec086b1c7834 left promiscuous mode
    [ 86.344201] ath: phy3: Set HW RX filter: 0x37
    [ 86.358185] device ec086b1c4645 left promiscuous mode
    [ 86.370162] device c46e1f21bbf1 left promiscuous mode
    [ 86.377554] ath: phy2: Set HW RX filter: 0x37
    [ 86.378298] ath: phy1: Set HW RX filter: 0x37
    [ 87.191206] device c46e1f21bbf1 entered promiscuous mode
    [ 87.210421] ath: phy1: Set HW RX filter: 0x37
    [ 87.215230] device ec086b1c4645 entered promiscuous mode
    [ 87.218165] ath: phy2: Set HW RX filter: 0x37
    [ 87.233157] device ec086b1c7834 entered promiscuous mode
    [ 87.237414] ath: phy3: Set HW RX filter: 0x37
    [ 94.811654] device ec086b1c7834 left promiscuous mode
    [ 94.829155] device ec086b1c4645 left promiscuous mode
    [ 94.841140] device c46e1f21bbf1 left promiscuous mode
    [ 94.911149] ath: phy3: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 94.911174] ath: phy3: REGISTER READ FAILED: (0x803c, -110)
    [ 94.929142] ath: phy2: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 94.929152] ath: phy2: REGISTER READ FAILED: (0x803c, -110)
    [ 94.941136] ath: phy1: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 94.941145] ath: phy1: REGISTER READ FAILED: (0x803c, -110)
    [ 95.011135] ath: phy3: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 95.011146] ath: phy3: REGISTER READ FAILED: (0x810c, -110)
    [ 95.029191] ath: phy2: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 95.029201] ath: phy2: REGISTER READ FAILED: (0x810c, -110)
    [ 95.041133] ath: phy1: Timeout waiting for WMI command: WMI_REG_READ_CMDID
    [ 95.041141] ath: phy1: REGISTER READ FAILED: (0x810c, -110)
    [ 95.111132] ath: phy3: Timeout waiting for WMI command: WMI_REG_RMW_CMDID
    [ 95.111152] ath: phy3: REGISTER RMW FAILED:(0x0034, -110)
    [ 95.129215] ath: phy2: Timeout waiting for WMI command: WMI_REG_RMW_CMDID
    [ 95.129224] ath: phy2: REGISTER RMW FAILED:(0x0034, -110)
    [ 95.141139] ath: phy1: Timeout waiting for WMI command: WMI_REG_RMW_CMDID
    [ 95.141148] ath: phy1: REGISTER RMW FAILED:(0x0034, -110)
    [ 95.211319] ath: phy3: Timeout waiting for WMI command: WMI_REG_WRITE_CMDID
    [ 95.211339] ath: phy3: REGISTER WRITE FAILED, multi len: 2
    [ 95.211351] ath: phy3: Set HW RX filter: 0x2137
    [ 95.229146] ath: phy2: Timeout waiting for WMI command: WMI_REG_WRITE_CMDID
    [ 95.229163] ath: phy2: REGISTER WRITE FAILED, multi len: 2
    [ 95.229172] ath: phy2: Set HW RX filter: 0x2137
    [ 95.241140] ath: phy1: Timeout waiting for WMI command: WMI_REG_WRITE_CMDID
    [ 95.241148] ath: phy1: REGISTER WRITE FAILED, multi len: 2
    [ 95.241157] ath: phy1: Set HW RX filter: 0x2137

    However, I think we don’t need promiscous mode as wifi cards in monitor mode forward all traffic anyway.

    I have now changed this line in rx.c:
    interface->ppcap = pcap_open_live(name, 2048, 1, -1, szErrbuf);

    interface->ppcap = pcap_open_live(name, 2048, 0, -1, szErrbuf);

    So that promisc mode is not set. Seems to have fixed the problem so far.

    • Nicolas permalink

      Something important for users of the TPLink 722N (and possibly others that have more than one antenna): The 2nd antenna connection causes badblocks!

      Disable the internal PCB antennas on these dongles by either de-soldering the white SMD component on the backside or covering the antenna with copper or something.

      See here for more info:

  147. Nicolas permalink

    There is an issue with Alfa 051NH/052NH cards. The Ralink rt2800usb drivers do a vco calibration every 10 seconds for those cards or chipsets. This leads to 3-6 lost packets (depending on bitrate settings etc.) every 10 seconds. When it’s more than 4 packets lost (with default FEC settings), this will lead to a badblock and thus a visible glitch.

    this can be fixed by using a longer interval in rt2x00lib.h in the kernel sources.

    change this line:
    #define VCO_INTERVAL round_jiffies_relative(10 * HZ) /* 10 sec */

    #define VCO_INTERVAL round_jiffies_relative(120 * HZ) /* 10 sec */

    To reduce the interval to 120seconds.

  148. Nicolas permalink

    EZ-Wifibroadcast 1.4 is ready, try it out 🙂

  149. skpswamy permalink

    when i execute “sudo update-rc.d wbctxd start ” i am getting this error
    update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
    Use of uninitialized value in string eq at /usr/sbin/update-rc.d line 292.
    update-rc.d: warning: start runlevel arguments (none) do not match wbctxd Default-Start values (2 3 4 5)
    update-rc.d: warning: stop runlevel arguments (none) do not match wbctxd Default-Stop values (0 1 6)

  150. Nicolas permalink

    Hi befinitiv,

    did some testing with injecting CTS frames. They have less overhead (only 10byte 802.11 header) and for whatever reason Atheros cards seem to inject faster (apart from the lower overhead). Cool thing is, with the other medium-access optimizations, throughput with 18mbit data rate is now slightly higher than with 24mbit datarate without optimizations and injecting CTS.

    With 8/4/1024 FEC:

    1.27MByte/s with 18mbit and CTS frames
    1.14MByte/s with 18mbit and DATA frames

    1.67MByte/s with 24mbit and CTS frames
    1.43MByte/s with 24mbit and DATA frames

    Managed to change the u8aIeeeHeader function in tx.c, also changed the set_port_no function and the defines with the byte position to use the last byte of the (single) mac address in the cts ieee802.11 header.

    In rx.c I changed the sprintf(szProgram, “ether[0x0a:4]==0x13223344 && ether[0x0e:2] == 0x55%.2x”, port); lines to make the pcap filter listen to the cts packets, which seems to work.

    However, I must still be missing something, it’s not working, i.e. no valid data received. If I see it correctly, the first four bytes of the payload are the wbc sequence number? Is there some other header parsing etc. going on in the rx.c?

    Another thing I’ve noticed is, that the FCS of the CTS frames is wrong, not sure why, isn’t the FCS added by the wifi drivers or firmware usually?

  151. Alex permalink

    Anyone with problems to boot up on RPi3?

  152. Nicolas permalink

    Runs completely glitch-free now 🙂

  153. I believe this can be of some interest to us: (Secure Reliable Transport)

  154. Can this work withh multiple transmitters and receivers in the same area?

  155. Rocky LaBarbera permalink

    Fantastic post and information. DJI needs a good push to innovate instead of marketing toys to mentally ill selfie obsessed zombies. Lightbridge/occusync have afforded them a comfortability that seems unfair to the rest of the industry. As soon as other drone manufacturers wake up and implement similar technologies, DJI will continue to dominate the segment and face no real competition. I’m aware its easier said than done, but this proves it’s possible and DJI had no actual “magic” in their back pocket.

  156. Nicolas permalink

    EZ-Wifibroadcast 1.6 RC2 has been released:

  157. Hello, first off I wanted to say thank you for your work on this project and for making it available to everyone. I am currently looking into integrating it into a raspberry pi 3 with a Navio2 for data, telemetry, and video streaming. Would this wifi card work with your software? I can have one sent to you for testing if necessary.

    Thank you for your time.

  158. Flyer permalink

    Hi befinitiv!

    I really appreciate your work! I’ve been finding usable video transmit solution for a long time, and I think now I find it. But I have a question: is it possible to change the receiver side to a Windows based solution? I have an EFIS software written in C# and I want to integrate this wifibroadcast stuff to it, but I do not have any idea where to start. I know there are several solution on Linux to run my C# application, but the task is to only run on Windows 10. I think some managed code is needed in my app to switch on monitor mode in wifi card, but I cannot find anything about this on internet.

    Thank you for your time

    • Nicolas permalink

      8km on omni antennas with EZ-Wifibroadcast:

      Setup is:
      TX: Ubiquiti Wifistation USB with 2dbi quarter-wave groundplane antenna

      RX: 3x TPLink 722N V1 with cheap rubber-duckies

      All settings left default except frequency changed and txpower set to “54”

  159. Incredible tons of very good information!

  160. You actually suggested this superbly.

  161. Many thanks. Quite a lot of data single moms.

  162. Excellent information, Many thanks meet indian singles!

  163. Yes21 permalink

    Do one know if the Realtek RTL8812AU chipset is supported ?

    It is now supported in Kali Linux :

    In 2017, Kali Linux began supporting drivers for the RTL8812AU wireless chipsets. These drivers are not part of the standard Linux kernel and have been modified to allow for injection. This is a big deal because this chipset is one of the first to support 802.11 AC, bringing injection-related wireless attacks to this standard.
    This is the newest offering I’ve found that’s compatible with Kali, so if you’re looking for the fastest and longest range, this would be the adapter to start with.

    To use this, you may need to first run the following.

    apt update
    apt install realtek-rtl88xxau-dkms

    Do you think it could work with WifiBroadcast ?

  164. shahid hakro permalink

    Can I use one side raspberry pi to send video stream and other side my laptop to receive the video stream ? if yes could you please send men the where I can follow how can I setup

  165. tom permalink

    Hi befinitiv!

    i have a wifi card , use AR9344 chip , itls that works on this project ? thank u .

    • hi tom

      if i remember correctly, i never used this type of chip. you might take a look at the wiki of openhd. they have a list of well functioning chipsets.


      • tom permalink

        thank u so much for your information , we have some troubles in a wifi project.we have strengths on compression, but poor at RF module . this is my mail ,, i will be very grateful if u can leave me an email box , i will show u more informations , thank u .

  166. Adriel permalink

    Hi thanks for sharing this project. I saw this project and I really interesting. I doing something like this. I want to know how to start with trasmitting a single image. I try to understand the code and I get a little stuck and I return to basics. Do you have any start point?

    • sending Single images might actually be more complicated than sending a videostream. the reason for this is that you would need to have some kind of framing that tells the receiver where the images start and end. video data usually contains this framing already (called NAL header). So as an initial test I would recommend just sending text. just start the rx program without anything else so that the output appears on the screen. same for tx and then paste a very long text into the tx terminal. the text should be at least ~20000 characters long to overcome internal buffers of the system. after pasting this, you should see the text at the rx side.

  167. At this time I am going to do my breakfast, once havingmy breakfast coming over again to read other news.

Trackbacks & Pingbacks

  1. Picture of my ground station and sample video of a weak link | befinitiv
  2. Relieable 3+km HD FPV solution | befinitiv
  3. Quadrocopters with a Raspberry PI | Tim Delbrügger
  4. Wifibroadcast Makes WiFi FPV Video More Like Analog | Hackaday
  5. Wifibroadcast Makes WiFi FPV Video More Like Analog | Ad Pub
  6. Wifibroadcast Makes WiFi FPV Video More Like Analog | Hack The Planet
  7. Wifibroadcast Makes WiFi FPV Video More Like Analog - zeax blog (
  8. Telemetry OSD for wifibroadcast | befinitiv
  9. Enable 2.4GHz RC systems to work with 2.4GHz video systems | befinitiv
  10. (True) Unidirectional Wifi broadcasting of video data for FPV | befinitiv
  11. Introducing the Piexplorer | Dronevolution
  12. Wifibroadcast – Analog-like transmission of live video data | Daily Hackers News
  13. 11 – Wifibroadcast – Analog-like transmission of live video data
  14. Bookmarks for December 17th | Chris's Digital Detritus
  15. SO – how about multicast/broadcast Audio/Video streaming over 802.11 aka Wi-Fi? – jbmaillet
  16. Controlling WifiBroadcast with an RC Switch – Unmanned Build
  17. Indoor holonomic rover with NAVIO2 and Wifibroadcast – Unmanned Build
  18. Wi-Fi + HD Video + Drones = Still Wobbly - En Comunidad
  19. Wi-Fi + HD Video + Drones = Still Wobbly - ChatNews
  20. Wi-Fi + HD Video + Drones = Still Wobbly
  21. Wi-Fi + HD Video + Drones = Still Wobbly
  22. Testing DIY Digital Video for FPV Flying |
  23. Testing DIY Digital Video for FPV Flying - The Tech News
  24. Testing DIY Digital Video for FPV Flying - World Big News
  25. Et le FPV numérique ? C’est pour quand ? –
  26. Indoor holonomic rover with NAVIO2 and Wifibroadcast - Backyard Robotics
  27. Testing DIY Digital Video for FPV Flying - Therabotics
  28. Screening Do it yourself Digital Video clip for FPV Flying - EngNews
  29. Eachine Mustang P-51D Electronics Repair | befinitiv
  30. OlliW's Bastelseiten » MAVLink Gimbal Protocol V2: Intro and Overview
  31. OlliW's Bastelseiten » WifiBroadcast, OpenHD, DroneBridge: 5 GHz Wifi Cards
  32. Awesome Broadcasting – Massive Collection of Resources – Learn Practice & Share
  33. La pantalla de diapositivas es un estuche excelente para Project OLED -

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: