This post describes a new feature of wifibroadcast: Software diversity.
Update 06/02/1025: The new diversity code from 06/01 worked fine on my desktop pc but showed some rare problems on a Raspberry A+. The TX_RESTART was triggered sometimes by out of order packets (with a block distance of 10 blocks!). Due to that I increased the threshold for the TX_RESTART trigger. The Downside is that the empirical tuning of the -d parameter does not work anymore. So I’ll have to write a tool that determines the -d value automatically…
Update 06/01/2015: I’ve added a new parameter to wifibroadcast for diversity. With -d you can define the number of retransmission blocks that will be kept as a window buffer. The parameter defaults to one (which minimizes latency). However, if you use diversity with wifi adapters of different types, they might deliver their data at different times. The -d parameter should be set so that the window buffer spans over the time difference between the two adapters.
The empirical approach of finding the right setting is simple: If -d is too low, you should see often the message “TX RESTART”. Increase the -d parameter until this message disappears and you have found the correct setting.
An active commenter on this blog, malkauns, has started development on software support for diversity. The general idea is: You use more than one WIFI receiver and combine the received data. Due to multipath and other effects the probability that you receive a single packet correctly increases with each added receiver. Together with malkauns I integrated this feature into the default branch of wifibroadcast.
The usage is extremely simple: Just call wifibroadcast with more WIFI adapters:
./rx wlan0 wlan1
Please note that currently you need two identical adapters. Different types can introduce a delay between the received data streams that currently cannot be processed by wifibroadcast. In this case the systems works but you gain little advantage from diversity.
My first experiments showed that the reception quality increases drastically. I tested the feature in a noisy environment with lots of surrounding networks. In every case the packet drop rate was at least one order of magnitude lower when using two adapters. In 90% of the cases it was even better by two orders of magnitude.
Conclusion and outlook
The results are very promising. I think in the future I’ll use diversity on all of my flights. Since the TL-WN722N is very cheap, this feature should be useful to many adopters. There are still some open issues with diversity that need to be solved.
The support of different adapter types in parallel is one thing that is planned. Also, there is possibly some potential for smarter FEC in this case. Currently the diversity only works on packet basis. It may be a good idea to dive into their content. We’ll see :)
This post presents just another FPV video transmitted using wifibroadcast
After my last crash I rebuilt my quad and tried it out today. And I am getting more and more used to the FPV feeling. I am still a total noob compared to others but there is progress :)
Since I was a bit afraid to crash again I went for maximum safety: triple retransmissions, 5MBPS, 30fps with a keyframe every 15 frames (thus 2Hz). Due to that and a dirty lens the image quality is not optimal. Again, I had the receiver in my pocket and my body occasionally blocked the line of sight. Therefore, there are some disturbed video frames. I really need something to put the antenna on!
Here is the video (recorded at the ground station):
This post presents my first experiences using the wifibroadcast video transmission.
To be honest, this post is a bit contradictory.
The title uses the word “reliable” and later in this post I’ll describe my first three FPV flights where the last one ended in a really bad crash.
Analyzing the cause showed that it was actually my fault, not the transmissions one.
But before I come to the bad part let’s start with the good news which is a range test.
Extended range test
The last range test was a bit limited in terms of range since the area I was in was limited to 500m line of sight (LOS).
This time I had a car available and drove to a small hill where I’ve placed the video sender.
You can see a picture of the valley down the hill here:
I used the ALFA AWUS36NHA with the omni antenna that came with it as a transmitter.
The H264 data rate was 7mbps with a double retransmission.
On the rx side I used the TP-LINK TL-WN722N with my home-made double biquad antenna:
The results were really astonishing.
I drove down the valley all the way to the end right before I lost line of sight.
At this point (at 3km distance to the transmitter) I had rock-solid video image.
Since the double biquad antenna is a directional antenna its sensitivity decreases when the transmitter is not in the center of the receive coil.
And again I was surprised that I could move the antenna +-30° both horizontally and vertically until I lost too many packets.
This is really good news: In case your quad stays in a radius of 3km you are safe to fly withing a 60° sector.
This also indicates that the maximum distance is way beyond 3km.
My guess would be that at least 5km should be possible.
I also tested a dipole antenna at the receiver.
This gave a maximum distance of 900m with the advantage of a 360° coverage.
The end of my last FPV flight
As I have mentioned before my last flight ended quite brutally.
And I have learned a lot from it.
Two main problems have caused that crash: Strong wind (30km/h) and blocked line of sight.
I was flying at a distance of around 200m against the wind.
When I turned around the quad was moving with the wind and suddenly lost height which blocked the LOS between the antennas.
The stream was quickly so disrupted that I could not guess its current position and height.
While I was trying to move the quad back to my position (I did not know that the LOS was blocked due to the height) the strong wind pushed the quad away horizontally.
When I (blindly) increased altitude the quad was already pushed away so far that the LOS was now blocked horizontally due to trees close to me.
I still received data but that was mostly unusable.
Maybe each 5 seconds a good frame blinked quickly and was then disrupted again.
When I lost contact completely I was forced to “land”.
The problem now was: Where is it? I knew that the battery was almost dead so the RC activated beeper would soon be quiet.
I ran around like a mad dog listening for the beeping.
After one hour I gave up and “accepted my loss”.
Back home I converted the received video stream of the crash flight to single images using gstreamer.
This way I was able to recover a more or less intact image 10s before the crash:
This image allowed me to localize the approximate position and finally I have found my quad sitting in a field (luckily the green antenna stood out of the long grass).
The impact took place at a distance of 700m to the rx station.
This means that the wind pushed the quad in 30s around 500m away from me…
Aftermath and conclusions
Although the crash was quite hard all the electronics and motors stayed intact.
Most likely this is a result of the sandwich construction that protects the electronics between two plates.
However, all of the 8mm carbon fiber rods that hold the motors broke quite brutally.
Under normal circumstances these rods are really durable.
So the crash seemed to be quite hard.
My conclusion is that a blocked line of sight using 2.4GHz transmission is deadly.
In the future I’ll take care that my rx antenna has a clear field of view of at least 180°.
Also my first reaction in case of a lost video connection will be to gain altitude.
This incident also motivates me to buy a GPS receiver that transmits the position of the quad and also provides an automatic “return to home” function.
PS: A little cheer up
For not letting this post end with bad feelings I show you the video of my first FPV-only flight.
I shot it using the ALFA AWUS36NHA in the air with a dipole antenna and a TP-LINK TL-WN722N also with a dipole on the ground. The video is decoded and displayed using a raspberry pi as shown here.
The occasional errors in the video are because I had the receiver in my pocket. Due to that my body blocked quite often the line of sight between the two antennas. The images are wobbly because it was quite windy that day (around 20km/h). I know, compared to other FPV videos this is sooo boring. But hey, it was my first flight ;)
In this post I’ll explain my ground station that receives and displays video data using wifibroadcast
My ground station
Let’s start with a picture:
Here you see my ground station. To the left you see my FPV glasses which contain a 7″ HDMI display and a fresnel lens. The enclosure is made out of foam and at the back a strap from ski-goggles is attached. Right next to the “helmet” you see a 3-cell battery that powers the system. On the right you find the Raspberry Pi A+ that has been integrated into a USB hub enclosure (there was enough room to keep the hub :). The Raspberry is powered by a cheap 5V DC/DC converter.
Sample video of a weak link
Following is a video where I have intentionally removed the antenna from the receiving wifi dongle. This shows some of the artifacts you could experience under bad signal conditions. I know, it looks really bad but remember: A lot of bad images and some good images is better than no images at all. And to get bad images like that with an antenna attached you really need a high distance or a solid block of the line of sight.
This post shows how video data transmitted using wifibroadcast can be received and displayed on a Raspberry pi
As I have described here the whole wifibroadcast pipeline has been ported to Android. However, since this is highly device specific it may be of interest for people to have a simple and cheap way to rebuild a mobile device that can receive and display a video stream. As a bonus this also has a lower latency.
The hardware is very simple: I bought a 7″ HDMI display for 42€ and connected it to a Raspberry Pi A+. The Raspberry Pi also uses a TL-WN722N wifi card to receive the wifibroadcast stream. Together with a fresnel lens this display could be easily integrated into self-made FPV glasses (and I am planning to do that).
The software part is equally simple. First I tried to use gstreamer to decode and display the video. Gstreamer is able to use OMX to deliver accelerated decoding of h264 but displaying the content needs to be done outside of OMX. And it seemed as if the CPU was too weak to display the (already decoded) video on the screen. Therefore I looked for an OMX pipeline that goes all the way down to the display. And actually there is already a nearly fitting example program in every Raspbian installation: /opt/vc/src/hello_pi/hello_video
This program reads h264 data from a file and displays it on the screen. Since the whole pipeline is using OMX it is efficient enough to display a full-HD video.
I modified that program so that it reads its video data from stdin and fixed the framerate to 60Hz. This way the incoming data stream should always be processed faster than it arrives.
You can find the modified version of it here:
Just copy the video.c file to /opt/vc/src/hello_pi/hello_video and build the software as follows:
Then you can use the following command to receive video data and display it on the screen:
sudo ./rx -b 8 wlan1 | /opt/vc/src/hello_pi/hello_video/hello_video.bin
I am quite happy with that solution. The latency is as good (if not better) as on my PC. Also, the CPU of the Raspberry is just used by 10%. So there is still enough power for GUI stuff available.
Since the latency on my Android tablet is higher I think I’ll use the Raspberry receiver for my FPV glasses and give the tablet to bystanders. That is also a point where wifibroadcast behaves just like good old analog: Other people can simply join your video stream with their compatible device.
This post shows my approach on getting my FPV wifi transmission system to run on an Android device
Motivation sometimes comes through strange ways. In this case it was my laptop’s battery that died completely. I just had finished my quad and one week later my laptop was bound to the wall due to the failed battery. Thus I could not fly anymore since then :( That was the perfect excuse to port the wifibroadcast stuff over to Android. To be honest: I wouldn’t want to do this a second time. It was quite painful. I can’t give give you exact reproduction steps since this is higly device dependend. But you should gather enough info to redo the stuff for your device (I used a Nexus 7 2013 running Android 5). You’ll find a screenshot from the android device here (so the system has more or less photographed itself. How I love these recursive worm holes :):
This project needed to archive the following sub-goals:
* Root the device (many manuals available on how to do that)
* Support TL-WN722N wifi card in android kernel
* Cross-compile wifibroadcast for android
* Decode of raw h264 streams
* Displaying video data
TL-WN722N wifi card kernel support
Although most modern Android devices support host mode over USB OTG the number of activated USB drivers in the vanilla kernels is quite low. You can be pretty sure that there will be no USB wifi drivers enabled. Thus there is no way around a custom kernel build. This page has a detailed description on how to build your own kernel for the official “Google” android. It seems as if kernel compilation for Cyanogenmod is a bit more complicated. That’s why I went this way. Using “make menuconfig” to enable the driver “ath_htc” for the TL-WN722N is all there is to do. After installing the new kernel in your device you also need the firmware for the card. Note that you probably can’t use the firmware from your Ubuntu machine (or similar) because that probably uses the newest v1.4 firmware. The android kernel only supports v1.3 which you can get here. To copy it to the right location enter on the android shell (root required):
#make system writeable mount -o rw,remount /system cp /sdcard/htc_9271.fw /system/etc/firmware
Now your wifi card should run. Note that the card is by default “down”. So you won’t notice anything when you plug it in. Look into dmesg and see if everything went well. You can safely ignore error messages in dmesg about the “ASSERT in hdd_netdev_notifier_call”. That originates from some unclean qualcom wifi drivers for host mode…
Running wifibroadcast on Android
Now that the wifi card runs, how do we use it with wifibroadcast? Well, I was lazy and did not bother to port it over to Android. Instead I used a Ubuntu chroot environment where I could easily build and run it like I would be on a real linux machine. Again, you find a lot of pages describing how to create an “Ubuntu in a file”. I won’t repeat these steps here. Just the two lines which get me from the android shell into the system:
#this mounts the ubuntu image into the directory "ubuntu" mount -o loop -t ext4 ubuntu.img ubuntu #this chroots into the system unset LD_PRELOAD export USER=root export HOME=/root chroot ubuntu /bin/bash
Now you have a full powered ubuntu system available on your android device :) In there you can do the usual things:
apt-get install mercurial hg clone https://bitbucket.org/befi/wifibroadcast cd wifibroadcast make
And then you have a usable wifibroadcast. Inside the ubuntu system you can then setup your wifi card:
ifconfig wlan1 down iw dev wlan1 set monitor otherbss fcsfail ifconfig wlan1 up iwconfig wlan1 channel 13
Decode and display video
Remember, the rx program will receive raw h264 frames that need to be decoded and displayed. My first attempt was to install the “XServer XSDL” app (excellept app by the way) which gives your chroot system graphical output capabilities. From that I just used the standard ubuntu gstreamer tools to decode and display the video. Unfortunately this was a bit too slow and had a latency of around 500ms (too much to be usable). I also tried some of the video streaming apps from the Play store but I found that none of them was suitable. So I had to write my own… :(
I wrote an app that opens a UDP socket on port 5000 for receiving raw h264 data. The incoming data is parsed for “access units” and these are forwarded into a MediaCodec instance. The decoded frames are then directly rendered onto a Surface object. You can grab the source code of it here:
hg clone https://bitbucket.org/befi/h264viewer
This repository contains an “Android Studio” project. Use “import project” to open it. It targets Android API 21 (aka 5 Lollipop). There is also a ready-to-install .apk under “app/build/outputs/apk/app-debug.apk”
How to use that program? Very simple: Just send raw h264 data over UDP to port 5000 of your android device and it gets displayed. If your Raspi is in the same network as your android device you could use something like this:
raspivid -t 0 -w 1280 -h 720 -fps 30 -b 3000000 -n -pf baseline -o - | socat -b 1024 - UDP4-DATAGRAM:192.168.1.100:5000
where 192.168.1.100 is the IP address of your android device. Why have I used “socat” instead of “netcat”? Well, netcat is a line-based tool. Since we are sending binary data it usually takes some data until something similar to a newline is found (netcat triggers the packet transmission on this condition). Most often this will be more data than the maximum packet size. Therefore, netcat packets are usually fragmented IP packets (something you should avoid). Socat just sends unfragmented packets when the threshold (in this case 1024bytes) is reached.
How to use that with wifibroadcast? Again, very simple. On the raspberry execute the usual command to broadcast your video into the air:
raspivid -t 0 -w 1280 -h 720 -fps 30 -b 3000000 -n -pf baseline -o - | sudo ./tx -r 3 -b 8 wlan1
Inside the chrooted Ubuntu on your android device execute:
./rx -b 8 wlan1 | socat -b 1024 - UDP4-DATAGRAM:localhost:5000
All this does is to forward the packets from “rx” to the UDP port 5000 of your device. And voilà, you should see your camera’s image.
The latency is ok and should be usable. My PC is still a little bit faster but that is not too surprising. I have some ideas on how to optimize everything and keep you posted. Concerning battery life I was quite surprised. The app and wifi dongle drained the battery 14% after one hour of runtime. So I’m quite confident that this would allow me to at least stream for 5 hours with a full battery.
Another alternative I’m thinking about is to use a Raspberry with a HDMI display to show the video stream. I’ve fiddled around a bit and maybe in the next post I’ll present my OMX pipeline that does the decoding and displaying of the video on the Raspberry’s GPU (thus the CPU idles). I was actually surprised that this gave me even lower latency compared to a gstreamer pipeline running on my PC. The downside is that this is a bit more complicated hardware-wise. raspi+display+display-controller+wifi-card+battery+5v-dc-dc-converter could get a bit messy… we’ll see :)
This post describes my quad hardware including a (mostly) complete parts list. It also shows the results of the first test of my failsafe(r) wifibroadcast video transmission
The history of my quad
I spend my last weeks on finishing my quad to actually test the video transmission in the air. At first I thought this would be done with no hassle. I grabbed my good old Mikrokopter (built in the year 2007) and tried to fly it after sitting for 5 years in the attic. The result was horrible. The gyros were dead and the bearings of all four motors were damaged… so I repaired the motors, ordered a new flight control and noticed that the old brushless controllers were not capable of processing the high update rate required over PPM (previously they used I2C). So they also went into the bin. Long story short: Everything I kept were the 4 motors…
The new quad
So much has happened in the quad domain since 2007. I was really impressed. I just briefly got an overview on the electronics that are on the market today. So to all the quad-aficionados: Don’t be mad with me if I’ve bought not the most optimal part here and there ;)
Lets start with some photos:
Here you can see the whole device. It is constructed using a sandwich out of two 1mm carbon fiber plates. The motor rods are round 8mm carbon fiber pipes. The mountings for the motors have been made out of 10mm square aluminium pipes. The inner diameter of them is 8mm. Due to this the carbon fiber rods fit perfectly. With a bit of epoxy glue they are very well connected:
On the side of the chassis the rods are glued into some custom “spacers” which also hold the two carbon plates together:
The inside of the sandwich is packed full with electronics (so they are well protected in case of a crash):
Sample of the live video stream
Following is a video of my first FPV flight with the new quad. The video was recorded at the ground station (-> it is the live stream). The sun was already gone which explains the pale colors. Also, I did set the key-frame rate to 0.5s (safe is safe for the first flight). This also had some impacts on video quality. It was quite windy which caused the camera to shake a lot. I was still too afraid to let my quad of the leash flying FPV only. Therefore, the maximum distance I archived was around 100m. As soon as I’ve practiced a bit more I’ll do some real range tests.