This post shows my approach on getting my FPV wifi transmission system to run on an Android device
Motivation sometimes comes through strange ways. In this case it was my laptop’s battery that died completely. I just had finished my quad and one week later my laptop was bound to the wall due to the failed battery. Thus I could not fly anymore since then 😦 That was the perfect excuse to port the wifibroadcast stuff over to Android. To be honest: I wouldn’t want to do this a second time. It was quite painful. I can’t give give you exact reproduction steps since this is higly device dependend. But you should gather enough info to redo the stuff for your device (I used a Nexus 7 2013 running Android 5). You’ll find a screenshot from the android device here (so the system has more or less photographed itself. How I love these recursive worm holes :):
This project needed to archive the following sub-goals:
* Root the device (many manuals available on how to do that)
* Support TL-WN722N wifi card in android kernel
* Cross-compile wifibroadcast for android
* Decode of raw h264 streams
* Displaying video data
TL-WN722N wifi card kernel support
Although most modern Android devices support host mode over USB OTG the number of activated USB drivers in the vanilla kernels is quite low. You can be pretty sure that there will be no USB wifi drivers enabled. Thus there is no way around a custom kernel build. This page has a detailed description on how to build your own kernel for the official “Google” android. It seems as if kernel compilation for Cyanogenmod is a bit more complicated. That’s why I went this way. Using “make menuconfig” to enable the driver “ath_htc” for the TL-WN722N is all there is to do. After installing the new kernel in your device you also need the firmware for the card. Note that you probably can’t use the firmware from your Ubuntu machine (or similar) because that probably uses the newest v1.4 firmware. The android kernel only supports v1.3 which you can get here. To copy it to the right location enter on the android shell (root required):
#make system writeable mount -o rw,remount /system cp /sdcard/htc_9271.fw /system/etc/firmware
Now your wifi card should run. Note that the card is by default “down”. So you won’t notice anything when you plug it in. Look into dmesg and see if everything went well. You can safely ignore error messages in dmesg about the “ASSERT in hdd_netdev_notifier_call”. That originates from some unclean qualcom wifi drivers for host mode…
Running wifibroadcast on Android
Now that the wifi card runs, how do we use it with wifibroadcast? Well, I was lazy and did not bother to port it over to Android. Instead I used a Ubuntu chroot environment where I could easily build and run it like I would be on a real linux machine. Again, you find a lot of pages describing how to create an “Ubuntu in a file”. I won’t repeat these steps here. Just the two lines which get me from the android shell into the system:
#this mounts the ubuntu image into the directory "ubuntu" mount -o loop -t ext4 ubuntu.img ubuntu #this chroots into the system unset LD_PRELOAD export USER=root export HOME=/root chroot ubuntu /bin/bash
Now you have a full powered ubuntu system available on your android device 🙂 In there you can do the usual things:
apt-get install mercurial hg clone https://bitbucket.org/befi/wifibroadcast cd wifibroadcast make
And then you have a usable wifibroadcast. Inside the ubuntu system you can then setup your wifi card:
ifconfig wlan1 down iw dev wlan1 set monitor otherbss fcsfail ifconfig wlan1 up iwconfig wlan1 channel 13
Decode and display video
Remember, the rx program will receive raw h264 frames that need to be decoded and displayed. My first attempt was to install the “XServer XSDL” app (excellept app by the way) which gives your chroot system graphical output capabilities. From that I just used the standard ubuntu gstreamer tools to decode and display the video. Unfortunately this was a bit too slow and had a latency of around 500ms (too much to be usable). I also tried some of the video streaming apps from the Play store but I found that none of them was suitable. So I had to write my own… 😦
I wrote an app that opens a UDP socket on port 5000 for receiving raw h264 data. The incoming data is parsed for “access units” and these are forwarded into a MediaCodec instance. The decoded frames are then directly rendered onto a Surface object. You can grab the source code of it here:
hg clone https://bitbucket.org/befi/h264viewer
This repository contains an “Android Studio” project. Use “import project” to open it. It targets Android API 21 (aka 5 Lollipop). There is also a ready-to-install .apk under “app/build/outputs/apk/app-debug.apk”
How to use that program? Very simple: Just send raw h264 data over UDP to port 5000 of your android device and it gets displayed. If your Raspi is in the same network as your android device you could use something like this:
raspivid -t 0 -w 1280 -h 720 -fps 30 -b 3000000 -n -pf baseline -o - | socat -b 1024 - UDP4-DATAGRAM:192.168.1.100:5000
where 192.168.1.100 is the IP address of your android device. Why have I used “socat” instead of “netcat”? Well, netcat is a line-based tool. Since we are sending binary data it usually takes some data until something similar to a newline is found (netcat triggers the packet transmission on this condition). Most often this will be more data than the maximum packet size. Therefore, netcat packets are usually fragmented IP packets (something you should avoid). Socat just sends unfragmented packets when the threshold (in this case 1024bytes) is reached.
How to use that with wifibroadcast? Again, very simple. On the raspberry execute the usual command to broadcast your video into the air:
raspivid -t 0 -w 1280 -h 720 -fps 30 -b 3000000 -n -pf baseline -o - | sudo ./tx -r 3 -b 8 wlan1
Inside the chrooted Ubuntu on your android device execute:
./rx -b 8 wlan1 | socat -b 1024 - UDP4-DATAGRAM:localhost:5000
All this does is to forward the packets from “rx” to the UDP port 5000 of your device. And voilà, you should see your camera’s image.
The latency is ok and should be usable. My PC is still a little bit faster but that is not too surprising. I have some ideas on how to optimize everything and keep you posted. Concerning battery life I was quite surprised. The app and wifi dongle drained the battery 14% after one hour of runtime. So I’m quite confident that this would allow me to at least stream for 5 hours with a full battery.
Another alternative I’m thinking about is to use a Raspberry with a HDMI display to show the video stream. I’ve fiddled around a bit and maybe in the next post I’ll present my OMX pipeline that does the decoding and displaying of the video on the Raspberry’s GPU (thus the CPU idles). I was actually surprised that this gave me even lower latency compared to a gstreamer pipeline running on my PC. The downside is that this is a bit more complicated hardware-wise. raspi+display+display-controller+wifi-card+battery+5v-dc-dc-converter could get a bit messy… we’ll see 🙂
This post describes my quad hardware including a (mostly) complete parts list. It also shows the results of the first test of my failsafe(r) wifibroadcast video transmission
The history of my quad
I spend my last weeks on finishing my quad to actually test the video transmission in the air. At first I thought this would be done with no hassle. I grabbed my good old Mikrokopter (built in the year 2007) and tried to fly it after sitting for 5 years in the attic. The result was horrible. The gyros were dead and the bearings of all four motors were damaged… so I repaired the motors, ordered a new flight control and noticed that the old brushless controllers were not capable of processing the high update rate required over PPM (previously they used I2C). So they also went into the bin. Long story short: Everything I kept were the 4 motors…
The new quad
So much has happened in the quad domain since 2007. I was really impressed. I just briefly got an overview on the electronics that are on the market today. So to all the quad-aficionados: Don’t be mad with me if I’ve bought not the most optimal part here and there 😉
Lets start with some photos:
Here you can see the whole device. It is constructed using a sandwich out of two 1mm carbon fiber plates. The motor rods are round 8mm carbon fiber pipes. The mountings for the motors have been made out of 10mm square aluminium pipes. The inner diameter of them is 8mm. Due to this the carbon fiber rods fit perfectly. With a bit of epoxy glue they are very well connected:
On the side of the chassis the rods are glued into some custom “spacers” which also hold the two carbon plates together:
The inside of the sandwich is packed full with electronics (so they are well protected in case of a crash):
Sample of the live video stream
Following is a video of my first FPV flight with the new quad. The video was recorded at the ground station (-> it is the live stream). The sun was already gone which explains the pale colors. Also, I did set the key-frame rate to 0.5s (safe is safe for the first flight). This also had some impacts on video quality. It was quite windy which caused the camera to shake a lot. I was still too afraid to let my quad of the leash flying FPV only. Therefore, the maximum distance I archived was around 100m. As soon as I’ve practiced a bit more I’ll do some real range tests.
Just a quick update on my FPV project: Today I went to a park and tested the video transmission over wifi. I contrast to the last test this environment was free of other wifi networks. So this fits more the usual FPV environment.
I’ve mounted the standard 3dBi dipole at a height of 50cm for tx and used my double biquad on the rx side. Air data rate was 26mbit/s which allowed me to transfer 5mbit/s of video data at a retransmission rate of 3.
The results were quite promising. The longest line of sight (with a some trees in between. Actually the perfect line of sight was not usable since my FPV system guard insisted on a bench 😉 ) I’ve found in that park was 500m. At that distance there wasn’t a single packet drop. Thus the video stream was perfect. That is quite good for HD video transmission at 100mW (which means it’s perfectly legal to use). This is a satellite image of the setup:
The RSSI at 500m was at -70dBm. I’ve found that with the 3dBi dipole on the RX side at 300m I was still able to receive a more or less usable video. This was received with a RSSI of -82dBm. So I’m quite confident to reach 1km with the double biquad and the antenna more elevated (-> in the air on the quad).
I did some latency tests using the usual “recursive” approach:
The results (using a block retransmission size of 8 packets) were mostly identical with the findings of PilotGary. At a resolution of 768×480 I was able to get 80ms and with 720p I got around 130ms. This is what I’ve expected.
The range tests I did are not quite representative of a typical FPV scenario. I used my balcony (~18m above ground) for the transmitter. My street is full with flats of which most have their own wifi. This is what the street looks like:
Following is list of networks on the least used channel 13:
APs: PWR RXQ Beacons #Data, #/s CH MB ENC CIPHER AUTH -70 3 26 5 0 11 54e. WPA2 CCMP PSK -71 29 111 8 0 12 63e WPA2 CCMP PSK -73 0 0 0 0 11 54e WPA2 CCMP PSK -73 0 196 8 0 13 54e WPA2 CCMP PSK -74 7 42 1 0 11 54e OPN -74 0 15 0 0 11 54e. WPA2 CCMP PSK -76 18 10 0 0 11 54e. WPA2 CCMP PSK -77 0 16 0 0 11 54e WPA2 CCMP PSK -78 0 19 0 0 12 54e WPA2 CCMP PSK -78 0 2 0 0 11 54e. WPA2 CCMP PSK -80 39 174 5 0 13 54e WPA2 CCMP PSK -81 3 39 6 0 13 54e. WPA2 CCMP PSK -81 0 31 0 0 12 54e WPA TKIP PSK -81 11 43 0 0 13 54e WPA2 CCMP PSK -82 20 152 0 0 13 61e WPA2 -82 55 253 41 0 13 54e WPA2 CCMP PSK -82 33 137 0 0 13 54e WPA2 CCMP PSK -82 0 4 0 0 11 54e WPA2 CCMP PSK -80 0 33 0 0 13 54e. WPA2 CCMP PSK -83 2 57 1 0 13 61e. WPA2 CCMP PSK -84 0 63 0 0 13 54e WPA2 CCMP PSK -83 16 46 0 0 13 54e. WPA2 CCMP PSK -83 0 17 4 0 13 54e. OPN -83 11 46 0 0 13 54e WPA2 CCMP PSK -83 0 47 0 0 13 63e. WPA2 CCMP PSK -83 3 32 0 0 13 54e. WPA2 CCMP PSK -83 0 11 11 0 13 55e WPA2 CCMP PSK -84 1 22 0 0 13 54e. WPA2 CCMP PSK -86 0 2 0 0 13 54 WPA2 CCMP -86 0 7 0 0 13 54e WPA2 PSK -87 0 0 15 0 13 -1 WPA -65 0 0 0 0 -1 -1 -1 0 1 0 0 -1 0 OPN -1 0 0 2 0 13 -1 WEP WEP40 CLIENTS: PWR Rate Lost Packets -1 2e- 0 0 4 -1 1e- 0 0 1 -1 1e- 0 0 1 -75 0 - 1 0 1 -1 2e- 0 0 1 -71 0 - 1 0 1 -30 0 - 1 0 2 -70 0 - 1 0 4 -70 0 - 1 0 1 -70 0 - 1 0 1 -70 0 - 1 0 4 -71 0 - 1 0 1 -72 0 - 1 0 2 -72 0 - 1 0 5 -72 0 - 1 0 1 -72 0 - 1 41 2 -74 0 - 1 0 2 -75 0 - 1 51 9 -75 0 - 2 0 4 -75 0 - 1 0 1 -76 0 - 1 0 2 -77 0 - 1 0 1 -77 0 - 1 0 1 -77 0 - 1 0 1 -77 0 - 1 0 2 -78 0 - 1 54 4 -78 0 - 1 0 1 -79 0 - 1 0 3 -80 0 - 1 0 1 -81 0 - 1 0 1 -81 0 - 1 0 1 -81 0 - 1 0 1 -81 0 - 1 0 4 -82 0 - 1 0 1 -82 0 - 1 0 2 -82 0 - 1 0 1 -83 0 - 1 0 1 -83 0 - 1 0 2 -83 0 - 1 0 12
It is even worse than it looks since channel 11 overlaps with channel 13. I won’t spam you with the network list of channel 11. In summary: Around 50 networks with 100+ clients. And that is just what I’ve received using the standard 3dBi dipole that came with the TL-WN722N. All this traffic on channel 11 increases the noise level on channel 13 (that is the one that was used in this test).
You can find instructions to build one here.
The performance of that antenna is quite impressive. The RSSI was always around 10dB better compared to the dipole (assuming the unit the card showed to me was in dB. Often it is not). Still its beam is not too narrow for FPV (in contrast to most yagis and dish antennas).
To not look like a tinfoil head that runs through the street with that weird antenna and a laptop I put both laptop and antenna into my backpack. The laptop streamed its image to my phone via a second wifi network. Because of that I wasn’t able to align the antenna correctly. It always looked a bit downward instead of up to the transmitter. Also, it had no direct view to the transmitter. A plastic box, several shirts (for protecting the antenna) and the backpack were in between the direct line of sight. On top of that the line of sight was blocked by some trees. All in all pretty bad circumstances.
Since the channel was so occupied I managed only to transfer 3mbit/s. Because of that I limited the image resolution to VGA. These were my test conditions: 768×480@30fps, 1mbit/s video rate, 26mbit/s air rate, retransmission block size of 8 packets, 1024 bytes of payload, triple retransmissions.
200m: Perfect image
300m: Noisy image but still flyable
400m: Very noisy images, not usable
600m: Still lots of packets received but most of them with bad FCS
300m of usable range might not sound like a lot. But I was really surprised it still worked at that distance. Imagine more than 200 networks and even more clients polluting the air around transmitter and receiver with noise. Together with the bad antenna alignment I am quite confident that this solution could reach at least 1km+ @ 720p with a direct line of sight and lesser networks in between.
This post presents some changes to my wifibroadcast project (https://befinitiv.wordpress.com/2015/01/25/true-unidirectional-wifi-broadcasting-of-video-data-for-fpv/ ) that improve reliability and functionality
When playing around with my rx and tx tools I noticed something odd. I tried to find the right setting for the retransmission count. To recall, the retransmission count is the number an identical data packet is being sent. The idea is to simply increase the probability that at least a single packet makes its way. The strange thing I’ve encountered was that it made no difference whether I was sending the data two times or six times. In both cases I had nearly the same packet loss. By using wireshark I was able to find the cause of that problem: Beacon frames from hidden terminals.
The hidden terminal problem:
A ~~~~ B ~~~~ C
Assume A wants to talk to B. Before A sends a packet it monitors if the channel is free. If so, it sends its packet. Unfortunately, C is doing the same thing as A at the exact same time. Since A cannot head C and vice versa since they are too far apart they both assume the channel to be free. At station B the frames from A and C collide and get lost.
What does this have to do with retransmission? Well, beacon frames are sent at the lowest possible rate (1Mbit/s). They carry usually between 150 and 250 bytes so the duration of them can be up to two miliseconds. In contrast to that the wifi card I used was sending its data at 26Mbit/s. Thus the duration of the frames was significantly shorter. Because of that, a single beacon frame from a hidden station was able to destroy a whole retransmission block. Lets visualize that. Assuming a retransmission rate of 3 the data packets a,b and c would be sent like this:
Now lets assume a beacon B starts after the first packet of a:
Whops, you have just lost the packet b because the beacon blocked every transmission of it.
The solution to this problem is simple: Retransmission blocks. Packets are gathered to form a block which is then sent several times. Let’s assume a block size of 8 packets, a retransmission rate of 3 and the packets a,b,c,d,e,f,g and h to send. On the air the packets are now sent as follows:
When our nasty beacon arrives it looks as follows:
The total number of packets is identical but we lost no packet 🙂
This comes at a little price: Latency. But if the blocks are small enough it is pretty low. I found block sizes of 8 or 16 packets quite useful. They really made the link quality much much better. If you choose a block size of 1 then the program behaves as before (this is also the default setting). A little caution is needed: both rx and tx need to agree on the block size!
What retransmission rates are sensible? In my experience you need at least two. Then the video stream will be mostly error-free at good reception. If you are having a bad link (long range or high noise) that you should set this value as high as you can. This really improves the range you can get. The limit for this factor is the available bandwidth. Just multiply your video bitrate with the retransmission count and try to fit that product into the maximum bandwidth (Assuming a TL-WN722N this would be roughly 14mbit/s at MCS3 26mbit/s air data rate)
The first version of rx and tx was able to transfer only a single stream of data at once. Since it would be nice to transport also other data over the same channel (think of GPS, etc) I added a “port” feature to both programs. The last byte of the fake MAC address used by rx and tx is replaced by this port number. This allows you to define up to 256 channels.
Testing video transmission using the raspberry camera
On the receiver side:
sudo ./rx -p 0 -b 16 wlan0 | gst-launch-1.0 fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
On the raspberry side:
raspivid -t 0 -w 1280 -h 720 -fps 30 -b 3000000 -n -pf baseline -o - | sudo ./tx -p 0 -r 3 -b 16 -f 1024 wlan1
This creates a transmission on port 0 (-p) with three retransmissions (-r), a retransmission block size of 16 (-b) and packets with the length of 1024 bytes (-f).
So far it looks pretty good. The latency is roughly (not measured) in the range of 100ms and the range is satisfying. I tested it with two TL-WN722N with 3dbi dipoles and had good video quality throughout my apartment. The longest distance covered was 20m with 4 concrete walls in between. I already saw lots of packets dropping but a retransmission rate of 5 fixed that and gave a clear image.
If you want to try that by yourself:
hg clone https://bitbucket.org/befi/wifibroadcast/
If you are using the same wifi cards you might also take a look at https://befinitiv.wordpress.com/2015/02/22/finding-the-right-wifi-dongle-and-patching-its-kernel-driver-and-firmware/ .
What’s next? I guess building a double biquad antenna for the receiver 🙂
This post is a follow-up on https://befinitiv.wordpress.com/2015/01/25/true-unidirectional-wifi-broadcasting-of-video-data-for-fpv/ . Here I describe my findings on several WIFI dongles that I have tested.
In the last post I presented my raw sender and raw receiver software. This post focuses on the right hardware to use with that software.
Therefore, I tested the following WIFI adapters:
My tests focused on tx power and injection rate. I also briefly looked into the receiver sensivity.
Looking into the TX power was an interesting journey. The first thing I did was to google how others change their TX power. I guess most of you know about the wardriving kids changing their regulatory domain to BO for being able to execute “iwconfig wlan0 txpower 30” to get 30dbm. I tried that same command for lowering my tx output power and noticed no difference at all when looking at the receive strength (RSSI) on a second card. That is quite weird. There should be a noticeable difference between 1dbm and 20dbm. I started looking into the device drivers and noticed that the iwconfig command never reaches the power set function of the cards (this was true for all of the tested cards). I haven’t traced down the position where the call gets lost but I was quite stunned. Should all the 1000 “change your regdomain” websites be wrong about this? As an additional check I hardcoded different tx power values in the drivers. This did change the RSSI! To be honest, I only tested the vanilla drivers. I didn’t check if Kali uses modified drivers. But I found no hints on this.
Let’s look in detail at the extraordinary claims of the ALFA cards (1W, 2W, …). For example the AWUS036NHR (sold as 2W) only contains a 1W amplifier. I increased the tx power in the kernel driver to 200mw and tested the card in a shielded cage (perfect use case for a micro-wave oven 🙂 ). I saw already lots of package drops. I suspect that the power distribution of the card is too bad to supply this amount of power. One indication is that the supply voltage of the tx amplifier dropped from 5V to 4V during the transmission of a packet. I wouldn’t be surprised if this caused the drops. Using a tx power of 1W nearly all packages got dropped. Just a few made in through the air. I noticed the same behaviour on the AWUS036H – although not as bad as on the NHR.
The TP-LINK TL-WN722N supports only 100mW output power on the paper. My tests showed that this claim is more or less true. The card shows reliable transmission at 100mW and the RSSI suggested that the output power is in that order of magnitude (compared to the AWUS cards RSSI). Nice!
I did not bother to hack the Dlink cards driver since this card does not have an external antenna connector.
I was surprised to see how the injection rates differ from card to card. I was thinking that the packets are injected at maximum rate (obeying free channel). But this is not at all the case. I did not trace the cause of low injection rates because I’m not motivated to optimize a driver stack in that respect. Following is a list of injection rates of the tested cards. I injected packets with 1024 bytes payload and measured the rate. On the left you see the air data rate, on the right you see the net data throughput. Net means 100% user data.
len: 1024 bytes REALTEK RTL8187 --------------- 54 mbit/s OFDM: 480p/s ~ 3.9mbit/s REALTEK RTL8192 --------------- 54 mbits/s OFDM: 80p/s ~ 0.6mbit/s RALINK RT2573 ------------- 54 mbit/s OFDM: 2500p/s ~ 20.5mbit/s ATHEROS AR9172 -------------- 11 mbit/s CCK: 830p/s ~ 6.8mbit/s 24 mbit/s OFDM: 1750p/s ~ 14.3mbit/s 54 mbit/s OFDM: 2700p/s ~ 22mbit/s 6.5 mbits/s MCS0: 630p/s ~ 5.1mbit/s 13 mbit/s MCS1: 1100p/s ~ 9mbit/s 26 mbit/s MCS3: 1800p/s ~ 14.7mbit/s 52 mbit/s MCS5: 2700p/s ~ 22.1mbit/s
Again, the ALFA cards are pretty much crap. Luckily, the TP-LINK card showed excellent results. Depending on your data rate and packet loss reliability requirements (the retransmission flag of the tx program) you can use these numbers to find the right modulation scheme. Assume you have a video stream with 6mbit/s (enough for h264 HD) you can fit that easily into the MCS3 modulation, even with one retransmission.
The RX sensitivity of the TP-LINK was the best of all the tested cards. I have no numbers available since all the other cards are already disqualified by their injection rate. I measured it by sending packets at a known rate and looked at the reception rate. Again, the ALFA cards were the worst. The public opinion on these cards is really the opposite of my findings… quite interesting.
The winner of my tests is very clear: TP-LINK TK-WN722N
It has excellent injection rates and also an acceptable tx power level. But it is not only good for transmission: It supports to deliver frames with wrong FCS to user space. These frames are for example the ones where only a single bit is flipped. Normal cards just drop these frames and there is no way to see them from user space. It is quite obvious that a few flipped bits have lesser effect on a video stream than the loss of a whole packet (8*1024 = 8192 bits). This is exactly what I described in the last post: That the link should behave as much as possible as an analogous link. The feature of this card can be enabled by this command:
sudo iw dev wlan1 set monitor fcsfail
Another advantage of this card is that its firmware is open: https://github.com/qca/open-ath9k-htc-firmware.git
Lastly, this card is really cheap: 10€. I’m quite happy that I’ve found it.
As said before, to change the tx power of the card a change of the driver is necessary. I patched the driver of the TL-WN722N in the raspberry kernel. This patch should set the power to a constant 20dbm (although I haven’t checked the unit of the tx-power value. This is still TODO). Use the following commands to apply it:
cd /tmp hg clone https://bitbucket.org/befi/wifibroadcast/ git clone https://github.com/raspberrypi/linux.git cd linux git checkout fe4a83540ec73dfc298f16f027277355470ea9a0 git branch wifi_txpower git checkout wifi_txpower git apply ../wifibroadcast/patches/AR9271/kernel/fixed_channel_power_of_ath9k_to_20dbm.patch
From here on you can just compile the necessary modules the usual way (Refer to http://www.raspberrypi.org/documentation/linux/kernel/building.md ). When the compilation is finished you need to copy the following modules to the corresponding folders under /lib/modules/`uname -r`/kernel on the raspi:
drivers/net/wireless/ath/ath9k/ath9k.ko drivers/net/wireless/ath/ath9k/ath9k_hw.ko drivers/net/wireless/ath/ath9k/ath9k_htc.ko drivers/net/wireless/ath/ath9k/ath9k_common.ko
Changing the tx rate was a bit more complicated. It seems as if the firmware of the card just takes rate suggestions from the kernel driver. But the actual rate is decided by the firmware. Therefore, I needed to patch that as well. You find a pre-compiled firmware that uses MCS3 (26mbit/s) as an injection rate under patches/AR9271/firmware/htc_9271.fw. Copy this file to /lib/firmware and re-insert the card to use the modified firmware. If you want to compile the firmware with a different injection rate you can take look at the patch that I supplied. The easiest way is to replace the first line in the list with a copy of the wanted injection rate. Instructions for compiling the firmware are given in the firmware repo: https://github.com/qca/open-ath9k-htc-firmware.git
The TL-WN722N is my choice for the receiving side. I’m still hesitating to buy a second one for the transmitter. Maybe I should give ALFA a second chance with the AWUS036NHA (https://wikidevi.com/wiki/ALFA_Network_AWUS036NHA ) which uses the same chip-set as the TP-Link. The advantage would be that it adds a 1W power amplifier to the output. However, if it is as bad as the other ALFA cards, I would have thrown away 25€… If you want me to test that card, feel free to donate 🙂
This post shows how to broadcast data over 802.11 devices with a true unidirectional data flow. No need to be associated to a network (thus no risk of being disassociated) and no acknowledgements are sent from receiver to transmitter. Please note that this is ongoing work and just a proof of concept.
EDIT: For a complete overview, refer to this page: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
My plan for this spring: Fly FPV! There are already a lot devices on the market for transmitting analog video signals over 2.4GHz or 5.8GHz. But to me that always seemed a bit outdated. Of cause, several people thought that too and tried to send their video data over Wifi. A good example for doing so: https://sparkyflight.wordpress.com/2014/02/22/raspberry-pi-camera-latency-testing-part-2/
“Sparky flight” took a RaspberryPi, encoded the video stream as h264 and sent it over Wifi to a PC. He was able to get a latency of down to 85ms glass-to-glass, which is quite nice! But: All Wifi solutions have the same problem: When you loose your Wifi connection, you are immediately blind. That is of cause not acceptable in a FPV setup. This is where the old analog link has a really big advantage. When you are getting out of range, the signal slowly degrades. You would have still time to react and turn your plane around.
So I thought: Wouldn’t it be possible to archive the same advantage of a slowly degrading link over Wifi? Not out of the box but I’ll show you how.
The basic approach is: The video transmitter sends its data as a true broadcaster into the air. Regardless of who is listening. The receiver listens all the time and when he is in range of the transmitter he’ll receive the data. When he starts getting out of range he’ll eventually receive not all packets but still some. This behaviour is comparable to that of an analog signal path.
The main problem is that the Wifi standard does not support such a mode. Devices always need to know to whom they are sending their data. This relationship is created by the “Wifi association”. If your PC is associated with your router, both devices know to whom they are talking. One of the reasons for this association is to make the data transfer more reliable. A receiver of a packet always acknowledges the reception to the transmitter. If no acknowledgement has been received, the transmitter has the chance to re-transmit the packet. Once they loose their association, the cannot exchange data anymore.
The Wifi ad-hoc mode comes pretty close to an unassociated “broadcast style” way of transmitting data. Unfortunately, it seems as if modern 802.11 standards aren’t supporting ad-hoc mode anymore. If you buy a 802.11ac card and put it into ad-hoc mode, it most likely falls down to 11MBPS. It is not required by the standard to support ac rates in ad-hoc mode 😦
To solve this issue I wrote two small programs that serve as a raw transmitter and raw receiver. They are pretty much hacked together out of a program called “packetspammer” http://wireless.kernel.org/en/users/Documentation/packetspammer
You find my programs here: https://bitbucket.org/befi/wifibroadcast
After compiling the sources with “make”, you’ll have two programs called “tx” and “rx”. Both take as a mandatory argument a wlan interface that has been put into monitor mode. The tx program reads data over stdin and sends it with a raw 802.11 packet into the air. On the other side the rx program listens on a device and outputs received data to stdout. The packets of the transmitter are recognized by their fake MAC address (13:22:33:44:55:66). The packets only contain a valid 802.11 header (so that they are not rejected by the wifi card). The rest of the packet is filled with raw data.
Following is an example on how to use the programs:
sudo ifconfig wlan0 down sudo iwconfig wlan0 mode monitor sudo iwconfig wlan0 channel 1 sudo ifconfig wlan0 up sudo ./rx mon0
sudo ifconfig wlan0 down sudo iwconfig wlan0 mode monitor sudo iwconfig wlan0 channel 1 sudo iwconfig wlan0 rate 54M sudo ifconfig wlan0 up sudo ./tx mon0
Everything you’ll type into the tx shell should now appear on the rx shell. The tx program also supports as parameters the maximum length of the packets and the number of retransmissions. A retransmission rate of 3 for example will cause the transmitter to transmit each packet three times. This increases the chances that the receiver receives one of them correctly. To avoid that the data is being received 3 times each packet contains a 32bit sequence number. If a sequence number is received more than one time the subsequent packages with identical sequence numbers will be ignored. I admid that this type of redundancy is rather stupid. The problem is that most Wifi cards discard packets with wrong FCS (frame check sequence) completely. So the more classical approaches of redundancy (hamming-codes, …) are not so easy to use. There is definetely still some work to do here! Feel free to participate 🙂
Writing text over a true broadcast-connection is nice. But how about video? Actually it is really simple. GStreamer is a nice tool for this purpose. My test-setup looks as follows:
(usb webcam) <—> (raspberry pi) <—> (wifi dongle) ~~~~airgap~~~~> (wifi dongle)<—>(PC)
On the raspberry pi I execute:
gst-launch-1.0 -v v4l2src ! 'video/x-raw, width=640, height=480, framerate=30/1' ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0
In words: Receive raw video from V4L2, transform it into 640×480 30fps, encode it as h264 with 500kbit/s and increased keyframerate (this helps if packets get dropped), directly write the video data (without container) to stdout. The video data is then piped into the tx program with two retransmissions and a packet length of 1024 bytes.
And to receive and display it on my PC:
sudo ./rx wlan0 | gst-launch-1.0 fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
In words: Receive data on interface wlan0 and pipe it into gstreamer. Gstreamer receives the data, parses the raw h264 data, decodes it and displays the video.
The video quality is ok, maybe a bit too low for actual flying. In the settings above the latency is quite ok, maybe between 100 and 200ms. I noticed that when I increased the encoder bitrate that the latency was bigger. But I still need to look into that. I think by using the original raspi-cam it should be possible to archive the ~100ms of the “Sparky Flight” guy.
Dropped packets turn out to behave as expected. The video image is partly disturbed but continues to run. A rough estimate: Up to a loss of 5-10% of the packets the video should still be usable to rescue your plane. See below an example of a transmitted video with a packet loss rate of approximately 2.5%:
Unfortunately I wasn’t able to change the power of the transmitted packets. There is a field in the radiotap header which I have set but it seems to be ignored. Otherwise my solution would be perfect for FPV. You could (as a Bolivian of cause 😉 ) buy one of these cheap Wifi cards with 1W output power and have an extremely cheap long distance video link (and since this is a true unidirectional link you would only need a single high power card in your plane). You could also use the Raspi to gather and transfer GPS information or battery capacity live. Of cause this would then be realized as a side-channel and written into the image on the receiving device. In contrast to those (shitty) analog devices that write directly onto the transmitted image…
If you are interested in participating, please share your experiences in the comments, take my code, modify it, improve it 🙂 My gut feeling is that there are only little things left to do for having a true digital (possibly HD) FPV system with low cost equipment.
My next step: Make some range experiments…