Skip to content

Telemetry OSD for wifibroadcast

July 6, 2015

This post shows how to add an overlay with telemetry information onto a video stream

Since the last flyaway of my quad having a telemetry link that transmits GPS coordinates was high on my wish-list. Wifibroadcast was prepared long time ago to transfer a second data stream in parallel to the video data by defining ports. But until now I did not make use of it. You can see a video of the OSD in action here:

This post describes how to set up the OSD display. Additionally, two changes required to wifibroadcast are described before the actual OSD stuff.

Setup

My quad uses a Naze32 flight controller that is able to provide FrSky telemetry data over a serial link. The serial port of the Naze32 is connected to a USB2SERIAL converter that is plugged into the tx Raspberry. The telemetry data is then transmitted using wifibroadcast over a second port in parallel to the video. On the receiving Raspberry the telemetry data is received, parsed and drawn onto the screen.

Wifibroadcast minimum packet length

Telemetry data is very different to video data in that it consists of very small packets. For example, most of the FrSky packets are only 5 bytes long. Using the default settings of tx this would lead to a single packet being sent for each small telemetry data unit. This of cause would create a high overhead since the ratio of payload to required header data is very low. To avoid this issue I added a command line parameter -m that can be used to define a minimum payload length. If given, tx waits until it has captured the minimum number of bytes and only then sends a packet.

Several tx instances in parallel

As said above wifibroadcast had the “port” feature for quite some time. The idea was to start several instances of tx, each with a different -p parameter. I used this method for my first telemetry experiments but noticed something odd: Whenever I started a second tx instance in parallel to the video tx process, the video tx was influenced. This was even the case when the second tx did not send any data at all. Quite strange. The result of this was that the video transmission was less fluid and stuttering a bit. Quite a high price for just a bit of telemetry data…

I basically had two options: Trace the cause of this issue in the kernel or adapt tx so that a single instance is able to send several data streams. I went for the latter one because I think it required less effort.
The route I went for was that tx got a new parameter -s that describes the number of parallel streams that shall be transmitted. If this parameter is omitted then tx behaves just like before, transmitting a single stream that is read from standard input. If however two or more streams are requested, tx changes the way it works. It then creates named FIFOs under /tmp. For example, if -s 2 is given, tx creates two named FIFOs called /tmp/fifo0 and /tmp/fifo1. Everything that gets written into fifo0 will be transported over the first port and everything written into fifo1 gets transported over the second port. Quite simple. The actual port number can be influenced by the -p parameter. It serves as an offset. Assuming that -p 100 is given, then fifo0 sends on port 100 and fifo1 on port 101.

The next section will show a usage example of this new mode that should make things clearer.

This feature is still not in the master branch since I had not yet the time to test it well. But after my first successful flight I will merge it into the master.

Putting things together

These commands assume that you have already installed wifibroadcast (as described here).

Transmitting side

First, change to the “tx_fifo_input” branch in wifibroadcast:

cd
cd wifibroadcast
hg pull
hg update tx_fifo_input
make

Next, start a tx instance in the background with a retransmission block size of 4, a minimum packet size of 64 bytes and two parallel streams:

sudo ./tx -b 4 -m 64 -s 2&

All that is needed is to connect the data sources to the named FIFOs:

sudo su
stty -F /dev/ttyUSB0 -imaxbel -opost -isig -icanon -echo -echoe -ixoff -ixon 9600 #set up serial port

raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o - > /tmp/fifo0 & #video
cat /dev/ttyUSB0 > /tmp/fifo1 & #telemetry

Now video and telemetry data is transmitted in parallel.

Receiving side

The video reception can be left unchanged (refer to here) and should work out of the box.

To get the OSD working, we first need to check out and build my OSD project:

cd
hg clone https://bitbucket.org/befi/frsky_omx_osd
cd frsky_omx_osd
make

And finally you need to pipe the telemetry data into the OSD viewer (note the -p 1 for the telemetry port):

cd
cd wifibroadcast
sudo ./rx -p 1 -b 4 wlan0 | /home/pi/frsky_omx_osd/frsky_omx_osd /home/pi/frsky_omx_osd

Of cause, you can also save the telemetry data using the “tee” command.

In case you just want to test the OSD without the whole wifibroadcast stuff there is a FrSky telemetry log included in the repository:

./frsky_omx_osd . < testlog.frsky

Conclusion

The OSD works well for me. But be warned: Things are still a bit hacky around here. I wouldn’t guarantee that my FrSky protocol interpretations are always right. No wonder seeing how that protocol is designed. It is a perfect example of what happens if companies outsource their coding tasks to mental institutions 😉
It should be easy though to replace the frsky.c parser with a different one (also for a different protocol). Volunteers are welcome 🙂

From → Uncategorized

70 Comments
  1. Indur permalink

    I’ve been following your blog for a last couple of months and I have to say – it’s really great what you are doing!
    I’m using this on my bigger quad and it looks very promising. The picture is so much better compared to the analog transmission. (still using 2.4 and 5.8GHz systems on my racing quads though).
    On my big quad I have NAZA FC (tried to avoid non open source flight controllers for years, but then I just gave it a try and.. damn it’s so stable).
    Anyway, on my NAZA copter I’m using Arduino Nano to get the telemetry data (battery voltage and gps coordinates using NazaDecoder library), but I connected my Nano directly to the RPI UART pins. Well, almost directly, since Nano is 5V device and RPI UART is 3.3V, I had to use couple of resistors to lower the voltage. The serial device on RPI is /dev/ttyAMA0

    I’m wondering why are you using USB2SERIAL to connect your Naze to RPI?

    • Thanks 🙂 I’m using the USB2SERIAL because I really like having the serial console always enabled. Can be a real time saver if the Raspi refuses to boot for whatever reason.

      • John van der Berg permalink

        Hi befinitiv,

        Its a grate project, well done!
        I use your prebuild version v0.2 image on my DJI phantom, but now I will use the navdata from the DJI- naza FC (with a naza2frsky for the conversion).
        I have study the shell TX scripts, but couldn’t find the serial telemetry entrys, is this enabled?
        and how can i change the serial-port into /dev/ttyAMA0 ?

        thanks,

        John

      • Hi John

        Thanks! In the prebuilt images the telemtry transmission is disabled. This is because it is heavily device dependent (which serial port to use, which protocol, …) But you should find everything needed in this post. So what you would need to do is to add the lines from this post to the tx.sh you’ll find in the TX prebuild images under /home/pi/wifibroadcast_fpv_scripts. Hope this helps!

      • Burb permalink

        Hi befintiv, do you know if I can connect the softserial TX from naze32 directly to the RPI RX. I don’t know how to find out if the naze32 locig level is 5 or 3.3.
        Thanks

    • Ronald permalink

      Hello INDUR , you have a tutorial for connecting the Naza ?
      I also use a NAZA for my copter and want transmitted the telemetry by wifibroadcast .

      Thx Ronald

  2. @befinitiv Great work!

    I used 3 Tplink wn722n with USB Hub on RX side ,but it doesn’t work , What could be the problem here?

    • Could you tell us something more about what’s not working here? Error messages together with command lines would be helpful. Does it work with 2 but not with 3?

      • I used RPI2 , 3 wn722n on rx side , used your rx script (https://bitbucket.org/befi/wifibroadcast_fpv_scripts/src) ,
        it work with 2 wn722n but not with 3 wn722n . The third wn722n Not Found .

      • I updated the RX script so that it automatically detects how many wifi dongles are connected. I tested it with 1,2 and 3 dongles on a RPI2 and it worked fine. Maybe you could update your rx script as well as wifibroadcast and try again?

    • Ronald permalink

      Hi David, i used 3 tplink “wn722n” on rx side and this work`s fine. Could you tell us your rx script ?

      best greeting ronald

    • Trailblazer permalink

      I tried to use 3 TP-Links with an RPi 2, but it didn´t worked for me. They couldn´t be initialized and the OS freezed. With an Pi Model B there is no problem for me, all interfaces are listed and initialized correctly.

  3. Indur permalink

    Roland, take a look at this: http://www.rcgroups.com/forums/showthread.php?t=1995704
    Or if you prefer ready-to-use solutions, you may be interested of this: http://pleasantsoftware.com/developer/3d/zaggometry-naza2frsky-taranis-telemetry-adapter/

    You should be able to connect this adapter directly to your RPI UART.

    • Ronald permalink

      Hallo Indus die Zaggometry hab ich hier , kann diese direkt an den uart vom Pi (hab hier einen Zero) oder muss da noch was dazwischen?

      Hatte die zaggometry schon angeschlossen aber bis jetzt wurden noch keine Werte gezeigt. Vieleicht noch den uart auf dem zero aktivieren?

  4. cbl permalink

    @befinitiv you forgot the interface in the ‘./tx’ part. I think it should be ‘sudo ./tx -b 4 -m 64 -s 2 mon0 &’. Otherwise, great project. Currently I’m tryin to code a python interface to remote control the camera with tx/rx in parallel on the same interface. I’m new to python, so could take a while. Will make it public if it works. Greetings

  5. Michael permalink

    Hey man, this is MAGIC!
    Great project, I love it!
    One wish: Mavlink please! This will be an early xmas present for the whole ardupilot community!!

    Thanks a lot,
    Michael

  6. shaddi permalink

    Is there a way to get an average RSSI-reading of the last x packets as an OSD-Element? Radiotap seems to provide this information as “IEEE80211_RADIOTAP_DBM_ANTSIGNAL”

  7. DAGA permalink

    Hi,
    I am currently working on a mavlink osd for your project. But I have problems to get the new tx/rx sytem working. I updated both raspberrys as described above.
    The problem is, I am not able to get the stream working again. I tried your commands using the fifos for telemetry and video and I also tried the standard tx command:

    “sudo /home/pi/wifibroadcast/rx -b 4 -p 0 wlan0 | /opt/vc/src/hello_pi/hello_video/hello_video.bin &”

    The tx pi is telling me that he is sending packets, but the tplink of the tx is not blinking and I can’t receive anything on the rx side…

    Do you have any idea what could be wrong?

    • DAGA permalink

      I just tested my old image on the tx side. This is working fine, so the problem has to be somewhere on the tx pi.

    • DAGA permalink

      Okay I dont get it.
      I posted the wrong command above. This is the command I used on the tx side:

      raspivid -ih -t 0 -w 1280 -h 720 -fps 30 -b 4000000 -n -g 60 -pf high -o – | sudo /home/pi/wifibroadcast/tx -b 4 -r 2 wlan0

      The terminal is telling me that packets are send, but the tplink just wont start blinking and I am not receiving anything on rx.

      I really need some help. I will ofcourse publish the code for the mavlink osd once finished.
      I already compiled it but I cant test since I cant get the tx working.

      • DAGA permalink

        Got it working. I redownloaded the wifibroadcast and configured everything again.
        Mavlink OSD is working fine now. The only problem I have is that the renderer crashes if the display resolution is set to 1440p. With 1080p everything is working fine.

  8. Alan permalink

    Hi DAGA,

    Do you have show some mavlink data on the video stream?
    Can you post the source for the mavlink OSD?

  9. Dredge permalink

    I have wifibroadcst running great.

    How can I use osd with rx pc running debian and gstreamer?

    • Hi

      It depends on the format of the telemetry data. For FrSky I wrote a tool for Raspberry Pi. Unfortunately, this won’t run on a pc. But you can use this program as a starting point. All you need to do is to replace the rasperry specific render function with your own.

  10. Kevin permalink

    Hello all,
    I need some help :).

    Im working on a project using navio+&raspberrypi2, i successfully transmit fpv over wifibroadcast but for telemetry i have a problem.
    I expect to use APM planner 2, in order to receive telemetry data, i use ArduCopter-quad from apm available here http://docs.emlid.com/Navio-APM/installation-and-running/.

    My issue seems to be : APM planner expect UDP data or serial data, it seems that it doesn’t understand data received over broadcast, here are my commands used :

    TX side :
    Before i emulate an opening udp port 14550 :

    nc -l -u 14550

    then i transmit :

    ArduCopter-quad -A 127.0.0.1:14550 -| ./tx -b 1 -p 1 -f 80 $1

    RX side :

    ./rx -p 1 -b 1 $1 | socat – UDP:127.0.0.1:14550
    (i used socat but i can also use nc : ./rx -p 1 -b 1 $1 | nc -u 127.0.0.1 14550 )

    => $1 is wlan1

    APMplanner detect something but do not display telemetry data

    I made a test using an adressing :

    ArduCopter-quad -| socat – UDP:192.168.1.7:14550
    => don’t work (APMplanner detect something but do not display telemetry data)

    ArduCopter-quad -A udp:192.168.1.7:14550
    => work

    192.168.1.7 = ground station
    14550 = udp port of apm planner2

    • luke permalink

      Did you find a solution to this. I have the same problem…

  11. Francois permalink

    Hi, I’m trying to figure out how to make the basics to works and I fail… here is my test:

    I have only one raspberry pi. I’m trying to watch the demo video with OSD over it.

    ./hello_video.bin < test.h264
    here the video starts fullscreen

    And with another terminal (blind becauce the video is playing)
    ./frsky_omx_osd . < testlog.frsky
    assertion failure:graphics.c:269:gx_priv_restore():egl_result
    Aborted

    I thing there is something I don't understand…

    • Does the OSD run without the video? Do you start everything from the console or from X?

      • Francois permalink

        Yes, OSD run well without the video. I starts everything directly from console, directly on the rpi via hdmi.

      • Francois permalink

        I found the problem. My version of Raspbian is Jessie and it cannot manage this well. With wheezy it works well.

      • Francois permalink

        EDIT: Even worst… It only works on the old Raspbian that support support RPI first generation, with the menu on bottom of screen. (My version: Linux 3.10.25+)

        But I just bought a RPI 2 for this project, that’s too bad.
        But, I like the fact we can instantiate two times the video driver. But, probably the programmers of linux kernel found that it’s a bug…

  12. Samuel permalink

    Hi Befi,

    i’m currently working on a full graphics OSD that was inspired by your OSD project. You only use VGfont which does not offer Lines and other drawings. I use the openVG which vgfont is actually based on. There you can draw everything you want.

    Heres a sample of an AHI:

    Will add more features and then put the code on github.

    Thanks for this project, theres a lot improvements going on right now. Glad the opensource idea works very well here.

    Regards

    Samuel

    • Hi Samuel

      Cool, I would love to see a fully featured OSD! By the way, do you think it would be possible to encapsulate your calls to openVG? Like you are calling throughout your code a function called “my_draw_line” which in turn calls an openVG function? This would make it much simpler to port the code onto another platform like a PC.

      And you are right, I am also really excited about the open source movement that developed around this project. A lot of good ideas and much of my motivation comes from good feedback and contributions of the users!

      • Samuel permalink

        Hi,

        have a look at the openvg functions here:
        https://github.com/ajstarks/openvg

        Its quite simple… For example Line(x1,y1, x2, y2) draws a line, so it should be easy to replace that later with other drawing libraries.

        New video:

      • Indeed, very very simple API. Nice library, I should have used that on my OSD too. Do you have your code somewhere online so that we can follow (and contribute to) its developments?

  13. Samuel permalink

    Its on my other PC, will upload it tomorrow. You will find it here: https://github.com/SamuelBrucksch/wifibroadcast_osd

    • Samuel permalink

      First demo is up, i’m still new to C, so be nice ;). My main programming language is Java.

    • fantasiiio permalink

      Hello, I’m also working on a similar project. When I saw this library (openVG) I did not liked it because it’s simply a wrapper of the real openVG in GO language. But, thanks for sharing your project, I’ll follow your development and maybe contributing.

      • Samuel permalink

        As far as i understand this library has a wrapper for the go language, but the library itself is native C. So no wrapper at all if you use c. This library directly uses the EGL functions.

      • Francois permalink

        I see. In that case, you don’t need the ajstarks/openvg library. the “VG/openvg.h” include file is already included in raspberry pi.

    • Francois permalink

      These days, I’m working on the communication protocol. I’m using a port of the LightTelemetryProtocol from GhettoStation: https://github.com/KipK/Ghettostation/tree/master/GhettoProxy

      I can read mavlink data and convert it to LTM, directly with the RPI (without arduino). So, Everyone will be able to use the OSD. ArduPilot/Pixhawk or CC3D/Naze32, and probably FRSky (Without attitude data)

      • Samuel permalink

        Frsky can transfer attitude as well over the acc sensor id.

      • Francois permalink

        That’s a good thing. I think we can start a new thread on another forum ?

  14. Samuel permalink

    Yes, i could use that as well, but the openVG lib from ajstarks is easier to use. So i prefer that one.

  15. Samuel permalink

    Working graphical OSD is on Github:
    https://github.com/SamuelBrucksch/wifibroadcast_osd

    Still needs some optimisations and so on, but it works and can be used for testing.

    • nobie permalink

      Hi Samuel…have dome some tests with mavlink and modified openvg:

      …should be possible to merge the mavlink parsing and openvg-render stuff to your project!?

    • Samuel permalink

      Hi, i saw that osd on youtube yesterday already, good job. Will have a look at your code. Your OSD looks much more professional than mine, so i should merge my code to your osd instead 😉

      • nobie permalink

        plz with mavlink support ;O) nice project

    • Samuel permalink

      Ok looks like we followed the same structure from befi 😉 should be easy to merge then. BTW td.osd_fix_type=”no fix” i woukldnt do stuff like that, make it an unsigned int, fix is 0, 2D (2), 3D (3) and so on… you could just use numbers then you dont have to compare text later.

      • Samuel permalink

        Oh ok, that is just the text for the osd… Still investigating your code 😉

    • fantasiiio permalink

      Good job guys.

  16. Alexandre Licinio permalink

    Hi,
    I want to send both audio and video but i’m stuck and i have a lot of packet loss using parallels streams as described.
    I have a video stream at 4Mbit/s and an audio stream at 1.5Mbit/s. (both from tx side).

    I tried many things on tx side with all the parameters but i receive massive corrupted frames.
    So i don’t understand really -m parameter and do i have to mention -f parameter like we did before ?

    Parallels streams work great and fifo fill themselves but to much packet loss. I mean a lot, i can’t decode video properly and audio too. With standard technique video and audio are working great separately.

    #/bin/bash
    sudo killal ifplugd
    sudo ifconfig wlan0 down
    sudo iw dev wlan0 set monitor otherbss fcsfail
    sudo ifconfig wlan0 up
    sudo iwconfig wlan0 channel 6
    sleep 1
    sudo ./tx -p 1 -b 4 -r 4 -m 64 -s 2 wlan0 &

    if anyone could help me 🙂

    thanks a lot,
    cheers

    • Hi Alexandre

      The purpose of the -m parameter is to avoid many small packages. Let’s assume your data source delivers data in chunks of 4 bytes. Without the -m parameter each chunk would be bundled into its own wifi packet. This is very inefficient, since the 4 bytes will be prepended with headers and so on that are more than 10x the size of your payload. If you give the -m parameter then tx waits until at least -m bytes have arrived from the source and then sends out a packet. So depending on your data source -m might help. And I think audio is a good candidate where the chunks would be small.

      Does audio alone work fine?

      • Alexandre Licinio permalink

        Hi Bennet, thanks for your answer.

        With default branch of wifibroadcast :

        Yes audio alone is working quite fine but not as fast as i want. Without wifibroadcast and using netcat for both transmitting with or without opus, i reach a low latency.
        With wifibroadcast, i can reach a low latency without opus but not with using it.
        Maybe i’m wrong with packet size. And also another problem is (with opus) i have to start the receiver before txside ifnot i can’t decode audio and aplay won’t start.
        How do you calculate properly -f ?

        netcat (no compression) : latency less than 10ms, bitrate ~1.9Mbit/s
        no latency drift
        tx side >> sudo arecord -D plughw:1 -B16 -Vstereo -fdat -traw | nc 192.168.108.209 7001
        rx side >> nc -k -l 7001 | sudo aplay -B2000 -v -Dplughw:1 -fdat -Vstereo

        netcat (with opus) : latency less than ~100ms, bitrate ~150Kbit/s
        no latency drift
        tx side >> sudo arecord -D plughw:1 -B16 -Vstereo -fdat -traw | opusenc –bitrate 96 –max-delay 0 –comp 2 –framesize 10 –hard-cbr –ignorelength –raw –raw-rate 48000 –comment tag=intercom – – | nc 192.168.108.209 7001
        rx side >> nc -k -l 7001 | opusdec – –rate 48000 – | sudo aplay -B2000 -v -Dplughw:1 -fdat -Vstereo

        wifibroadcast (no compression) : latency less than 10ms
        no latency drift, i can start what i want first
        tx side >>
        sudo killal ifplugd
        sudo ifconfig wlan0 down
        sudo iw dev wlan0 set monitor otherbss fcsfail
        sudo ifconfig wlan0 up
        sudo iwconfig wlan0 channel 6
        sleep 1
        sudo arecord -D plughw:1 -B16 -Vstereo -fdat -traw | sudo /home/ekla/c50/wifibroadcast/tx -b 8 -r 4 -m 64 wlan0

        rx side >>
        sudo rfkill unblock wifi; sudo rfkill unblock all
        sudo killal ifplugd
        sudo ifconfig wlan2 down
        sudo iw dev wlan2 set monitor otherbss fcsfail
        sudo ifconfig wlan2 up
        sudo iwconfig wlan2 channel 6
        sleep 1
        sudo ./rx -b 8 -r 4 wlan2 | sudo aplay -B2000 -v -Dplughw:1 -fdat -Vstereo

        wifibroadcast (opus) : latency less than half a second why????
        no latency drift, i have to start rx before tx ifnot it doesn’t work
        tx side >>
        sudo killal ifplugd
        sudo ifconfig wlan0 down
        sudo iw dev wlan0 set monitor otherbss fcsfail
        sudo ifconfig wlan0 up
        sudo iwconfig wlan0 channel 6
        sleep 1
        sudo arecord -D plughw:1 -B16 -Vstereo -fdat -traw | | opusenc –bitrate 96 –max-delay 0 –comp 2 –framesize 10 –hard-cbr –ignorelength –raw –raw-rate 48000 –comment tag=intercom – – | sudo /home/ekla/c50/wifibroadcast/tx -b 2 -r 4 -m 10 wlan0

        rx side >>
        sudo rfkill unblock wifi; sudo rfkill unblock all
        sudo killal ifplugd
        sudo ifconfig wlan2 down
        sudo iw dev wlan2 set monitor otherbss fcsfail
        sudo ifconfig wlan2 up
        sudo iwconfig wlan2 channel 6
        sleep 1
        sudo ./rx -b 2 -r 4 wlan2 | opusdec – –rate 48000 – | sudo aplay -B2000 -v -Dplughw:1 -fdat -Vstereo

        So first for audio i don’t understand why the latency is so much important when using opus+wifibroadcast. I tried a lot of different configuration for -b -r -f and even -m parameter but i did never reach the same low latency as i did without wifibroadcast.

        For video, we use this parameters :
        -b8 -r 4 -f 1318 for both tx and rx side. The output bitrate of the camera is 7Mbit/s
        It works perfectly
        we have to use a fifo between our program and wifibroadcast.

        With fifo branch of wifibroadcast : sudo ./tx -p 1 -b 4 -r 4 -m 64 -s 2 wlan0 &
        When i put things together what is the best parameters i should give ? Because now it is the big mess, i mean i receive quite only errors, packets loss, some time some pixels for video and noise for audio. Is it because we use a fifo before your fifo in /tmp ?
        Maybe it is because my parameters are not set correctly when i want to put things together ?

        Thanks a lot,
        cheers

      • I introduced the FIFO interface to wifibroadcast because running two instances of TX caused a bit too much load on my RPI A+. If you do not have that problem then you should start two instances, each with different parameters.

        The latency of Opus is easy to explain: Wifibroadcast only starts to send data when a block is full. Assuming -b 8 and -f 1450 (default value) that would be already 200ms at a data rate of 64kbps. So for audio you might want to reduce -b and also -f. I don’t know enough about Opus but I think it will also group it’s data into “frames”. Setting -f close to the frame size is a good start for achieving low latency.

  17. Alexandre Licinio permalink

    Hi and thanks again for your answer.
    I tried again with parallels stream with “default” branch but it won’t work correctly.
    I tried many many different things but what i notice is the latency of the video i receive growing up when i stream both audio and video. I did decrease both bitrates but same issue.

    Thanks,
    Cheers,

    • Mh, ok. To be honest, the multi-channel feature of wifibroadcast has not been tested as much as a single video transmission. The transmission of telemtry data worked and did not disturb the video stream. However, that is a rather low bandwidth stream. And if in the telemetry something is missing, it is likely to be overseen.

      One suggestion to narrow down the problem: Receive the two streams using two separate receivers. AKA two computers each with its own dongle. This way you can make sure that there is no interference on the RX computer if two rx processes are running.

      If the error still persists, then you have to take a closer look to the TX.

      • Alexandre Licinio permalink

        Thanks Bennet, it works with separate adapter on each rx. But it only works with low bitrate for the second (audio) and not straight alsa > wifibroadcast.
        Also, with opus, i have a biggest latency with wifibroadcast than netcat and sometime latency grows.
        When bitrate of second stream is too big, video lacks, latency comes huge.

        TX
        sudo arecord -D plughw:1 -B16 -Vstereo -fdat -traw | opusenc –bitrate 96 –max-delay 0 –comp 2 –framesize 10 –hard-cbr –ignorelength –raw –raw-rate 48000 –comment tag=intercom – – | sudo ./tx -p 4 -b 2 -r 2 -f 300 wlan0

        RX
        sudo ./rx -p 4 -b 2 -r 2 -f 300 wlan6 | opusdec – –rate 48000 – | sudo aplay -B16 -v -Dplughw:1 -fdat -Vstereo

  18. Ben permalink

    Firstable many thanks to befinitiv, but also to Constantin (MyMediaCodecFPVPlayer) and Samuel (wifibroadcast_osd).

    For info: I’ve got it running with
    TX= RPi b+ / 722
    RX= Ubuntu /722 on screen or Ubuntu /722 plus Oneplus one hotspot and (MyMediaCodecFPVPlayer)

    and a double bi-quad 2.4Ghz antenna !

    I’m still working on getting OSD from my Naze32 rev5, but +1 too all of you !!

  19. nathaniel akkermans permalink

    hi Samuel, i tested your OSD and looks really nice, but one question. everything is working except the LAT and LON. i run a NAZA with a naza to mavlink arduino converter, i shows the number of sats and the speed and altitude. also the horizon is working with the NAZA gimbal output, works really nice. but the one thing i want the most is the GPS location, in case my drone crash i can find the last known location. but is only shows 0.0000. what would this be?

  20. Portakal permalink

    Hello Befinitiv,

    Thank you for really amazing project. I would like to ask a question. I’m using prebuilt images and i guess telemetry is disabled. I tried writing the lines from TX part into tx.sh in TX raspberry. However, it doesn’t work. Could you please be more specific about writing which lines to tx.sh.

    Thank you.

    P.S. I am using Pixhawk, frsky telemetry protocol is enabled.

  21. Reed Noel permalink

    Dear all,
    My fc is pixhawk and using esp8266 as telemetry and no osd.
    Pls show me that, Ez Wifibroakcasd can using telemetry and osd without external telemetry,osd?
    If yes,please help me wiring pixhawk with ez wifi broadcast.
    Rgs
    Reed

Trackbacks & Pingbacks

  1. Forward error correction for wifibroadcast | befinitiv
  2. SamuelBrucksch/wifibroadcast_osd | GITROOM
  3. Wifibroadcast – Analog-like transmission of live video data – Hacker Planet

Leave a comment