Skip to content

FPV in 4K

April 29, 2019

This post describes how to achieve 4K resolution with a wifibroadcast-based FPV system using H265 encoding. The improved encoding can also be used to get better image quality or reduce bitrate at lower resolutions.

Note: This writeup will not give you a complete image that you can burn on your SD card. It shows a way that makes 4K FPV possible. And due to the image sensor used, it is not traditional 16:9 but instead 4:3 format.

Most of the readers of my blog will know the wifibroadcast system that I developed four years ago. Unfortunately, due to lack of time I was not able to continue working on wifibroadcast anymore. But the community came to my rescue: Many many others have taken over the idea and created incredible images based on the original wifibroadcast. They extended it with so much functionality, broader hardware support, etc. I was really happy to see what they made out of the project.

So as a small disclaimer: This post does not mean that I will pick back up development on the wifibroadcast project. It is just a small hint to the community about new potentials for the wifibroadcast project. These potentials follow of course the wifibroadcast philosophy: To use cheap readily available parts and to make something cool out of them šŸ™‚

The choice of single board computer for wifibroadcast

Why did I choose Raspberry PI for wifibroadcast? Obviously, due to it’s low price and popularity? Wrong. This was just an extra bonus in the Wifibroadcast soup. The main motivation was that it had a well supported video encoder on board. Video encoding is one of the key ingredients that is required for wifibroadcast. It is not important whether you use MIPS or ARM architecture, USB or Raspberry camera, Board A or Board B… the only things you really need is a video encoder and a USB 2.0 interface.

And in terms of video encoding, the stakes are rather high in case of Raspberry PI. The encoder is so well integrated, it just works out of the box. It has gstreamer support if you want to build your own video pipeline and if not, the camera app directly delivers compressed video data. It cannot get any easier.

Since the original wifibroadcast I was always on the lookout for another SBC with a well supported video encoder. Always with the hope in mind to lower the latency. Unfortunately, my latest find does not seem to deliver improvements in that area. Instead, it improves in terms of image resolution and coding efficiency by using H265 instead of H264 (Raspberry PI does not support H265 and also cannot go beyond 1920×1080 resolution).

NVIDIA Jetson Nano

Some weeks ago, NVIDIA announced the Jetson Nano, a board targeted towards makers with a rather low price tag of $99. The following features caught my attention:

  • Raspberry PI camera connector
  • Well supported h264 and h265 support (through gstreamer)

I could not resist and bought one of these boards. Spoiler: It does not seem to improve in terms of latency but definitely in terms of resolution and bit-rate.

Other nice features of this board is the improved processor (A57 quadcore compared to RPI A52), improved RAM (4GB DDR4 vs 1GB DDR2), improved GPU (128 core vs 4?).

Camera modes

The Jetson Nano supports only the Raspberry camera V2. With this, it provides the following (for Wifibroadcast most useful) camera modes

  • 4K (3280 x 2464) at 21fps
  • FHD (1920 x 1080) at 30fps
  • HD (1280 x 720) at 120fps

In theory, it should also support other frame rates but I could not manage to change this setting. But I did not really try hard so it is probably my fault.

Latency measurements

I did some very quick latency measurements. The setup that I used was my Laptop with gstreamer and a wired Ethernet connection between Laptop and Jetson. The reason for a wired connection instead of Wifibroadcast is that my flat is heavily polluted on all wifi bands. If I had used Wifibroadcast transmission, the latency from the encoding and the transmission in the polluted environment would have mixed together. Using an Ethernet connection was a simple way of separating the encoding latency since I was only interested in that.

The measurement setup was rather crude, I used the TestUFO page and took a picture of the setup to measure the time distance like shown here (right: “live view”, left: “transmitted view”):

This setup introduced of course a couple of variables. Screen update latency and decoding latency of my laptop, update speed of my smartphone displaying the TestUFO page. Not quite ideal… but good enough for a first impression.

With this setup, I determined the following latency (only one measurement point per setting):


  • 4K 21fps: 210ms
  • FHD 30fps: 150ms
  • HD 120fps: 140ms


  • 4K 21fps: 170ms
  • FHD 30fps: 160ms
  • HD 120fps: 160ms

Interpretation of latencies

One thing to note is that the CPU of my laptop was smoking quite a bit while decoding H265 or the high frame rates. So in case of 4K H265 I would definitively expect improvements if you would use a proper decoding device (like a smartphone or even a second “decoding Jetson”).

Otherwise, I would say that the latencies are definitively usable for FPV scenarios. I think it would be worth to invest more work into the Jetson Nano for Wifibroadcast.

Command lines

Receiver: gst-launch-1.0 -v tcpserversrc host= port=5000 ! h265parse ! avdec_h265 ! xvimagesink sync=false

Transmitter: gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3280, height=2464, framerate=21/1, format=NV12’ ! omxh265enc bitrate=8000000 iframeinterval=40 ! video/x-h265,stream-format=byte-stream ! tcpclientsink host= port=5000

Some remarks: To use H264, all occurrences of H265 need to be replaced in the command lines. To change the resolution, the parameter of the TX command needs to be adapted. Lowering resolution results in a cropped image, meaning a smaller field of view.

Quality comparison

One important question is of course: What do you gain from 4K+H265? I extracted some stills out of the video streams to compare things:

The difference in quality gets quite clear if you switch back and forth between h264 and h265 (note: only 256 colors due to gif format):

The bad results from H264 are expected: Even at high bitrates the format is not really intended for 4K resolution. And Wifibroadcast usually runs at even lower than usual bitrates (all recordings in this post use Wifibroadcast-typical 8mbit/s)

Sample recordings

Here you can find some raw sample recordings. They have been created with the following command line (resolution and codec changed accordingly):

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3280, height=2464, framerate=21/1, format=NV12’ ! omxh265enc bitrate=8000000 iframeinterval=40 ! video/x-h265,stream-format=byte-stream ! filesink location=/tmp/4k.h265

(Please excuse the use of such shady file hosting site… )


In summary I can say: Yes, the Jetson Nano is a useful SBC for a Wifibroadcast transmitter. It also is very likely a good candidate for a Wifibroadcast receiver. Plenty of processor power plus a (compared to Raspberry) capable GPU to draw even the fanciest OSD.

The weight and size might be a bit too high for certain applications. But there are options here as well. The heat sink (being the most heavy part of the system) could easily be replaced with something lighter (since you have lots of airflow on a drone). Also, the Nano is in fact a System-on-Module. Meaning: The whole system is contained on a small PCB the size of a Raspberry PI Nano. The mainboard of the Nano is mostly just a breakout board for connectors. A custom board or even a soldering artwork of wires might turn the Nano into a very compact, very powerful system.

Also, the 4K resolution + H265 seems to improve Wifibroadcast quality quite a bit. Together with a suitable display device (high resolution smart phone for example) this has the potential to boost the video quality to a completely new level.

I really hope that the hint from this post will be picked up by somebody from the community. My gut feeling says that there is definitively potential in this platform and that my crude latency measurements of this post can be improved quite a bit (by using a different receiving unit and/or parameter tuning).

If someone has something that he would like to share, please add a link to the comments. I will then happily integrate this link into this post.


From → Uncategorized

  1. Thomas E Elfstrom permalink

    I will send you an email tomorrow

  2. Aleksander Simonsen permalink

    What is the best results you guys have got with a webcam/action camera. And could you guys send/link me the image and codes to execute.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: