[FFmpeg] Introduce FFmpeg-Rockchip for hyper fast video transcoding via CLI

Hi all, I was playing with ffmpeg and gst-launch for the past few days and I’m currently pretty sure that my testcase of using the radxa 4k camera to stream 4k to whip endpoint is currently not possible. (Missing plugins for both and I have no idea how to merge RK-HW libs with available WHIP repos and vise-versa. The best result I currently get is with this command:

gst-launch-1.0 v4l2src num-buffers=512 device=/dev/video11 io-mode=4 ! videoconvert ! video/x-raw, format=NV12, width=1920, height=1080, framerate=30/1 ! tee name=t ! queue ! mpph264enc ! queue ! h264parse ! mux. flvmux streamable=true name=mux ! rtmpsink http://

When I try with ffmpeg it works too but I get much worse encoding quality:

ffmpeg -i /dev/video11 -c:v h264_rkmpp -video_size 1920x1080 -b:v 6M -maxrate 6M \

-bufsize 1M -profile:v high -g:v 120 -f flv rtmp://

Also it seems like the sensor is not in the middle of the camera, when I point the camera at me it’s like I just see the top left(or right) quarter of the sensor…

Any hints? Thank you in advance.

You didn’t limit the bitrate in gstreamer, but you manually limited the bitrate in ffmpeg and set a strange bufsize. Video quality is positively correlated with bitrate and 6000kbps is too small for a 4k video. Therefore, you need to learn more about video rate control, such as CQP/CBR/VBR, and how to set them appropriately.

1 Like

Hi There, Thank you for answering. I know that, it was just a sample of like 25 commands I tried. My main problem is not the quality atm I was more focussed on getting whip working somehow.

You missed this old post:

I think is kernel version again, some clues here:

1 Like

Ohhhh… How didn’t I find your post here… I was searching and searching, asking AI which had absolutely no clue (of course, but I tried it anyways) :smiley: I’ll give it a shot and report back! Thank you!!

Hi There, thank you it’s working as a server. Allthough I was looking to stream to an existing server I’ll make it work somehow. I don’t think this function is impemented. I’ll have a look at the sensor topic this evening.

My testing here was successful with one issue, ffmpeg ignores CBR.

Feeding 4K to a rtmp server is like this:

ffmpeg -re -f v4l2 -pixel_format nv12 -framerate 30 -video_size 3840x2160 -i /dev/video11 -c:v h264_rkmpp -b:v 4M -rc_mode 1 -g 6 -f flv rtmp://
ffmpeg version 6.1 Copyright © 2000-2023 the FFmpeg developers
built with gcc 11 (Ubuntu 11.4.0-1ubuntu1~22.04)
configuration: --prefix=/usr --disable-libopenh264 --disable-vaapi --disable-vdpau --disable-decoder=h264_v4l2m2m --disable-decoder=vp8_v4l2m2m --disable-decoder=mpeg2_v4l2m2m --disable-decoder=mpeg4_v4l2m2m --disable-libxvid --disable-libx264 --disable-libx265 --enable-rkmpp --enable-nonfree --enable-gpl --enable-version3 --enable-libmp3lame --enable-libpulse --enable-libv4l2 --enable-libdrm --enable-libxml2 --enable-librtmp --enable-libfreetype --enable-openssl --enable-opengl --enable-libopus --enable-libvorbis --disable-shared --enable-decoder=‘aac,ac3,flac’ --extra-cflags=-I/usr/src/linux-headers-5.10.110-rk3588-v4l2/include --disable-cuvid --disable-stripping --disable-optimizations --extra-cflags=-Og --extra-cflags=-fno-omit-frame-pointer --enable-debug=3 --extra-cflags=-fno-inline
libavutil 58. 29.100 / 58. 29.100
libavcodec 60. 31.102 / 60. 31.102
libavformat 60. 16.100 / 60. 16.100
libavdevice 60. 3.100 / 60. 3.100
libavfilter 9. 12.100 / 9. 12.100
libswscale 7. 5.100 / 7. 5.100
libswresample 4. 12.100 / 4. 12.100
libpostproc 57. 3.100 / 57. 3.100
[video4linux2,v4l2 @ 0x55d4359400] ioctl(VIDIOC_G_INPUT): Inappropriate ioctl for device
[video4linux2,v4l2 @ 0x55d4359400] ioctl(VIDIOC_G_PARM): Inappropriate ioctl for device
[video4linux2,v4l2 @ 0x55d4359400] Time per frame unknown
[video4linux2,v4l2 @ 0x55d4359400] Stream #0: not enough frames to estimate rate; consider increasing probesize
Input #0, video4linux2,v4l2, from ‘/dev/video11’:
Duration: N/A, start: 3035.689535, bitrate: N/A
Stream #0:0: Video: rawvideo (NV12 / 0x3231564E), nv12, 3840x2160, 1000k tbr, 1000k tbn
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (h264_rkmpp))
Press [q] to stop, [?] for help
Output #0, flv, to ‘rtmp://’:
encoder : Lavf60.16.100
Stream #0:0: Video: h264 (High) ([7][0][0][0] / 0x0007), nv12(progressive), 3840x2160, q=2-31, 4000 kb/s, 1000k fps, 1k tbn
encoder : Lavc60.31.102 h264_rkmpp
frame= 16 fps=0.0 q=-0.0 size= 427kB time=00:00:00.50 bitrate=6989.0kbits/frame= 32 fps= 30 q=-0.0 size= 479kB time=00:00:01.03 bitrate=3797.8kbits/frame= 47 fps= 30 q=-0.0 size= 516kB time=00:00:01.53 bitrate=2754.3kbits/frame= 63 fps= 30 q=-0.0 size= 568kB time=00:00:02.06 bitrate=2250.9kbits/frame= 79 fps= 30 q=-0.0 size= 604kB time=00:00:02.59 bitrate=1903.3kbits/frame= 95 fps= 30 q=-0.0 size= 656kB time=00:00:03.13 bitrate=1715.3kbits/frame= 111 fps= 30 q=-0.0 size= 708kB time=00:00:03.66 bitrate=1582.3kbits/frame= 127 fps= 30 q=-0.0 size= 745kB time=00:00:04.19 bitrate=1452.4kbits/frame= 142 fps= 30 q=-0.0 size= 796kB time=00:00:04.69 bitrate=1387.6kbits/frame= 158 fps= 30 q=-0.0 size= 848kB time=00:00:05.23 bitrate=1327.6kbits/frame= 173 fps= 30 q=-0.0 size= 885kB time=00:00:05.73 bitrate=1264.0kbits/frame= 188 fps= 30 q=-0.0 size= 936kB time=00:00:06.23 bitrate=1230.1kbits/frame= 203 fps= 30 q=-0.0 size= 972kB time=00:00:06.73 bitrate=1182.5kbits/frame= 219 fps= 30 q=-0.0 size= 1024kB time=00:00:07.26 bitrate=1154.1kbits/frame= 235 fps= 30 q=-0.0 size= 1060kB time=00:00:07.79 bitrate=1113.1kbits/frame= 250 fps= 30 q=-0.0 size= 1111kB time=00:00:08.29 bitrate=1096.9kbits/frame= 266 fps= 30 q=-0.0 size= 1163kB time=00:00:08.83 bitrate=1079.0kbits/frame= 282 fps= 30 q=-0.0 size= 1200kB time=00:00:09.36 bitrate=1049.9kbits/frame= 298 fps= 30 q=-0.0 size= 1253kB time=00:00:09.89 bitrate=1036.5kbits/frame= 313 fps= 30 q=-0.0 size= 1288kB time=00:00:10.39 bitrate=1014.9kbits/frame= 329 fps= 30 q=-0.0 size= 1340kB time=00:00:10.93 bitrate=1004.3kbits/frame= 345 fps= 30 q=-0.0 size= 1393kB time=00:00:11.46 bitrate= 995.2kbits/frame= 360 fps= 30 q=-0.0 size= 1429kB time=00:00:11.96 bitrate= 978.2kbits/frame= 376 fps= 30 q=-0.0 size= 1481kB time=00:00:12.49 bitrate= 970.9kbits/frame= 392 fps= 30 q=-0.0 size= 1534kB time=00:00:13.03 bitrate= 964.3kbits/frame= 407 fps= 30 q=-0.0 size= 1580kB time=00:00:13.53 bitrate= 956.4kbits/frame= 422 fps= 30 q=-0.0 size= 1695kB time=00:00:14.03 bitrate= 989.5kbits/frame= 437 fps= 30 q=-0.0 size= 1810kB time=00:00:14.53 bitrate=1020.5kbits/frame= 453 fps= 30 q=-0.0 size= 1895kB time=00:00:15.06 bitrate=1030.2kbits/frame= 469 fps= 30 q=-0.0 size= 1931kB time=00:00:15.59 bitrate=1014.1kbits/frame= 484 fps= 30 q=-0.0 size= 1983kB time=00:00:16.09 bitrate=1009.0kbits/frame= 499 fps= 30 q=-0.0 size= 2019kB time=00:00:16.59 bitrate= 996.3kbits/frame= 514 fps= 30 q=-0.0 size= 2071kB time=00:00:17.09 bitrate= 992.2kbits/frame= 530 fps= 30 q=-0.0 size= 2123kB time=00:00:17.63 bitrate= 986.4kbits/frame= 545 fps= 30 q=-0.0 size= 2160kB time=00:00:18.13 bitrate= 975.9kbits/frame= 561 fps= 30 q=-0.0 size= 2212kB time=00:00:18.66 bitrate= 970.7kbits/frame= 576 fps= 30 q=-0.0 size= 2249kB time=00:00:19.16 bitrate= 961.1kbits/frame= 592 fps= 30 q=-0.0 size= 2301kB time=00:00:19.69 bitrate= 956.9kbits/frame= 608 fps= 30 q=-0.0 size= 2353kB time=00:00:20.23 bitrate= 952.9kbits/s speed=0.997x

[q] command received. Exiting.

[flv @ 0x55d435b630] Failed to update header with correct duration.
[flv @ 0x55d435b630] Failed to update header with correct filesize.
[out#0/flv @ 0x55d435ab10] video:2360kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.526353%
frame=  618 fps= 30 q=-0.0 Lsize=    2373kB time=00:00:20.56 bitrate= 945.1kbits/s speed=0.997x

Just a small correction, i tested with an old mpp and 4GB of RAM, and the bitrate worked fine, it’s either mpp issue or > 4GB RAM issue.

1 Like

Hi There, Today I wanted to continue getting the whole sensor working but without doing anything I had the full image at first try… confirmed it by slowly moving pointy object into sensor… Idk what happened

Sounds like you get the full frame now, and not the cropped frame. Maybe you updated the kernel or the camera engine?

btw, in my case, i found out that by using libav in C to encode or gstreamer to encode i can change the bitrate with some granularity, on my setup with 16GB RAM and with the latest mpp.

Using ffmpeg (cli) i need to increase bps 10 times, for example, if i want 2M bps i need to pass 20000000 on the command line.

The next goal is to use the libav to encode, feed, and display the result in a web browser, something like videostreamer does with h264.

I played with videostreamer some years ago, and with some changes, i could even display h265 on Microsoft Edge.

But videostreamer is mainly written in the Go language, with a small part with ffmpeg.

Anyway, i have my ff-rknn (POC) that encodes the result and records it in a file.

I can get 25FPS with the “performance” governor, which is sub-optimal and needs a lot of optimization to be useful.

There is an issue with frame rate since i get 25 fps average (get the frame, apply the NPU, display the result on screen, encode, and save it), i need to record at 25 fps to get synced and not 30 fps, i don’t know why.

For example, if i record say 30.5 secs, the resulting file has ~ 32 secs.

gst-discoverer-1.0 test4.h264
Analyzing file:///apps/rockchip/ffmpeg/ff-rknn-new-sdl2/test4.h264
Done discovering file:///apps/rockchip/ffmpeg/ff-rknn-new-sdl2/test4.h264

Duration: 0:00:32.705865149
Seekable: yes
Live: no
video #0: H.264 (High Profile)
Stream ID: 5f1940d29bcb7361d31d7d489d0fb9eaa6cbc9b8c1e2d1f829c4ca1c3fec2bfb
Width: 1920
Height: 1080
Depth: 24
Frame rate: 25/1
Pixel aspect ratio: 1/1
Interlaced: false
Bitrate: 0
Max bitrate: 0

I use mediamtx to have stream visible in webbrowser. It is working good, but not perfect. It’s working much better than vlc etc.

Regarding sensor it’s a bit complicated. It seems that as soon as I use fullhd one time I can’t go back. For example full sensor 4k works, then I do this command for example:

gst-launch-1.0 v4l2src device=/dev/video11 io-mode=dmabuf ! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! mpph264enc ! rtph264pay name=pay0 pt=96 ! rtmpsink location=rtmp://

and after that the picture is cropped going forward. The first command which worked before is then cropped too…

Also built with GO language.

I don’t have here a 4K display monitor (only 1920x1080) to reproduce this issue. I will check it later.

I don’t think it’s the monitor. I dont even have monitor connected to my 4k-rock :smiley: It’s streaming via wlan to my mediamtx and I try to watch the stream on my browser.

And until now I can either stream 1080 cropped image or I can view 4k stream via rstp-server and vlc you suggested. But I wasn’t able to get either 1080 full sensor or any 4k whip-working.

Sorry, I did not mean the monitor is the problem, it’s just for me to check the image…:wink:

1 Like

Thank you for your help!!

Edit: And for the record, when I try to ingest the rtsp-server-stream I’ll get the following:

[WebRTC] [session 5056f211] closed: the stream doesn’t contain any supported codec, which are currently AV1, VP9, VP8, H264, Opus, G722, G711

Then I tried using VP8 and I get similar errors:

[WebRTC] [session 63209a5b] peer connection established, local candidate: host/udp/, remote candidate: prflx/udp/
[WebRTC] [session 63209a5b] is reading from path ‘4k’, 1 track (VP8) 2
[path 4k] packets containing single partitions are not supported

When I try AV1 it’s kind of working but the picture is just a colorful mush of something. Nothing is nearly recognizable as if it’s some kind of picture of random data.

Any chance this version of ffmpeg could be used to utilize the HDMI input and feed that to the rk_mpp h264 encoders?

1 Like


ffmpeg -re -f v4l2 -i /dev/video0 -c:v h264_rkmpp -qp_init 25 /path/to/output.mp4
1 Like

Had a try and came out with no luck unfortunately.

Installed all dependencies (MPP & RGA)tried multiple cmd’s and got (Cannot open video device /dev/video0: No such device) which is better than (not a capture device) in the past I did v4l2-ctl --list-devices and my rk_hdmirx says hdmirx-controller as /dev/video20 if thats relevant, no matter what whenever /dev/video0 is ever called it will always mention /dev/video0: No such device

went to https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu installed all libraries and dependencies also tried the example at the bottom of the page as you mentioned which showed the same result (no such device).When i try Orange Pi’s test for hdmi input using gstreamer it works perfectly with test_hdmiin.sh, would be interesting to see how that works to begin with.

Is there a specific installation methed im not doing correctly? i must be overlooking obviously. seems to be a bit overkill for HDMI input to not have such terrible latency

this person here has the same issue

Okay after a week of trial and error i finally got it to work smh lol

I used your commented example as

ffmpeg -re -f v4l2 -i /dev/video20 -c:v hevc_rkmpp -qp_init 25 /root/Desktop/test.mp4

I got some error after execution, it’ll make a file on my desktop but how can i make it display the live video onto my PI in fullscreen mode? vlc? gstreamer? and how can i change video to 120fps?

The MPP library provided by OrangePi is outdated, so ffmpeg fails to run.

See also https://github.com/nyanmisaka/ffmpeg-rockchip/wiki/Compilation


Make sure you have built and installed the latest MPP runtime and headers. Otherwise undefined behavior may occur, such as segmentation fault. Be careful if you use MPP deb packages from a vendor/BSP image, these packages are likely to be out of date.

RK3566 datasheet mentions this about H.265 encoding:

H.265/HEVC MP@level4.1, up to 1920x1080@100fps (4096x4096@10fps with TILE)

Is it possible to use H.265 encoding at 4096x4096 or lower? ( 3840x2160 ). How is “TILE” activated?