Rock 3A extremely low performance in GStreamer

Hi there. I’ve been trying to utilize hardware acceleration for video capture on the 3A with GStreamer and the Rockchip plugin, but it basically doesn’t work at all. I’m using the legacy kernel. I’ve built OpenCV from source with GStreamer and made sure that the plugin is installed. I created a working pipeline based on what I had seen elsewhere on the internet, but it’s basically terrible.

This is what my VideoCapture looks like:
cv2.VideoCapture("v4l2src device=/dev/video0 ! image/jpeg,format=MJPG,width=1280,height=720,framerate=30/1 ! mppjpegdec ! videoconvert ! video/x-raw,format=BGRx ! appsink drop=1", cv2.CAP_GSTREAMER)

And so it reads frames on the board at less than 2 FPS. Any thoughts on this?
Or ideas on how to accelerate reading raw frames?
Normal methods (so no GStreamer) are only capable of 17-24 FPS and I need 30 FPS for HD image.
And yes, I’m sure that my camera can put out stable 30 FPS.

The problem is that MPP uses a buffer mapped without caching, so reading the decoded frames in mppjpegdec is very slow. This is made worse because _backup_video_orc_unpack_NV12, which is the function that reads the pixel data, is at least on my system not compiled to use SIMD, so only a single byte at a time is read.

Fixing this would require changing the plugin so that it can pass a dma-buf to the next plugin in the chain, as that can be imported into the GPU device where caching can be used and reads will be fast.

Another way would be to make the plugin use the RGA blitter to copy into CPU-cached memory.

Thanks for the answer. So this has basically no solution. Understood.

eh? It’s solvable, and probably not hard to do. I bet I can write a patch to fix it by this time tomorrow…

I think the problem is actually in your GStreamer pipeline.

If I put glimagesink after mppjpegdec in the pipeline, then it can easily handle 1280x720 at 30 fps in real time, using about 40% load on a single A55 CPU core, though note that I am using a Rock 5B with a different OpenGL driver. (Probably it will be even lower for you, because you are using a hardware device as the source rather than the file that I am testing with.)

So then mppjpegdec correctly puts the output in a dma-buf, and the problem is that GStreamer is trying to map the dma-buf directly rather than importing it into an OpenGL context and using glReadPixels.

Perhaps that can be fixed by adding new filters in the GStreamer pipeline, perhaps it will require a new plugin, I’ll continue investigating.

Does a pipeline like this work any better? (I replaced videoconvert with glupload ! glcolorconvert ! gldownload):

v4l2src device=/dev/video0 ! image/jpeg,format=MJPG,width=1280,height=720,framerate=30/1 ! mppjpegdec ! glupload ! glcolorconvert ! gldownload ! video/x-raw,format=BGRx ! appsink drop=1

That will use OpenGL for converting from Y’CbCr to BGR, but more importantly the download to normal CPU memory is done by the OpenGL driver rather than directly through the uncached dma-buf.

Thanks a lot for your help! I will check this pipeline in a few days and report the results.

I’ve just tested the pipeline you posted.
On one hand it works. On the other hand it fluctuates a lot.

First it works terrible, 3-10 FPS, then it suddenly jumps to 30 FPS for a few seconds, and then it’s back in that lower range.

What is the bottleneck? Is it the CPU? Userspace or the kernel? Use a profiling tool, either a generic tool like perf or something gstreamer-specific to find what is preventing it from running faster. Even just looking at the CPU usage could be helpful.

But if you don’t mind compiling gstreamer (or at least the gst-plugins-base subproject), applying this patch might help. It at least made things work better for me.

diff --git a/subprojects/gst-plugins-base/gst-libs/gst/gl/gstglmemorypbo.c b/subprojects/gst-plugins-base/gst-libs/gst/gl/gstglmemorypbo.c
index 13fdba3..5d629c0 100644
--- a/subprojects/gst-plugins-base/gst-libs/gst/gl/gstglmemorypbo.c
+++ b/subprojects/gst-plugins-base/gst-libs/gst/gl/gstglmemorypbo.c
@@ -187,6 +187,8 @@ _gl_mem_create (GstGLMemoryPBO * gl_mem, GError ** error)
   GstGLContext *context = gl_mem->mem.mem.context;
   GstGLBaseMemoryAllocatorClass *alloc_class;
 
+  return TRUE;
+
   alloc_class = GST_GL_BASE_MEMORY_ALLOCATOR_CLASS (parent_class);
   if (!alloc_class->create ((GstGLBaseMemory *) gl_mem, error))
     return FALSE;

Since last time I had to reinstall the OS and now I can’t install mpp at all.
So I can’t check anything anymore.

The following packages have unmet dependencies:
 gstreamer1.0-rockchip1 : Depends: librockchip-mpp1 but it is not installable
E: Unable to correct problems, you have held broken packages.

Btw. what would be the pipeline for CSI? I can’t get OV5647 to change resolution and framerate.
Interfacing with the sensor itself works but it always defaults to 2592x1944 @15 FPS.
I don’t think v4l2 is meant to work here.

Search for ov5647 on Rock 3, there you find how to change resolution and framerate using v4l2-ctl.

Doesn’t work for me since I need 720p@30 in OpenCV.
I don’t see anything about it on this forum.

v4l2-ctl does seem to change the parameters but it doesn’t matter since they reset after opening VideoCapture.

That’s why i provided this: Rock 3A Camera support

camera-mode = <0>;
or
camera-mode = <1>;

Edit:

maybe you should also use:

vid_capture.set(CAP_PROP_FRAME_WIDTH, capture_width);
vid_capture.set(CAP_PROP_FRAME_HEIGHT, capture_height);
vid_capture.set(CAP_PROP_FPS, framerate);

I think the modern way (v4l2) is:

cv::VideoCapture cap;
cap.open(index, cv::CAP_V4L2);
if (!cap.isOpened())
{
    std::cerr << "***Could not initialize capturing...***" << std::endl;
    return -1;
}
cap.set(cv::CAP_PROP_FRAME_WIDTH, capture_width);
cap.set(cv::CAP_PROP_FRAME_HEIGHT, capture_height);
cap.set(cv::CAP_PROP_FPS, framerate);

This certainly is not a simple topic.
But I don’t know what am I supposed to do with what you linked.

By default OpenCV opens CSI camera with GStreamer.
It doesn’t get opened if you specify V4L2 API (opening by index).
And if it does (/dev/video-camera0) then set method returns False so that doesn’t work. Wish it was that simple.

I don’t have the board with me right now, but try this pipeline and see if you can get an output:

sudo gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=NV12,width=1280,height=720, framerate=30/1 ! xvimagesink

Note that you need to find the correct /dev/videoX node with v4l2-ctl.

No, no. You misunderstood me.

What you’ve just written is the most basic pipeline which I had been trying to get it to work for weeks.
But I don’t have the knowledge to go deeper than installing packages or editing device tree overlays in accordance to instructions. And so there’s no packages or instructions for this.

What I meant by “not knowing what to do with what you linked” is that - ok, you’ve written some patch that allowed you to make v4l2 able to change resolution and framerate. I don’t understand what that patch is, what it’s for and where to put it (if I even have to). There was also something about editing drivers… I can’t really do that because I don’t know where and what to change and how to load the changes back.

Long story short - the pipeline has never worked.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.

I see.

I don’t know what skills you have. And i don’t know if gstreamer works or not, it should. You need to tell the sensor which mode (resolution/framerate) prior to calling any v4l2 function.
You do that with this set of function:

media-ctl -d /dev/media0 --set-v4l2 ‘“ov5647 5-0036”:0[fmt:SGBRG10_1X10/2592x1944]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[fmt:SGBRG10_1X10/2592x1944]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[crop:(0,0)/2592x1944]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:2[crop:(0,0)/2592x1944]’
v4l2-ctl -d /dev/video0 --set-selection=target=crop,top=0,left=0,width=2592,height=1944

media-ctl -d /dev/media0 --set-v4l2 ‘“ov5647 5-0036”:0[fmt:SGBRG10_1X10/1920x1080]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[fmt:SGBRG10_1X10/1920x1080]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[crop:(0,0)/1920x1080]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:2[crop:(0,0)/1920x1080]’
v4l2-ctl -d /dev/video0 --set-selection=target=crop,top=0,left=0,width=1920,height=1080

media-ctl -d /dev/media0 --set-v4l2 ‘“ov5647 5-0036”:0[fmt:SGBRG10_1X10/640x480]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[fmt:SGBRG10_1X10/640x480]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:0[crop:(0,0)/640x480]’
media-ctl -d /dev/media0 --set-v4l2 ‘“rkisp-isp-subdev”:2[crop:(0,0)/640x480]’
v4l2-ctl -d /dev/video0 --set-selection=target=crop,top=0,left=0,width=640,height=480

And then you call gstreamer for 480x640 or it will not work. The patch is supposed to free you from calling this set of functions every time it is reset. If you set it for 640x480 and call gstreamer with 1280x720, you get an error.

If this is basic for you, ignore everything i have written.

Edit:
I am not sure the driver support 1280x720, try 1920x1080 just to check what you get.

Ok. So this is the current state of things.

Pipeline with the resolution of 1280x720 opens with framerate set at 15 FPS.
It just “corrects” the resolution to 1280x1080 from what I can see, which is not a big deal.
However v4l2-ctl reported the sensor can do half its original resolution so 1296x972 at 40-something FPS. But it’s impossible to set with any method known by me.

And I need the higher framerates.
P.S. Your commands don’t work (first two kinds: invalid argument, second two: syntax errors with ‘(’).