Using custom camera in OpenCV (via GStreamer)

Mahyar picture Mahyar · May 9, 2014 · Viewed 18.5k times · Source

I'm using Nitrogen6x board with ov5640 camera(mipi).

The camera is not using standard v4l/v4l, but we can stream video using GStreamer for its driver (mfw_v4l):

gst-launch mfw_v4lsrc ! autovideosink

I want to use the camera in OpenCV by calling it via GStreamer (GStreamer inside OpenCV). I asked a question about calling GStreamer inside OpenCV here, and this is the follow up.

If I enable GStreamer support, it's checked in the source code, but OpenCV tries to use standard V4L/V4L2 for GStreamer which I want to change. The section about calling GStreamer is in cap_gstreamer.cpp:

    CvCapture* cvCreateCapture_GStreamer(int type, const char* filename )
{
    CvCapture_GStreamer* capture = new CvCapture_GStreamer;

    if( capture->open( type, filename ))
        return capture;

    delete capture;
    return 0;
}

I guess this is the section I should work on to somehow point to the camera's driver. ("type" here probably is a number related to the driver(as defined in precomp.hpp), but what's the "filename"?)

Any suggestions about how to access the camera via GStreamer would be helpful and appreciated. Thanks!

Answer

Mahyar picture Mahyar · May 22, 2014

Looks like we can call the camera using a proper GStreamer pipeline like below:

VideoCapture cap("mfw_v4lsrc ! ffmpegcolorspace ! video/x-raw-rgb ! appsink")

as the camera output is in YUV, we need to convert that to RGB to pass the frames to OpenCV. This is where OpenCV makes sure it gets RGB colorspace.