How to use Opencv VideoWriter with GStreamer? - c++

I am trying to transfer a h264 stream using Opencv VideoWriter to get it on another pc on the network using VideoCapture. However, I am stuck on VideoWriter. Execution of this code returns with an error and out.isOpened () is always false.
    int FOURCC = cv::VideoWriter::fourcc('H', '2', '6', '4');
    cv::VideoWriter out;
    out.open ("appsrc ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000",
        cv::CAP_GSTREAMER,
        FOURCC,
        16,
        cv::Size (640, 480),
        true);
    if (!out.isOpened ()) {
        qDebug () << "\ n ***** Failed to open videowriter *****";
        return -1;
    }
[ WARN:0] global D:\Libs\opencv-4.3.0\modules\videoio\src\cap_gstreamer.cpp (1274) cv::CvVideoWriter_GStreamer::close_ OpenCV | GStreamer warning: No source in GStreamer pipeline. Ignore
[ERROR:0] global D:\Libs\opencv-4.3.0\modules\videoio\src\cap.cpp (527) cv::VideoWriter::open VIDEOIO(GSTREAMER): raised OpenCV exception:
OpenCV(4.3.0) D:\Libs\opencv-4.3.0\modules\videoio\src\cap_gstreamer.cpp:144: error: (-215:Assertion failed) ptr in function 'cv::`anonymous-namespace'::GSafePtr<struct _GstElement>::get'
***** Failed open videowriter *****
Even a simple example returns me an error and out.isOpened() false.
out.open("autovideosrc ! videoconvert ! appsink",
cv::CAP_GSTREAMER,
FOURCC,
16,
cv::Size(640, 480),
true);
[ WARN:0] global D:\Libs\opencv-4.3.0\modules\videoio\src\cap_gstreamer.cpp (1500) cv::CvVideoWriter_GStreamer::open OpenCV | GStreamer warning: OpenCV backend does not support this file type (extension): autovideosrc ! videoconvert ! appsink
[ WARN:0] global D:\Libs\opencv-4.3.0\modules\videoio\src\cap_gstreamer.cpp (1274) cv::CvVideoWriter_GStreamer::close_ OpenCV | GStreamer warning: No source in GStreamer pipeline. Ignore
***** Failed to open videowriter *****
The version of opencv 4.3.0 is compiled from source code with gstreamer support.
cv::getBuildInformation () says:
    Video I/O:
      DC1394: NO
      FFMPEG: YES (prebuilt binaries)
        avcodec: YES (58.54.100)
        avformat: YES (58.29.100)
        avutil: YES (56.31.100)
        swscale: YES (5.5.100)
        avresample: YES (4.0.0)
      GStreamer: YES (1.16.2)
      DirectShow: YES
      Media Foundation: YES
        DXVA: YES
How can I stream the stream? What parameters should be specified by VideoWriter? I tried various tips from google, but none of them helped me. I would be grateful for a simple example of how to send a stream from VideoWriter on one side and receive it from VideoCapture on the other.
I am using Windows 10 x64 and Qt5.13 MSVC2017

You need to feed raw video to appsrc. Setting fourcc to h264 forces VideoWriter to encode video instead of gstreamer pipe. You can set your fourcc to 0 to push raw video. The following should work.
cv::VideoWriter out;
out.open ("appsrc ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000",
cv::CAP_GSTREAMER,
0,
16,
cv::Size (640, 480),
true);

I believe that this is a bug in OpenCV, which should be fixed by this PR, or maybe this one (or maybe both).
Indeed, it seems like the OpenCV code used to work with some version of GStreamer that would set the error pointer to NULL when the function succeeds, but that is not guaranteed with 1.16.2 in my experience (and from the documentation).
With those PRs, it should properly use the return value instead of relying on the state of the error pointer.

Related

GStreamer Pipline with OpenCV on Coral Dev Board

I am trying to access the CSI Camera on the Coral Dev Board via OpenCV and GStreamer in C++;
This is my pipline code that seems to work fine when testing with gst:
gst-launch-1.0 v4l2src device = /dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1
But when trying to open it with OpenCV it doesnt seem to do the trick:
#include "opencv2/opencv.hpp"
#include <iostream>
int main() {
std::string pipline = "v4l2src device = /dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1";
cv::VideoCapture capture(pipline, cv::CAP_GSTRAMER);
if (!_capture.isOpened()) {
std::cout << "Could not open Camera" << std::endl;
}
return 0;
}
Also, is there somehow a more detailed error message available via opencv? When running gst-launch-1.0 it tells me quite specifically what it didnt like about my pipline string. But in opencv it just seems to tell me that it didnt work.
OpenCv VideoCapture has 5 overloaded functions and all of them don't include this type implementation. You can use the followings as VideoCapture inputs:
video file or image file sequence
a capturing device index
IP video stream link
As I see in your command, your CSI camera locates on device = /dev/video0 so you can call it simply with the line:
cv::VideoCapture capture(0, cv::CAP_GSTREAMER);

How can I save an GStreamer RTSP stream of unknown type to a file

I'm using this Gstreamer pipeline to send an RTSP stream of a camera.
./gst-rtsp-launch --port 8554 "( v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=640,height=480 ! rtpvrawpay name=pay0 pt=96 )"
I want to using playbin, so I don't need to specify the type of video from the rtsp stream. If I use this pipeline, I can get a single image from the camera:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="jpegenc ! filesink location=capture1.jpeg"
But if I try this pipeline, to save as a file:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
I get this error:
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1: Internal data stream error.
Additional information for debugging:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.094968753
Defining the processing queue to NULL...
Freeing the processing queue...
Note: I had translate the last two lines.
Is there a problem in the pipeline I'm using to save the stream as a file?
I copied your pipeline and in worked.
I only added -e after gst-launch-1.0, so that GStreamer writes all necessary information to the file when ctrl-C is pressed:
gst-launch-1.0 -e playbin uri=(rtsp url) video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
Maybe you are using an older version of GStreamer?
I am using GStreamer 1.20.0

OpenCV returns no error when open is called, but gstreamer does

I have the problem when I open a camera with GStreamer, and the camera is not connected, I don't get an error code back from OpenCV. GStreamer returns an error in the console. When I check if the camera is open with .isOpend() the return value is true. When the camera is connected, it works without any issue.
std::string pipeline = "nvarguscamerasrc sensor_id=0 ! video/x-raw(memory:NVMM), width=(int)3264, height=(int)2464, format=(string)NV12, framerate=(fraction)21/1 ! nvvidconv flip-method=2 ! video/x-raw, width=(int)3264, height=(int)2464, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink"
cap_device_.open(pipeline, cv::CAP_GSTREAMER);
bool err = cap_device_.isOpened();
if (!err) {
printf("Not possible to open camera");
return EXIT_FAILURE;
}
The GStreamer error code in the console is:
(Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function openSocketConnection(), line 219)
(Argus) Error Timeout: Cannot create camera provider (in src/rpc/socket/client/SocketClientDispatch.cpp, function createCameraProvider(), line 106)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:720 Failed to create CameraProvider
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
If I understand everything correct, .isOpend() should return false. If not, how can I check if the pipeline is initialized correct?
My system runs with Ubuntu 18.04 on an Nvidia Jetson Nano with a MIPI-CSI camera.
GStreamer version 1.14.5, OpenCV version 4.1.1
This may just be because of a typo. nvarguscamerasrc has no property sensor_id but has sensor-id. It should work after fixing this.
In not working case, cap.read() should return false.

Gstreamer pipeline to convert MPEG-4 video to MPEG-TS format

I am trying to write gstreamer pipeline to convert mpeg4 video to mpegts format.
I tried below pipeline but no luck
$ gst-launch-1.0 -e filesrc location=20200818125158_00001.ts.mp4 ! qtdemux name=mdemux ! \
h264parse ! video/x-h264,stream-format=byte-stream ! mpegtsmux name=mux ! filesink location=20200818125158_00001.ts
I get below error when I execute above pipeline
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstQTDemux:mdemux: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstQTDemux:mdemux:
failed delayed linking some pad of GstQTDemux named mdem
My Input format
gst-discoverer-1.0 vid.mp4
Analyzing file:///vid.mp4
Done discovering file:///vid.mp4
Topology:
container: Quicktime
video: MPEG-4 Video (Simple Profile)
Properties:
Duration: 0:00:07.267000000
Seekable: yes
Live: no
Tags:
video codec: MPEG-4 video
maximum bitrate: 8400000
bitrate: 298925
encoder: Lavf57.83.100
container format: ISO MP4/M4A
My target format
Topology:
container: MPEG-2 Transport Stream
video: H.264 (High Profile)
Properties:
Duration: 0:00:09.900164000
Seekable: yes
Live: no
Tags:
video codec: H.264
video codec: MPEG-4 video
Your video codec in the source file is MPEG-4 Video. That is different from H.264. Try mpegvideoparse instead of h264parse.
Als video/x-h264,stream-format=byte-stream caps forcing should not be required. The parser and muxer should agree on caps by themselves.
After spending multiple hours and reading more about gstreamer I figured out correct pipeline. Below pipeline works for me
gst-launch-1.0 filesrc location=vid.mp4 ! qtdemux ! avdec_mpeg4 ! X264enc ! mpegtsmux ! filesink location=vid.ts

Error while streaming MJPEG video with RTSP protocol

I want to stream a MJPEG video from my ethernet camera with OpenCV and Gstreamer.
I have tried to open the stream with the following gstreamer pipeline:
std::string pipe(
"rtspsrc location=rtsp://192.168.1.219:554/video.pro1 ! application/x-rtp,encoding-name=JPEG,payload=26 ! "
"rtpjpegdepay ! jpegdec ! xvimagesink sync=false ! appsink"
);
cv::VideoCapture cap(pipe, cv::CAP_GSTREAMER);
but the previous code returns me the following error:
OpenCV Error: Unspecified error (GStreamer: cannot find appsink in manual pipeline
) in cvCaptureFromCAM_GStreamer, file /home/nvidia/Documents/CameraTests/src/opencv-3.3.0/modules/videoio/src/cap_gstreamer.cpp, line 796
VIDEOIO(cvCreateCapture_GStreamer (CV_CAP_GSTREAMER_FILE, filename)): raised OpenCV exception:
/home/nvidia/Documents/CameraTests/src/opencv-3.3.0/modules/videoio/src/cap_gstreamer.cpp:796: error: (-2) GStreamer: cannot find appsink in manual pipeline
in function cvCaptureFromCAM_GStreamer
I managed to stream the the same RTSP source with H264 and H265 algorithms,
MJPEG is the only format that doesn't work, then I suppose that is a gstreamer pipeline problem..
thanks