I am trying to find a way to do video scaling on the GPU and the only thing I could find was the glcolorscale filter. I am running streamer 1.8.0 on my ARM device and I tried to execute the following:
gst-launch-1.0 -v videotestsrc ! "video/x-raw-yuv" | glcolorscale ! ximagesink
This is an example that I found in the documentation for glcolorscale but it returns an error:
"Could not return videotestsrc0 to glcolorscale0"
Related
I am trying to access the CSI Camera on the Coral Dev Board via OpenCV and GStreamer in C++;
This is my pipline code that seems to work fine when testing with gst:
gst-launch-1.0 v4l2src device = /dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1
But when trying to open it with OpenCV it doesnt seem to do the trick:
#include "opencv2/opencv.hpp"
#include <iostream>
int main() {
std::string pipline = "v4l2src device = /dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1";
cv::VideoCapture capture(pipline, cv::CAP_GSTRAMER);
if (!_capture.isOpened()) {
std::cout << "Could not open Camera" << std::endl;
}
return 0;
}
Also, is there somehow a more detailed error message available via opencv? When running gst-launch-1.0 it tells me quite specifically what it didnt like about my pipline string. But in opencv it just seems to tell me that it didnt work.
OpenCv VideoCapture has 5 overloaded functions and all of them don't include this type implementation. You can use the followings as VideoCapture inputs:
video file or image file sequence
a capturing device index
IP video stream link
As I see in your command, your CSI camera locates on device = /dev/video0 so you can call it simply with the line:
cv::VideoCapture capture(0, cv::CAP_GSTREAMER);
I have the problem when I open a camera with GStreamer, and the camera is not connected, I don't get an error code back from OpenCV. GStreamer returns an error in the console. When I check if the camera is open with .isOpend() the return value is true. When the camera is connected, it works without any issue.
std::string pipeline = "nvarguscamerasrc sensor_id=0 ! video/x-raw(memory:NVMM), width=(int)3264, height=(int)2464, format=(string)NV12, framerate=(fraction)21/1 ! nvvidconv flip-method=2 ! video/x-raw, width=(int)3264, height=(int)2464, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink"
cap_device_.open(pipeline, cv::CAP_GSTREAMER);
bool err = cap_device_.isOpened();
if (!err) {
printf("Not possible to open camera");
return EXIT_FAILURE;
}
The GStreamer error code in the console is:
(Argus) Error Timeout: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function openSocketConnection(), line 219)
(Argus) Error Timeout: Cannot create camera provider (in src/rpc/socket/client/SocketClientDispatch.cpp, function createCameraProvider(), line 106)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:720 Failed to create CameraProvider
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
If I understand everything correct, .isOpend() should return false. If not, how can I check if the pipeline is initialized correct?
My system runs with Ubuntu 18.04 on an Nvidia Jetson Nano with a MIPI-CSI camera.
GStreamer version 1.14.5, OpenCV version 4.1.1
This may just be because of a typo. nvarguscamerasrc has no property sensor_id but has sensor-id. It should work after fixing this.
In not working case, cap.read() should return false.
I want to preview RTMP using gstreamer xvimagesink. i can see the output if i use autovideosink like this:
gst-launch-1.0 -v rtmpsrc location='rtmp://127.0.0.1:1935/live/stream' ! decodebin3 ! autovideosink
but if i replace "autovideosink" with "xvimagesink" i get this:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ...
Both decodebin3 and autovideosink are auto-plugging GStreamer elements. It means that both elements are auto-selecting available and the most appropriate GStreamer plugins to demux/decode (decodebin3) and render video (autovideosink) from, in this case, live RTMP stream.
So it is very possible that, for example,
decodebin3 decodes video in format that xvimagesink cannot show on your platform/hardware and/or with your Gstreamer version,
xvimagesink is not set properly on your platform and it is not related with available display/monitor.
To find out more details about
video format decoded by decodebin3
video sink element "chosen" by autovideosink,
you can set higher (more detailed) debug level of GStreamer with, for example, export GST_DEBUG=3, rerun pipeline and inspect output.
I have dvblast that is successfully multicasting an MPEG2 stream originating from DVB-T onto a network, and I am trying to pick up this multicast MPEG2 stream and convert it to HLS on a Raspberry Pi 2 using gstreamer v1.0 as follows:
gst-launch-1.0 udpsrc port=5004 multicast-group=239.255.1.30 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000" ! rtpbin ! rtpmp2tdepay ! tsdemux ! mpegvideoparse ! omxmpeg2videodec ! queue ! videoconvert ! omxh264enc ! mpegtsmux ! hlssink max-files=5 location=/var/www/stream/segment%05d.ts playlist-location=/var/www/stream/output.m3u8 playlist-root=http://192.168.225.2/stream/
The HLS files are successfully created, and are served via httpd successfully to mediastreamvalidator which is happy with the results:
Processed 7 out of 7 segments: OK
Segment bitrate: Average: 430.90 kbits/sec, Max: 741.38 kbits/sec
The MPEG2 license is in place and works.
Neither Safari nor an iPhone can view this stream, in both cases the play button appears but no video or audio pays. Eventually Safari will claim "Missing plugin". I am struggling to see where I have gone wrong, and am struggling to find any documentation or examples on this specific scenario. Can anyone point out where in the pipeline this has gone wrong?
Discovered that the current gstreamer gst-omx code doesn't handle include AU delimiters, and the following patch is required to make omxh264enc generate a stream that Safari and/or iOS will play:
https://bugzilla.gnome.org/show_bug.cgi?id=736211
Using the June 9 2015 version of mediastreamvalidator shows the following issues, but the stream does now play on Safari and iOS:
WARNING: Video segment does not contain an IDR frame
--> Track ID 1
ERROR: (-12642) Playlist vs segment duration mismatch
--> Segment duration 4.7600, Playlist duration: 2.4000
I am trying to run certain pipelines on the Command prompt for playing a video and I am often getting these errors/messages/warnings :
WARNING: erroneous pipeline: no element "qtdemux"
WARNING: erroneous pipeline: no element "playbin2"
WARNING: erroneous pipeline: no element "decodebin2"
ERROR: pipeline could not be constructed: no element "playbin".
Following are the pipelines :
gst-launch filesrc location=path to the mp4 file ! playbin2 ! queue ! ffmpegcolorspace ! autovideosink
or
gst-launch -v filesrc location=path to the mp4 file ! qtdemux name=demuxer ! { queue ! decodebin ! sdlvideosink } { demuxer. ! queue ! decodebin ! alsasink }
or
gst-launch -v playbin uri=path to the mp4 file
or
gst-launch -v playbin2 uri=path to the mp4 file
Questions
I wanted to know, if I am I missing the plugins to execute this.
How do I know which plugin is responsible for which or found where?
What is the benefit of implementing the pipeline via c code.Are the missing plugins still required.
Is it good to install the missing plugins form the Synaptic manager or form the Gstreamer site(base,good,bad,ugly)
When we do gst-inspect we get output like this:
postproc: postproc_hdeblock: LibPostProc hdeblock filter
libvisual: libvisual_oinksie: libvisual oinksie plugin plugin v.0.1
flump3dec: flump3dec: Fluendo MP3 Decoder (liboil build)
vorbis: vorbistag: VorbisTag
vorbis: vorbisparse: VorbisParse
vorbis: vorbisdec: Vorbis audio decoder
vorbis: vorbisenc: Vorbis audio encoder
coreindexers: fileindex: A index that stores entries in file
coreindexers: memindex: A index that stores entries in memory
amrnb: amrnbenc: AMR-NB audio encoder
amrnb: amrnbdec: AMR-NB audio decoder
audioresample: audioresample: Audio resampler
flv: flvmux: FLV muxer
flv: flvdemux: FLV Demuxer
What does the x : y ( x and y mean ) ?
Answers,
It looks like gstreamer at your ends was not installed correctly. playbin2, decodebin2 are basic and part of the base plugins
1 Yes you may be missing some plugins
2 Use gst-inspect command to check if it is available
3 From C code you can manage states, register callback, learn more
Yes missing plugins are still required
4 I guess gstreamer site would be better
5 Not sure about this one, would help if you arrange the result in a proper way
Most probably the GST_PLUGIN_PATH is incorrect. Please set the correct path to where the gstremer has been installed.