I'm trying to open UDP stream video in Raspberry Pi using this pipeline:
VideoCapture video("udpsrc port=5600 ! application/x-rtp,payload=96,encoding-name=H264 !"
"rtpjitterbuffer mode=1 ! rtph264depay ! h264parse ! decodebin ! videoconvert ! appsink emit-signals=true sync=false max-buffers=2 drop=true", cv::CAP_GSTREAMER);
// Exit if video is not opened
if(!video.isOpened())
{
cout << "Could not read video file" << endl;
return 1;
}
However, video.isOpened() return false and I couldn't be able to open with this code. This works on loopback test and another Ubuntu 18.04 PC but RPi 4 (Buster OS) couldn't run it. Also following lines can run upcoming gstream video:
gst-launch-1.0 udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink fps-update-interval=1000 sync=false
Furthermore specific code stack (e.g. [video_udp.cpp][1]) can easily handle video but also it's hard to use with opencv.
NOTE: OpenCV version is 4.2.0-pre
The problem is about using GStreamer library as a plugin of OpenCV. OpenCV doesn't throw exception even you build source code without GStreamer support. (In default, GStreamer library was directly found by Ubuntu, conversely Raspberry Pi 4 couldn't find it.)
Firstly I check build information of OpenCV with std::cout<<cv::getBuildInformation(); in Ubuntu 18.04 machine and found that:
GStreamer: YES (1.14.5)
Also I just check this on Raspberry Pi 4 side and build information was:
GStreamer:NO
Before the build OpenCV I just compare GStreamer plugins with gst-inspect-1.0 command for both of them and I just install some missing plugins like gstreamer1.0-tools . Also I wasn't know the problem, before the checking build information, so I installed some other GStreamer plugins that currently I don't remember.
Lastly, I build system by adding -D WITH_GSTREAMER=ON flag. And now it works well.
I'll edit answer if the problem related to missing plugins those are installed later. For this, I'll check this issue with clean Buster OS image.
Related
I am creating a video stream from my headless raspberry pi using usb camera and opencv.
Ubuntu 18.04 LTS
Raspi OS: Bullseye
USB Camera
gst syntax test on terminal:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=640, height=480 ! autovideosink -v
In OpenCV C++:
string pipeline;
pipeline = "v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=640, height=480 ! autovideosink -v";
cv::VideoCapture cap(pipeline, cv::CAP_GSTREAMER);
When I launch the pipeline from terminal it creates a window with by usb webcam feed. However when I try to use the same pipeline from c++ it throws this error:
[ WARN:0] global ../modules/videoio/src/cap_gstreamer.cpp (734) open OpenCV | GStreamer warning: Error opening bin: syntax error
[ WARN:0] global ../modules/videoio/src/cap_gstreamer.cpp (501) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
terminate called after throwing an instance of 'std::invalid_argument'
what(): cannot predict with an empty OpenCV image
Aborted
What am i missing in the syntax in the code?
Edit:
After some trials I could get the pipeline generated, with this:
pipeline = "v4l2src device=/dev/video0 ! videoconvert! video/x-raw, width=640, height=480 ! appsink";
and warning:
[ WARN:0] global ../modules/videoio/src/cap_gstreamer.cpp (961) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
.
.
.
some stuff for that frame
.
.
[ WARN:0] global ../modules/videoio/src/cap_gstreamer.cpp (509) isPipelinePlaying OpenCV | GStreamer warning: unable to query pipeline state
[ WARN:0] global ../modules/videoio/src/cap_gstreamer.cpp (1824) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src0 reported: Could not read from resource.
When searched for it, I found this post. Which led to a bug report here and here.
I am using OpenCV 4.5, and I assume that bug would have been fixed by later version as it was reported for OpenCV 3.4.
I am using this command: gst-launch-1.0 v4l2src ! xvimagesink
to stream video over usb on my nvidia jetson nano and I am getting this output:
Setting pipeline to PAUSED...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not display (null)
Setting pipeline to NULL..
Freeing pipeline...
i am trying to play sequence of image(.jpeg) from my local folder to play as h264 video in VLC player through local host connection.
I am using Gstreamer lib to stream support and i am using from test-launch.c from github samples
OS : Linux 18.04
Gstremer ver: 1.14.2 Building from source using Cerbero
Following command is used to set the parser
multifilesrc location=/home/user/Downloads/gen/img.%4d.jpeg loop=true caps=image/jpeg,framerate=10/1 ! jpegdec ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96
but no streaming happening insted i can see a error like below
(Gstremer_img2vid2:21273): GStreamer-CRITICAL **: 15:57:21.547: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
0:01:26.143446201 21273 0x55555592b850 ERROR rtspclient rtsp-client.c:3105:handle_setup_request: client 0x5555559d5260: no control in path '/test'
Any help will be appreciated thanks
You may have to provide size of the frames for jpeg decoding (here using 640x480):
test-launch "multifilesrc location=/home/user/Downloads/gen/img.%4d.jpeg loop=true ! image/jpeg,width=640,height=480,framerate=10/1 ! jpegdec ! x264enc insert-vui=1 ! h264parse config-interval=1 ! rtph264pay name=pay0"
Should be ok for receiving on localhost using vlc:
cvlc rtsp://127.0.0.1:8554/test
I'm working with AI-Thermometer project using Nvidia Jeton Nano.
The project is using Pi camera v2 for video capturing.
Here's the command of showing video streams using Pi camera v2.
gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=2 ! 'video/x-raw,width=960, height=720' ! nvvidconv ! nvegltransform ! nveglglessink -e
I want to use the normal USB webcam (such as Logitech c930) instead of Pi camera v2.
To do so, I need to stream the USB webcam data using GStreamer in the same way as above pipeline commands.
I installed v4l-utils on Ubuntu of Jetson Nano. And tried like this,
gst-launch-1.0 v4l2src device="/dev/video0" ! 'video/x-raw(memory:NVMM),width= ...
, but it gave a warning and didn't work.
How can I show video streams from webcam?
There should not quotes around the device parameter i.e. device=/dev/video0. If the error persists, then its probably something else.
gst-launch-1.0 v4l2src device="/dev/video0" ! \
"video/x-raw, width=640, height=480, format=(string)YUY2" ! \
xvimagesink -e
Hy i tried to play a doorbird live audio stream via gstreamer gst-launch.
I found a code snipped in an fhem server script.
It should be possible with the command gst-launch-1.0 filesrc location=<http://12.0.0.231/bha-api/audio-receive.cgi user=xxxx passwd=xxxx> ! wavparse ! audioconvert ! lame ! filesink location=a.mp3
But i get following error: lame not found. if i change lame to lame lamemp3enc i get the error file not found http://12.0.0.231/bha-api/audio-receive.cgi
What do i wrong?