Write gstreamer source with opencv - c++

My goal is, to write a GigEVision to Gstreamer application.
The first approach was to read the frames via a GigEVision API and then send it via gstreamer as raw RTP/UDP stream.
This stream can then be received by any gstreamer application.
Here is a minimal example for a webcam: https://github.com/tik0/mat2gstreamer
The drawback of this is, alot of serialization and deserialization when the package is send via UDP to the next application.
So the question: Is it possible to write a gstreamer source pad easily with opencv, to overcome the drawbacks? (Or do you have any other suggestions?)
Greetings

I think I've found the best solution for my given setup (s.t. the data is exchanged between applications on the same PC).
Just using the plugin for shared memory allows data exchange with minimal effort.
So my OpenCV pileline looks like:
appsrc ! shmsink socket-path=/tmp/foo sync=true wait-for-connection=false
And any other receiver (in this case gstreamer-1.0) looks like:
gst-launch-1.0 shmsrc socket-path=/tmp/foo ! video/x-raw, format=BGR ,width=<myWidth>,height=<myHeight>,framerate=<myFps> ! videoconvert ! autovideosink
Works very nice even with multiple access.

Related

Stream a color map to RTSP and kvssink

My goal is to stream a color map to rtsp and from there to stream it to kvs.
By color map I mean that I've created a numpy matrix, 32x24x3, that I write to the screen at around 7 fps, as shown here:
Sending OpenCV output to VLC stream
and convert it to an RTSP stream.
My line is:
python3 rtsp.py | vlc -I dummy --demux=rawvideo --rawvid-fps=7 --rawvid-width=32 --rawvid-height=24 --rawvid-chroma=RV24 - --sout "#transcode{vcodec=mp4v,vb=200,fps=7,width=32,height=24,samplerate=44100}:rtp{sdp=rtsp://127.0.0.1:8557/color.sdp}"
This works and if I open the vlc GUI I can read it and see the frames.
My next goal is to stream it to kvs using gstreamer.
Usually I stream h264 source to kvssink and it looks like that (from amazon's tutorial):
gst-launch-1.0 rtspsrc location="rtsp://127.0.0.1:8557/color.sdp" ! rtph264depay ! video/x-h264, format=avc,alignment=au ! kvssink stream-name="colorStream" storage-size=512
But it seems that the gstreamer can't read the frames and I get:
INFO - freeKinesisVideoStream(): Freeing Kinesis Video stream.
DEBUG - curlApiCallbacksShutdownActiveRequests(): pActiveRequests hashtable is empty
I've tried all kinds of codecs but nothing works.
Does anybody maybe have another solution?
Amazon's documentation is either outdated or only works for some system configurations. I had the same issue and had to use a different gstreamer incantation. For you it would look like:
gst-launch-1.0 rtspsrc location="rtsp://127.0.0.1:8557/color.sdp" ! rtph264depay ! h264parse ! kvssink stream-name="colorStream" storage-size=512
Note the h264parse in place of video/x-h264, format=avc,alignment=au. I spent hours figuring this one out and the lightbulb in my head went off after using the debug guide to precisely jack up the debug level of the part of the pipeline dropping frames, and then reading this issue thread on the KVS producer sdk repo.
I'm not a gstreamer expert and can't explain why this works, or why this plugin is considered "bad", but I hope this helps others stranded in the same problem.

Gstreamer RTSP webcam server

I want to read the feed from a webcam and host a RTSP stream without encoding the feed. I have access to high bandwidth network but the CPUs are very low end and have other tasks to full fill due to which I want to skip the encoding/decoding steps to save up on CPU usage. Before jumping on to RTSP I tried a simple MJPG stream and tried to skip the jpegenc (JPG encoding) as it can be done directly with a simple gst pipeline:
gst-launch-1.0 -v autovideosrc ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! rtpjpegpay ! udpsink host=10.0.1.10 port=5000
However, I got a warning:
WARNING: erroneous pipeline: could not link videoscale0 to
rtpjpegpay0, rtpjpegpay0 can't handle caps video/x-raw,
format=(string)I420, width=(int)800, height=(int)600,
framerate=(fraction)25/1
I'm new to Gstreamer and not sure if this is possible and how to move forward next. The same command above works if I include the jpg encoding. Any suggestions would be appreciated.
rtpjpegpay is an element that takes in a Motion JPEG stream and translates it to RTP. The input you're giving it however isvideo/x-raw, which means it is unencoded, rather than encoded with Motion JPEG. If you want to use this element, you'll first have to encode it to Motion JPEG, using something like jpegenc.
Like #vermaete already mentions: if you really, really don't want to encode your video, you can use someting like rtpvrawpay, which will translate your raw video into RTP packets. However: sending raw, unencoded video over the network is not really advisable (and not even workable if you have a bad connection, or even impossible if you plan on sending it over the Internet). You might also end up using a lot of resources on your CPU just to get everything payloaded properly, and gettign it sent to your network card.

gstreamer dropping frames: ARM processor

I am running a 9 sec video on my NVIDIA Tegra jetson TK1 board using gstreamer as:
gst-launch-0.10 playbin uri=file:///home/ubuntu/widescreen.avi
I notice this drops a lot of frames and gstreamer prints these messages:
WARNING: from element /GstPlayBin:playbin0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPlayBin:playbin0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
I ran top while executing this and indeed gstreamer is taking 95% of the CPU.
Now, when i play this video through the default media player, it plays completely fine and without any lag. I was wondering if anyone knows what may be the reason that gstreamer is unable to play it properly. I am new to gstreamer and wondering if I can do something to alleviate this.
Bins are many elements combined into one element type structure(i.e with appropriate number of sinks and sources). playbin is a standard bin with an "autovideosink" element which automatically detects video sinks.
Firstly I would suggest you to upgrade to gstreamer-1.0 in which many bugs were fixed.
Secondly the problem seems to be with your xvimagesink, hence try using ximagesink by defining explicitly, your autovideosink is selecting xvimagesink by default.
try this pipeine:-
(for hardware decoding)
gst-launch-1.0 filesrc location="location of h264 video/file.avi" ! avidemux ! h264parse ! omxh264dec ! videoconvert ! ximagesink
or for cpu decoding use avdec_h264 instead of omxh264dec
The default media player is likely using video hardware decoding if the media file is H.264, which allows it decode with less CPU resources. Try creating a gstreamer pipeline that explicitly uses the omx H.264 decoding element as discussed here (http://elinux.org/Jetson/H264_Codec).
eg. gst-launch-0.10 filesrc location=/home/ubuntu/widescreen.avi ! avidemux ! h264parse ! nv_omx_h264dec ! autovideosink

Gstreamer rtsp stream to appsink to openCV

I need a bit of your help because I'm trying to receive rtsp stream by gstreamer and then put it into openCV to process video. What is worse, I will need it back from openCV but first things first. I'm quite new to this so I don't know Gstreamer well so I'm counting on you guys. Some simple examples would be best but I'll use what I have;)
Thanks in advance
You can use something like this:
uridecodebin uri=rtsp:// name=uridec ! queue ! tee name=t ! queue ! <some encoder and muxer> ! filesink t. ! queue ! videoconvert ! "video/x-raw, format=BGR" ! appsink t. ! queue ! <restream>
In this possible solution you are receiving and decoding at uridecodebin which means that for re-streaming you need to encode, as well as encoding for storing to a file. If that's not what you want you can replace uridecodebin with rtspsrc that will give you RTP streams instead of decoded raw streams. Something like:
rtspsrc ! rtpXdepay ! tee name=t ! ...
Replace X with the format you are receiving (can be done dynamically from your application). Now the output is an encoded stream that you can use in a similar way as the sample pipeline above.
Note that these suggestions are assuming your rtsp input is a single stream (video likely), if you want video and audio you need to add 2 branches out of uridecodebin or rtspsrc. I also assumed that by 'rtspStream' is some sort of external library/application that you are going to use to retransmit instead of using gstreamer itself. In any way, this should give you an idea.

How to get Pipeline created by playbin in textual format in Gstreamer?

I'm playing a transport stream file (*.ts) using the following pipeline:
gst-launch-0.10 playbin2 uri=file:///c:/bbb.ts
But I need to convert that into a pipeline myself. I'm not sure how to achieve this.
So far I have tried: (works fine)
gst-launch-0.10 -v filesrc location=c:/bbb.ts ! tsdemux ! audio/x-ac3 ! fakesink
But if i replace fakesink with autoaudiosink it fails with a not-linked error.
And even the fakesink doesn't work for video:
gst-launch-0.10 -v filesrc location=c:/bbb.ts ! tsdemux ! video/x-mpeg2 ! fakesink
So I have two questions:
How to find out pipeline created by playbin element.
How to play mpeg2-ts file using gstreamer pipeline.
Answer to question 1 -
There is a way to get the graphs of the pipeline created mentioned in documentation of basic tutorial-11.
A brief from the page
Getting pipeline graphs
For those cases where your pipeline starts to grow too large and you
lose track of what is connected with what, GStreamer has the
capability to output graph files. These are .dot files, readable with
free programs like GraphViz, that describe the topology of your
pipeline, along with the caps negotiated in each link.
This is also very handy when using all-in-one elements like playbin2
or uridecodebin, which instantiate several elements inside them.
I hope this resolves what you want