I am trying to make a tiny cpp program module that makes live stream from camera, I am using gstreamer1.0 and I have plugins 1.6 and gst-rtsp-server-1.6.0 . I am able to start stream but when I try to stop and start again it does not start again.
my pipeline is:
std_pipeline = "videotestsrc ! video/x-raw ! "
"videoconvert ! x264enc ! rtph264pay"
and I am initializing with functions below:
gst_rtsp_server_new();
gst_rtsp_server_get_mount_points(server);
gst_rtsp_media_factory_new();
gst_rtsp_media_factory_set_launch(factory, std_pipeline);
gst_rtsp_mount_points_add_factory(mounts, "/stream", factory);
gst_rtsp_server_attach(server, NULL);
to start stream I call in another thread:
g_main_loop_run(loop);
and I am try to destroy everything with:
GstRTSPSessionPool *pool
gst_rtsp_server_get_session_pool (server);
gst_rtsp_session_pool_filter (pool,
(GstRTSPSessionPoolFilterFunc) remove_func, server);
g_object_unref (pool);
g_source_remove(server_id);
mounts = gst_rtsp_server_get_mount_points(server);
gst_rtsp_mount_points_remove_factory(mounts, "/stream");
gst_object_unref(mounts);
g_main_loop_quit(loop);
and the stream stops but when I try to init and start stream again it does not start.
Related
I am able to play a video on the command line with gstreamer's gst-launch like this:
gst-launch gnlfilesource location=file:///tmp/myfile.mov start=0 duration=2000000000 ! autovideosink
This plays the first 2 seconds of the file in /tmp/myfile.mov, afterwards the video playback stops. Is there anyway to get this to loop repeatidly? i.e. turn the 2 second long gnlfilesource into an infinite length video that plays those 2 seconds again and again and again?
If using gst-launch then you may have to use while true; do [your command]; done as Fredrik has stated. However if interested in C code, I have written a code which may help you. Looping of video every 2 seconds from the beginning of the file at the end of the stream of first run.
//(c) 2011 enthusiasticgeek
// This code is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
#include <gst/gst.h>
gboolean bus_callback(GstBus *bus, GstMessage *msg, gpointer data)
{
GstElement *play = GST_ELEMENT(data);
switch (GST_MESSAGE_TYPE(msg))
{
case GST_MESSAGE_EOS:
/* restart playback if at end */
if (!gst_element_seek(play,
1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
GST_SEEK_TYPE_SET, 2000000000, //2 seconds (in nanoseconds)
GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
g_print("Seek failed!\n");
}
break;
default:
break;
}
return TRUE;
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *play;
GstBus *bus;
/* init GStreamer */
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have a URI */
if (argc != 2) {
g_print ("Usage: %s <URI>\n", argv[0]);
return -1;
}
/* set up */
play = gst_element_factory_make ("playbin", "play");
g_object_set (G_OBJECT (play), "uri", argv[1], NULL);
bus = gst_pipeline_get_bus (GST_PIPELINE (play));
gst_bus_add_watch (bus, bus_callback, play);
gst_object_unref (bus);
gst_element_set_state (play, GST_STATE_PLAYING);
/* now run */
g_main_loop_run (loop);
/* also clean up */
gst_element_set_state (play, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (play));
return 0;
}
Update:
See the following link
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-dataaccess.html
[Section 19.1.2. Play a region of a media file]. This could be used in conjugation with my code.
This seems to be possible with multifilesrc plugin,
gst-launch-1.0 multifilesrc location=alien-age.mpg loop=true ! decodebin ! autovideosink
Seems to be added back in June 2011.
According to folks on the #gstreamer IRC channel, you can't do this with gstreamer itself, you'd need something outside the gstreamer pipeline to loop it.
multifilesrc is the easiest way, but it won't work on media files that have "Media length" known. you can loop on any video files only if file does not have any information about the time or length.
Open your file with any media player, if it shows media length or if you can seek the file forward or backward, that means it knows the media length and multifilesrc won't loop it.
How to convert video file into file without time track (stream file) with GStreamer:
you need to run two pipelines on command line, first run the recorder:
gst-launch-1.0 udpsrc port=10600 ! application/x-rtp-stream ! rtpstreamdepay name=pay1 ! rtph264depay ! h264parse ! video/x-h264,alignment=nal ! filesink location=my_timeless_file.mp4
it starts and waits for incoming stream.
on another terminal run the play pipeline:
gst-launch-1.0 filesrc location=my_file_with_time_track ! queue ! decodebin ! videoconvert ! x264enc ! h264parse config-interval=-1 ! rtph264pay pt=96 ! rtpstreampay name=pay0 ! udpsink host=127.0.0.1 port=10600
play pipeline starts and eventually terminates when it streamed whole file, now go back to the first command line and terminate recording pipeline with Ctrl+C.
(instead of udpsrc/udpsink you can use any other mechanisms to make the stream, like appsrc/appsink)
Now you have a new file which can be used in multifilesrc with loop:
gst-launch-1.0 multifilesrc location=my_timeless_file.mp4 loop=true ! queue ! decodebin ! videoconvert ! ximagesink
Why multifilesrc does not loop files with known length?
Because when length of media is known it sends EOS message downstream and causes whole pipeline going to state NULL, by removing that information when it reaches end of file (byte stream) it tries to find next file to play (remember it is "multi" file source, and by default can accept wildcard location like "image_%d.png"). When there is no wildcard to point to the next file, it loops back to only known file.
It's not looping file in stream on gstreamer, but I was able to do it with ffmpeg -stream_loop option.
https://ffmpeg.org/ffmpeg.html#Main-options
$ ffmpeg -re -stream_loop -1 -i /tmp/sample.mp4 -f rtsp rtsp://localhost:8554/stream
Assuming bash...
Wrap it in a while-loop?
while true; do [your command]; done
where true does nothing sucessfully, i.e.
true: true
Return a successful result.
Exit Status:
Always succeeds.
It allows you to create infinite loops, e.g.
$ while true; do echo "run..."; sleep 1; done
run...
run...
run...
run...
run...
...
I am new to Gstreamer. I wrote a simple RTSP server that generates a pipeline like:
appsrc name=vsrc is-live=true do-timestamp=true ! queue ! h264parse ! rtph264pay name=pay0 pt=96
The SDP response is generated after the DESCRIBE request, but only after a few frames on the signal have been received by the appsrc input:
vsrc = gst_bin_get_by_name_recurse_up(GST_BIN(element), "vsrc"); // appsrc
if (nullptr != vsrc)
{
gst_util_set_object_arg(G_OBJECT(vsrc), "format", "time");
g_signal_connect(vsrc, "need-data", (GCallback)need_video_data, streamResource);
}
The time from which the video is to be played is passed in the RTSP request PLAY, in the Range header as an absolute:
PLAY rtsp://172.19.9.65:554/Recording/ RTSP/1.0
CSeq: 4
Immediate: yes
Range: clock=20220127T082831.039Z- // Start from ...
To the object GstRTSPClient attached the handler to the signal in which I process this request and make the move to the right time in my appsrc
g_signal_connect(client, "pre-play-request", (GCallback)pre_play_request, NULL);
The problem is that at this point my appsrc's start time frames have already arrived in pipline and I watch them first, and then the playback continues from the time specified in the PLAY request.
Can you please tell me how I can cut off these initial frames that came in before the PLAY call.
I've tried:
gst_element_seek - doesn't help because of peculiarities of appsrc implementation
Flush didn't help either, tried resetting sink at element rtph264pay:
gst_pad_push_event(sinkPad, gst_event_new_flush_start());
GST_PAD_STREAM_LOCK(sinkPad);
// ... seek in appsrc
gst_pad_push_event(sinkPad, gst_event_new_flush_stop(TRUE));
GST_PAD_STREAM_UNLOCK(sinkPad);
gst_object_unref(sinkPad);
Thank You!
I am building my pipeline like this:
gst-launch-1.0 -v filesrc location=" + video_filename + " ! qtdemux ! queue max-size-buffers=0 max-size-time=0 ! vpudec frame-drop=0 ! queue ! imxv4l2sink device=/dev/video16
This is how I stop video playback
gst_element_set_state (m_gst_pipeline, GST_STATE_NULL);
g_main_loop_quit (m_mainloop);
gst_object_unref(m_gst_pipeline);
The stopping itsself works fine, only issue is that I get the last video frame displayed even when exiting the program. How do I flush the pipeline before destruction so I have a black screen?
I want to improve the writing stream performance of GStreamer pipeline.
When I read from RTSP stream, the CPU usage is about 1%~2% which is very nice, but when I write the frame back to the RTMP/UDP stream over network, is an overkill task, which the CPU usage goes up to 17%, it is the case which I did not perform any image process (just read and directly write back), so if I start to process 2+ channels of IP camera images simultaneously, then the CPU usage is going to be full almost.
So the question is, how can I keep the writing quality as read (1~2% of CPU usage)? Below is how I call the read/write process:
Read:
String url = "rtspsrc location=rtsp://admin:admin#xxx.xxx.xxx.xxx:554/Streaming/Channels/102" +
" latency=0 ! decodebin ! videoconvert ! appsink";
if (!capture.open(url, Videoio.CAP_GSTREAMER)) {
System.out.println("error opening capture");
return;
}
Write:
String urlWriter = "appsrc ! videoconvert ! x264enc tune=zerolatency threads=2 ! mpegtsmux ! udpsink host=localhost port=5000";
writer.open(urlWriter, Videoio.CAP_GSTREAMER, fourcc, 15, new Size(width, height), true);
if (!writer.isOpened()) {
System.out.println("error open writer");
return;
}
Thanks!!
UPDATE::
I want to stream video data (H264) through RTSP in Gstreamer.
gst_rtsp_media_factory_set_launch (factory, "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96 ");
I want "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96" this pipeline would also be in C programming in place of direct command.
Actually I have custom pipeline, i want to pass this pipeline to GstRTSPMediaFactory.
With launch i am not able to pass my pipline.
source = gst_element_factory_make("videotestsrc", "test-source");
parse = gst_element_factory_make("x264enc", "parse");
sink = gst_element_factory_make("rtph264pay", "sink");
gst_bin_add_many(GST_BIN(pipeline), source, parse, sink, NULL);
gst_element_link_many(source, parse, sink, NULL);
Now, I want to stream this pipeline using RTSP. I can stream with gst_rtsp_media_factory_set_launch,
But i want to pass only pipeline variable, and has to stream the video.
Can it possible, if so How?
I Modified the rtsp-media-factory.c as follows,
Added GstElement *pipeline in struct _GstRTSPMediaFactoryPrivate.
And the Added two more functions get_pipeline & set pipeline
void
gst_rtsp_media_factory_set_launch_pipeline (GstRTSPMediaFactory * factory, GstElement *pipeline)
{
g_print("PRASANTH :: SET LAUNCH PIPELINE\n");
GstRTSPMediaFactoryPrivate *priv;
g_return_if_fail (GST_IS_RTSP_MEDIA_FACTORY (factory));
g_return_if_fail (pipeline != NULL);
priv = factory->priv;
GST_RTSP_MEDIA_FACTORY_LOCK (factory);
// g_free (priv->launch);
priv->pipeline = pipeline;
Bin = priv->pipeline;
GST_RTSP_MEDIA_FACTORY_UNLOCK (factory);
}
In the Same way get also.
And at last in place of gst_parse_launch in function default_create_element,
added this line
element = priv->pipeline; // priv is of type GstRTSPMediaFactoryPrivate
return element;
but I am not able to receive the data.
When i put pay0 for rtpmp2pay it is working.
But it is working for once only. If Client stops and again starts its not working. To work it, again i am restarting the server.
What is the problem?
** (rtsp_server:4292): CRITICAL **: gst_rtsp_media_new: assertion 'GST_IS_ELEMENT (element)' failed
To have some answer here.
It solves the main problem according to comments discussion, but there is still problem with requesting another stream (when stopping and starting client).
The solution was to add proper name for payloader element as stated in docs:
The pipeline description should contain elements named payN, one for each
stream (ex. pay0, pay1, ...). Also, for increased compatibility each stream
should have a different payload type which can be configured on the payloader.
So this has to be changed to:
sink = gst_element_factory_make("rtph264pay", "pay0");
notice the change in name of element from sink -> pay0.
For the stopping client issue I would check if this works for parse version.
If yes then check if the parse pipeline string (in original source code of rtsp server) is saved anywhere and reused after restart.. you need to debug this.