Gstreamer 1.0 error: Missing element: HTTP protocol source - gstreamer

I have installed gstreamer 1.0 with base, good, ugly and bad plugin. When I run following command using:
gst-launch-1.0 playbin uri=http://-somr url to video src-
It gives me following error:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
Missing element: HTTP protocol source
ERROR: from element /GstURIDecodeBin:uridecodebin0: No URI handler implemented for "http".
Additional debug info:
gsturidecodebin.c(1416): gen_source_element (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0
Setting pipeline to NULL ...
Freeing pipeline ...
I am new to this and found most solution for gstreamer-0.10 which doesnot work for me. btw I am using ubuntu 14.04.
Badly looking for help.
Thanks in advance.

Missing element: HTTP protocol source
And
no_source:
{
/* whoops, could not create the source element, dig a little deeper to
* figure out what might be wrong. */
if (err != NULL && err->code == GST_URI_ERROR_UNSUPPORTED_PROTOCOL) {
gchar *prot;
prot = gst_uri_get_protocol (decoder->uri);
if (prot == NULL)
goto invalid_uri;
gst_element_post_message (GST_ELEMENT_CAST (decoder),
gst_missing_uri_source_message_new (GST_ELEMENT (decoder), prot));
GST_ELEMENT_ERROR (decoder, CORE, MISSING_PLUGIN,
(_("No URI handler implemented for \"%s\"."), prot), (NULL));
g_free (prot);
} else {
GST_ELEMENT_ERROR (decoder, RESOURCE, NOT_FOUND,
("%s", (err) ? err->message : "URI was not accepted by any element"),
("No element accepted URI '%s'", decoder->uri));
}
g_clear_error (&err);
return NULL;
}
Mean your gstreamer install has problem, you miss http client source plugin. Try gst-inspect-1.0 souphttpsrc? If no result, try to re-install plugin-good.

Related

Removing RTP Extensions in GStreamer

In a gstreamer (v1.17.0) application coded in C++, I have a stream of MPEG-TS/rtp with a specific-purpose rtp extension. I guess that rtpmp2tdepay has a problem with rtp extensions, as when I bypass the part of the chain adding rtp extensions, everything seems OK. So I decided to remove the rtp extension before rtpmp2tdepay element.
But as there is no remove_extension kind of thing in GstRtpBuffer, I have encountered problems with that.
This is my code, which results in segmentation faults:
GstRTPBuffer rtp = GST_RTP_BUFFER_INIT;
GstBuffer *buf = gst_buffer_make_writable(inputbuf);
gst_rtp_buffer_map(buf, GST_MAP_READWRITE, &rtp);
gst_rtp_buffer_set_extension(&rtp, 0);
if ((&rtp)->map[1].memory != NULL)
{
gst_buffer_unmap(buf, &((&rtp)->map[1]));
}
gst_rtp_buffer_unmap(&rtp);
return buf;
The error I encouter is this:
(receiver:1903): GStreamer-CRITICAL **: 13:19:43.409: gst_mini_object_unlock: assertion '(state & access_mode) == access_mode' failed
(receiver:1903): GStreamer-WARNING **: 13:19:43.409: free_priv_data: object finalizing but still has parent (object:0x7f4ea800d000, parent:0x7f4eb0107d80)
and later on:
(receiver:1903): GStreamer-CRITICAL **: 13:19:43.409: gst_mini_object_lock: assertion 'GST_MINI_OBJECT_IS_LOCKABLE (object)' failed
What is wrong with this code? Is there a better way to do this?

FFmpeg - RTCP BYE packets

I’m working on some C++ project which depends on Wi-Fi RAK5206 electronic board. I’m using ffmpeg library to obtain video and audio stream and I have issue where I can start and stop stream for four times, but when I want to start for the fifth time I get error. Error description is Invalid data found when processing input and it happens when I call avformat_open_input function and I need to restart the electronic board, reconnect to Wi-Fi etc.
I figured out with Wireshark application that VLC is working and it is sending some BYE packets when TEARDOWN is called. I wonder if error depends to them, because from my application I’m not sending. How I can make setup to force ffmpeg to send BYE packets?
I found some declarations in rtpenc.h file which options to set and tried when I want to connect, but obviously without success.
The code that I used for setting options and opening input:
AVDictionary* stream_opts = 0;
av_dict_set(&stream_opts, "rtpflags", "send_bye", 0);
avformat_open_input(&format_ctx, url.c_str(), NULL, &stream_opts);
Make sure you are calling this av_write_trailer function, from your application.
if not please debug and check it.
/* Write the trailer, if any. The trailer must be written before you
* close the CodecContexts open when you wrote the header; otherwise
* av_write_trailer() may try to use memory that was freed on
* av_codec_close(). */
av_write_trailer(oc);
function Call flow code snippet from ffmpeg source:
av_write_trailer ->
....
ret = s->oformat->write_trailer(s);
} else {
s->oformat->write_trailer(s);
}
...
.write_trailer = rtp_write_trailer ->
...
if (s1->pb && (s->flags & FF_RTP_FLAG_SEND_BYE))
rtcp_send_sr(s1, ff_ntp_time(), 1)
Resolved issue with adding flag 16 (binary: 10000) to AVFormatContext object's flag.
formatCtx->flags = formatCtx->flags | 16;
According to rtpenc.h:
#define FF_RTP_FLAG_SEND_BYE 16

Gstreamer: why can't I send data over UDP on localhost?

I'm trying to test udp streaming on localhost but it's not showing anything:
videotestsrc (or audiotestsrc) -> udpsink (port: 5078, host: 127.0.0.1)
Here is the code:
console_out_inf("TESTING", "Starting work with test elements");
gint port = 5078;
// TEST PIPELINE OUT
gst_bin_add_many(GST_BIN(GSD->pipetest_out), GSD->testsrc, GSD->udpsink, NULL);
gchar* host = "127.0.0.1";
g_object_set(GSD->udpsink, "port", port, NULL);
g_object_set(GSD->udpsink, "host", host, NULL);
if (!gst_element_link(GSD->testsrc, GSD->udpsink))
console_out_bad("STREAMING", "Error linking test udp elements -- SEND");
else
console_out_yes("STREAMING", "Correctly linked test udp elements -- SEND");
// TEST PIPELINE IN
gst_bin_add_many(GST_BIN(GSD->pipetest_in), GSD->udpsrc, GSD->autovideosink, NULL);
gst_element_set_state(GSD->udpsrc, GST_STATE_NULL);
g_object_set(GSD->udpsrc, "port", port, NULL);
if (!gst_element_link(GSD->udpsrc, GSD->autovideosink))
console_out_bad("STREAMING", "Error linking test udp elements -- RECEIVE");
else
console_out_yes("STREAMING", "Correctly linked test udp elements -- RECEIVE");
// PLAY TEST PIPELINE OUT
GstStateChangeReturn ret1;
ret1 = gst_element_set_state(GSD->pipetest_out, GST_STATE_PLAYING);
if (ret1 == GST_STATE_CHANGE_FAILURE)
console_out_bad("TESTING", "Failed playing pipetest out");
else
console_out_yes("TESTING", "Correctly played pipetest out");
// PLAY TEST PIPELINE IN
GstStateChangeReturn ret2;
ret2 = gst_element_set_state(GSD->pipetest_in, GST_STATE_PLAYING);
if (ret2 == GST_STATE_CHANGE_FAILURE)
console_out_bad("TESTING", "Failed playing pipetest in");
else
console_out_yes("TESTING", "Correctly played pipetest in");
// PRINT PIPELINES
GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(GSD->pipetest_out), GST_DEBUG_GRAPH_SHOW_ALL, "pipetest_out");
GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(GSD->pipetest_in), GST_DEBUG_GRAPH_SHOW_ALL, "pipetest_in");
This is "my own console output":
EDIT: not relevant anymore! Everything is instanciated fine, the pipeline was built correctly, yet with
PIPELINE OUT: videotestsrc --> udpsink (host:127.0.0.1, port: 5078)
PIPELINE IN: udpsrc (port: 5078) --> autovideosink
The autovideosink does not display anything!
By checking netstat -a, no connection on such port is showed.
Additional INFO:
The graph generated with "gstreamer debugging" contains of course only the video/audio testsrc element connected to udpsink.
The first time I run that code, the "Windows Firewall Window" appeared, so I guess something is being sent/received.
This is inside a Visual Studio 2013/Qt5 Add-In Project, but that should not be an issue
Does anyone know what am I doing wrong?
This code seems fine but it doesn't relate to the console output you posted.
Try testing your pipeline piece by piece with the command line gst-launch:
gst-launch-1.0 -e -v videotestsrc ! udpsink host="127.0.01"
connected to a fakesink first and then swapping in the udp sink, once you have it working in the command line mirror the command in code.
try it with host="localhost" or host="192.168.0.1" i can't remember but i think udpsink might have trouble sending to the loopback

Data Transfer through RTSP in Gstreamer

UPDATE::
I want to stream video data (H264) through RTSP in Gstreamer.
gst_rtsp_media_factory_set_launch (factory, "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96 ");
I want "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96" this pipeline would also be in C programming in place of direct command.
Actually I have custom pipeline, i want to pass this pipeline to GstRTSPMediaFactory.
With launch i am not able to pass my pipline.
source = gst_element_factory_make("videotestsrc", "test-source");
parse = gst_element_factory_make("x264enc", "parse");
sink = gst_element_factory_make("rtph264pay", "sink");
gst_bin_add_many(GST_BIN(pipeline), source, parse, sink, NULL);
gst_element_link_many(source, parse, sink, NULL);
Now, I want to stream this pipeline using RTSP. I can stream with gst_rtsp_media_factory_set_launch,
But i want to pass only pipeline variable, and has to stream the video.
Can it possible, if so How?
I Modified the rtsp-media-factory.c as follows,
Added GstElement *pipeline in struct _GstRTSPMediaFactoryPrivate.
And the Added two more functions get_pipeline & set pipeline
void
gst_rtsp_media_factory_set_launch_pipeline (GstRTSPMediaFactory * factory, GstElement *pipeline)
{
g_print("PRASANTH :: SET LAUNCH PIPELINE\n");
GstRTSPMediaFactoryPrivate *priv;
g_return_if_fail (GST_IS_RTSP_MEDIA_FACTORY (factory));
g_return_if_fail (pipeline != NULL);
priv = factory->priv;
GST_RTSP_MEDIA_FACTORY_LOCK (factory);
// g_free (priv->launch);
priv->pipeline = pipeline;
Bin = priv->pipeline;
GST_RTSP_MEDIA_FACTORY_UNLOCK (factory);
}
In the Same way get also.
And at last in place of gst_parse_launch in function default_create_element,
added this line
element = priv->pipeline; // priv is of type GstRTSPMediaFactoryPrivate
return element;
but I am not able to receive the data.
When i put pay0 for rtpmp2pay it is working.
But it is working for once only. If Client stops and again starts its not working. To work it, again i am restarting the server.
What is the problem?
** (rtsp_server:4292): CRITICAL **: gst_rtsp_media_new: assertion 'GST_IS_ELEMENT (element)' failed
To have some answer here.
It solves the main problem according to comments discussion, but there is still problem with requesting another stream (when stopping and starting client).
The solution was to add proper name for payloader element as stated in docs:
The pipeline description should contain elements named payN, one for each
stream (ex. pay0, pay1, ...). Also, for increased compatibility each stream
should have a different payload type which can be configured on the payloader.
So this has to be changed to:
sink = gst_element_factory_make("rtph264pay", "pay0");
notice the change in name of element from sink -> pay0.
For the stopping client issue I would check if this works for parse version.
If yes then check if the parse pipeline string (in original source code of rtsp server) is saved anywhere and reused after restart.. you need to debug this.

Set rtsp_flags to listen ffmpeg in c code

I am trying to create a client client server application to stream and then receive video using rtsp using ffmpeg libraries. I am done with the client part which is streaming the video and i can receive the video on ffplay using following command
ffplay -rtsp_flags listen rtsp://127.0.0.1:8556/live.sdp
My problem is that i want receive the video in a c code and i need to set rtsp_flags option in it. Can anyone plz help??
P.S. i cannot use ffserver because i am working on windows and ffserver is not available for windows as far as i knw
You need to add the option when opening the stream:
AVDictionary *d = NULL; // "create" an empty dictionary
av_dict_set(&d, "rtsp_flags", "listen", 0); // add an entry
//open rtsp
if ( avformat_open_input( &ifcx, sFileInput, NULL, &d) != 0 ) {
printf( "ERROR: Cannot open input file\n" );
return EXIT_FAILURE;
}