I'm creating a player in Python-GStreamer, on a pretty dated GStreamer 0.10.32, like this:
import pygst
pygst.require("0.10")
import gst
import gobject
self.__player = gst.parse_launch(
'filesrc name="source" location="/file/here.mp3" '
'! audio/mpeg, mpegversion=1, layer=3 '
'! ffdec_mp3 '
'! audioconvert ! audioresample ! volume name="vol" '
'! alsasink name="sink" sync=false')
It works fine, but I never get a tags message from the player's bus. I do need id3 tags.
So I'm replacing caps filter (audio/mpeg, mpegversion=1, layer=3) with id3demux, and an error appears on certain MP3s: "streaming task paused, reason not-linked (-1)".
Putting identity or queue in front and linking to them doesn't help with id3demux.
For some reason, mad element is not available on my platform.
Why won't my second replacement work, or is there another way to get id3 tags from the stream?
EDIT: Apparently, this is caused by specific files. No idea yet what is so specific about those MP3s. This also happens when I simply test the pipeline with gst-launch.
With GST_DEBUG=2, I'm getting:
0:00:00.046048767 32720 0x22388a0 WARN tagdemux gsttagdemux.c:680:gst_tag_demux_chain:<id3demux0> Downstream did not handle newsegment event as it should
0:00:00.046096615 32720 0x22388a0 WARN basesrc gstbasesrc.c:2625:gst_base_src_loop:<source> error: Internal data flow error.
0:00:00.046106087 32720 0x22388a0 WARN basesrc gstbasesrc.c:2625:gst_base_src_loop:<source> error: streaming task paused, reason not-linked (-1)
Replacing id3demux with a caps filter back helps, but I never get the tags then.
I ended up resorting to playbin2. It manages to build a working pipelines that do send tags message somehow.
Related
I am running below pipeline on mac but it shows error while running:
$**gst-launch-1.0 osxaudiosrc device=92**
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:00.048601000 16777 0x7fafe585d980 WARN osxaudio gstosxcoreaudio.c:500:gst_core_audio_asbd_to_caps: No sample rate
0:00:00.048699000 16777 0x7fafe585d980 ERROR audio-info audio-info.c:304:gboolean gst_audio_info_from_caps(GstAudioInfo *, const GstCaps *): no channel-mask property given
0:00:00.048736000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: Internal data stream error.
0:00:00.048744000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: streaming stopped, reason not-negotiated (-4)
New clock: GstAudioSrcClock
**ERROR: from element /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0: Internal data stream error.**
Additional debug info:
gstbasesrc.c(3072): void gst_base_src_loop(GstPad *) ():
/GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000101000
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
The device id mentioned in the cmd is fetched from gst-inspect and is of macbook speakers. I am using GStreamer 1.16.2 on catalina.
What is wrong/missing in this pipeline?
TL;DR: you have an incomplete pipeline.
Once the osxaudiosrc starts producing buffers, where is it supposed to go? Do you want to encode it and/or write it to file? Should it be streamed somewhere? Should it be plotted? ...
This is also the reason GStreamer is erroring out. There's no element after your source element, so if it were to start playing, those buffers would somehow end up in the void, with no destination to go to (to be a bit more thorough: you're trying to push data on a pad which has no peer, so it would try to dereference an invalid sinkpad). Since this is not possibe, GStreamer just plainly stops.
An example pipeline is given in the osxaudiosrc documentation:
gst-launch-1.0 osxaudiosrc ! wavenc ! filesink location=audio.wav
I am trying to use RidgeRun's Interpipe plugin. When the input video is I420, it works fine, but negotiation fails with RGB video.
Below the gst-launch commands I use in my app.
"videotestsrc ! video/x-raw,format=xRGB ! interpipesink name=camera sync=false"
"interpipesrc name=display listen-to=camera accept-events=false accept-eos-event=false enable-sync=false allow-renegotiation=false ! autovideosink sync=false async=false"
And the error I get:
ERROR: from element /GstPipeline:pipeline1/GstInterPipeSrc:display: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline1/GstInterPipeSrc:display:
streaming stopped, reason not-negotiated (-4)
WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<display1> error: Internal data stream error.
WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<display1> error: streaming stopped, reason not-negotiated (-4)
Is RGB not supported? How do I find out what formats are supported?
I get the same error from the intervideosrc/sink plugin, and was hoping the interpipe plugin would not have this limitation.
If this is so, what other solution do I have?
Can someone provide pointers as to how to add support for additional formats (I've started looking at the source code for interpipe, but any pointer would be appreciated).
I'm designing a program to stream an icecast server (radio.clarkson.edu). Ultimately it will be written in Python3, but for now I'm using gst-launch to test the pipeline. I've been working on Debian Jessie and using gstreamer-1.0. Using a file on Wikimedia, I was able to play pretty easily using:
url=https://upload.wikimedia.org/wikipedia/commons/0/0c/Muriel-Nguyen-Xuan-Korsakov-Flight-of-the-bumblebee.flac.oga
gst-launch-1.0 -v souphttpsrc location =$url ! decodebin ! audioconvert ! audioresample ! alsasink
Running the same commands with my stream, I get the output:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = text/uri-list
Missing element: text/uri-list decoder
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gstdecodebin2.c(3977): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
no suitable plugins found
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = "NULL"
Freeing pipeline ...
I have tried too many other pipelines to put on one post, but I can answer any other questions.
Thank you
By now you probably have solved that problem, but still here's an idea: text/uri-list indicates that you didn't hand an actual stream to gstreamer, but rather a (textual) playlist that contains stream addresses. I guess gstreamer can't handle those, hence you need to parse them beforehand and then hand an actual audio stream address to it.
I am new using g streamer, and i try to use the emit-stats properties in tsdemux
How can I do to use this in my pipeline?
I'm trying to get the program clock reference value of a signal transport stream but no way to get it.
Properties in GStreamer are normally accessed by using the normal GLib APIs : g_object_set and g_object_get. Doing g_object_set (v1_demux, "emit-stats", TRUE, NULL);, supposing that v1_demux is a GstTSDemux*, will start emitting messages containing the PTS and DTS of the packets that flow into the demuxer.
Element messages in GStreamer are emitted by gst_element_post_message. In order to receive them in your application, it needs to set up a bus watch on the main pipeline's GstBus.
Just for the record, you can test how the property works and see the content of the messages by running this example pipeline in gst-launch :
gst-launch-1.0 -m filesrc location="$YOUR_TRANSPORT_STREAM" ! tsdemux emit-stats=1 ! fakesink
Running this with one of the transport streams on my HDD, I can see messages with the PTS and DTS being emitted from the demuxer element :
Got message #77 from element "tsdemux0" (element): tsdemux, pid=(uint)1803, offset=(guint64)266020, pts=(guint64)8429319339;
Got message #78 from element "tsdemux0" (element): tsdemux, pid=(uint)1805, offset=(guint64)273540, pts=(guint64)8429311261;
Got message #79 from element "tsdemux0" (element): tsdemux, pid=(uint)1802, offset=(guint64)282564, dts=(guint64)8429444461;
However, it doesn't look like PCR and OPCR values are emitted. You'll have to add this functionality yourself.
Thanks for the info.
was test commands and see the script and check the values, but is costing me add messages emit-stats in my line.
If I created a bus watch on the main pipeline's GstBus,to see the video duration and as playtime in my line, but can not see messages stats and video simultaneously. I still investigating comohacerlo as storing information pts and dts in some way.
My idea is get stamps of two videos and subtract this to calculate an automatic offset in one video.
I'm having difficulties in retrieving rtsp stream from a specific camera, because the rtp payload type the camera is providing is 35 (unassigned) and payload types accepted by the rtph264depay plugin are in range [96-127]. The result is that gstreamer displays ann error like:
<udpsrc0> error: Internal data flow error.
<udpsrc0> error: streaming task paused, reason not-linked (-1)
Other cameras that I have tested are working because they define a good payload type.
FFmpeg, MPlayer and other tools play the stream, although they may display a warning for the unknown type, for instance in Mplayer:
rtsp_session: unsupported RTSP server. Server type is 'unknown'
Is there any way in gstreamer to fake the payload type, ignore the mismatching property, force linking between the plugins or otherwise create a workaroud to my problem?
Pipeline I am using is:
gst-launcg-0.10 rtspsrc location="..." ! rtph264depay ! capsfilter caps="video/x-h264,width=1920,height=1080,framerate=(fraction)25/1" ! h264parse ! matroskamux ! filesink location="test.mkv"
I figured it out and got it working. Posting an answer here in hope that it might benefit someone. There are multiple similar questions out there, but they lack proper answers.
Following does the trick:
GstElement* depay = gst_element_factory_make("rtph264depay", "video_demux");
assert(depay);
GstPad* depay_sink = gst_element_get_static_pad(depay, "sink");
GstCaps* depay_sink_caps = gst_caps_new_simple("application/x-rtp",
"media", G_TYPE_STRING, "video",
"encoding-name", G_TYPE_STRING, "H264",
NULL);
gst_pad_use_fixed_caps(depay_sink);
gst_pad_set_caps(depay_sink, depay_sink_caps);
gst_object_unref(depay_sink);
it overrides the rtph264depay plugin's sink pad caps to be less restrictive, now it accepts any payload type (and any clock-rate) as long as it is rtp and has H.264 encoding.
I don't think this is possible with gst-launch.
There is a select-stream signal in rtspsrc module documented here http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/gst-plugins-good-plugins-rtspsrc.html#GstRTSPSrc-select-stream
it's a callback where you check the stream and if you return true, gstreamer will SETUP and PLAY the stream, if you return false it will ignore it, this should let you ignore the unsupported stream, in my case I'm having trouble with ONVIF metadata stream, it always tries to play it and there is no parser for it, I really wish gstreamer will just ignore the streams that can't play and work with what it has or at least a flag to toggle that behaviour.