Gstreamer pipeline fails to run with osxaudiosrc plugin - gstreamer

I am running below pipeline on mac but it shows error while running:
$**gst-launch-1.0 osxaudiosrc device=92**
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:00.048601000 16777 0x7fafe585d980 WARN osxaudio gstosxcoreaudio.c:500:gst_core_audio_asbd_to_caps: No sample rate
0:00:00.048699000 16777 0x7fafe585d980 ERROR audio-info audio-info.c:304:gboolean gst_audio_info_from_caps(GstAudioInfo *, const GstCaps *): no channel-mask property given
0:00:00.048736000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: Internal data stream error.
0:00:00.048744000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: streaming stopped, reason not-negotiated (-4)
New clock: GstAudioSrcClock
**ERROR: from element /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0: Internal data stream error.**
Additional debug info:
gstbasesrc.c(3072): void gst_base_src_loop(GstPad *) ():
/GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000101000
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
The device id mentioned in the cmd is fetched from gst-inspect and is of macbook speakers. I am using GStreamer 1.16.2 on catalina.
What is wrong/missing in this pipeline?

TL;DR: you have an incomplete pipeline.
Once the osxaudiosrc starts producing buffers, where is it supposed to go? Do you want to encode it and/or write it to file? Should it be streamed somewhere? Should it be plotted? ...
This is also the reason GStreamer is erroring out. There's no element after your source element, so if it were to start playing, those buffers would somehow end up in the void, with no destination to go to (to be a bit more thorough: you're trying to push data on a pad which has no peer, so it would try to dereference an invalid sinkpad). Since this is not possibe, GStreamer just plainly stops.
An example pipeline is given in the osxaudiosrc documentation:
gst-launch-1.0 osxaudiosrc ! wavenc ! filesink location=audio.wav

Related

errors when run ximagesink/xvimagesink on imx8mm board

I was trying gstreamer with xvimagesink/ximagesink on imx8mm board, and gstvideooverlay program. The error information are listed:
gst-launch-1.0 videotestsrc ! videoconvert ! ximagesink
Setting pipeline to PAUSED ...
ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could not initialise X output
Additional debug info:
../sys/ximage/ximagesink.c(867): gst_x_image_sink_xcontext_get (): /GstPipeline:pipeline0/GstXImageSink:ximagesink0:
Could not open display
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
./overlay
error: XDG_RUNTIME_DIR not set in the environment.
error: XDG_RUNTIME_DIR not set in the environment.
QMetaObject::invokeMethod: No such method MainWindow::quit()
Any suggestions about how to debug? Thanks for your reply.
BR
GUO

Gstreamer: moov-recovery info with qtmux?

I am trying to experiment with Gstreamer and moov-recovery via qtmux.
When I try to get the recovery moov from a non-corrupted .mp4 file
gst-launch-1.0 filesrc location=full.mp4 ! qtdemux ! qtmux moov-recovery-file=moov_recov.mrf ! filesink location=recovered_video.mp4
then I get
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.112361582
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
What is the reason for the Got EOS from element "pipeline0"?
And what would be the correct way to pull the recovery moov from the .mp4 file?
Thanks.
Your muxing process was a success. It took about a tenth of a second. Therefore the EOS. Since it did not crash or anything the file probaly gets removed after a successful muxing. There is no point in keeping that file.

GStreamer plugins 'interpipe' (Ridgerun) and intervideo do not handle RGB format

I am trying to use RidgeRun's Interpipe plugin. When the input video is I420, it works fine, but negotiation fails with RGB video.
Below the gst-launch commands I use in my app.
"videotestsrc ! video/x-raw,format=xRGB ! interpipesink name=camera sync=false"
"interpipesrc name=display listen-to=camera accept-events=false accept-eos-event=false enable-sync=false allow-renegotiation=false ! autovideosink sync=false async=false"
And the error I get:
ERROR: from element /GstPipeline:pipeline1/GstInterPipeSrc:display: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline1/GstInterPipeSrc:display:
streaming stopped, reason not-negotiated (-4)
WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<display1> error: Internal data stream error.
WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<display1> error: streaming stopped, reason not-negotiated (-4)
Is RGB not supported? How do I find out what formats are supported?
I get the same error from the intervideosrc/sink plugin, and was hoping the interpipe plugin would not have this limitation.
If this is so, what other solution do I have?
Can someone provide pointers as to how to add support for additional formats (I've started looking at the source code for interpipe, but any pointer would be appreciated).

which gstreamer rtp payloader element should I use to wrap AAC audio?

I am trying to figure out the proper gstreamer element to use to transmit AAC audio over RTP.
By dumping the dot graph of a playbin on the file I can conclude that the caps coming out of the tsdemux is audio/mpeg,mpegversion:2,stream-format:adts .
If I use the following pipeline
gst-launch-1.0 -v filesrc location=$BA ! tsdemux ! audio/mpeg ! rtpmpapay ! filesink location=/tmp/test.rtp
it fails:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1: caps = audio/mpeg
WARNING: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Delayed linking failed.
Additional debug info:
/var/tmp/portage/media-libs/gstreamer-1.12.3/work/gstreamer-1.12.3/gst/parse/grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
failed delayed linking some pad of GstTSDemux named tsdemux0 to some pad of GstRtpMPAPay named rtpmpapay0
ERROR: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Internal data stream error.
Additional debug info:
/var/tmp/portage/media-libs/gst-plugins-bad-1.12.3/work/gst-plugins-bad-1.12.3/gst/mpegtsdemux/mpegtsbase.c(1613): mpegts_base_loop (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Which gstreamer element should I be using to wrap AAC audio in an RTP packet?
I guess its rtpmp4apay: RTP MPEG4 audio payloader. Maybe you want/need aacparse before the payloader as well.

Stream Icecast using Gstreamer

I'm designing a program to stream an icecast server (radio.clarkson.edu). Ultimately it will be written in Python3, but for now I'm using gst-launch to test the pipeline. I've been working on Debian Jessie and using gstreamer-1.0. Using a file on Wikimedia, I was able to play pretty easily using:
url=https://upload.wikimedia.org/wikipedia/commons/0/0c/Muriel-Nguyen-Xuan-Korsakov-Flight-of-the-bumblebee.flac.oga
gst-launch-1.0 -v souphttpsrc location =$url ! decodebin ! audioconvert ! audioresample ! alsasink
Running the same commands with my stream, I get the output:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = text/uri-list
Missing element: text/uri-list decoder
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gstdecodebin2.c(3977): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
no suitable plugins found
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = "NULL"
Freeing pipeline ...
I have tried too many other pipelines to put on one post, but I can answer any other questions.
Thank you
By now you probably have solved that problem, but still here's an idea: text/uri-list indicates that you didn't hand an actual stream to gstreamer, but rather a (textual) playlist that contains stream addresses. I guess gstreamer can't handle those, hence you need to parse them beforehand and then hand an actual audio stream address to it.