I cannot figure out an gst-launch invocation to simply play an opus file to pulseaudio. Any help?
Things I've tried
130 % file foo.opus
foo.opus: Ogg data, Opus audio, version 0.1, stereo, 44100 Hz (Input Sample Rate)
0 % gst-launch-1.0 filesrc location=foo.opus ! pulsesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstPulseSink:pulsesink0: The stream is in the wrong format.
1 % gst-launch-1.0 filesrc location=foo.opus ! opusparse ! pulsesink
WARNING: erroneous pipeline: could not link opusparse0 to pulsesink0
1 % gst-launch-1.0 filesrc location=foo.opus ! opusdec ! pulsesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
# this runs, but only makes one burst of noise
0 % gst-launch-1.0 filesrc location=foo.opus ! opusparse ! opusdec ! pulsesink
Here is a working example.
gst-launch-1.0 filesrc location=foo.opus ! oggdemux ! opusparse ! opusdec ! alsasink
I was able to piece this together from the example pipelines here
https://gstreamer.freedesktop.org/documentation/opus/opusdec.html?gi-language=c
and
https://gstreamer.freedesktop.org/documentation/opus/opusenc.html?gi-language=c
Although, in those examples, there is no opusparse. I don't know why opusparse is needed for my files, but it isn't needed for sine.ogg created with opusenc. Or what the difference is between parsing and decoding the opus file data ...
Related
I'm using the following pipeline to listen to a RTSP stream and save a video file:
gst-launch-1.0 -q rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! autovideoconvert ! x264enc pass=5 quantizer=25 speed-preset=6 ! h264parse ! matroskamux ! filesink location=<filename>
But even though I can see the files generated, they lack the duration of the video when playing on VLC.
I can fix it by passing it through ffmpeg later, but I want to generate the video from gstreamer already completely valid. How can I fix this pipeline?
gst-launch-1.0 -e rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! videoconvert ! x264enc ! mp4mux ! filesink location="filename.mp4"
This will create a video with duration shown correctly
I'm trying to convert an mp3 file to wav with gstreamer. Here's the pipeline:
gst-launch-1.0 filesrc location=audio.mp3 ! audio/mpeg ! mpg123audiodec ! wavenc ! filesink location=audio.wav
Also, I'd like the output to be 24 bit/48kHz
I get this error:
ERROR: from element /GstPipeline:pipeline0/GstCapsFilter:capsfilter0: Filter caps do not completely specify the output format
There was another similar thread that I saw here and tried to comment, but I had to have 50 points or whatever;)
I would make use of the bins to make your life easier. I came up with this:
gst-launch-1.0 filesrc location=in.mp3 ! decodebin ! audioresample ! audioconvert ! \
audio/x-raw,format=S24LE,rate=48000 ! wavenc ! filesink location=out.wav
Which gives me this result:
$ file out.wav
out.wav: RIFF (little-endian) data, WAVE audio, Microsoft PCM, 24 bit, stereo 48000 Hz
Initial Issue
I would like to use gstreamer plugin opusdec to decode an Opus bitstream. The final purpose is to make glue around it with appsrc and appsink as input/output, to decode 20 ms Opus packets coming from a RTP packet payload and provide PCM sample.
Remark: I can't use gstreamer rtpopusdepay
The following pipeline works:
gst-launch-1.0 filesrc location=testvector01.bit.opus ! oggdemux !
opusdec ! fakesink
In my final application I'm no expected OGG contained data so I did the following:
1) Desencapsulate Opus bitstream
gst-launch-1.0 filesrc location=testvector01.bit.opus ! oggdemux !
filesink location = testvector01.bit.demux
That works. And then:
2) Decode Opus bitstream
gst-launch-1.0 filesrc location=testvector01.bit.demux ! opusdec !
fakesink
and I have the following error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming task paused, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ..
Input File
testvector01.bit.opus
From Opus test vector :https://people.xiph.org/~greg/opus_testvectors/
My question is:
What is the proper way to use gstreamer plugin opusec without transport container?
Update
Gstreamer version 1.2.4
As recommended I tried to add opusparse after filesrc and got the following error.
Pipeline is PREROLLING ...
(gst-launch-1.0:5147): GStreamer-WARNING **:
gstpad.c:4555:store_sticky_event:<opusparse0:src> Sticky event
misordering, got 'caps' before 'stream-start'
(gst-launch-1.0:5147): GStreamer-WARNING **:
gstpad.c:4555:store_sticky_event:<opusdec0:sink> Sticky event
misordering, got 'caps' before 'stream-start' Pipeline is PREROLLED
... Setting pipeline to PLAYING ... New clock: GstAudioSinkClock
ERROR: from element /GstPipeline:pipeline0/GstOpusDec:opusdec0:
Decoding error: -4 Additional debug info: gstopusdec.c(460):
opus_dec_chain_parse_data ():
/GstPipeline:pipeline0/GstOpusDec:opusdec0 Execution ended after
0:00:00.063372478 Setting pipeline to PAUSED ... Setting pipeline to
READY ... Setting pipeline to NULL ... Freeing pipeline ...
GStreamer 1.8.1
The following pipeline
gst-launch-1.0 filesrc location = testvector01.bit.demux ! opusparse !
opusdec ! audioconvert ! alsasink
halt here:
Setting pipeline to PAUSED ... Pipeline is PREROLLING ...
Gstreamer 1.13.1
gst-launch-1.0 filesrc location = testvector01.bit.demux ! opusparse !
opusdec ! alsasink
Playback just produce a short audio glitch while no gstreamer error is raised.
gst-launch-1.0 filesrc location = testvector01.bit.opus ! oggdemux ! opusparse !
opusdec ! alsasink
Playback is choppy while no gstreamer error is raised.
Regards,
appsrc is-live=true do-timestamp=true name=audiosrc ! opusparse ! oggmux ! filesink location=test.ogg
gstreamer 1.14.1 works fine
You need to have a parser (opusparse) in between as opusdec doesn’t know what format it is, try the following pipeline:
gst-launch-1.0 filesrc location=testvector01.bit.demux ! opusparse !
opusdec ! fakesink dump=true
I'm trying to get UPNP streaming to work. Rygel runs fine, however, all I get is a mono stream, even if the input is stereo. Doing some debugging, I replicated Rygel's gstreamer pipeline with
gst-launch-1.0 pulsesrc device=upnp.monitor num-buffers=100 ! audioconvert ! lamemp3enc target=quality quality=6 ! filesink location=test.mp3
where the problem is also apparent:
mp3info -x test.mp3
...
Media Type: MPEG 1.0 Layer III
Audio: Variable kbps, 44 kHz (mono)
...
Where does this pipeline lose the second channel? How can I debug this?
You never ask for stereo:
gst-launch-1.0 pulsesrc device=upnp.monitor num-buffers=100 ! "audio/x-raw,channels=2" ! audioconvert ! lamemp3enc target=quality quality=6 ! filesink location=test.mp3
Add a -v to the launch-line to see all the caps negotiated on all pads of the pipeline. Look for "channels" and see where it goes from 2 to 1.
I used the following GStreamer pipeline to store my encoded stream in a binary file:
gst-launch v4l2src ! videorate ! video/x-raw-yuv, framerate=\(fraction\)10/1 \
! videoscale ! video/x-raw-yuv, format=\(fourcc\)I420, width=640, height=480\
! ffmpegcolorspace ! x264enc ! fdsink > vid.bin
Now i want to play previously recorded files in GStreamer using the following pipeline:
cat vid.bin | gst-launch fdsrc ! ffdec_h264 ! autovideosink
But then it gives the following error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstffmpegdec.c(2804): gst_ffmpegdec_chain (): /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640:
ffdec_h264: input format was not set before data start
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
I know that the best way to capture video is using Muxers but is there any way to play my previous files?
Thanks
Not sure your pipeline is right.
If you want to write to a file why not simply use filesink and filesrc.
fdsink > vid.bin will not work fine because if you see the prints by gstreamer gst-launch will also go into the file. [Just open vid.bin in an text editor and you will see what I mean].
Also for x264 stream to be stored without a muxer you need to use byte-stream=1 in your x264enc to store it in annexb format so that it is decodable.
To play back raw x264 stream you need to have a color space convertor before the video sink
gst-launch filesrc location=inputfile ! legacyh264parse ! ffdec_h264 ! queue ! ffmpegcolorspace ! autovideosink
plays just fine here at my end
Or, to playback a raw h264 file with gstreamer 1.0:
gst-launch-1.0 filesrc location=/tmp/video.h264 ! h264parse ! avdec_h264 ! autovideosink