Using gst-launch to streaming video? - gstreamer

I want to stream a media file (video or audio). I used command:
gst-launch-0.10 filesrc location="/home/ms/GStreamerTest/test.ogg" ! vorbisenc \
! rtpvorbispay pt=96 ! udpsink host=127.0.0.1 port=5000
to stream the file test.ogg, but, I got an error:
"ERROR: from element /GstPipeline:pipeline0/GstVorbisEnc:vorbisenc0: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvorbisenc.c(1227): gst_vorbis_enc_chain (): /GstPipeline:pipeline0/GstVorbisEnc:vorbisenc0:
encoder not initialized (input is not audio?)
ERROR: pipeline doesn't want to preroll.
Please help me solve this problem, thanks.

You plugged an encoded and muxed bitstream into an audio encoder. That cannot possibly work.
In your case filesrc ! udpsink would send your file across the network and on the other side you have to receive it udpsrc, demux it oggdemux, decode it theoradec or vorbisdec, and pipe it into a sink autovideosink or autoaudiosink

Related

How can I save an GStreamer RTSP stream of unknown type to a file

I'm using this Gstreamer pipeline to send an RTSP stream of a camera.
./gst-rtsp-launch --port 8554 "( v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=640,height=480 ! rtpvrawpay name=pay0 pt=96 )"
I want to using playbin, so I don't need to specify the type of video from the rtsp stream. If I use this pipeline, I can get a single image from the camera:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="jpegenc ! filesink location=capture1.jpeg"
But if I try this pipeline, to save as a file:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
I get this error:
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1: Internal data stream error.
Additional information for debugging:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.094968753
Defining the processing queue to NULL...
Freeing the processing queue...
Note: I had translate the last two lines.
Is there a problem in the pipeline I'm using to save the stream as a file?
I copied your pipeline and in worked.
I only added -e after gst-launch-1.0, so that GStreamer writes all necessary information to the file when ctrl-C is pressed:
gst-launch-1.0 -e playbin uri=(rtsp url) video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
Maybe you are using an older version of GStreamer?
I am using GStreamer 1.20.0

Storing AAC Audio and Retrieving

I would like to store a file which has AAC audio frames,
For that i used the below pipeline,
gst-launch-1.0 filesrc location=Test_44100Hz_2ch_s16le.wav ! "audio/x-raw,rate=44100,format=s16le,channels=2" ! audioparse format=raw raw-format=s16le rate=44100 channels=2 ! faac ! aacparse ! queue ! filesink location=a1
While reading that file again to pulsesink using below pipeline,
gst-launch-1.0 filesrc location=a1 ! aacparse ! faad ! audioconvert ! audioresample ! pulsesink
I am Receiving below error, I used GST_DEBUG=3, but i am not able find the solution.
0:00:00.031924804 3379 0x2231d60 WARN basesrc gstbasesrc.c:3483:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
0:00:00.033044700 3379 0x2231050 WARN baseparse gstbaseparse.c:3255:gst_base_parse_loop:<aacparse0> error: No valid frames found before end of stream
ERROR: from element /GstPipeline:pipeline0/GstAacParse:aacparse0: No valid frames found before end of stream
Additional debug info:
gstbaseparse.c(3255): gst_base_parse_loop (): /GstPipeline:pipeline0/GstAacParse:aacparse0
ERROR: pipeline doesn't want to preroll.
Can anybody help me, To solve this? I need to store AAC audio frames and need to stream that file as AAC audio stream.
This is it, tested working:
gst-launch-1.0 filesrc location=WAV_44_16bit.wav ! decodebin ! audioconvert ! queue ! voaacenc ! aacparse ! queue ! mp4mux ! filesink location=aac.mp4
gst-launch-1.0 filesrc location=aac.mp4 ! decodebin ! audioconvert ! audioresample ! alsasink
In container there are metadata information stored.. without them the decoder does not know how to process the data.
AAC Audio streams require a container in order to be useful within gstreamer
For decoder initialization it is necessary to know sampling frequency and Audio Object. In gstreamer we are unable to pass this metadata directly to the parser or the decoder. The parser collects this data instead from the mp4 header then the encoder inherits the frame structure/size and sample rate. So this is a deficiency in either aacparse(parser) or avdec_aac/faad(decoder), none of which have exposed parameters to specify frame size of a raw file, the afore mentioned metadata. That being said, I haven't found a compelling reason why anyone would need to do this. I found myself trying to do it before I discovered the aac simply needed to be muxed into an MP4(mp4mux) or another container to work and be portable. The container/framing only adds a small amount of data to the stream.

Gstreamer: Could not swtich codebooks: rtpvorbisdepay

I am trying to stream audio with the following GStreamer pipeline:
Server:
gst-launch-1.0 -v audiotestsrc ! audioconvert ! vorbisenc ! rtpvorbispay ! udpsink host=127.0.0.1 port=5000
Client:
gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp, media=audio, clock-rate=44100, encoding-name=VORBIS, encoding-params=1, payload=96" ! rtpvorbisdepay ! vorbisdec ! audioconvert ! autoaudiosink
I get the following message from GStreamer:
WARNING: from element /GstPipeline:pipeline0/GstRtpVorbisDepay:rtpvorbisdepay0: Could not decode stream.
Additional debug info: gstrtpvorbisdepay.c(614): gst_rtp_vorbis_depay_process (): /GstPipeline:pipeline 0/GstRtpVorbisDepay:rtpvorbisdepay0: Could not switch codebooks
And I don't get any sound on the client. Can anyone help?
[EDIT:]
When I copy-paste the caps from the server side... It works! But among those caps there is a configuration parameter which looks really ugly (link here). I noticed that if I just delete this parameter it doesn't work anymore. Moreover I used gst-inspect on udpsrc and rtpvorbisdepay elements and there is nothing about this parameter. Can someone explain me what this parameter corresponds to? Is there a way to avoid it?
I think this is Theora Vorbis thing.. those are some configuration parameters for initialization of decoder if I understand that properly..
Theora makes the same controversial design decision that Vorbis made to
include the entire probability model for the DCT coecients and all the quan-
tization parameters in the bitstream headers. This is often several hundred
elds. It is therefore impossible to decode any frame in the stream without
having previously fetched the codec info and codec setup headers.
~ from here
some similar question

Gstreamer - streaming with TCP sources

I am trying to stream video using TCP. Following is the server pipeline:
gst-launch filesrc location=<movie>.mkv ! decodebin ! ffenc_mpeg4 bitrate=5000000 ! rtpmp4vpay mtu=1400 pt=96 ssrc=0 timestamp-offset=0 seqnum-offset=0 send-config=true ! tcpserversink host=0.0.0.0 port=5000
And the client pipeline is:
gst-launch tcpclientsrc host=192.168.1.93 port=5000 ! capsfilter caps="application/x-rtp, media=(string)video, clock-rate=(int)2147483647, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d88007d0a041e1463000001b24c61766335322e3132332e30, payload=(int)96, ssrc=(uint)298758266, clock-base=(uint)3097828288, seqnum-base=(uint)63478" ! rtpmp4vdepay ! ffdec_mpeg4 ! autovideosink
I had to include a capsfilter between tcpclientsrc and rtpmp4vdepay. This does not seem to be required if I use the udpsrc however. (I cant seem to get the multicast based streaming to work through my BT router). When started the client starts to play the video, but it is garbage and I can see following warning in the client side.
WARNING: from element /GstPipeline:pipeline0/GstRtpMP4VDepay:rtpmp4vdepay0: Could not decode stream.
Additional debug info:
gstbasertpdepayload.c(387): gst_base_rtp_depayload_chain (): /GstPipeline:pipeline0/GstRtpMP4VDepay:rtpmp4vdepay0:
Received invalid RTP payload, dropping
This warning is emitted continuously.
What am I doing wrong here?

Compress H264 Stream Using Gstreamer

I am trying to create a GStreamer pipeline (v 1.0) in order to record and play special file format.
For recording purpose I use the following pipeline:
gst-launch-1.0 videotestsrc ! video/x-raw-yuv, format=\(fourcc\)I420, width=640, height=480 ! videoconvert ! x264enc byte-stream=1 ! queue ! appsink
In appsink (using new_sample() callback) I use a compression method to compress H264 stream and finally store in a output file.
I use the following pipeline to play the recorded file:
gst-launch-1.0 appsrc ! video/x-h264 ! avdec_h264 ! autovideosink
In appsrc I decompress H264 stream and send it to appsrc buffer (using push-buffer). The size of each buffer is 4095.
Unfortunately GStreamer after push 2 buffers print the following debug message:
Error: Internal data flow error.
Is there any way to fix the problem?
Add legacyh264parse or h264parse (depending on your version of gst components) before your decoder. You need to be able to send full frames to the decoder.
Post avdec_h264 it would be nice to have a ffmpegcolorspace to be able to convert the video format to your display requirements.