As title, how can I change this so it also plays the files audio too?
gst-launch-1.0 filesrc location='/usr/share/myfile.mp4' ! qtdemux ! h264parse ! imxvpudec ! imxipuvideosink framebuffer=/dev/fb2 &
The I can get the file to play with audio using
gst-launch-1.0 -v playbin uri=file:///path/to/somefile.mp4
But I need the output to be onto device fb2 like in the first example
Many thanks
I posted a link to this question into the gstreamer reddit and a hero called Omerzet saved the day.
The following is the solution:
gst-launch-1.0 filesrc location='/usr/share/myfile.mp4' ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! imxvpudec ! imxipuvideosink framebuffer=/dev/fb2 demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! alsasink device="sysdefault:CARD=imxhdmisoc"
Where framebuffer diverts the video to device /dev/fb2.
And
alsasink device="sysdefault:CARD=imxhdmisoc"
Diverts the audio to my define sound card.
I'm using the following pipeline to listen to a RTSP stream and save a video file:
gst-launch-1.0 -q rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! autovideoconvert ! x264enc pass=5 quantizer=25 speed-preset=6 ! h264parse ! matroskamux ! filesink location=<filename>
But even though I can see the files generated, they lack the duration of the video when playing on VLC.
I can fix it by passing it through ffmpeg later, but I want to generate the video from gstreamer already completely valid. How can I fix this pipeline?
gst-launch-1.0 -e rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! videoconvert ! x264enc ! mp4mux ! filesink location="filename.mp4"
This will create a video with duration shown correctly
I have a third party application that reads data from a thermal camera and generates a RTP stream to a given UDP source. I am trying to wrap this RTP into a RTSP stream but I am running into problems...
The third party application basically runs gstreamer with this command
appsrc format=GST_FORMAT_TIME is-live=true block=true caps=video/x-raw,width=640,height=480,format=GRAY8,clock-rate=90000,framerate=10/1 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
Using the command below I can visualize the stream on my machine
gst-launch-1.0 udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! openjpegdec ! videoconvert ! xvimagesink
However when trying to use the default RTP to RTSP application example using https://github.com/freedesktop/gstreamer-gst-rtsp-server/blob/master/examples/test-launch.c to just forward it with a RTSP container the connection fails with VLC. Command below:
./rtp-src-to-rtsp '( udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! rtpj2kpay )'
Any light on what I am doing wrong? VLC gives only a non-descriptive error
live555 error: Nothing to play for rtsp://{IP}:{PORT}/test
It might be a lack of support of J2K in VLC (I'm using revision 3.0.8-0).
Simulating your source with:
gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,framerate=10/1,format=GRAY8 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
and relaying as RSTP with:
./test-launch "udpsrc port=3000 auto-multicast=0 ! application/x-rtp,encoding-name=JPEG2000,sampling=GRAYSCALE ! queue ! rtpj2kdepay ! image/x-jpc ! jpeg2000parse ! rtpj2kpay name=pay0 "
works on Linux with X using:
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, encoding-name=JPEG2000,sampling=GRAYSCALE ! rtpj2kdepay ! jpeg2000parse ! openjpegdec ! videoconvert ! xvimagesink -v
Though, I haven't been able to receive with VLC, nor able to make a correct J2K/RTP SDP for VLC nor ffmpeg. Someone better skilled may further advise.
I'm trying to use this command to create multiple files from stream but they have no audio playback, I think decodebin should be dealing with it, what am I doing wrong?
gst-launch-1.0 -e filesrc location=video.mp4 ! queue ! decodebin ! queue ! videoconvert ! queue ! timeoverlay ! x264enc key-int-max=10 ! h264parse ! splitmuxsink location=videos/test%02d.mp4 max-size-time=1000000000000
Why do you make that assumption that decodebin will handle it? decodebin will decode the audio track to raw audio and exposes an audio pad. If you don't make use of that pad it will not make itself into the file.
Since you transcode you will have to re-encode the audio too:
gst-launch-1.0 -e filesrc location=video.mp4 ! queue ! decodebin ! queue ! \
videoconvert ! queue ! timeoverlay ! x264enc key-int-max=10 ! h264parse ! \
splitmuxsink location=videos/test%02d.mp4 max-size-time=1000000000000 \
decodebin0. ! queue ! voaacenc ! aacparse ! splitmuxsink0.
If you don't want to re-encode but passthrough the audio decodebin is the wrong way. parsebin may be a better fit in that case.
I can successfully stream HD video using following pipelines:
streame server:
gst-launch-1.0 filesrc location="Gravity.2013.720p.BluRay.x264.YIFY.mp4" ! decodebin ! x264enc ! rtph264pay pt=96 ssrc=0 timestamp-offset=0 seqnum-offset=0 pt=96 ! gdppay ! tcpclientsink host=192.168.1.93 port=5000
client:
gst-launch-1.0 tcpserversrc host=192.168.1.93 port=5000 ! gdpdepay ! rtph264depay ! decodebin ! autovideosink
I want to add the audio stream too.
I guess it is possible to use a different port and tcpserver/tcpclient combination to stream audio parallel to video. But I am not certain, how gstreamer would synchronize two streams properly to play the movie in the client end. Apart from this method, are there any other methods? such as muxing two streams before and demuxing it in client end?