Save H264 encoded stream without re-encoding - gstreamer

I have a gstreamer pipeline that streams using :
v4l2src ! x264enc ! rtph264pay pt=96 ! udpsink host=ip port=8554
And this pipeline that receives this stream :
/ queue ! avdec_h264 ! appsink
udpsrc ! capsfilter ! rtpjitterbuffer ! rtph264depay ! tee !
\ queue ! h264parse ! mp4mux ! filesink
Simplified receiver pipeline without the tee is :
gst-launch-1.0 udpsrc port=8080 caps="lots-of-caps" ! rtpjitterbuffer ! rtph264depay ! h264parse ! mp4mux ! filesink location=/home/rish/Desktop/recorded.264 -e
Question :
Is there a way to save the H264 encoded stream received from udpsrc without having to re-encode it? How do I correctly close the filesink?
What I've tried so far : The discussion from this thread suggests the pipeline I've tried above but file is still corrupt. (not correctly closed).
This question asks a similar question. However I do not want to decode and re-encode. Another answer in the thread suggests using matroskamux element instead of mp4mux. This works, but I'd rather prefer using mp4mux (no particular reason, but I'd like to know why matroskamux works and mp4mux doesn't).

Your pipeline is already muxing without re-encoding, there is no encoder on your pipeline. h264parse is just a parser.
you've already got an answer on how to close the stream here: Sending EoS to filesink while removing branch from tee

Related

How to combine appsink and filesink using GStreamer?

I'm new to GStreamer and I'm trying to create a pipeline to display a video and record it at the same time. I've managed to make the display part using:
ss << "filesrc location=/home/videos/video1.avi ! avidemux name=demux demux.video_0 ! mpeg4videoparse ! avdec_mpeg4 ! nvvidconv ! video/x-raw,format=I420 ! appsink name=mysink";
Also, I've read that filesink location=somepath is used for saving data into a file but I don't know how combine it with the rest of the pipeline.
So, how do I use appsink and filesink in the same pipeline?
GStreamer offers a tee element for such cases. Note however that in most cases you will want a queue after each branch of a tee to prevent deadlocks. E.g.
filesrc location=/home/videos/video1.avi ! avidemux name=demux demux.video_0 ! mpeg4videoparse ! avdec_mpeg4 ! nvvidconv ! video/x-raw,format=I420 ! tee name=mytee ! queue ! appsink name=mysink mytee. ! queue ! filesink location=out.raw

record camera stream from gstreamer

I have a gstreamer pipeline which works perfectly and takes a camera stream, encodes it as H.264 video, saves it to a file AND displays it on the screen as follows:
gst-launch-1.0 -v autovideosrc ! tee name = t ! queue ! omxh264enc !
'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! qtmux !
filesink location=test.mp4 t. ! queue ! videoscale ! video/x-raw,
width=480,height=270 ! xvimagesink -e sync=false
Now, I am trying to do something even simple and just record the stream to a file (without displaying on screen) and this does not seem to work! It writes a file but cannot play it. What I have tried so far is:
gst-launch-1.0 -v autovideosrc ! queue ! omxh264enc ! 'video/x-h264,
stream-format=(string)byte-stream' ! h264parse ! qtmux ! filesink
location=test.mp4 sync=false
I can also remove the queue element but with the same result:
gst-launch-1.0 -v autovideosrc ! omxh264enc ! 'video/x-h264,
stream-format=(string)byte-stream' ! h264parse ! qtmux ! filesink
location=test.mp4 sync=false
It does not give any errors but just does not write a valid stream to my filesink, it seems.
How do you stop the stream? Will the camera correctly inject an EOS signal? If not and you just press ctrl-c to stop the operation the .mp4 file will missing important headers which are required for proper playback.
Add -e to your command line. In that case when you press ctrl-c the pipeline will not just stop but is being properly shut down by sending an EOS signal through the pipeline.

How to demux audio and video from rtspsrc and then save to file using matroska mux?

I have been working on an application where I use rtspsrc to gather audio and video from one network camera to another. However I can not watch the stream from the camera and thereby cant verify that the stream works as intended. To verify that the stream is correct I want to record it on a SD card and then play the file on a computer. The problem is that I want the camera to do as much of the parsing, decoding, depayloading as possible since that is the purpose of the application.
I thereby have to separate the audio and video streams by a demuxer and do the parsing, decoding etc and thereafter mux them back into a matroska file.
The video decoder has been omitted since it is not done yet for this camera.
Demux to live playback sink(works)
gst-launch-0.10 -v rtspsrc location="rtsp://host:pass#192.168.0.91/XXX/XXXX?resolution=1280x720&audio=1&audiocodec=g711&audiosamplerate=8000&audiobitrate=64000" latency=0 name=d d. ! rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! autoaudiosink d. ! rtph264depay ! ffdec_h264 ! queue ! ffmpegcolorspace ! autovideosink
Multiple rtspsrc to matroska(works)
gst-launch-1.0 -v rtspsrc location="rtsp://host:pass#192.168.0.91/XXX/XXXX?audio=1&audiocodec=g711&audiosamplerate=8000&audiobitrate=64000" latency=0 ! rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! queue ! matroskamux name=mux ! filesink location=/var/spool/storage/SD_DISK/testmovie.mkv rtspsrc location="rtsp://root:pass#192.168.0.91/axis-media/media.amp?resolution=1280x720" latency=0 ! rtph264depay ! h264parse ! mux.
Single rtspsrc to matroska(fails)
gst-launch-1.0 -v rtspsrc location="rtsp://host:pass#192.168.0.91/XXX/XXXX?resolution=1280x720&audio=1&audiocodec=g711&audiosamplerate=8000&audiobitrate=64000" latency=0 name=d d. ! queue ! rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! queue ! matroskamux name=mux d. ! queue ! rtph264depay ! h264parse ! queue ! mux. ! filesink location=/var/spool/storage/SD_DISK/testmoviesinglertsp.mkv
The last example fails with the error message
WARNING: erroneous pipeline: link without source element
Have i missunderstood the usage of matroska mux and why does the 2 above examples work but not the last?
The problem is here:
queue ! mux. ! filesink
You need to do
queue ! mux. mux. ! filesink
mux. means that gst-launch should select a pad automatically from mux. and link it. You could also specify manually a name, like mux.src. So syntactically you are missing another element/pad there to link to the other element.

gstreamer recording m3u8 stream

I'm trying to record stream from m3u8 file.
This pipeline works:
gst-launch-0.10 -e souphttpsrc location=(mysrc.m3u8) ! queue ! hlsdemux ! queue ! mpegtsparse ! queue ! mpegtsdemux ! queue ! audio/mpeg ! queue ! filesink location=test.ts
and (sometimes) record audio stream.
But i can't record video, whatever i do it crashes.
I tried something like this:
gst-launch-0.10 souphttpsrc location=(mysrc.m3u8) ! queue ! hlsdemux ! queue ! mpegtsparse ! queue ! mpegtsdemux ! queue ! video/x-264 ! queue ! filesink location=test.ts
But it does nothing.
You are using gstreamer 0.10 which is obsolete and unmantained, all users should upgrade to the 1.x series.
Given that warning, it is not clear whether you want to save the mpegts stream or the streams inside it.
To save the mpegts stream you can just do:
gst-launch-1.0 http://path/to/your/stream.m3u8 ! hlsdemux ! filesink
Be aware that if the HLS playlist contains multiple bitrates hlsdemux might switch bitrate and it will fail as gst-launch-1.0 isn't capable of handling this. (it is a debugging and testing tool). You can likely set a fixed "connection-speed" to make it always use the same bitrate you desire to overcome this issue.
If you want to get only the video stream and you know it is H264, try:
gst-launch-1.0 http://path/to/your/stream.m3u8 ! hlsdemux ! tsdemux ! queue ! video/x-h264 ! filesink
It might be a better idea to save it to a container format to allow easier use later, with something like:
gst-launch-1.0 http://path/to/your/stream.m3u8 ! hlsdemux ! tsdemux ! queue ! video/x-h264 ! h264parse ! qtmux ! filesink
But, as I said, please move to 1.x, HLS is much better at 1.x than it was in 0.10 and it should work.

gstreamer output-selector does not allow saving to file

I want to listen to an audiostream from alsasrc continuously, and at the same time be able to save snippets to file. I will push a button with 'record' or 'stop'.
I think I need the following gst-launch-0.10 command to work:
gst-launch-0.10 alsasrc do-timestamp=true ! tee name=t ! queue ! alsasink t. ! queue ! audioconvert ! wavenc ! output-selector name=s s. ! filesink location=test1.wav s. ! fakesink
Where I program the output-selector to switch between the filesink and the fakesink when I push the record/stop-button, I know gst-plugins-bad/tests/icles/output-selector-test.c example in the plugins, and want to hack that a bit.
Now the problem arises in the outputselector, it creates the file test1.wav but does not write to it. To focus on this problem I created:
gst-launch-0.10 audiotestsrc is-live=true do-timestamp=true ! wavenc ! output-selector name=s s. ! filesink location=test1.wav s. ! filesink location=test2.wav
and this also does not work (while gst-launch-0.10 audiotestsrc is-live=true do-timestamp=true ! wavenc ! filesink location=test1.wav works as expected). The 2 files are created but not written to. Can anybody point me in the right direction?
In posting "[gst-devel] how to link multiple filesinks to a output-selector before playing pipeline" I read that the 2nd sink is blocking on preroll. That why the example in output-selector-test.c uses a live-src as a trick, I also do that with audiotestsrc but it does not do the trick for me.
Prerolling is not your friend here, so set async=0 on both filesink and fakesink:
gst-launch-0.10 command to work: gst-launch-0.10 alsasrc do-timestamp=true ! tee name=t ! queue ! alsasink t. ! queue ! audioconvert ! wavenc ! output-selector name=s s. ! filesink location=test1.wav async=0 s. ! fakesink async=0
What about putting the wavenc element into the s. ! filesink location=test1.wav branch? wavenc will do some work when the stream starts and stops. output-selector can only ensures that this work if the elements sits behind it.