Can we write audio binary data directly to gstreamer pipeline? - gstreamer

I want to write binary data directly to gstreamer pipeline but I'm unable to do so.
I had tried the rawaudioparse plugin. I had written the binary data into the .raw file and tried this command to play this binary data.
gst-launch-1.0 filesrc location=audio.raw ! rawaudioparse use-sink-caps=false \
format=pcm pcm-format=s16le sample-rate=48000 num-channels=2 \
audioconvert ! audioresample ! autoaudiosink
My goal is to write audio binary data to gstreamer pipeline and play that as RTMP streaming.

Yes, you can achieve this using the element fdsrc, which takes a file descriptor (by default: standard input) from which it will start reading data.
Your GStreamer pipeline will then look like this:
# Replace "cat audio.raw" with your actual commands
cat audio.raw | gst-launch-1.0 fdsrc ! rawaudioparse (...)

Related

How can I save an GStreamer RTSP stream of unknown type to a file

I'm using this Gstreamer pipeline to send an RTSP stream of a camera.
./gst-rtsp-launch --port 8554 "( v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=640,height=480 ! rtpvrawpay name=pay0 pt=96 )"
I want to using playbin, so I don't need to specify the type of video from the rtsp stream. If I use this pipeline, I can get a single image from the camera:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="jpegenc ! filesink location=capture1.jpeg"
But if I try this pipeline, to save as a file:
gst-launch-1.0 playbin uri=rtsp://(ip-of-camera):8554 video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
I get this error:
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1: Internal data stream error.
Additional information for debugging:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.094968753
Defining the processing queue to NULL...
Freeing the processing queue...
Note: I had translate the last two lines.
Is there a problem in the pipeline I'm using to save the stream as a file?
I copied your pipeline and in worked.
I only added -e after gst-launch-1.0, so that GStreamer writes all necessary information to the file when ctrl-C is pressed:
gst-launch-1.0 -e playbin uri=(rtsp url) video-sink="videoconvert ! video/x-h264,width=320,height=240 ! mp4mux ! filesink location=test.mp4"
Maybe you are using an older version of GStreamer?
I am using GStreamer 1.20.0

gstreamer: pass frame PTS in command line API

Currently I have a setup like this.
my-app | gst-launch-1.0 -e fdsrc ! \
videoparse format=GST_VIDEO_FORMAT_BGR width=640 height=480 ! \
videoconvert ! 'video/x-raw, format=I420' ! x265enc ! h265parse ! \
matroskamux ! filesink location=my.mkv
From my-app I am streaming raw BGR frame buffers to gst. How can I also pass presentation timestamps (PTSs) for those frames? I have somewhat full control over my-app. I can open other pipes to gst from it.
I know I have the option to use gstreamer C/C++ API or write a gstreamer plugin, but I was trying to avoid this.
I guess you can set a framerate for the videoparse element. You can also try do-timestamp=true for the fdsrc - maybe it requires a combination of both.
If you have the PTS in my-app you would probably need to wrap buffers and PTS in a real GstBuffer and use gdppay and gdpdepay as payload between the link.
For example if your my-app would dump the images in the following format:
https://github.com/GStreamer/gstreamer/blob/master/docs/random/gdp
(not sure how recent this info document is)
You could receive the data with the following pipeline:
fdsrc ! gdpdepay ! videoconvert ! ..
No need for resolution and format either as it is part of the protocol too. And you will have PTS as well if set.
If you can use GStreamer lib in my-app you could some soome pipeline like this:
appsrc ! gdppay ! fakesink dump=true
And you would push your image buffers with PTS to the appsink.
See https://github.com/GStreamer/gst-plugins-bad/tree/master/gst/gdp for some examples how gdp is used as a protocol.

Saving webcam jpeg stream to multiple files with gstreamer

I'm trying to save a MJPEG stream from a logitech C920 webcam to multiple video files (matroska).
I've got this pipeline: (1 mkv file every 60s)
gst-launch-1.0 -ev v4l2src device=/dev/video0 \
! image/jpeg,width=1280,height=720,framerate=24/1 \
! matroskamux ! multifilesink next-file=max-duration max-file-duration=60000000000 location='test1-%02d.mkv'
It outputs several files, as expected, but the files have errors, so tools like avidemux can't play them back. mkvalidator reports these:
WRN080: Unknown element [FF] at 293 size 88
WRN080: Unknown element [FF] at 494 size 64
WRN080: Unknown element [7D][01] at 566 size w98603107602
WRN801: The segment has no SeekHead section
WRN0B8: Track #1 is defined but has no frame
BTW, saving to a single file using filesink produces an mkv file without errors.
Is there a way to save multiple mkv files properly?
Any other container is also OK, but I cannot transcode (need low CPU load) and I cannot use raw (need HD with high fps).
I'm using GStreamer 1.8.2 on Ubuntu 16.04.1.
Thanks.
Update:
Following the advice below, I tried with splitmuxsink:
gst-launch-1.0 -e v4l2src device=/dev/video1 \
! image/jpeg,width=1280,height=720,framerate=24/1 \
! splitmuxsink muxer=matroskamux location='test1-%02d.mkv' \
max-size-time=10000000000
But it doesn't work: The file is never split and keeps growing in size.
The following pipeline seems to work:
gst-launch-1.0 -e v4l2src ! x264enc key-int-max=10 ! h264parse ! splitmuxsink muxer=matroskamux location='test1-%02d.mkv' max-size-time=60000000000
multifilesink doesn't know nothing about the container format, so you must use splitmuxsink to do the spliting.
Here is the quote from multifilesink doc:
It is not possible to use this element to create independently
playable mp4 files, use the splitmuxsink element for that instead.
I've got success with an upgraded GStreamer (Ubuntu 18.04)
$ gst-launch-1.0 --gst-version
GStreamer Core Library version 1.14.1
Here is a pipeline with an AVI container, where a new file is generated every ten seconds:
gst-launch-1.0 -e v4l2src device=/dev/video1 \
! image/jpeg,width=1280,height=720,framerate=24/1 \
! splitmuxsink muxer=avimux location='test1-%02d.avi' max-size-time=10000000000
It also works with matroskamux.

Compress H264 Stream Using Gstreamer

I am trying to create a GStreamer pipeline (v 1.0) in order to record and play special file format.
For recording purpose I use the following pipeline:
gst-launch-1.0 videotestsrc ! video/x-raw-yuv, format=\(fourcc\)I420, width=640, height=480 ! videoconvert ! x264enc byte-stream=1 ! queue ! appsink
In appsink (using new_sample() callback) I use a compression method to compress H264 stream and finally store in a output file.
I use the following pipeline to play the recorded file:
gst-launch-1.0 appsrc ! video/x-h264 ! avdec_h264 ! autovideosink
In appsrc I decompress H264 stream and send it to appsrc buffer (using push-buffer). The size of each buffer is 4095.
Unfortunately GStreamer after push 2 buffers print the following debug message:
Error: Internal data flow error.
Is there any way to fix the problem?
Add legacyh264parse or h264parse (depending on your version of gst components) before your decoder. You need to be able to send full frames to the decoder.
Post avdec_h264 it would be nice to have a ffmpegcolorspace to be able to convert the video format to your display requirements.

Gstreamer : transcoding Matroska video to mp4

The hardware on which we are working on doesnt support playing of mkv files.
So i'm required to transcode Matroska (mkv) video filea to mp4 video file.
As I have understood from the material available online on transcoding,I'm required to do the following :
separate out different streams of mkv file using matroskademux element.
decode the audio and Video streams into raw format using available mkv decoder and
supply this data to the mp4 Muxer element and re-encode to required format.
Could anyone please tell me if I applying right approach?
Any information/link on this would be very helpful.
vikram
Depending on what is in the Matroska file you might not need to decode it at all, just remux.
I assume the video for instance is H264, so just remux that.
Below is an example pipeline for gst-launch for remuxing a file with h264 and mp3.
gst-launch-0.10 -v filesrc location=$file \
! matroskademux name="demux" demux. ! h264parse ! queue \
! mp4mux name=mux ! filesink location=$file._out.mp4 demux. \
! mp3parse ! queue ! mux.`
You can also look at the Transmageddon transcoder (www.linuxrising.org) which should give you want you want.