I want to send raw audio data(PCM) to VLC player on RTP for playing the PCM using gstreamer.
Here is the command to send the PCM
gst-launch-1.0 -v filesrc location=/home/webos/pcm_data_dump ! audio/x-raw, rate=44100, channels=2, endianness=1234, format=S16LE, layout=interleaved, clock-rate=44100 ! audioconvert ! audioresample ! audio/x-raw, rate=44100, channels=2, format=S32LE, layout=interleaved ! audioconvert ! rtpL16pay pt=10 ! application/x-rtp, pt=10, encoding-name=L16, payload=10, clock-rate=44100, channels=2 ! udpsink host=192.168.0.2 port=5555
Here is the VLC option to receive the PCM
rtp://192.168.0.2:5555
VLC player can get the PCM from gstreamer, but it cannot play.
VLC shows the debug message like below.
Lastly "core debug: Buffering 0%" message is shown repeatedly in VLC debug message.
core debug: output 'f32l' 44100 Hz Stereo frame=1 samples/8 bytes
core debug: looking for audio volume module matching "any": 2 candidates
core debug: using audio volume module "float_mixer"
core debug: input 's16l' 44100 Hz Stereo frame=1 samples/4 bytes
core debug: looking for audio filter module matching "scaletempo": 14 candidates
scaletempo debug: format: 44100 rate, 2 nch, 4 bps, fl32
scaletempo debug: params: 30 stride, 0.200 overlap, 14 search
scaletempo debug: 1.000 scale, 1323.000 stride_in, 1323 stride_out, 1059
standing, 264 overlap, 617 search, 2204 queue, fl32 mode
core debug: using audio filter module "scaletempo"
core debug: conversion: 's16l'->'f32l' 44100 Hz->44100 Hz Stereo->Stereo
core debug: looking for audio converter module matching "any": 12 candidates
audio_format debug: s16l->f32l, bits per sample: 16->32
core debug: using audio converter module "audio_format"
core debug: conversion pipeline complete
core debug: conversion: 'f32l'->'f32l' 44100 Hz->44100 Hz Stereo->Stereo
core debug: Buffering 0%
core debug: conversion pipeline complete
core debug: looking for audio resampler module matching "any": 3 candidates
core debug: Buffering 0%
core debug: Buffering 0%
core debug: Buffering 0%
core debug: Buffering 0%
core debug: Buffering 0%
.......
And, the log below is shown once the gstreamer command to send PCM starts.
Normally, gstreamer is blocked with this message"New clock: GstSystemClock" when command starts.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = audio/x- raw, format=(string)S32LE, layout=(string)interleaved, rate=(int)44100, channels=(int)2
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = audio/x-raw, format=(string)S32LE, layout=(string)interleaved, rate=(int)44100, channels= (int)2
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = audio/x-raw, format=(string)S32LE, layout=(string)interleaved, rate=(int)44100, channels= (int)2
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps = audio/x-raw, layout=(string)interleaved, rate=(int)44100, format=(string)S16BE, channels=(int)2, channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = audio/x-raw, layout=(string)interleaved, rate=(int)44100, format=(string)S16BE, channels=(int)2, channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = audio/x-raw, layout=(string)interleaved, rate=(int)44100, format=(string)S16BE, channels=(int)2, channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:src: caps = application/x-rtp, media=(string)audio, clock-rate=(int)44100, encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2, payload=(int)10, ssrc=(uint)2226113402, timestamp-offset=(uint)1744959080, seqnum-offset=(uint)62815
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)audio, clock-rate=(int)44100, encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2, payload=(int)10, ssrc=(uint)2226113402, timestamp-offset=(uint)1744959080, seqnum-offset=(uint)62815
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:sink: caps = audio/x-raw, layout=(string)interleaved, rate=(int)44100, format=(string)S16BE, channels=(int)2, channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps = audio/x-raw, format=(string)S32LE, layout=(string)interleaved, rate=(int)44100, channels=(int)2
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0: timestamp = 1744959080
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0: seqnum = 62815
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.622147167
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
gst-launch-0.1 has no problem, only 1.0 has problem.
Is there any problem?
If I replace your filesrc with audiotestsrc, the example works for me. Still let me point out some room for improvement.
use audioparse instead of the first capsfilter
don't audioconvert twice.
Here is a simplified pipeline that works for me:
gst-launch-1.0 -v audiotestsrc ! audioresample ! audioconvert ! rtpL16pay pt=10 ! application/x-rtp, pt=10, encoding-name=L16, payload=10, clock-rate=44100, channels=2 ! udpsink host=localhost port=5555
Related
I have an RTP streaming app which implements the following pipeline using the C API.
gst-launch-1.0 -v rtpbin name=rtpbin \
videotestsrc ! x264enc ! rtph264pay! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5002 host=127.0.0.1 \
rtpbin.send_rtcp_src_0 ! udpsink port=5003 host=127.0.0.1 sync=false async=false \
udpsrc port=5007 ! rtpbin.recv_rtcp_sink_0
I want to add header extensions to the RTP packet; therefore I created an extension using the new GstRTPHeaderExtension class introduced in GStreamer v1.20. I want to set the attributes of the extension (e.g. color space properties for the example below). AFAIU this should be done by providing those as caps to the payloader element. However, I can't figure out how I should provide these caps exactly. Do I need to use a capsfilter here or what is the right way? In the current state, I can send the RTP packets and see that the extension is added but can't set the attributes.
Related parts of the code are below:
#define URN_COLORSPACE "http://www.webrtc.org/experiments/rtp-hdrext/color-space"
const GstVideoColorimetry colorimetry = {
GST_VIDEO_COLOR_RANGE_0_255,
GST_VIDEO_COLOR_MATRIX_BT601,
GST_VIDEO_TRANSFER_BT2020_10,
GST_VIDEO_COLOR_PRIMARIES_BT2020};
const GstVideoChromaSite chroma_site = GST_VIDEO_CHROMA_SITE_MPEG2;
ext = gst_rtp_header_extension_create_from_uri(URN_COLORSPACE);
gst_rtp_header_extension_set_id(ext, 1);
g_signal_emit_by_name(videopay, "add-extension", ext);
// other element definitions, links..
videopay = gst_element_factory_make("rtph264pay", "videopay");
colorimetry_str = gst_video_colorimetry_to_string(&colorimetry);
// How to provide these caps to the payloader set the extension properties?
caps = gst_caps_new_simple("application/x-rtp",
"media", G_TYPE_STRING, "video",
"clock-rate", G_TYPE_INT, 90000,
"encoding-name", G_TYPE_STRING, "H264",
"colorimetry", G_TYPE_STRING, colorimetry_str,
"chroma-site", G_TYPE_STRING,
gst_video_chroma_to_string(chroma_site), NULL);
The caps should be provided to the sink of the RTP payloader element using a capsfilter element:
GstElement *capsfilt;
capsfilt = gst_element_factory_make("capsfilter", "capsfilter");
g_object_set(capsfilt, "caps", caps, NULL);
gst_element_link_many(videosrc, videoenc, capsfilt, videopay, NULL)
where videosrc, videoenc, videopay are the source, encoder and payloader elements, respectively.
Also, the caps should have a media type matching to the encoder element. E.g. video/x-h264 if the encoder element is an instance of x264enc.
The payloader will try to automatically enable the extension with the attributes set in the caps by passing the caps to the extension, if auto-header-extension is enabled (set to true by default).
In a gst-launch pipeline, the caps are passed automatically when the header extension is inserted after the payloader.
Constructing this pipeline from code in few steps.
Gstreamermm used.
Create appsrc_1:
appsrc = Gst::ElementFactory::create_element("appsrc", name);
appsrc->set_property("block", false);
appsrc->set_property("min-latency", 0);
appsrc->set_property("max-latency", 100);
appsrc->set_property("do-timestamp", false);
appsrc->set_property("format", 3);
const char * format = nullptr;
switch (m_type)
{
case CV_8UC1: format = "GRAY8"; break;
case CV_8UC3: format = "BGR"; break;
default:
assert(false);
return;
}
appsrcCaps = Gst::Caps::create_simple("video/x-raw",
"format", format,
"framerate", Gst::Fraction(m_fps, 1),
"width", m_frameSize.width,
"height", m_frameSize.height);
appsrc->set_property("caps", appsrcCaps);
Create bin0:
bin = Glib::RefPtr<Gst::Bin>::cast_dynamic(Gst::Parse::create_bin("queue ! videoconvert", false));
binSinkPad = bin->add_ghost_pad(bin->get_element("queue0"), "sink");
binSrcPad = bin->add_ghost_pad(bin->get_element("videoconvert0"), "src");
Create tee:
tee = Gst::ElementFactory::create_element("tee");
tee->set_property("allow-not-linked", true);
Then, new pipeline was created and existing elements were added:
pipeline = Gst::Pipeline::create("webrtc_pipeline");
pipeline->add(appsrc);
pipeline->add(tee)->add(bin);
Link step:
appsrc->link(bin)->link(tee);
Then, on some time, when new client connects, bin1 spawns with fake ximagesink:
pipeline->set_state(Gst::STATE_READY);
bin1 = Glib::RefPtr<Gst::Bin>::cast_dynamic(Gst::Parse::create_bin("queue name=q ! ximagesink", false));
videoSinkPad = bin1->add_ghost_pad(bin1->get_element("q"), "sink", "videosinkpad");
bin1->set_state(Gst::STATE_PAUSED);
pipeline->add(bin1);
tee->link(bin1);
pipeline->set_state(Gst::STATE_PLAYING);
After those steps I am getting this strange log and error:
0:01:13.399425680 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:405:gst_video_convert_transform_caps:<videoconvert0> transformed video/x-raw, format=(string)GRAY8, framerate=(fraction)8/1, width=(int)640, height=(int)480 into video/x-raw, format=(string){ I420, YV12, YUY2, UYVY, AYUV, VUYA, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, Y210, Y410, NV12, NV21, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10BE, I420_10LE, I422_10BE, I422_10LE, Y444_10BE, Y444_10LE, GBR, GBR_10BE, GBR_10LE, NV16, NV24, NV12_64Z32, A420_10BE, A420_10LE, A422_10BE, A422_10LE, A444_10BE, A444_10LE, NV61, P010_10BE, P010_10LE, IYU2, VYUY, GBRA, GBRA_10BE, GBRA_10LE, BGR10A2_LE, GBR_12BE, GBR_12LE, GBRA_12BE, GBRA_12LE, I420_12BE, I420_12LE, I422_12BE, I422_12LE, Y444_12BE, Y444_12LE, GRAY10_LE32, NV12_10LE32, NV16_10LE32, NV12_10LE40 }, width=(int)640, height=(int)480, framerate=(fraction)8/1
0:01:13.399443298 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:405:gst_video_convert_transform_caps:<videoconvert0> transformed video/x-raw, format=(string)GRAY8, framerate=(fraction)8/1, width=(int)640, height=(int)480 into video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480
0:01:13.399507923 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:342:gst_video_convert_fixate_caps:<videoconvert0> trying to fixate othercaps video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480, format=(string)BGRx, pixel-aspect-ratio=(fraction)5/3 based on caps video/x-raw, format=(string)GRAY8, framerate=(fraction)8/1, width=(int)640, height=(int)480
0:01:13.399515137 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:353:gst_video_convert_fixate_caps:<videoconvert0> now fixating video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480, format=(string)BGRx, pixel-aspect-ratio=(fraction)5/3
0:01:13.399519528 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:290:gst_video_convert_fixate_format:<videoconvert0> source format GRAY8
0:01:13.399524394 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:300:gst_video_convert_fixate_format:<videoconvert0> iterate 1 structures
0:01:13.399528710 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:264:score_value:<videoconvert0> score GRAY8 -> BGRx = 3
0:01:13.399532151 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:269:score_value:<videoconvert0> found new best 3
0:01:13.399550504 32822 0x7fd578002800 ERROR videoconvert gstvideoconvert.c:490:gst_video_convert_set_info:<videoconvert0> input and output formats do not match
...
0:01:13.399741982 32822 0x7fd578002800 WARN basetransform gstbasetransform.c:1370:gst_base_transform_setcaps:<videoconvert0> FAILED to configure incaps video/x-raw, format=(string)GRAY8, framerate=(fraction)8/1, width=(int)640, height=(int)480 and outcaps video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480, format=(string)BGRx, pixel-aspect-ratio=(fraction)5/3
0:01:13.399756085 32822 0x7fd578002800 INFO task gsttask.c:312:gst_task_func:<queue0:src> Task going to paused
0:01:13.524512587 32822 0x7fd578002360 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<appsrc_1> error: Internal data stream error.
0:01:13.524526933 32822 0x7fd578002360 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<appsrc_1> error: streaming stopped, reason not-negotiated (-4)
The question is: How to fix this?
PS: How this could happen, if upstream is limited to format=GRAY8???
0:01:13.399507923 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:342:gst_video_convert_fixate_caps:<videoconvert0> trying to fixate othercaps video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480, format=(string)BGRx, pixel-aspect-ratio=(fraction)5/3 based on caps video/x-raw, format=(string)GRAY8, framerate=(fraction)8/1, width=(int)640, height=(int)480
0:01:13.399515137 32822 0x7fd578002800 DEBUG videoconvert gstvideoconvert.c:353:gst_video_convert_fixate_caps:<videoconvert0> now fixating video/x-raw, framerate=(fraction)8/1, width=(int)640, height=(int)480, format=(string)BGRx, pixel-aspect-ratio=(fraction)5/3
I think your problem is not the pixel format=GRAY8 but the pixel-aspect-ratio=5/3 which is not the same up- and downstream of your videoconvert element.
videoconvert cannot handle conversion between different pixel-aspect-ratios. As can be seen here.
Check that you have the right resolution set for your ximagesink. ximagesink docs
gstreamer raises an error when trying to stream side-by-side video from a stereoscopic UVC camera.
I have a stereoscopic camera attached via USB to an ARM board, but on the highest resolution setting that the camera allows gstreamer is raising an Invalid Dimension 0x0 error.
v4l2-ctl --list-formats-ext -d /dev/video2
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 2560x960
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 2560x720
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x480
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x240
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! "image/jpeg, width=2560, height=960, framerate=60/1" ! progressreport ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstProgressReport:progressreport0.GstPad:src: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)60.000000, x-dimensions=(string)"2560\,960", payload=(int)26, ssrc=(uint)1656850644, timestamp-offset=(uint)2590317031, seqnum-offset=(uint)18356
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)60.000000, x-dimensions=(string)"2560\,960", payload=(int)26, ssrc=(uint)1656850644, timestamp-offset=(uint)2590317031, seqnum-offset=(uint)18356
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstProgressReport:progressreport0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)2560, height=(int)960, framerate=(fraction)60/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
0:00:00.163107430 7652 0x55d47a920a30 WARN v4l2bufferpool gstv4l2bufferpool.c:790:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 2590320450
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 18356
progressreport0 (00:00:05): 4 seconds
progressreport0 (00:00:10): 9 seconds
progressreport0 (00:00:15): 14 seconds
0:00:15.770955137 7652 0x55d47a920a30 WARN v4l2src gstv4l2src.c:968:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 2 - ts: 0:00:15.622372937
progressreport0 (00:00:20): 19 seconds
progressreport0 (00:00:25): 24 seconds
progressreport0 (00:00:30): 29 seconds
progressreport0 (00:00:35): 34 seconds
Then on the viewing machine (currently just using localhost on the same laptop):
$ gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
WARNING: from element /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0: Invalid Dimension 0x0.
Additional debug info:
gstrtpjpegdepay.c(741): gst_rtp_jpeg_depay_process (): /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0
WARNING: from element /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0: Invalid Dimension 0x0.
The two lowest resolution modes work with this configuration, but the 720p side-by-side mode throws the foregoing errors.
What am I doing wrong here? And does this have anything to do with gst-launch-1.0 not supporting full screen mode perhaps?
Thanks in advance
See RFC 2435 https://www.rfc-editor.org/rfc/rfc2435 on width and height:
3.1.5. Width: 8 bits
This field encodes the width of the image in 8-pixel multiples (e.g.,
a width of 40 denotes an image 320 pixels wide). The maximum width
is 2040 pixels.
3.1.6. Height: 8 bits
This field encodes the height of the image in 8-pixel multiples
(e.g., a height of 30 denotes an image 240 pixels tall). When
encoding interlaced video, this is the height of a video field, since
fields are individually JPEG encoded. The maximum height is 2040
pixels.
Here you can see that the limit is 2040 pixels in both dimensions. This is a protocol limitation.
Check the GStreamer source code https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/blob/master/gst/rtp/gstrtpjpegdepay.c:
/* allow frame dimensions > 2040, passed in SDP session or media attributes
* from gstrtspsrc.c (gst_rtspsrc_sdp_attributes_to_caps), or in caps */
if (!width)
width = rtpjpegdepay->media_width;
if (!height)
height = rtpjpegdepay->media_height;
Here you can see the GStreamer devs offer you a solution for this protocol limitation. When receiving images over 2040 pixels in either dimension you probably will have to add width and height information to the caps.
How can i write a pipeline that streams videotestsrc h265 encoded on RTSP and another that playback the former?
As far as i understand, this should be a valid server
gst-launch-1.0 -v videotestsrc ! video/x-raw,width=1280,height=720 ! x265enc ! rtph265pay ! udpsink host=127.0.0.1 port=5000
The output is
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstX265Enc:x265enc0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
Redistribute latency...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstX265Enc:x265enc0.GstPad:src: caps = "video/x-h265\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ level\=\(string\)3.1\,\ tier\=\(string\)main\,\ profile\=\(string\)main\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)30/1"
/GstPipeline:pipeline0/GstRtpH265Pay:rtph265pay0.GstPad:sink: caps = "video/x-h265\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ level\=\(string\)3.1\,\ tier\=\(string\)main\,\ profile\=\(string\)main\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)30/1"
/GstPipeline:pipeline0/GstRtpH265Pay:rtph265pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H265\,\ ssrc\=\(uint\)2573237941\,\ timestamp-offset\=\(uint\)1713951204\,\ seqnum-offset\=\(uint\)27727\,\ a-framerate\=\(string\)30"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H265\,\ ssrc\=\(uint\)2573237941\,\ timestamp-offset\=\(uint\)1713951204\,\ seqnum-offset\=\(uint\)27727\,\ a-framerate\=\(string\)30"
/GstPipeline:pipeline0/GstRtpH265Pay:rtph265pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H265\,\ sprop-parameter-sets\=\(string\)\"QAEMAf//AWAAAAMAkAAAAwAAAwBdlZgJAA\\\=\\\=\\\,QgEBAWAAAAMAkAAAAwAAAwBdoAKAgC0WWVmkkyuAQAAAAwBAAAAHggA\\\=\\\,RAHBcrRiQAA\\\=\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)27727\,\ timestamp-offset\=\(uint\)1713951204\,\ ssrc\=\(uint\)2573237941\,\ a-framerate\=\(string\)30"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H265\,\ sprop-parameter-sets\=\(string\)\"QAEMAf//AWAAAAMAkAAAAwAAAwBdlZgJAA\\\=\\\=\\\,QgEBAWAAAAMAkAAAAwAAAwBdoAKAgC0WWVmkkyuAQAAAAwBAAAAHggA\\\=\\\,RAHBcrRiQAA\\\=\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)27727\,\ timestamp-offset\=\(uint\)1713951204\,\ ssrc\=\(uint\)2573237941\,\ a-framerate\=\(string\)30"
/GstPipeline:pipeline0/GstRtpH265Pay:rtph265pay0: timestamp = 1713951204
/GstPipeline:pipeline0/GstRtpH265Pay:rtph265pay0: seqnum = 27727
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
So i tried
gst-launch-1.0 udpsrc uri=udp://127.0.0.1:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)27727, timestamp-offset=(uint)1713951204, ssrc=(uint)2573237941, a-framerate=(string)30" ! rtph265depay ! vaapidecode ! vaapipostproc ! vaapisink
but with no luck
libva info: VA-API version 0.39.3
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_0_39
libva info: va_openDriver() returns 0
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'pipeline0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaapiDisplay)NULL;
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstVaapiDecode:vaapidecode0: No valid frames decoded before end of stream
Additional debug info:
gstvideodecoder.c(1167): gst_video_decoder_sink_event_default (): /GstPipeline:pipeline0/GstVaapiDecode:vaapidecode0:
no valid frames found
Execution ended after 0:00:00.025823038
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Try adding the parser before the decode like: gst-launch-1.0 udpsrc uri=udp://127.0.0.1:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)27727, timestamp-offset=(uint)1713951204, ssrc=(uint)2573237941, a-framerate=(string)30" ! rtph265depay ! h265parse ! vaapidecode ! vaapipostproc ! vaapisink
Notice the h265parse element between the depay and decode
I am new to GStreamer and I try to swap the color channels of a RGB-video. (e.g. red to blue). How can I do this with gst-launch?
I go trough this list but I am unable to find an element to do it: http://gstreamer.freedesktop.org/documentation/plugins.html
I wrote now my own Element. I used "Colorflip" as my base Element, changed the name to "ChannelFlip" (you must rename all methods from gst_video_flip_bla to gst_channel_flip_bla and rename the structs).
Then I was able to register my element with:
gst_element_register(NULL, "channelflip", GST_RANK_NONE, GST_TYPE_CHANNEL_FLIP);
Then I added my enums to GstChannelFlipMethod and my properties to _GstChannelFlip. Changed caps to "RGB" and added my Code to gst_channel_flip_packed_simple and called it in gst_channel_flip_transform_frame instead of videoflip->process (videoflip, out_frame, in_frame); with:
GST_OBJECT_LOCK (videoflip);
//videoflip->process (videoflip, out_frame, in_frame);
gst_channel_flip_packed_simple(videoflip, out_frame, in_frame);
GST_OBJECT_UNLOCK (videoflip);
You can actually trick GStreamer by replacing the caps:
gst-launch-1.0 -v videotestsrc ! video/x-raw, format=RGBx ! capssetter replace=true caps="video/x-raw, format=(string)BGRx, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive" ! videoconvert ! ximagesink
Please note that:
"width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive"
are the default settings for videotestsrc. If you, for example, want another resolution, you need to declare it twice:
gst-launch-1.0 -v videotestsrc ! video/x-raw, format=RGBx, width=640, height=480 ! capssetter replace=true caps="video/x-raw, format=(string)BGRx, width=(int)640, height=(int)480, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive" ! videoconvert ! ximagesink
But of course having a dedicated element is the better solution in order to support proper dynamic caps negotiation.