Gstreamer hangs while generating timelapse from JPEGs on Raspberry pi - gstreamer

Situation:
I want to generate a timelapse on my Raspberry Pi 512mb, using the onboard H.264 encoder.
Input: +300 JPEG files (2592 x 1944 pixels), example: http://i.imgur.com/czohiki.jpg
Output: h264 video file (2592 x 1944 pixels)
GStreamer 1.0.8 + omxencoder (http://pastebin.com/u8T7mE18)
Raspberry Pi version: Jun 17 2013 20:45:38 version d380dde43fe729f043befb5cf775f99e54586cde (clean) (release)
Memory: gpu_mem_512=400
Gstreamer pipeline:
sudo gst-launch-1.0 -v multifilesrc location=GOPR%04d.JPG
start-index=4711 stop-index=4750
caps="image/jpeg,framerate=(fraction)25/1" do-timestamp=true !
omxmjpegdec ! videorate ! video/x-raw,framerate=1/5 ! videoconvert !
omxh264enc ! "video/x-h264,profile=high" ! h264parse ! queue
max-size-bytes=10000000 ! matroskamux ! filesink location=test.mkv
--gst-debug=4
Problem:
Gstreamer hangs and no output is generated.
--gst-debug=4:
0:00:01.027331700 2422 0x17824f0 INFO GST_EVENT
gstevent.c:709:gst_event_new_segment: creating segment event time
segment start=0:00:00.000000000, stop=99:99:99.999999999,
rate=1.000000, applied_rate=1.000000, flags=0x00,
time=0:00:00.000000000, base=0:00:00.000000000, position
0:00:00.000000000, duration 99:99:99.999999999
0:00:29.346875982 2422 0x17824f0 INFO basesrc
gstbasesrc.c:2619:gst_base_src_loop: pausing after
gst_base_src_get_range() = eos
--gst-debug=5:
0:01:16.089222125 2232 0x1fa8f0 DEBUG basesrc
gstbasesrc.c:2773:gst_base_src_loop: pausing task,
reason eos
0:01:16.095962979 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:5251:gst_pad_pause_task: pause task
0:01:16.107724723 2232 0x1fa8f0 DEBUG task
gsttask.c:662:gst_task_set_state: Changing task
0x2180a8 to state 2
0:01:16.435800597 2232 0x1fa8f0 DEBUG GST_EVENT
gstevent.c:300:gst_event_new_custom: creating new event 0x129f80 eos
28174
0:01:16.436191588 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:4628:gst_pad_push_event: event eos updated
0:01:16.436414584 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:3333:check_sticky: pushing all sticky
events
0:01:16.436620579 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:3282:push_sticky: event stream-start was
already received
0:01:16.436816575 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:3282:push_sticky: event caps was already
received
0:01:16.437001571 2232 0x1fa8f0 DEBUG GST_PADS
gstpad.c:3282:push_sticky: event segment was
already received
0:01:16.440457495 2232 0x1fa8f0 DEBUG GST_EVENT
gstpad.c:4771:gst_pad_send_event_unchecked:
have event type eos event at time 99:99:99.999999999: (NULL)
0:01:16.449986289 2232 0x1fa8f0 DEBUG videodecoder
gstvideodecoder.c:1144:gst_video_decoder_sink_event:
received event 28174, eos
0:01:16.462165024 2232 0x1fa8f0 DEBUG omxvideodec
gstomxvideodec.c:2489:gst_omx_video_dec_drain:
Draining component
0:01:16.463930986 2232 0x1fa8f0 DEBUG omx
gstomx.c:1223:gst_omx_port_acquire_buffer:
Acquiring video_decode buffer from port 130
0:01:16.465537951 2232 0x1fa8f0 DEBUG omx
gstomx.c:1334:gst_omx_port_acquire_buffer:
video_decode port 130 has pending buffers
0:01:16.466576928 2232 0x1fa8f0 DEBUG omx
gstomx.c:1353:gst_omx_port_acquire_buffer:
Acquired buffer 0x21f938 (0xb2068550) from video_decode port 130: 0
0:01:16.468237892 2232 0x1fa8f0 DEBUG omx
gstomx.c:1375:gst_omx_port_release_buffer:
Releasing buffer 0x21f938 (0xb2068550) to video_decode port 130
0:01:16.470360846 2232 0x1fa8f0 DEBUG omx
gstomx.c:1420:gst_omx_port_release_buffer:
Released buffer 0x21f938 to video_decode port 130: None (0x00000000)
0:01:16.472046809 2232 0x1fa8f0 DEBUG omxvideodec
gstomxvideodec.c:2544:gst_omx_video_dec_drain:
Waiting until component is drained
Full console dump: https://mega.co.nz/#!eI1ASBSY!R4mnuGqRH7M8dT4q6j03mBKsQ1A-7oCXU4stu50LnOw
Question:
What am I doing wrong?
Is there another or more efficient way to create high res timelapses from JPEGs on a raspberry pi?

Sorry about the necro, but I think this is trying to use the Raspberry Pi HW H264 encoder at a higher resolution than it is capable of. It can manage just over 1080p30, and has a maximum line length of 2048 pixels, so your source images are too large.
You could try MJPEG which does not have the same limitation.

I don't have a Pi to test on right now, but I'd suspect one possible issue is that you have two OMX elements in the same process. GStreamer is just wrapping OMX and IIRC the OMX API doesn't really want you running two things at once, particularly in the same process...
I'd try it with a jpegdec instead of omxmjpegdec, with a pipeline more along these lines:
gst-launch-1.0 multifilesrc location="GOPR%04d.JPG "start-index=4711 stop-index=4750 ! image/jpeg,framerate=1/5 ! jpegdec ! videoconvert ! omxh264enc ! h264parse ! matroskamux ! filesink location=test.mkv
I don't think there is any point to using queue elements on the Pi either.

Related

How to skip frames that came into the pipeline before the RTSP PLAY call?

I am new to Gstreamer. I wrote a simple RTSP server that generates a pipeline like:
appsrc name=vsrc is-live=true do-timestamp=true ! queue ! h264parse ! rtph264pay name=pay0 pt=96
The SDP response is generated after the DESCRIBE request, but only after a few frames on the signal have been received by the appsrc input:
vsrc = gst_bin_get_by_name_recurse_up(GST_BIN(element), "vsrc"); // appsrc
if (nullptr != vsrc)
{
gst_util_set_object_arg(G_OBJECT(vsrc), "format", "time");
g_signal_connect(vsrc, "need-data", (GCallback)need_video_data, streamResource);
}
The time from which the video is to be played is passed in the RTSP request PLAY, in the Range header as an absolute:
PLAY rtsp://172.19.9.65:554/Recording/ RTSP/1.0
CSeq: 4
Immediate: yes
Range: clock=20220127T082831.039Z- // Start from ...
To the object GstRTSPClient attached the handler to the signal in which I process this request and make the move to the right time in my appsrc
g_signal_connect(client, "pre-play-request", (GCallback)pre_play_request, NULL);
The problem is that at this point my appsrc's start time frames have already arrived in pipline and I watch them first, and then the playback continues from the time specified in the PLAY request.
Can you please tell me how I can cut off these initial frames that came in before the PLAY call.
I've tried:
gst_element_seek - doesn't help because of peculiarities of appsrc implementation
Flush didn't help either, tried resetting sink at element rtph264pay:
gst_pad_push_event(sinkPad, gst_event_new_flush_start());
GST_PAD_STREAM_LOCK(sinkPad);
// ... seek in appsrc
gst_pad_push_event(sinkPad, gst_event_new_flush_stop(TRUE));
GST_PAD_STREAM_UNLOCK(sinkPad);
gst_object_unref(sinkPad);
Thank You!

which gstreamer rtp payloader element should I use to wrap AAC audio?

I am trying to figure out the proper gstreamer element to use to transmit AAC audio over RTP.
By dumping the dot graph of a playbin on the file I can conclude that the caps coming out of the tsdemux is audio/mpeg,mpegversion:2,stream-format:adts .
If I use the following pipeline
gst-launch-1.0 -v filesrc location=$BA ! tsdemux ! audio/mpeg ! rtpmpapay ! filesink location=/tmp/test.rtp
it fails:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1: caps = audio/mpeg
WARNING: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Delayed linking failed.
Additional debug info:
/var/tmp/portage/media-libs/gstreamer-1.12.3/work/gstreamer-1.12.3/gst/parse/grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
failed delayed linking some pad of GstTSDemux named tsdemux0 to some pad of GstRtpMPAPay named rtpmpapay0
ERROR: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Internal data stream error.
Additional debug info:
/var/tmp/portage/media-libs/gst-plugins-bad-1.12.3/work/gst-plugins-bad-1.12.3/gst/mpegtsdemux/mpegtsbase.c(1613): mpegts_base_loop (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Which gstreamer element should I be using to wrap AAC audio in an RTP packet?
I guess its rtpmp4apay: RTP MPEG4 audio payloader. Maybe you want/need aacparse before the payloader as well.

Multicast streaming of music file (wav, mp3, ...etc) with GStreamer: can receive but the data is Intermittent

I want to implement multicast streaming in embedded-linux (Yocto) system.
I thought that Gstreamer is easy to implement it, but the received data is choppy and like as if it passes low-pass filter when the filesrc is mp3.
When the filesrc is wav, the recived data is like as if it passes choppy and high-pass filter.
Here is the gst-launch command (mp3).
Tx:
GST_DEBUG=3 gst-launch-1.0 filesrc location="background.mp3" ! decodebin ! \
audioconvert ! rtpL16pay ! queue ! udpsink host=239.0.0.1 auto-multicast=true port=5004
Rx:
GST_DEBUG=3 gst-launch-1.0 udpsrc multicast-group=239.0.0.1 auto-multicast=true port=5004 \
caps="application/x-rtp, media=audio, clock-rate=44100, payload=0" ! rtpL16depay !\
audioconvert ! alsasink
GST_DEBUG3 result is as follows:
Tx:
Setting pipeline to PAUSED ...
0:00:00.115165875 936 0x7b8c40 WARN basesrc gstbasesrc.c:3486:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
====== BEEP: 4.1.4 build on Feb 14 2017 13:39:18. ======
Core: MP3 decoder Wrapper build on Mar 21 2014 15:04:50
file: /usr/lib/imx-mm/audio-codec/wrap/lib_mp3d_wrap_arm12_elinux.so.3
CODEC: BLN_MAD-MMCODECS_MP3D_ARM_02.13.00_CORTEX-A8 build on Jul 12 2016 13:15:30.
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Rx:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.269585916 1232 0x772320 WARN alsa conf.c:4974:snd_config_expand: alsalib error: Unknown parameters {AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
0:00:00.269914500 1232 0x772320 WARN alsa pcm.c:2495:snd_pcm_open_noupdate: alsalib error: Unknown PCM default:{AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
0:00:00.283770666 1232 0x772320 WARN alsa pcm_hw.c:1250:snd_pcm_hw_get_chmap: alsalib error: Cannot read Channel Map ctl
: No such file or directory
Redistribute latency...
0:00:06.335845459 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of -0:00:00.120430839, resyncing
0:00:07.167036751 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1512:gst_audio_base_sink_skew_slaving:<alsasink0> correct clock skew -0:00:00.020498109 < -+0:00:00.020000000
0:00:07.178596167 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1484:gst_audio_base_sink_skew_slaving:<alsasink0> correct clock skew +0:00:00.020102330 > +0:00:00.020000000
0:00:08.215633667 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of -0:00:00.128480725, resyncing
0:00:08.962452751 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1512:gst_audio_base_sink_skew_slaving:<alsasink0> correct clock skew -0:00:00.020283552 < -+0:00:00.020000000
0:00:09.095737543 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1484:gst_audio_base_sink_skew_slaving:<alsasink0> correct clock skew +0:00:00.020221135 > +0:00:00.020000000
0:00:10.135542001 1232 0x772320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of -0:00:00.125238095, resyncing
Here is the gst-command (wav)
Tx:
GST_DEBUG=3 gst-launch-1.0 filesrc location="background.wav" ! decodebin ! \
audioconvert ! rtpL16pay ! queue ! udpsink host=239.0.0.1 auto-multicast=true port=5004
Rx:
GST_DEBUG=3 gst-launch-1.0 udpsrc multicast-group=239.0.0.1 auto-multicast=true port=5004 \
caps="application/x-rtp, media=audio, clock-rate=44100, payload=0" ! rtpL16depay !\
audioconvert ! alsasink
GST_DEBUG3 result is as follows:
Tx:
Setting pipeline to PAUSED ...
0:00:00.116759125 958 0x1c0cc40 WARN basesrc gstbasesrc.c:3486:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
0:00:00.136465125 958 0x1c1f460 FIXME default gstutils.c:3764:gst_pad_create_stream_id_internal:<wavparse0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.137230750 958 0x1c1f460 WARN riff riff-read.c:794:gst_riff_parse_info:<wavparse0> Unknown INFO (metadata) tag entry IPRT
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
0:00:00.152916625 958 0x1c0cc40 WARN bin gstbin.c:2597:gst_bin_do_latency_func:<pipeline0> did not really configure latency of 0:00:00.000000000
New clock: GstSystemClock
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:03.435631250
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Rx:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.270927792 1238 0x120d320 WARN alsa conf.c:4974:snd_config_expand: alsalib error: Unknown parameters {AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
0:00:00.271261625 1238 0x120d320 WARN alsa pcm.c:2495:snd_pcm_open_noupdate: alsalib error: Unknown PCM default:{AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
0:00:00.284991583 1238 0x120d320 WARN alsa pcm_hw.c:1250:snd_pcm_hw_get_chmap: alsalib error: Cannot read Channel Map ctl
: No such file or directory
Redistribute latency...
0:00:04.227007167 1238 0x120d320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of +0:00:00.053514739, resyncing
0:00:04.314387751 1238 0x120d320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of +0:00:00.055510204, resyncing
0:00:04.396900334 1238 0x120d320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of +0:00:00.052607709, resyncing
0:00:04.483605876 1238 0x120d320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of +0:00:00.055215419, resyncing
0:00:04.570297626 1238 0x120d320 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<alsasink0> Unexpected discontinuity in audio timestamps of +0:00:00.055215419, resyncing
If I use pulsesink instead of alsasink, following is appeared.
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
0:00:00.410499500 1255 0x70813120 WARN pulse pulsesink.c:702:gst_pulsering_stream_underflow_cb:<pulsesink0> Got underflow
0:00:00.423478917 1255 0x7e7920 WARN audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<pulsesink0> Unexpected discontinuity in audio timestamps of +0:00:00.038095238, resyncing
0:00:00.450453459 1255 0x70813120 WARN pulse pulsesink.c:702:gst_pulsering_stream_underflow_cb:<pulsesink0> Got underflow
What is the problem ? Can anybody solve this ?
I hope your kind reply.
Thank you for reading.
I reckon that the problem is that the application/x-rtp parameters don't match between the transmitter and receiver.
This can be easily solved putting verbose in the transmitter and then using the same parameters in the receiver:
Let's see in an example:
TX:
gst-launch-1.0 -v filesrc location="test.mp3" ! decodebin ! audioconvert ! rtpL16pay ! queue ! udpsink host=239.0.0.1 auto-multicast=true port=5004
Which last lines (thanks to -v) are next:
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps =
"application/x-rtp\,\ media\=(string)audio\,\
clock-rate\=(int)44100\,\ encoding-name\=(string)L16\,\
encoding-params\=(string)2\,\ channels\=(int)2\,\
payload\=(int)96\,\ ssrc\=(uint)1806894235\,\
timestamp-offset\=(uint)468998694\,\ seqnum-offset\=(uint)20785"
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:sink: caps =
"audio/x-raw\,\ layout\=(string)interleaved\,\ rate\=(int)44100\,\
format\=(string)S16BE\,\ channels\=(int)2\,\
channel-mask\=(bitmask)0x0000000000000003"
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= "audio/x-raw\,\ format\=(string)S32LE\,\ layout\=(string)interleaved\,\ rate\=(int)44100\,\
channels\=(int)2\,\ channel-mask\=(bitmask)0x0000000000000003"
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1:
caps = "audio/x-raw\,\ format\=(string)S32LE\,\
layout\=(string)interleaved\,\ rate\=(int)44100\,\
channels\=(int)2\,\ channel-mask\=(bitmask)0x0000000000000003"
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0: timestamp = 468998694
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0: seqnum = 20785
Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock:
GstSystemClock
Using the same parameters in the player or receiver:
RX:
gst-launch-1.0 udpsrc caps='application/x-rtp, media=(string)audio, clock-rate=(int)44100, encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2, payload=(int)96' ! rtpL16depay ! pulsesink
And this plays perfectly.
Going to the .wav file in my case the transmitter is:
gst-launch-1.0 -v filesrc location="test.wav" ! decodebin ! audioconvert ! rtpL16pay ! queue ! udpsink host=239.0.0.1 auto-multicast=true port=5004
Last lines output:
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps =
"application/x-rtp\,\ media\=(string)audio\,\ clock-rate\=(int)44100\,\ encoding-name\=(string)L16\,\
encoding-params\=(string)1\,\ channels\=(int)1\,\
payload\=(int)96\,\ ssrc\=(uint)620824608\,\
timestamp-offset\=(uint)433377669\,\ seqnum-offset\=(uint)7103"
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:sink: caps =
"audio/x-raw\,\ layout\=(string)interleaved\,\ rate\=(int)44100\,\
format\=(string)S16BE\,\ channels\=(int)1"
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= "audio/x-raw\,\ format\=(string)S16LE\,\ layout\=(string)interleaved\,\ channels\=(int)1\,\
rate\=(int)44100"
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1:
caps = "audio/x-raw\,\ format\=(string)S16LE\,\
layout\=(string)interleaved\,\ channels\=(int)1\,\
rate\=(int)44100" /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
timestamp = 433377669 /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
seqnum = 7103 Pipeline is PREROLLED ... Setting pipeline to PLAYING
... New clock: GstSystemClock
Using this information in the receiver:
gst-launch-1.0 udpsrc caps='application/x-rtp, media=(string)audio, clock-rate=(int)44100, encoding-name=(string)L16, encoding-params=(string)1, channels=(int)1, payload=(int)96' ! rtpL16depay ! pulsesink
The audio also plays smoothly.
Hope this helps.

gstreamer correct pipeline to decode rtsp stream to raw video

using gstreamer-1.10 I have been trying several version of pipeline to decode an RTSP stream that starts as a webrtc connection.
ffprobe reports stream as
Duration: N/A, start: 0.128000, bitrate: N/A
Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp
Stream #0:1: Video: h264 (Constrained Baseline), yuv420p, 512x288 [SAR 1:1 DAR 16:9], 30 fps, 30 tbr, 90k tbn, 60 tbc
using variations of the following pipeline
GST_DEBUG=3 gst-launch-1.0 -e rtspsrc location=rtsp://xxx.xxx.xxx.xxx:1935/alpha/Stream1 \
! decodebin name=decode \
decode. \
! x264enc bitrate=512 speed-preset=6 \
! video/x-h264, profile=baseline \
! queue ! mp4mux name=mp4mux ! filesink location=file.mp4 \
decode. ! avenc_aac bitrate=96000 ! aacparse ! queue ! mp4mux.
I get the following errors
0:00:00.299416405 7705 0x7f0d48001e80 WARN default grammar.y:510:gst_parse_no_more_pads:<decode> warning: Delayed linking failed.
0:00:00.299435518 7705 0x7f0d48001e80 WARN default grammar.y:510:gst_parse_no_more_pads:<decode> warning: failed delayed linking some pad of GstDecodeBin named decode to some pad of GstX264Enc named x264enc0
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:decode: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstDecodeBin:decode:
failed delayed linking some pad of GstDecodeBin named decode to some pad of GstX264Enc named x264enc0
0:00:01.296295371 7705 0x7f0d6402a8f0 WARN basesrc gstbasesrc.c:2951:gst_base_src_loop:<udpsrc3> error: Internal data stream error.
0:00:01.296324999 7705 0x7f0d6402a8f0 WARN basesrc gstbasesrc.c:2951:gst_base_src_loop:<udpsrc3> error: streaming stopped, reason not-linked (-1)
what is the correct pipeline to decode this stream?

Error: Failed to write input into the OpenMAX buffer

I am trying to encode uncompressed video in H.265; however, when I write the following pipeline I receive an error message that I cannot resolve. I am following the example code in Tegra X1 Multimedia User Guide, and I do not understand why the following pipeline does not work. I am a beginner in video compression so any help would be very useful. The code/error message:
ubuntu#tegra-ubuntu:~$ gst-launch-1.0 filesrc location=small_mem_vid.mov ! 'video/x-raw, format=(string)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720' ! omxh265enc ! filesink location=new_encode.mov -e
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is PREROLLING ...
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
ERROR: from element /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0: Could not write to resource.
Additional debug info:
/dvs/git/dirty/git-master_linux/external/gstreamer/gst-omx/omx/gstomxvideoenc.c(2139): gst_omx_video_enc_handle_frame (): /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0:
Failed to write input into the OpenMAX buffer
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu#tegra-ubuntu:~$
Are you sure the .mov file is really uncompressed video? The .mov extension is commonly used for quicktime video. You could use "mediainfo" in Linux to discover more details about the format of the file. In that case I don't think you can go directly from filesrc to the encoder. You probably need a qtdemux and a decoder, maybe avdec_h264 depending on what mediainfo shows.
You also might want to enable some more detailed debugging:
export GST_DEBUG=*:4