Error: Failed to write input into the OpenMAX buffer - gstreamer

I am trying to encode uncompressed video in H.265; however, when I write the following pipeline I receive an error message that I cannot resolve. I am following the example code in Tegra X1 Multimedia User Guide, and I do not understand why the following pipeline does not work. I am a beginner in video compression so any help would be very useful. The code/error message:
ubuntu#tegra-ubuntu:~$ gst-launch-1.0 filesrc location=small_mem_vid.mov ! 'video/x-raw, format=(string)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720' ! omxh265enc ! filesink location=new_encode.mov -e
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is PREROLLING ...
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
ERROR: from element /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0: Could not write to resource.
Additional debug info:
/dvs/git/dirty/git-master_linux/external/gstreamer/gst-omx/omx/gstomxvideoenc.c(2139): gst_omx_video_enc_handle_frame (): /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0:
Failed to write input into the OpenMAX buffer
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu#tegra-ubuntu:~$

Are you sure the .mov file is really uncompressed video? The .mov extension is commonly used for quicktime video. You could use "mediainfo" in Linux to discover more details about the format of the file. In that case I don't think you can go directly from filesrc to the encoder. You probably need a qtdemux and a decoder, maybe avdec_h264 depending on what mediainfo shows.
You also might want to enable some more detailed debugging:
export GST_DEBUG=*:4

Related

GStreamer preview RTMP using xvimage

I want to preview RTMP using gstreamer xvimagesink. i can see the output if i use autovideosink like this:
gst-launch-1.0 -v rtmpsrc location='rtmp://127.0.0.1:1935/live/stream' ! decodebin3 ! autovideosink
but if i replace "autovideosink" with "xvimagesink" i get this:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ...
Both decodebin3 and autovideosink are auto-plugging GStreamer elements. It means that both elements are auto-selecting available and the most appropriate GStreamer plugins to demux/decode (decodebin3) and render video (autovideosink) from, in this case, live RTMP stream.
So it is very possible that, for example,
decodebin3 decodes video in format that xvimagesink cannot show on your platform/hardware and/or with your Gstreamer version,
xvimagesink is not set properly on your platform and it is not related with available display/monitor.
To find out more details about
video format decoded by decodebin3
video sink element "chosen" by autovideosink,
you can set higher (more detailed) debug level of GStreamer with, for example, export GST_DEBUG=3, rerun pipeline and inspect output.

which gstreamer rtp payloader element should I use to wrap AAC audio?

I am trying to figure out the proper gstreamer element to use to transmit AAC audio over RTP.
By dumping the dot graph of a playbin on the file I can conclude that the caps coming out of the tsdemux is audio/mpeg,mpegversion:2,stream-format:adts .
If I use the following pipeline
gst-launch-1.0 -v filesrc location=$BA ! tsdemux ! audio/mpeg ! rtpmpapay ! filesink location=/tmp/test.rtp
it fails:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1: caps = audio/mpeg
WARNING: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Delayed linking failed.
Additional debug info:
/var/tmp/portage/media-libs/gstreamer-1.12.3/work/gstreamer-1.12.3/gst/parse/grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
failed delayed linking some pad of GstTSDemux named tsdemux0 to some pad of GstRtpMPAPay named rtpmpapay0
ERROR: from element /GstPipeline:pipeline0/GstTSDemux:tsdemux0: Internal data stream error.
Additional debug info:
/var/tmp/portage/media-libs/gst-plugins-bad-1.12.3/work/gst-plugins-bad-1.12.3/gst/mpegtsdemux/mpegtsbase.c(1613): mpegts_base_loop (): /GstPipeline:pipeline0/GstTSDemux:tsdemux0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Which gstreamer element should I be using to wrap AAC audio in an RTP packet?
I guess its rtpmp4apay: RTP MPEG4 audio payloader. Maybe you want/need aacparse before the payloader as well.

Stream Icecast using Gstreamer

I'm designing a program to stream an icecast server (radio.clarkson.edu). Ultimately it will be written in Python3, but for now I'm using gst-launch to test the pipeline. I've been working on Debian Jessie and using gstreamer-1.0. Using a file on Wikimedia, I was able to play pretty easily using:
url=https://upload.wikimedia.org/wikipedia/commons/0/0c/Muriel-Nguyen-Xuan-Korsakov-Flight-of-the-bumblebee.flac.oga
gst-launch-1.0 -v souphttpsrc location =$url ! decodebin ! audioconvert ! audioresample ! alsasink
Running the same commands with my stream, I get the output:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = text/uri-list
Missing element: text/uri-list decoder
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gstdecodebin2.c(3977): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
no suitable plugins found
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = "NULL"
Freeing pipeline ...
I have tried too many other pipelines to put on one post, but I can answer any other questions.
Thank you
By now you probably have solved that problem, but still here's an idea: text/uri-list indicates that you didn't hand an actual stream to gstreamer, but rather a (textual) playlist that contains stream addresses. I guess gstreamer can't handle those, hence you need to parse them beforehand and then hand an actual audio stream address to it.

Gstreamer Missing plugins

I am trying to run certain pipelines on the Command prompt for playing a video and I am often getting these errors/messages/warnings :
WARNING: erroneous pipeline: no element "qtdemux"
WARNING: erroneous pipeline: no element "playbin2"
WARNING: erroneous pipeline: no element "decodebin2"
ERROR: pipeline could not be constructed: no element "playbin".
Following are the pipelines :
gst-launch filesrc location=path to the mp4 file ! playbin2 ! queue ! ffmpegcolorspace ! autovideosink
or
gst-launch -v filesrc location=path to the mp4 file ! qtdemux name=demuxer ! { queue ! decodebin ! sdlvideosink } { demuxer. ! queue ! decodebin ! alsasink }
or
gst-launch -v playbin uri=path to the mp4 file
or
gst-launch -v playbin2 uri=path to the mp4 file
Questions
I wanted to know, if I am I missing the plugins to execute this.
How do I know which plugin is responsible for which or found where?
What is the benefit of implementing the pipeline via c code.Are the missing plugins still required.
Is it good to install the missing plugins form the Synaptic manager or form the Gstreamer site(base,good,bad,ugly)
When we do gst-inspect we get output like this:
postproc: postproc_hdeblock: LibPostProc hdeblock filter
libvisual: libvisual_oinksie: libvisual oinksie plugin plugin v.0.1
flump3dec: flump3dec: Fluendo MP3 Decoder (liboil build)
vorbis: vorbistag: VorbisTag
vorbis: vorbisparse: VorbisParse
vorbis: vorbisdec: Vorbis audio decoder
vorbis: vorbisenc: Vorbis audio encoder
coreindexers: fileindex: A index that stores entries in file
coreindexers: memindex: A index that stores entries in memory
amrnb: amrnbenc: AMR-NB audio encoder
amrnb: amrnbdec: AMR-NB audio decoder
audioresample: audioresample: Audio resampler
flv: flvmux: FLV muxer
flv: flvdemux: FLV Demuxer
What does the x : y ( x and y mean ) ?
Answers,
It looks like gstreamer at your ends was not installed correctly. playbin2, decodebin2 are basic and part of the base plugins
1 Yes you may be missing some plugins
2 Use gst-inspect command to check if it is available
3 From C code you can manage states, register callback, learn more
Yes missing plugins are still required
4 I guess gstreamer site would be better
5 Not sure about this one, would help if you arrange the result in a proper way
Most probably the GST_PLUGIN_PATH is incorrect. Please set the correct path to where the gstremer has been installed.

GStreamer: Play mpeg2

I'm trying to play a local mpeg2 TS file with gstreamer with this:
gst-launch filesrc location=open_season.mpg ! mpeg2dec ! xvimagesink
The first frame appears as big blocks of color and then stops. Any thoughts about what I'm doing wrong here? Does a -TS file need to be handled differently than this?
Here's the log:
$ gst-launch filesrc location=open_season.mpg ! mpeg2dec ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ....
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Internal data flow problem..
Additional debug info:.
gstbasesink.c(3492): gst_base_sink_chain_unlocked (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Received buffer without a new-segment. Assuming timestamps start from 0.
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 6866757291 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ..
I think first you should first try to play the file with the help of playbin2. If you are able to play it then u should use decodebin2 ,debug its output and construct your pipeline accordingly.
The syntax for playbin2 is as follows :-
gst-launch playbin2 uri = file:///home/user1031040/Desktop/file.mpg
The syntax for decodebin2 is as follows:-
gst-launch filesrc location = file.mpg ! decodebin2 ! autovideosink