gstreamer get length of media from console - mp3

I want to retrieve the time length of a MP3 file with gstreamer with a command on the console. But I don't know how.
I tried the following command
gst-launch filesrc location=$myMediaFile ! decodebin2 ! fakesink
but I got the following result:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 370731000 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline .
The time 370731000ns does not correspond to the time of the media which was 86 seconds.

If you have gst-discoverer you can get time length with this command line:
gst-discoverer-0.10 -v $myMediaFile

Related

errors when run ximagesink/xvimagesink on imx8mm board

I was trying gstreamer with xvimagesink/ximagesink on imx8mm board, and gstvideooverlay program. The error information are listed:
gst-launch-1.0 videotestsrc ! videoconvert ! ximagesink
Setting pipeline to PAUSED ...
ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could not initialise X output
Additional debug info:
../sys/ximage/ximagesink.c(867): gst_x_image_sink_xcontext_get (): /GstPipeline:pipeline0/GstXImageSink:ximagesink0:
Could not open display
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...
./overlay
error: XDG_RUNTIME_DIR not set in the environment.
error: XDG_RUNTIME_DIR not set in the environment.
QMetaObject::invokeMethod: No such method MainWindow::quit()
Any suggestions about how to debug? Thanks for your reply.
BR
GUO

Gstreamer: moov-recovery info with qtmux?

I am trying to experiment with Gstreamer and moov-recovery via qtmux.
When I try to get the recovery moov from a non-corrupted .mp4 file
gst-launch-1.0 filesrc location=full.mp4 ! qtdemux ! qtmux moov-recovery-file=moov_recov.mrf ! filesink location=recovered_video.mp4
then I get
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.112361582
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
What is the reason for the Got EOS from element "pipeline0"?
And what would be the correct way to pull the recovery moov from the .mp4 file?
Thanks.
Your muxing process was a success. It took about a tenth of a second. Therefore the EOS. Since it did not crash or anything the file probaly gets removed after a successful muxing. There is no point in keeping that file.

Gstreamer pipeline fails to run with osxaudiosrc plugin

I am running below pipeline on mac but it shows error while running:
$**gst-launch-1.0 osxaudiosrc device=92**
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
0:00:00.048601000 16777 0x7fafe585d980 WARN osxaudio gstosxcoreaudio.c:500:gst_core_audio_asbd_to_caps: No sample rate
0:00:00.048699000 16777 0x7fafe585d980 ERROR audio-info audio-info.c:304:gboolean gst_audio_info_from_caps(GstAudioInfo *, const GstCaps *): no channel-mask property given
0:00:00.048736000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: Internal data stream error.
0:00:00.048744000 16777 0x7fafe585d980 WARN basesrc gstbasesrc.c:3072:void gst_base_src_loop(GstPad *):<osxaudiosrc0> error: streaming stopped, reason not-negotiated (-4)
New clock: GstAudioSrcClock
**ERROR: from element /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0: Internal data stream error.**
Additional debug info:
gstbasesrc.c(3072): void gst_base_src_loop(GstPad *) ():
/GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.000101000
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
The device id mentioned in the cmd is fetched from gst-inspect and is of macbook speakers. I am using GStreamer 1.16.2 on catalina.
What is wrong/missing in this pipeline?
TL;DR: you have an incomplete pipeline.
Once the osxaudiosrc starts producing buffers, where is it supposed to go? Do you want to encode it and/or write it to file? Should it be streamed somewhere? Should it be plotted? ...
This is also the reason GStreamer is erroring out. There's no element after your source element, so if it were to start playing, those buffers would somehow end up in the void, with no destination to go to (to be a bit more thorough: you're trying to push data on a pad which has no peer, so it would try to dereference an invalid sinkpad). Since this is not possibe, GStreamer just plainly stops.
An example pipeline is given in the osxaudiosrc documentation:
gst-launch-1.0 osxaudiosrc ! wavenc ! filesink location=audio.wav

Error: Failed to write input into the OpenMAX buffer

I am trying to encode uncompressed video in H.265; however, when I write the following pipeline I receive an error message that I cannot resolve. I am following the example code in Tegra X1 Multimedia User Guide, and I do not understand why the following pipeline does not work. I am a beginner in video compression so any help would be very useful. The code/error message:
ubuntu#tegra-ubuntu:~$ gst-launch-1.0 filesrc location=small_mem_vid.mov ! 'video/x-raw, format=(string)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720' ! omxh265enc ! filesink location=new_encode.mov -e
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is PREROLLING ...
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
ERROR: from element /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0: Could not write to resource.
Additional debug info:
/dvs/git/dirty/git-master_linux/external/gstreamer/gst-omx/omx/gstomxvideoenc.c(2139): gst_omx_video_enc_handle_frame (): /GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0:
Failed to write input into the OpenMAX buffer
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu#tegra-ubuntu:~$
Are you sure the .mov file is really uncompressed video? The .mov extension is commonly used for quicktime video. You could use "mediainfo" in Linux to discover more details about the format of the file. In that case I don't think you can go directly from filesrc to the encoder. You probably need a qtdemux and a decoder, maybe avdec_h264 depending on what mediainfo shows.
You also might want to enable some more detailed debugging:
export GST_DEBUG=*:4

GStreamer: Play mpeg2

I'm trying to play a local mpeg2 TS file with gstreamer with this:
gst-launch filesrc location=open_season.mpg ! mpeg2dec ! xvimagesink
The first frame appears as big blocks of color and then stops. Any thoughts about what I'm doing wrong here? Does a -TS file need to be handled differently than this?
Here's the log:
$ gst-launch filesrc location=open_season.mpg ! mpeg2dec ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ....
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Internal data flow problem..
Additional debug info:.
gstbasesink.c(3492): gst_base_sink_chain_unlocked (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Received buffer without a new-segment. Assuming timestamps start from 0.
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 6866757291 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ..
I think first you should first try to play the file with the help of playbin2. If you are able to play it then u should use decodebin2 ,debug its output and construct your pipeline accordingly.
The syntax for playbin2 is as follows :-
gst-launch playbin2 uri = file:///home/user1031040/Desktop/file.mpg
The syntax for decodebin2 is as follows:-
gst-launch filesrc location = file.mpg ! decodebin2 ! autovideosink