How to use gstreamer to play gif file on windows - gstreamer

As title,
I use this commend to play gif on windows, but it just show the first frame then close it.
gst-launch-1.0 filesrc location=demo.gif ! gdkpixbufdec ! videoconvert ! autovideosink
I want to play whole gif file, is some gst element or parameter I forget to setup?

After gstreamer 1.14, user can use element of libav library to create gif pipline.
This is the sample commend.
gst-launch-1.0 filesrc location=demo.gif ! avdemux_gif ! avdec_gif ! autovideosink

Related

How to record video (1080p 30fps) from raspberry pi camera and stream the 'recording in progress' file simultaneously?

0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.

How do I save a video with an alpha channel in GStreamer?

I have a collection of RGBA png files, and have verified the presence of an alpha layer on each file:
gst-launch-1.0 multifilesrc location="pics/%d.png" ! decodebin ! videorate ! videoconvert ! video/x-raw,format=BGRA,framerate=60/1 ! videomixer background=checker ! videoconvert ! ximagesink
I want to take these files and make them into a video file (in any format that GStreamer will readily handle with a simple decodebin). What would be a good set of encoders, containers, and elements to use for this?
I've tried avimux but no alpha data was saved. I also tried avenc_huffyuv, and that would decode fine as raw data using avenc_huffyuv, but decodebin could not detect it.
Nothing like a good night's sleep to solve an issue..
Apparently the huffyuv encoder and avi muxer work nicely together to preserve tranpsarency:
gst-launch-1.0 multifilesrc location="pics/%d.png" ! decodebin ! videorate ! videoconvert ! video/x-raw,format=BGRA,framerate=60/1 ! avenc_huffyuv ! avimux ! filesink location=/tmp/test.avi

gstreamer pipeline code to display text over live streaming from camera and not image?

Can any one please suggest a working pipeline to display text over live streaming using v4l2src and mfw_isink ?
I have a working pipeline to display textoverlay, clockoverlay, timeoverlay but with test source and and test sink and not with desired source and sink.
gst-launch videotestsrc pattern=blue ! textoverlay font-desc="San 32" text="CAM1 Disconnected" valign=top halign=left ! ximagesink
You can just replace your elements with needed ones, and insert videoconvert if needed. I suggest you to learn how to write pipeline description, however.
Here I replaced videotestsrc with v4l2src and put videoconvert element because v4l2src and ximagesink has no common video format in their list of supported formats to use together.
gst-launch-1.0 v4l2src ! videoconvert ! textoverlay font-desc="San 32" text="CAM1 Disconnected" ! ximagesink
I used GStreamer 1.x here.
You can replace ximagesink with with your custom sink.

How do I use GStreamer gst-launch to re-mux an interlaced h264 stream from an RTSP source?

Is there any way to take an interlaced h264 stream and re-mux it into Matroska or QT container form in a way that players will correctly identify it as interlaced? My basic pipeline is:
gst-launch-1.0 -e souphttpsrc location="http://hostname/stream1.sdp" ! application/sdp ! sdpdemux ! rtpjitterbuffer ! rtph264depay ! h264parse ! matroskamux ! filesink location=test.mkv
This saves the file just fine, but in all the players I've tried, the interlaced form of the video was not detected.

gstreamer: display various image on top of video

I wrote a video player based on gstreamer. Now I need to display status images on top of playing video when some event is occurred. I tried following pipeline for testing purposes
gst-launch-1.0 videotestsrc ! videomixer name=mix ! videoconvert ! autovideosink filesrc location=pic.jpg ! jpegdec ! videoconvert ! imagefreeze ! mix.
to display image (implemented in C). To hide image I set pipeline to GST_STATE_READY, unlink and remove location, jpegdec, videoconvert and imagefreeze and set pipeline back to playing state but that doesn't work (video is not playing anymore).
Could someone suggest the right way of showing and hiding images on top of playing video?