How to convert ueyesrc buffer to opengl texture? - opengl

I'm trying to make a UI-3370CP-C-HQ R2 Camera work on a Coral DevBoard with gstreamer.
Since the camera is no standard v4l2 camera, I downloaded and compiled the ueyesrc gst plugin (https://github.com/atdgroup/gst-plugin-ueye) on the devboard.
In my application I need to have the frame as opengl textures and I'm stuck at building a working pipeline.
So far the only way I managed to get something from the camera is to save a frame as jpeg:
gst-launch-1.0 tee ueyesrc num-buffers=10 ! jpegenc ! filesinklocation=ueyesrc-frame.jpg
The pipeline example provided with ueyesrc gst-launch-1.0 ueyesrc ! videoconvert ! xvimagesink doesn't work in my case because there is no X-Server on the device (but Wayland)
gst-launch-1.0 ueyesrc ! videoconvert ! glimagesink returns the following error:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: Failed to convert multiview video buffer
Additional debug info:
gstglimagesink.c(1741): gst_glimage_sink_prepare (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink
Execution ended after 0:00:00.486558117
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
With a standard USB webcam (Logitech HD Pro Webcam C920), gst-launch-1.0 v4l2src ! videoconvert ! glimagesink works fine.
I don't really understand what is going wrong or how to find more clues about it, I suppose I'm missing a conversion step in the middle but I don't know how to fix it. Does someone have an idea?
edit 1: it is indeed a conversion issue. I got it to work by specifying the format in the videoconvert caps: gst-launch-1.0 ueyesrc exposure=2 ! videoconvert ! video/x-raw,format=YUY2 ! glimagesink sync=False
Although the CPU usage is super high (>90% on all 4 cores of the iMx8) and the framerate reach a max of 6.5 fps.

Related

GStreamer pipeline stops playing after fast and shaky camera movement

I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem?
The pipelines I use:
streaming device:
gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008"
client:
udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false
The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB.
For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src)
Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate.
A more detailed inspection and pipeline that works for me is in this GStreamer issue page

Gstreamer screenshot from RTSP stream is always gray

I'm trying to create screenshot (i.e. grab one frame) from RTSP camera stream using gstreamer pipeline.
The pipeline used looks like this:
gst-launch-1.0 rtspsrc location=$CAM_URL is_live=true ! decodebin ! videoconvert ! jpegenc snapshot=true ! filesink location=/tmp/frame.jpg
Problem is that the result image is always gray, with random artifacts. It looks like it's grabbing the very first frame, and it doesn't wait for the key frame.
Is there any way how can I modify the pipeline to actually grab first valid frame of video? Or just wait long enough to be sure that there was at least one key frame already?
I'm unsure why, but after some trial and error it is now working with decodebin3 instead of decodebin. Documentation is still bit discouraging though, stating decodebin3 is still experimental API and a technology preview. Its behaviour and exposed API is subject to change.
Full pipeline looks like this:
gst-launch-1.0 rtspsrc location=$CAM_URL is_live=true ! decodebin3 ! videoconvert ! jpegenc snapshot=true ! filesink location=/tmp/frame.jpg

GStreamer Playbin video speed is too fast

I am trying to create virtual web camera using GStreamer and v4l2loopback. The problem is that I want to use Playbin but the video speed is too fast when I use it. For instance, it happens when I execute the following command:
gst-launch-1.0 -v playbin uri=file:/vagrant/test.avi
video-sink="videoconvert
! videoscale
! video/x-raw,format=YUY2,width=320,height=320
! v4l2sink device=/dev/video0"
Adding "framerate=20/1" to the caps throws "Not negotiated error" while setting it to "30/1" works but doesn't help to fix the issue with the speed.
On the other hand, I am getting normal speed when executing the following command:
gst-launch-1.0 -v filesrc location=/vagrant/test.avi
! avidemux
! decodebin
! videoconvert
! videoscale
! "video/x-raw,format=YUY2,width=320,height=320"
! v4l2sink device=/dev/video0
I tried a lot of combinations with filters from the last example with the Playbin but none of them helped.
Any help would be highly appreciated!
The problem was with the virtual machine running on top of the VirtualBox. To be more precise - I had 3d acceleration turned on, which resulted in all the videos to play at speed 2x.
Turning off 3d acceleration by setting --accelerate3d=off helped to solve the issue.

How to use gstreamer to play gif file on windows

As title,
I use this commend to play gif on windows, but it just show the first frame then close it.
gst-launch-1.0 filesrc location=demo.gif ! gdkpixbufdec ! videoconvert ! autovideosink
I want to play whole gif file, is some gst element or parameter I forget to setup?
After gstreamer 1.14, user can use element of libav library to create gif pipline.
This is the sample commend.
gst-launch-1.0 filesrc location=demo.gif ! avdemux_gif ! avdec_gif ! autovideosink

gstreamer: display various image on top of video

I wrote a video player based on gstreamer. Now I need to display status images on top of playing video when some event is occurred. I tried following pipeline for testing purposes
gst-launch-1.0 videotestsrc ! videomixer name=mix ! videoconvert ! autovideosink filesrc location=pic.jpg ! jpegdec ! videoconvert ! imagefreeze ! mix.
to display image (implemented in C). To hide image I set pipeline to GST_STATE_READY, unlink and remove location, jpegdec, videoconvert and imagefreeze and set pipeline back to playing state but that doesn't work (video is not playing anymore).
Could someone suggest the right way of showing and hiding images on top of playing video?