Green screen when taking camera output and converting it to RTP on Mac OS X - gstreamer

I'm wondering if anyone can see an obvious issue as to why the output im seeing is a green screen. The source is my Macbook's webcam.
I'm encoding the stream in VP8 and then sending that as RTP to a remove source.
Relavent pipeline
"autovideosrc ! videoconvert ! vp8enc deadline=1 ! rtpvp8pay ! queue ! " RTP_CAPS_VP8 "96 ! sendrecv. "
Critical error
(webrtc-sendrecv:48489): GStreamer-Video-CRITICAL **: 10:24:08.124: gst_video_frame_map_id: assertion 'info->width <= meta->width' failed

Related

streaming live video after OpenCV image processing

i am trying to stream live video feed from a camera connected to a Jetson NX, to a computer connected to the same network, the network works as wireless ethernet, meaning the jetson sees it as wired connection but in reality its wireless and is limited by bitrate.
On the jetson side, I am using OpenCV VideoWrite to send frames over the network using this pipeline:
cv::VideoWriter video_write("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv\
! video/x-raw(memory:NVMM),format=NV12,width=640,height=360,framerate=30/1 ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=30 \
bitrate=1800000 EnableTwopassCBR=1 ! h264parse ! rtph264pay ! udpsink host=169.254.84.2 port=5004 auto-multicast=0",\
cv::CAP_GSTREAMER,30,cv::Size(640,360));
on the receiving computer my video capture is :
cv::VideoCapture video("udpsrc port=5004 auto_multicast=0 !
application/x-rtp,media=video,encoding-name=H264 ! rtpjitterbuffer latency=0 !
rtph264depay ! decodebin ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1 "
, cv::CAP_GSTREAMER);
Problem is, if the camera jitters a lot or moves a lot to any direction, the video stream either freezes or completely pixelates. I was hoping I could get suggestions for either a better encoder(im not limited to nvv4l2h264enc or h264 in general) or a solution for the pixaltion and freeze, or maybe even a better way to stream the video other than VideoWrite.
I am trying to stream 360p video at 30fps, my bitrate is limited to either 6mbps or 2.5mbps depending on the distance i want to limit myself at. it does not seem like a network problem simply because If i change parameters like Codec(MJPG instead of gstreamer for example) it changes the behaviour of the video feed, in my case it lowers the amount of freezing but makes the pixelating worse.

GStreamer pipeline stops playing after fast and shaky camera movement

I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem?
The pipelines I use:
streaming device:
gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008"
client:
udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false
The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB.
For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src)
Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate.
A more detailed inspection and pipeline that works for me is in this GStreamer issue page

Trying to stream virtualy created camera video0 device to make rtsp stream for VLC using gstreamer in 24bit RGB format

I have created Video0 device using V4l2loopback and used the following sample Git code V4l2Loopback_cpp as a application to stream jpg images from a folder sequential by altering some conditions in the code.But the code read images as 24Bit RGB image and send it Video0 device which is fine ,because the image run like a proper video on VLC video device capture. As i mentioned earlier thet if i checked the VLC properties of the video its Shows the following content
i need this video0 device to stream rtsp h264 video in vlc using the gstreamer lib .
i have used the following command to check in commandline for testing but its show some internal process error
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=590,height=332,framerate=30/1 ! x264enc ! rtph264pay name=pay0 pt=96
i dont know whats the problem here .is it 24bit jpeg format or the gstreamer command i use. I need a proper gstreamer command line to process the video0 devide to stream h264 rtsp video any help is appreciated thank u.
image Format - jpg (sequence image passed)
Video0 recives - 24-bit RGB image
output need - h264 rtsp stream from video0
Not sure this is a solution, but the following may help:
You may try adjusting resolution according to what V4L reports (width=584) :
v4l2src device=/dev/video0 ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay name=pay0 pt=96
Note that this error may happen on v4l2loopback receiver side while it may be a sender error. If you're feeding v4l2loopback with a gstreamer pipeline to v4l2sink, you would try adding identity such as:
... ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! identity drop-allocation=1 ! v4l2sink

How to convert ueyesrc buffer to opengl texture?

I'm trying to make a UI-3370CP-C-HQ R2 Camera work on a Coral DevBoard with gstreamer.
Since the camera is no standard v4l2 camera, I downloaded and compiled the ueyesrc gst plugin (https://github.com/atdgroup/gst-plugin-ueye) on the devboard.
In my application I need to have the frame as opengl textures and I'm stuck at building a working pipeline.
So far the only way I managed to get something from the camera is to save a frame as jpeg:
gst-launch-1.0 tee ueyesrc num-buffers=10 ! jpegenc ! filesinklocation=ueyesrc-frame.jpg
The pipeline example provided with ueyesrc gst-launch-1.0 ueyesrc ! videoconvert ! xvimagesink doesn't work in my case because there is no X-Server on the device (but Wayland)
gst-launch-1.0 ueyesrc ! videoconvert ! glimagesink returns the following error:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: Failed to convert multiview video buffer
Additional debug info:
gstglimagesink.c(1741): gst_glimage_sink_prepare (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink
Execution ended after 0:00:00.486558117
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
With a standard USB webcam (Logitech HD Pro Webcam C920), gst-launch-1.0 v4l2src ! videoconvert ! glimagesink works fine.
I don't really understand what is going wrong or how to find more clues about it, I suppose I'm missing a conversion step in the middle but I don't know how to fix it. Does someone have an idea?
edit 1: it is indeed a conversion issue. I got it to work by specifying the format in the videoconvert caps: gst-launch-1.0 ueyesrc exposure=2 ! videoconvert ! video/x-raw,format=YUY2 ! glimagesink sync=False
Although the CPU usage is super high (>90% on all 4 cores of the iMx8) and the framerate reach a max of 6.5 fps.

How to record video (1080p 30fps) from raspberry pi camera and stream the 'recording in progress' file simultaneously?

0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.