Gstreamer: How to play video from USB camera with spca561 driver? - gstreamer

I have a Logitech QuickCam Chat camera. When, I run the "v4l2-ctl" tool, I can see that uses the spca561 driver.
I try to use "cheese" tool but it said "No device was found". However, If I run the following, if it works:
vlc v4l2:///dev/video0
I want to use "gstreamer" tool. I run the following sentence:
gst-launch-1.0 v4l2src ! xvimagesink
but it's not working.
How can I capture video with gstreamer? Why the tool cheese not capture but if vlc?

I was trying to use a very old Logitech USB Camera 'Logitech, Inc. QuickCam Express' ... Yes the one from 1999 :-O .
the kernel detect it, and seems to be spca561: gspca_main: spca561-2.14.0 probing 046d:0928
vlc show nice live video on capture from /dev/video0 : 352x288. 30fps.
guvcview works very well too.
To make it work from gst-launch you can try this :
gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-bayer,width=176,height=144 ! bayer2rgb ! videoconvert ! autovideosink
It seems that Gstreamer is wrong on detecting supported formats, but VLC just works... so it's maybe gstreamer problem.

Related

Trying to stream virtualy created camera video0 device to make rtsp stream for VLC using gstreamer in 24bit RGB format

I have created Video0 device using V4l2loopback and used the following sample Git code V4l2Loopback_cpp as a application to stream jpg images from a folder sequential by altering some conditions in the code.But the code read images as 24Bit RGB image and send it Video0 device which is fine ,because the image run like a proper video on VLC video device capture. As i mentioned earlier thet if i checked the VLC properties of the video its Shows the following content
i need this video0 device to stream rtsp h264 video in vlc using the gstreamer lib .
i have used the following command to check in commandline for testing but its show some internal process error
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=590,height=332,framerate=30/1 ! x264enc ! rtph264pay name=pay0 pt=96
i dont know whats the problem here .is it 24bit jpeg format or the gstreamer command i use. I need a proper gstreamer command line to process the video0 devide to stream h264 rtsp video any help is appreciated thank u.
image Format - jpg (sequence image passed)
Video0 recives - 24-bit RGB image
output need - h264 rtsp stream from video0
Not sure this is a solution, but the following may help:
You may try adjusting resolution according to what V4L reports (width=584) :
v4l2src device=/dev/video0 ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay name=pay0 pt=96
Note that this error may happen on v4l2loopback receiver side while it may be a sender error. If you're feeding v4l2loopback with a gstreamer pipeline to v4l2sink, you would try adding identity such as:
... ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! identity drop-allocation=1 ! v4l2sink

Need to decode a RTSP stream to V4l2loopback with gstreamer

I am trying to decode an RTSP stream to a V4L2 device on a raspberry pi zero.
first i create a loopback device this way..
sudo modprobe v4l2loopback video_nr=25
Then i try to decode the RTSP stream to a virtual V4L2 device. I would prefer if i could sink it to YUYV format..
sudo gst-launch-1.0 rtspsrc location=rtsp://192.168.1.138:8554/unicast ! rtph264depay ! h264parse ! decodebin ! videoconvert ! video/x-raw,format=YUY2 ! tee ! v4l2sink device=/dev/video25
When i inspect the V4L2 device with this...
v4l2-ctl -d 25 --list-formats , i get this...
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
When i try to play it back with VLC , nothing happens.
sudo cvlc --v4l2-dev=25 --v4l2-width=640 --v4l2-height=480 --v4l2-fps=30 --vout=mmal_vout v4l2:///dev/video25
I suspect the gstreamer pipeline is not correct. I am new to gstreamer and i am poking a little bit in the dark there.
Could anybody give me some tips on what i am doing wrong here ?

GStreamer + OpenCV video processing problem

I am planning on doing VideoCapture from OpenCV for video file stream/live rtsp stream. However, the VideoCapture has alot of latency when used in my program so i decided to use the gstreamer pipeline instead. For example, i used
VideoCapture capVideo("filesrc location=CarsDriving.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink ", CAP_GSTREAMER);
My program is able to run but if i were to do something like
capVideo.get(CAP_PROP_FRAME_COUNT)
It always returns -1 because GStreamer has this warnings
[ WARN:0] global /home/nvidia/Downloads/opencv-4.4.0/source/modules/videoio/src/cap_gstreamer.cpp (898) open OpenCV | GStreamer warning: unable to query duration of stream
[ WARN:0] global /home/nvidia/Downloads/opencv-4.4.0/source/modules/videoio/src/cap_gstreamer.cpp (935) open OpenCV | GStreamer warning: Cannot query video position: status=1, value=1, duration=-1
How do i get the frame count in opencv if i use gstreamer for the video pipeline? I need the framecount for exceptions and also video processing techniques.
This is a bug which #alekhin mentioned here and here. Also mentioned how to fix. After changing you should rebuild opencv.
Also you said:
However, the VideoCapture has alot of latency when used in my program
so i decided to use the gstreamer pipeline instead.
rtsp cameras generaly streams as h264/h265 encoded data. If you are trying to decode that data via on CPU but not GPU, it will not give you much increasing about speed. Why don't you choose CAP_FFMPEG flag instead of CAP_GSTREAMER? CAP_FFMPEG will be faster than CAP_GSTREAMER

gstreamer and Qt with nvidia on ARM

I have cross compiled Qt 5.5.1 for my ARM board and been trying to play video files using gstreamer and Qt. I have the following pipeline on gstreamer which works fine.
gst-launch-1.0 filesrc location=tracked.mp4 !
qtdemux name=demux demux.video_0! queue ! h264parse ! omxh264dec !
nveglglesink -e
Now I try to play the same video with the video player examples coming with qt multimedia and I get the video being shown in grayscale but replicated 4 times across the screens. I am not sure why but my ARM board does have 4 processors. See the attached screenshot.
Has anyone come across this problem and perhaps have an idea on how to run such gstreamer pipelines with Qt successfully?
Qt sample usually use decodebin or playbin to play video.
So it is not abnormal for Qt play video differently with your pipeline.
Try to play this video in GStreamer with decodebin or playbin, and check whether same phenomenon occur.
One more points is that you use nveglglesink for the pipeline, but Qt always uses its own sink element (qtvideorendersink or somethings).
There is chance that your decoded format is not handled well by qt sink.
("Gray and duplicate images" phenomenon usually happens because sink element not handle the format correctly).
If it is the case, convert to other format before send to Qt sink may solve it.

Streaming H264 video for Logitech C920 using GStreamer produces a relief

I'm trying to stream the native H264 video from a Logitech C920 camera using GStreamer 1.2.4:
gst-launch-1.0 v4l2src ! video/x-h264,width=640,height=480,framerate=10/1 ! \
h264parse ! decodebin ! videoconvert ! ximagesink sync=false
This produces an image output which is a kind of a relief:
As soon as add more movement to the scene, the image quality improves but is still far away from being sufficent. I seems that anyhow the stable parts of the video stream are not decoded.
Any ideas?
I'm using gstreamer 1.2.4 on a Banana PI (Debian Wheezy)