I am trying to create virtual web camera using GStreamer and v4l2loopback. The problem is that I want to use Playbin but the video speed is too fast when I use it. For instance, it happens when I execute the following command:
gst-launch-1.0 -v playbin uri=file:/vagrant/test.avi
video-sink="videoconvert
! videoscale
! video/x-raw,format=YUY2,width=320,height=320
! v4l2sink device=/dev/video0"
Adding "framerate=20/1" to the caps throws "Not negotiated error" while setting it to "30/1" works but doesn't help to fix the issue with the speed.
On the other hand, I am getting normal speed when executing the following command:
gst-launch-1.0 -v filesrc location=/vagrant/test.avi
! avidemux
! decodebin
! videoconvert
! videoscale
! "video/x-raw,format=YUY2,width=320,height=320"
! v4l2sink device=/dev/video0
I tried a lot of combinations with filters from the last example with the Playbin but none of them helped.
Any help would be highly appreciated!
The problem was with the virtual machine running on top of the VirtualBox. To be more precise - I had 3d acceleration turned on, which resulted in all the videos to play at speed 2x.
Turning off 3d acceleration by setting --accelerate3d=off helped to solve the issue.
Related
I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem?
The pipelines I use:
streaming device:
gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008"
client:
udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false
The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB.
For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src)
Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate.
A more detailed inspection and pipeline that works for me is in this GStreamer issue page
I am trying to get a camera that uses Gstreamer to capture images in a specific interval of time, while searching in the web I found the following line:
gst-launch-1.0 -v videotestsrc is-live=true ! clockoverlay font-desc="Sans, 48" ! videoconvert ! videorate ! video/x-raw,framerate=1/3 ! jpegenc ! multifilesink location=file-%02d.jpg
I believe it would work great, but unfortunately, I don't know how to get it to work with my specific camera, meaning I don't know how to identify my video source in RPi4 and if that is the only thing I have to change to get it to work. I would appreciate either help with the video source, or any other method to get those images.
0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.
So I have a fisheye camera piped through gstreamer, over the internet to another pc where I want to display it on an Oculus Rift. The Oculus expects a 1280×800 resolution input just like a normal monitor, but the left 640×800 of the screen displays in the left eye, other 640×800 for right eye.
I need to modify this:
gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false
to show the stream twice, side-by-side. If I run this command and I winKey+leftArrow, it displays really well in one eye. The oculus even crops out edges (read: windows decorations). But gstreamer won't let me run gst-launch twice at the same time. Any way to make it work? Admittedly, it's quite a hack, but it seemed to work quite well in the one eye.
Alternatively, can someone help me use videomixer?
windows 8, btw'
Thanks!
You should be able to duplicate the video with a
... ! tee name=t ! queue ! videomixer name=m sink_0::xpos=0 sink_1::xpos=640 ! ... t. ! queue ! m.
the key is to use the videomixers pad properties to position the copies.
I want to use gstreamer library to work with sound in my c++ application. Can you tell me there are any ways to change sound tempo, pitch, etc?
Thanks.
With pitch plugin you can change sound pitch:
$ gst-launch filesrc location=sound.mp3
! decodebin ! audioconvert
! pitch pitch=3
! autoaudiosink
Or tempo:
$ gst-launch filesrc location=sound.mp3
! decodebin ! audioconvert
! pitch tempo=2
! autoaudiosink
Or rate.
Also there is huge LADSPA-library wrapper ladspa.
I had some bad expirience with it, but maybe it is more stable now.
It has several plugins to control pitch, tempo and much more.
This also may be of interest if you are planning to work with sound:
nice plugin library audiofx with various filters, e.g. there is compressor/expander plugin audiodynamic.
equalizer plugin.
Starting from max taldykin's suggestion, which returned an error in my version of GStreamer (0.10.35), I found a pipeline that does work. For example, to shift the song 1 step up maintaining the tempo, you should pitch 6%:
gst-launch-0.10 filesrc location=02-have_you_ever.mp3 ! \
decodebin ! audioconvert ! pitch pitch=1.06 tempo=1.0 ! \
audioconvert ! audioresample ! autoaudiosink
does the job for me.