gstreamer fallbacksrc plugin can not link glimagesink - gstreamer

I want to use the gstreamer plugin fallbacksrc on linux.
fallbacksrc (with ximagesink) -> OK
gst-launch-1.0 fallbacksrc \
uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm \
fallback-uri=file:///path/to/some/jpg \
! videoconvert \
! ximagesink
It works as expected.
fallbacksrc (with glimagesink) -> NG
gst-launch-1.0 fallbacksrc \
uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm \
fallback-uri=file:///path/to/some/jpg \
! videoconvert \
! glimagesink
It does not work.
error message
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Noformat', utils/fallbackswitch/src/fallbacksrc/imp.rs:1897:36
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
fatal runtime error: failed to initiate panic, error 5
It seems pad linking(capability) problem. But even if I inserted capsfilter video/x-raw before videoconvert, same error happens.
How to use fallbacksrc with glimagesink ?
Environment
gstreamer version
1.22 (installed by archlinux package repo)
gst-plugins-rs
0.9.7 (installed by archlinux user package repo)

Related

Link error when compiling opencv from source when including ffmpeg compiled from source

I've been stuck on this problem for weeks.
I'm trying to build opencv on a raspberry pi 4 with x264 support. To do this I need to include ffmpeg, and also build that from source.
However when compiling opencv, I'm consistently getting these linker errors:
/usr/bin/ld: ../../lib/libopencv_videoio.so.4.5.3: undefined reference to `avcodec_get_context_defaults3'
/usr/bin/ld: ../../lib/libopencv_videoio.so.4.5.3: undefined reference to `av_lockmgr_register'
/usr/bin/ld: ../../lib/libopencv_videoio.so.4.5.3: undefined reference to `av_register_all'
I'm new to Linux, so I'm not exactly sure how to get started troubleshooting this. I believe I have ffmpeg correctly compiled and installed.
Here's the steps I use:
Configure ffmpeg:
sudo ./configure \
--prefix=/usr \
--extra-ldflags="-latomic" \
--enable-shared \
--extra-libs='-lpthread -lm' \
--ld=g++ \
--enable-gpl \
--disable-debug \
--enable-nonfree \
--enable-libx264 \
--enable-omx \
--enable-omx-rpi \
--enable-gnutls \
--enable-libfreetype \
--enable-libmp3lame
Then build it:
sudo make -j4
Then install it as a package: (so opencv cmake will detect it)
sudo checkinstall -y --deldoc=yes --pkgversion=9999 --pkgname=ffmpeg
Then configure opencv (ffmpeg is detected)
sudo cmake ../opencv_sources -D CMAKE_BUILD_TYPE=RELEASE \
-D OPENCV_EXTRA_MODULES_PATH= $PWD/../opencv_contrib/modules \
-D ENABLE_NEON=ON \
-D ENABLE_VFPV3=ON \
-D BUILD_TESTS=ON \
-D INSTALL_PYTHON_EXAMPLES=OFF \
-D OPENCV_ENABLE_NONFREE=ON \
-D CMAKE_SHARED_LINKER_FLAGS='-latomic -L/usr/lib' \
-D WITH_V4L=ON \
-D WITH_QT=OFF \
-D BUILD_EXAMPLES=OFF \
-D CPU_BASELINE=NATIVE \
-D CMAKE_INSTALL_PREFIX="$HOME/opencv_build" \
-D BUILD_opencv_apps=OFF \
-D BUILD_opencv_python2=OFF \
-D BUILD_SHARED_LIBS=ON \
-D WITH_FFMPEG=ON
Then build opencv:
sudo make -j4
And this is where I get linking errors.
I checked in /usr/lib and the .so files appear to be there:
ls /usr/lib | grep libav
libavcodec.a
libavcodec.so
libavcodec.so.59
libavcodec.so.59.4.101
libavdevice.a
libavdevice.so
libavdevice.so.59
libavdevice.so.59.0.100
libavfilter.a
libavfilter.so
libavfilter.so.8
libavfilter.so.8.1.103
libavformat.a
libavformat.so
libavformat.so.59
libavformat.so.59.4.101
libavutil.a
libavutil.so
libavutil.so.57
libavutil.so.57.3.100
How do check where opencv is looking when linking? And is there a way I can check the shared libraries that ffmpeg generated to make sure they work?
The ffmpeg version is: git-2021-08-10-c245963
And opencv is 4.5.3-dev
TL;DR:
I was compiling the master branch of ffmpeg. This version removed some deprecated functions and was incompatible with my opencv version. Or at least the build tests failed. By switching to the release/4.4 branch and repeating my previous build steps I was able to successfully build opencv.
Details:
I was able to determine that it's not a linking problem. I wrote a short program that calls avcodec_get_context_defaults3(...), as well as avcodec_version(), which is another function that should be in there. Upon linking libavcodec.so, the avcodec_version() is found, but avcodec_get_context_defaults3() is not. Therefore I believe this is a version compatibility issue between ffmpeg and opencv.
I then confirmed that the function was removed from ffmpeg. The latest version does not have it, old versions do. I used git's search function and found this commit:
Author: Andreas Rheinhardt <andreas.rheinhardt#gmail.com>
Date: Thu Feb 25 20:37:24 2021 +0100
avcodec: Remove deprecated avcodec_get_context_defaults3
Deprecated in 04fc8e24a091ed1d77d7a3c0cbcfe60baec19a9f.
Signed-off-by: Andreas Rheinhardt <andreas.rheinhardt#gmail.com>
Signed-off-by: James Almer <jamrial#gmail.com>
I switched to the release/3.4 branch of FFMPEG and repeated the steps. But that was missing some other things. I repeated the steps a 3rd time with release/4.4 and that worked.
The command I used to download the correct branch was this:
sudo git clone https://github.com/FFmpeg/FFmpeg --branch release/4.4 --depth 1 ffmpeg_sources

GStreamer RTSP server pipeline on Raspberry Pi 4B

I'm trying to set up a RTSP server for my camera on my Raspberry Pi 4B. I installed GStreamer and libgstrtspserver-1.0-dev (version 1.14.4), and am using the test-launch.c script, compiled with gcc test-launch.c -o test-launch $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)
I then try to use the pipeline
./test-launch "v4l2src ! 'video/x-raw, width=1280, height=400, framerate=20/1, format=GRAY8' \
! videoconvert ! 'video/x-raw, format=I420' ! v4l2h264enc ! 'video/x-h264, stream-format=byte-stream, \
alignment=au' ! h264parse ! rtph264pay name=pay0 pt=96"
but when I try to connect to it from my other computer (through VLC or OpenCV), I get the GStreamer-CRITICAL messages on the Raspi (I've set GST_DEBUG=4):
j#JRaspi:~/server $ ./test-launch "v4l2src device='/dev/video0' ! 'video/x-raw, width=1280, height=400, format=GRAY8, framerate=20/1' ! videoconvert ! 'video/x-raw, format=I420' ! v4l2h264enc ! 'video/x-h264, stream-format=byte-stream, alignment=au' ! h264parse ! rtph264pay name=pay0 pt=96"
stream ready at rtsp://127.0.0.1:8554/test
(test-launch:8246): GStreamer-CRITICAL **: 14:56:54.865: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:8246): GStreamer-CRITICAL **: 14:56:54.865: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:8246): GStreamer-CRITICAL **: 14:56:54.865: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
and a failed connection.
I can confirm that
gst-launch-1.0 v4l2src ! 'video/x-raw, width=1280, height=400, framerate=20/1, format=GRAY8' \
! videoconvert ! 'video/x-raw, format=I420' ! autovideosink
works for me (displays the camera feed well), so I'm wondering what's wrong with the back of my pipeline or does the test-launch script not work on the Raspberry Pi for some reason?
I've got a similar pipeline working just fine on my Jetson Nano:
./test-launch "v4l2src ! video/x-raw, width=1280, height=400, framerate=20/1, format=GRAY8 \
! nvvidconv ! video/x-raw(memory:NVMM), format=(string)I420 ! omxh264enc ! video/x-h264, \
stream-format=byte-stream, alignment=au ! h264parse ! rtph264pay name=pay0 pt=96"
so I'm kind of lost what's wrong on the Raspberry Pi.
EDIT:
I did some further testing and this circuitous route works for me:
gst-launch-1.0 v4l2src ! 'video/x-raw, width=1280, height=400, \
format=GRAY8' ! videoconvert ! 'video/x-raw, format=I420'\
! v4l2h264enc ! 'video/x-h264, stream-format=byte-stream, alignment=au' \
! h264parse ! rtph264pay name=pay0 pt=96 ! rtph264depay ! h264parse \
! decodebin ! videoconvert ! autovideosink
so I'm wondering if it's a problem with test-launch and Raspberry Pi?
EDIT2:
So I ran G_DEBUG=fatal-criticals gdb -ex run --args test-launch "v4l2src ! 'video/x-raw, width=1280, height=400, format=GRAY8' ! videoconvert ! 'video/x-raw, format=I420' ! v4l2h264enc ! 'video/x-h264, stream-format=byte-stream, alignment=au' ! h264parse ! rtph264pay name=pay0 pt=96"
which printed out:
Reading symbols from test-launch...(no debugging symbols found)...done.
Starting program: /home/joel/Arducam-Stereo-RPI/server/test-launch v4l2src\ \!\ \'video/x-raw,\ width=1280,\ height=400,\ format=GRAY8\'\ \!\ videoconvert\ \!\ \'video/x-raw,\ format=I420\'\ \!\ v4l2h264enc\ \!\ \'video/x-h264,\ stream-format=byte-stream,\ alignment=au\'\ \!\ h264parse\ \!\ rtph264pay\ name=pay0\ pt=96
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/arm-linux-gnueabihf/libthread_db.so.1".
stream ready at rtsp://127.0.0.1:8554/test
[New Thread 0xb657b3a0 (LWP 2529)]
(test-launch:2524): GStreamer-CRITICAL **: 16:46:07.769: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:2524): GStreamer-CRITICAL **: 16:46:07.787: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:2524): GStreamer-CRITICAL **: 16:46:07.788: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
[New Thread 0xb57743a0 (LWP 2530)]
(test-launch:2524): GStreamer-CRITICAL **: 16:46:27.960: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:2524): GStreamer-CRITICAL **: 16:46:27.960: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
(test-launch:2524): GStreamer-CRITICAL **: 16:46:27.960: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
[Thread 0xb57743a0 (LWP 2530) exited]
[Thread 0xb657b3a0 (LWP 2529) exited]
^C
Thread 1 "test-launch" received signal SIGINT, Interrupt.
__GI___poll (timeout=-1, nfds=2, fds=0x2e7e0)
--Type <RET> for more, q to quit, c to continue without paging--
at ../sysdeps/unix/sysv/linux/poll.c:29
29 ../sysdeps/unix/sysv/linux/poll.c: No such file or directory.
(gdb) bt
#0 0xb6b4dcd0 in __GI___poll (timeout=-1, nfds=2, fds=0x2e7e0)
at ../sysdeps/unix/sysv/linux/poll.c:29
#1 0xb6b4dcd0 in __GI___poll (fds=0x2e7e0, nfds=2, timeout=-1)
at ../sysdeps/unix/sysv/linux/poll.c:26
#2 0xb6c45eb4 in () at /usr/lib/arm-linux-gnueabihf/libglib-2.0.so.0
(gdb)
So I tried sudo apt-get install --reinstall libglib2.0-0 but it didn't change anything
The extra single quotation marks in the pipeline caused the problem.
So a working pipeline would be:
./test-launch "v4l2src ! video/x-raw, width=1280, height=400, format=GRAY8 \
! videoconvert ! video/x-raw, format=I420 ! v4l2h264enc ! video/x-h264, \
stream-format=byte-stream, alignment=au ! h264parse ! rtph264pay name=pay0 pt=96"

building c++ opencv 4 with cuda ubuntu 16/04

I'm trying to build opencv' with cuda 10.2. When the following command :
cmake -DCMAKE_BUILD_TYPE=RELEASE \
-DOPENCV_EXTRA_MODULES_PATH=../../opencv_contrib/modules \
-DWITH_TBB=ON -DWITH_CUDA=ON \
-DBUILD_opencv_cudacodec=OFF \
-DENABLE_FAST_MATH=1 \
-DWITH_CUBLAS=1 \
-DWITH_V4L=ON \
-DWITH_OPENGL=ON \
-DWITH_GSTREAMER=ON \
-DOPENCV_GENERATE_PKGCONFIG=ON \
-DOPENCV_ENABLE_NONFREE=ON \
-DBUILD_EXAMPLES=TRUE \
-DBUILD_PERF_TESTS=FALSE \
-DEBUILD_TESTS=FALSE ../../opencv
I have the following issue :
Could NOT find CUDNN (missing: CUDNN_LIBRARY CUDNN_INCLUDE_DIR) (Required is at least version "6")
of course I have installed cudnn7 corresponding to cuda 10.2, I the installation test passed.
Can someone help ?
To fix the problem I added the following options for cmake :
-DCUDNN_INCLUDE_DIR=/usr/local/cuda/include \
-DCUDNN_LIBRARY=/usr/local/cuda/lib64/libcudnn.so.7.6.5 \
You may also need to add:
-DCUDNN_VERSION='7.6'
Or in my case it was:
-DCUDNN_VERSION='8.0'

Can raspistill connect with rtpjpegpay by pipeline

pi#raspberrypi:~ $ raspistill -w 800 -h 600 -tl 0 -t 0 -o - | test-launch "fdsrc ! image/jpeg ! jpegparse ! rtpjpegpay"
stream ready at rtsp://127.0.0.1:8554/test
But I failed to use vlc in win10 to connect with raspberry pi3. I dont know where the problem is. I can't get any error information. I success in running this command :
raspivid -t 0 -h 1920 -w 1080 -fps 30 -o - | ./test-launch " fdsrc ! h264parse ! rtph264pay name=pay0 pt=96 "
So, I think my command is okay. Could someone help me?Thanks.
HaHa, I think I have resolved my own question ( 哈哈哈,我认为我解决了自己的问题 )。Now, let me help you.
The server is
raspistill -w 800 -h 600 -tl 0 -t 0 -o - | test-launch "fdsrc ! image/jpeg,width=800,height=600 ! jpegparse ! rtpjpegpay name=pay0 pt=96"
The client is
gst-launch-1.0.exe rtspsrc location=rtsp://192.168.1.30:8554/test ! rtpjpegdepay ! jpegparse ! multifilesink location="%d.jpeg"
or
gst-launch-1.0.exe rtspsrc location=rtsp://192.168.1.30:8554/test ! rtpjpegdepay ! jpegparse ! jpegdec ! videoconvert ! autovideosink
vlc can't resolv the flow from server temporarily. So don't to use it !

Crosscompile OpenCV with FFMPEG support

I am trying to cross compile opencv with ffmpeg support for an arm board with a buildroot based custom linux. The host is an ubuntu PC.
I want static opencv libraries.
I have downloaded ffmpeg source and crosscompiled it with the following configuration
./configure \
--enable-cross-compile \
--cross-prefix=arm-linux-gnueabihf- \
--target-os=linux \
--arch=arm \
--disable-static \
--enable-shared \
--enable-nonfree \
--enable-ffmpeg \
--enable-gpl \
--enable-swscale \
--enable-pthreads \
--disable-yasm \
--disable-stripping \
--prefix=../build \
--extra-cflags=-I../build/include --extra-ldflags=-L../build/lib
The paths are all correct and ffmpeg is built successfully. Then when I try to configure opencv with cmake-gui, I have to manually specify all paths.
But despite this, opencv configuration is unable to resolve FFMPEG correctly
I have attached the screen shot below
It cannot resolve ffmpeg versions. I tried to ignore it and build but it fails with linker error
Linking CXX static library ../../lib/libopencv_features2d.a
[ 49%] Built target opencv_features2d
make: *** [all] Error 2