Capturing jpegs from an h264 stream with gstreamer on a Raspberry Pi - gstreamer

I have one of the new camera add-ons for a Raspberry Pi. It doesn't yet have video4linux support but comes with a small program that spits out a 1080p h264 stream. I have verified this works and got it pushing the video to stdout with:
raspivid -n -t 1000000 -vf -b 2000000 -fps 25 -o -
I would like to process this stream such that I end up with a snapshot of the video taken once a second.
Since it's 1080p I will need to use the rpi's hardware support for H264 encoding. I believe gstreamer is the only app to support this so solutions using ffmpeg or avconv won't work. I've used the build script at http://www.trans-omni.co.uk/pi/GStreamer-1.0/build_gstreamer to make gstreamer and the plugin for hardware H264 encoding and it appears to work:
root#raspberrypi:~/streamtest# GST_OMX_CONFIG_DIR=/etc/gst gst-inspect-1.0 | grep 264
...
omx: omxh264enc: OpenMAX H.264 Video Encoder
omx: omxh264dec: OpenMAX H.264 Video Decoder
So I need to construct a gst-launch pipeline that takes video on stdin and spits out a fresh jpeg once a second. I know I can use gstreamer's 'multifilesink' sink to do this so have come up with the following short script to launch it:
root#raspberrypi:~/streamtest# cat test.sh
#!/bin/bash
export GST_OMX_CONFIG_DIR=/etc/gst
raspivid -n -t 1000000 -vf -b 2000000 -fps 25 -o - | \
gst-launch-1.0 fdsrc fd=0 ! decodebin ! videorate ! video/x-raw,framerate=1/1 ! jpegenc ! multifilesink location=img_%03d.jpeg
Trouble is it doesn't work: gstreamer just sits forever in the prerolling state and never spits out my precious jpegs.
root#raspberrypi:~/streamtest# ./test.sh
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
[waits forever]
In case it's helpful output with gstreamer's -v flag set is at http://pastebin.com/q4WySu4L
Can anyone explain what I'm doing wrong?

We finally found a solution to this. My gstreamer pipeline was mostly right but two problems combined to stop it working:
raspivid doesn't add timestamps to the h264 frames it produces
recent versions of gstreamer have a bug which stop it handling untimestamped frames
Run a 1.0 build of gstreamer (be sure to build from scratch & remove all traces of previous attempts) and the problem goes away.
See http://gstreamer-devel.966125.n4.nabble.com/Capturing-jpegs-from-an-h264-stream-tt4660254.html for the mailing list thread.

Related

How to fix gstreamer v4l2src error when streaming video over usb?

I am using this command: gst-launch-1.0 v4l2src ! xvimagesink
to stream video over usb on my nvidia jetson nano and I am getting this output:
Setting pipeline to PAUSED...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not display (null)
Setting pipeline to NULL..
Freeing pipeline...

MP4 file created using gstreamer doesn't be played on Window Media Player

I created mp4 file using gstreamer.
gst-launch-1.0 videotestsrc num-buffers=10 ! "video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, framerate=(fraction)30/1" ! videoconvert ! avenc_mpeg4 ! avmux_mp4 ! filesink location=/tmp/test.mp4
And i can play it using Mplayer , VLC etc.
But i can't play it on Windows Media Player.
I checked MP4 box. but it seems like it doesn't have problem.
How can i try other way?
[Mplayer]
$ mplayer test.mp4
MPlayer 1.3.0 (Debian), built with gcc-7 (C) 2000-2016 MPlayer Team
do_connect: could not connect to socket
connect: No such file or directory
Failed to open LIRC support. You will not be able to use your remote control.
Playing test.mp4.
libavformat version 57.83.100 (external)
libavformat file format detected.
[mov,mp4,m4a,3gp,3g2,mj2 # 0x7fc21b56d2a0]Protocol name not provided, cannot determine if input is local or a network protocol, buffers and access patterns cannot be configured optimally without knowing the protocol
[lavf] stream 0: video (mpeg4), -vid 0
VIDEO: [MP4V] 1280x720 24bpp 29.970 fps 11338.9 kbps (1384.1 kbyte/s)
==========================================================================
Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family
libavcodec version 57.107.100 (external)
Selected video codec: [ffodivx] vfm: ffmpeg (FFmpeg MPEG-4)
==========================================================================
Clip info:
major_brand: isom
minor_version: 512
compatible_brands: isomiso2mp41
creation_time: 2020-04-09T11:11:51.000000Z
encoder: Lavf57.71.100
Load subtitles in ./
Audio: no sound
Starting playback...
Movie-Aspect is 1.78:1 - prescaling to correct movie aspect.
VO: [vdpau] 1280x720 => 1280x720 Planar YV12
Movie-Aspect is 1.78:1 - prescaling to correct movie aspect.
VO: [vdpau] 1280x720 => 1280x720 Planar YV12
V: 0.3 0/ 0 ??% ??% ??,?% 0 0
Exiting... (End of file)
See https://support.microsoft.com/en-us/help/316992/file-types-supported-by-windows-media-player
MP4 Video file (.mp4, .m4v, .mp4v, .3g2, .3gp2, .3gp, .3gpp)
[..] Windows Media Player does not support the playback of the .mp4 file format. You can play back .mp4 media files in Windows Media Player when you install DirectShow-compatible MPEG-4 decoder packs [..]

Opencv GStreamer pipeline doesn't work on Raspberry Pi 4

I'm trying to open UDP stream video in Raspberry Pi using this pipeline:
VideoCapture video("udpsrc port=5600 ! application/x-rtp,payload=96,encoding-name=H264 !"
"rtpjitterbuffer mode=1 ! rtph264depay ! h264parse ! decodebin ! videoconvert ! appsink emit-signals=true sync=false max-buffers=2 drop=true", cv::CAP_GSTREAMER);
// Exit if video is not opened
if(!video.isOpened())
{
cout << "Could not read video file" << endl;
return 1;
}
However, video.isOpened() return false and I couldn't be able to open with this code. This works on loopback test and another Ubuntu 18.04 PC but RPi 4 (Buster OS) couldn't run it. Also following lines can run upcoming gstream video:
gst-launch-1.0 udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink fps-update-interval=1000 sync=false
Furthermore specific code stack (e.g. [video_udp.cpp][1]) can easily handle video but also it's hard to use with opencv.
NOTE: OpenCV version is 4.2.0-pre
The problem is about using GStreamer library as a plugin of OpenCV. OpenCV doesn't throw exception even you build source code without GStreamer support. (In default, GStreamer library was directly found by Ubuntu, conversely Raspberry Pi 4 couldn't find it.)
Firstly I check build information of OpenCV with std::cout<<cv::getBuildInformation(); in Ubuntu 18.04 machine and found that:
GStreamer: YES (1.14.5)
Also I just check this on Raspberry Pi 4 side and build information was:
GStreamer:NO
Before the build OpenCV I just compare GStreamer plugins with gst-inspect-1.0 command for both of them and I just install some missing plugins like gstreamer1.0-tools . Also I wasn't know the problem, before the checking build information, so I installed some other GStreamer plugins that currently I don't remember.
Lastly, I build system by adding -D WITH_GSTREAMER=ON flag. And now it works well.
I'll edit answer if the problem related to missing plugins those are installed later. For this, I'll check this issue with clean Buster OS image.

how to use a video stream as input in python/opencv programme

I am able to stream and receive webcam feed in two terminal via udp
command for streaming:
ffmpeg -i /dev/video0 -b 50k -r 20 -s 858x500 -f mpegts udp://127.0.0.1:2000
command for recieving:
ffplay udp://127.0.0.1:2000
Now i have to use this received video stream as input in python/opencv how can i do that.
I will be doing this using rtp and rstp as well.
But in case of rtsp it is essential to initiate the receiving terminal, but if I do that then port will become busy and my program will not be able to take the feed.How could it be resolved.
I am currently using opencv 2.4.13, python 2.7 in ubuntu 14.04
Check this tutorial, and use cv2.VideoCapture("udp://127.0.0.1:2000"). You will need to build opencv with FFmpeg so that it works.

Capture and play MJPEG - Network Video stream over UDP with OpenCV and ffmpeg

I'm trying to receive and display a udp live mjpeg - network video stream from a network cam.
I can play the video stream by starting VLC with the Argument --demux=mjpeg and then typing udp://#:1234 in the network stream field. Or with gstreamer by the console line: gst-launch -v udpsrc port=1234 ! jpegdec ! autovideosink. My Cam has the IP Address 192.168.1.2 and it sends the stream to the address 192.168.1.1:1234.
I've tried to capture the stream with OpenCV with:
cv::VideoCapture cap;
cap.open("udp://#192.168.1.1:1234");
I tried also:
cap.open("udp://#:1234")
cap.open("udp://#localhost:1234")
cap.open("udp://192.168.1.1:1234")
cap.open("udp://192.168.1.1:1234/")
But the function hangs until I press ctrl+C. I have the same problem when I use ffmpeg with: ffmpeg -i udp://#192.168.1.1:1234 -vcodec mjpeg
What did I do wrong? When i installed ffmpeg i couldn't install the dependency libsdl1.2-dev. Is that the problem?
If so, there is any way to read the udp-frames from the socket and then decode the JPEG pictures and display it with OpenCV?
I have the OS Ubuntu linaro oneiric 11.10 with the kernel 3.0.35 from Freescale
thanks any way. i have fixed this problem by installing a newr version of ffmpeg and using the C-Api of ffmpeg