i am trying to add a udp src dynamically to a running pipeline.
e.g
void addAudioSource(std::string const ip, int const port, int const payloadtype)
{
std::string description = "autoaudiosrc ! queue ! audioconvert ! audio/x-raw,rate=16000 ! avenc_g722 ! rtpg722pay";
audiosrc = Gst::Parse::create_bin(description, true);
pipeline->add(audiosrc);
{
auto srcpad = audiosrc->get_static_pad("src");
auto sinkpad = rtpbin->get_request_pad("send_rtp_sink_1");
srcpad->link(sinkpad);
}
rtpudpsinkAudio->set_property("host", ip);
rtpudpsinkAudio->set_property("port", port);
rtpudpsinkAudio->set_property("sync",true);
rtpudpsinkAudio->set_property("async",false);
pipeline->add(rtpudpsinkAudio);
{
auto srcpad = rtpbin->get_static_pad("send_rtp_src_1");
auto sinkpad = rtpudpsinkAudio->get_static_pad("sink");
srcpad->link(sinkpad);
}
pipeline->set_state(Gst::State::STATE_PLAYING);
}
--- and ---
void addAudioSink(std::string const ip, int const port, int const payloadtype)
{
char const caps[] = "application/x-rtp,media=(string)audio,clock-rate=(int)8000,payload=(int)%d";
char buffer[128] = {0};
sprintf(buffer,caps,payloadtype);
pipeline->add(rtpudpsrcAudio);
rtpudpsrcAudio->set_property("caps",
Gst::Caps::create_from_string(buffer));
{
auto srcpad = rtpudpsrcAudio->get_static_pad("src");
auto sinkpad = rtpbin->get_request_pad("recv_rtp_sink_1");
srcpad->link(sinkpad);
}
pipeline->set_state(Gst::State::STATE_PLAYING);
}
individually when i am not calling the other function the pipeline works fine.
if i try to call addAudioSink some time after addAudioSource , i always get this error when i debug through the application
0:00:18.190302584 [334m 6945 [00m 0x555556669450 [36mINFO [00m [00;01;34m GST_EVENT gstevent.c:814:gst_event_new_caps: [00m creating caps event application/x-rtp, media=(string)audio, clock-rate=(int)8000, payload=(int)9, ssrc=(uint)1388635048
0:00:18.190323116 [334m 6945 [00m 0x555556669450 [36mINFO [00m [00m basesrc gstbasesrc.c:2965:gst_base_src_loop:<rtpudpsrcaudio-AVP-d80367f9-8361-458d-a52d-23db4d185996> [00m pausing after gst_pad_push() = not-linked
0:00:18.190333169 [334m 6945 [00m 0x555556669450 [33;01mWARN [00m [00m basesrc gstbasesrc.c:3055:gst_base_src_loop:<rtpudpsrcaudio-AVP-d80367f9-8361-458d-a52d-23db4d185996> [00m error: Internal data stream error.
0:00:18.190337616 [334m 6945 [00m 0x555556669450 [33;01mWARN [00m [00m basesrc gstbasesrc.c:3055:gst_base_src_loop:<rtpudpsrcaudio-AVP-d80367f9-8361-458d-a52d-23db4d185996> [00m error: streaming stopped, reason not-linked (-1)
0:00:18.190350252 [334m 6945 [00m 0x555556669450 [36mINFO [00m [00;01;31;47m GST_ERROR_SYSTEM gstelement.c:2145:gst_element_message_full_with_details:<rtpudpsrcaudio-AVP-d80367f9-8361-458d-a52d-23db4d185996> [00m posting message: Internal data stream error.
0:00:18.190358717 [334m 6945 [00m 0x555556669450 [36mINFO [00m [00;01;31;47m GST_ERROR_SYSTEM gstelement.c:2172:gst_element_message_full_with_details:<rtpudpsrcaudio-AVP-d80367f9-8361-458d-a52d-23db4d185996> [00m posted error message: Internal data stream error.
the other thing is that this pipeline works most of the time.
i am only hit by this error when i debug through the application and sometimes when on release build.
The only issue that i have been able to find out is.
sometimes it says rtpssrcdemux0:src_2345243 not linked, and then udpsrc fails with gst_pad_push() = not-linked.
there is this issue that i dont understand, that the pipeline works most of the time, it fails for 25 % of time.
please help
Related
I have a device running embedded linux that can show RTSP streams from a camera. The user can change the stream from a windowed stream to a full screen stream, and vice versa. If the stream is changed 32 times, the stream stops working. I have possibly narrowed down the problem to the rtspsrc itself.
My question is, how does one clear the memory for the gst "stuff" without re-starting the program?
If I use gst-launch-1.0 with the pipeline, it works for more than 32 re-starts because the program is being killed every time.
However, if I run my program and increase the rtspsrc to 31 (by switching between the two streams), and then run gst-launch-1.0 with a rtsp pipeline, the steam does not show up! It appears that until every program that is using gst is killed, the rtspsrc will not reset back to 0.
I enabled debugging the rtspsrc:
export GST_DEBUG="rtspsrc:6"
Lots of log messages are shown each time the stream is started. They print the rtspsrcX, which increases even though the previous stream is stopped:
First run log print:
**rtspsrc gstrtspsrc.c:8834:gst_rtspsrc_print_sdp_media:<rtspsrc0> RTSP response message**
Second run:
**rtspsrc gstrtspsrc.c:8855:gst_rtspsrc_print_sdp_media:<rtspsrc1> RTSP response message**
Continue stopping/starting the stream, and it increases up to 31, at which point the stream no longer shows up:
**rtspsrc gstrtspsrc.c:8855:gst_rtspsrc_print_sdp_media:<rtspsrc31> RTSP response message**
I'm not sure how to "reset" the stream each time the user stops it. It seems that gst can't release memory unless I kill the whole program (all programs using gst).
I have tried creating a new context each time the stream is re-started, but this doesn't help.
When I call gst_is_initialized each subsequent time, it returns true.
The main loop is stopped by calling the following from another thread:
g_main_loop_quit(loop_);
The video feeds are controlled with the following:
GMainLoop *loop_;
pipeline = "rtspsrc location=rtsp://192.168.0.243/0 latency=0 ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideosink window-width=512 window-height=384 sync=false"
or
pipeline = "rtspsrc location=rtsp://192.168.0.243/0 latency=0 ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideosink window-width=1024 window-height=768 sync=false"
void stream_video(std::string pipeline)
{
GMainContext* context;
GstElement *pipelineElement;
GstBus *bus = NULL;
guint bus_watch_id = 0;
GstState state;
try
{
if(!gst_is_initialized())
{
std::cout << "GST Is not initialized - initializing " << pipeline.c_str();
gst_init_check(nullptr,nullptr,nullptr);
}
context = g_main_contextnew(); // Creating a new context to see if the camera can be started more than 32 times, but the rtspsrc still increases when debugging
loop_ = g_main_loopnew (context, FALSE);
pipelineElement = gst_parse_launch(pipeline.c_str(), NULL);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipelineElement));
bus_watch_id = gst_bus_add_watch (bus, bus_call, loop_);
gst_object_unref (bus);
bus = NULL;
gst_element_set_state(pipelineElement, GST_STATE_READY );
gst_element_set_state(pipelineElement, GST_STATE_PAUSED );
gst_element_set_state(pipelineElement, GST_STATE_PLAYING);
if (gst_element_get_state (pipelineElement, &state, NULL, 2*GST_SECOND) == GST_STATE_CHANGE_FAILURE)
{
std::cout << "gst: Failed to chage states State:" << state << " ID: " << stream_id_;
}
else
{
std::cout << "gst: Running..." << " ID: " << stream_id_ << " State:" << state << " Loop:" << loop_;
g_main_looprun (loop_); // blocks until loop_ exits (EOS, error, stop request)
}
gst_element_set_state(pipelineElement, GST_STATE_PAUSED);
gst_element_set_state(pipelineElement, GST_STATE_READY );
gst_element_set_state(pipelineElement, GST_STATE_NULL); // Can only switch between certian states, see https://gstreamer.freedesktop.org/documentation/additional/design/states.html?gi-language=c
g_source_remove (bus_watch_id);
std::cout << "gst: Removing pipelineElement " << pipelineElement;
gst_object_unref (GST_OBJECT (pipelineElement));
pipelineElement = NULL;
g_main_contextunref (context);
context = NULL;
g_main_loopunref (loop_);
loop_ = nullptr;
std::cout << "gst: Deleted pipeline" << " ID: " << stream_id_ << " State: " << state;
}
catch(const std::exception& e)
{
std::cout << "Error Caught: stream_video " << e.what();
}
return;
}
I am trying to use the QCamera functionality of QT in order to scan barcodes. My device is a headless device that uses wayland. I'm using QT 5.9.1 and gstreamer 1.0 - v1.14. My device has the camerabin2 plugin on it and I can use gstreamer to record video/take pictures just fine alone. I was previously taking a picture with a separate gstreamer pipeline, gathering the file and feeding the image to QZXing to process but it was kinda slow. I wanted to see if i could get increased performance from using QCamera directly with QZXing but unfortunately I'm encountering some gstreamer errors.
Here is my code:
#include <QFile>
#include <QZXing.h>
#include <QCamera>
#include <QCameraInfo>
#include <QCameraImageCapture>
#include <QCameraViewfinder>
#include <QRegularExpressionMatch>
#include "wirelesscontrollermain.hpp"
#include "barcodereader.hpp"
#include "calibrationhandler.hpp"
#include "jsonhelper.hpp"
namespace
{
const int BARCODE_TIMEOUT_15S = 15000;
QZXing* mDecoder;
QCamera* mCamera;
QCameraImageCapture* mImageCapture;
QCameraViewfinder* mViewFinder;
}
BarcodeReader::BarcodeReader(WirelessControllerMain &parent)
: mParent(parent)
, mIsScanningActive(false)
, mTimeoutExpired(false)
, mScanningTimeout(this)
{
mScanningTimeout.setInterval(BARCODE_TIMEOUT_15S);
mScanningTimeout.setSingleShot(true);
QObject::connect(&mScanningTimeout, SIGNAL(timeout()), this, SLOT(onBarcodeFailureTimeout()));
}
BarcodeReader::~BarcodeReader()
{
delete mDecoder;
mDecoder = NULL;
delete mCamera;
mCamera = NULL;
delete mImageCapture;
mImageCapture = NULL;
}
void BarcodeReader::init()
{
if(QCameraInfo::availableCameras().count() > 0)
{
qWarning() << "we have cameras";
const QList<QCameraInfo> cameras = QCameraInfo::availableCameras();
for (const QCameraInfo &cameraInfo : cameras)
{
qCritical() << cameraInfo.deviceName();
}
}
mCamera = new QCamera(QCameraInfo::defaultCamera());
mImageCapture = new QCameraImageCapture(mCamera);
QObject::connect(mImageCapture, SIGNAL(imageCaptured(int,QImage)), this, SLOT(pictureReady(int,QImage)));
QObject::connect(mImageCapture, SIGNAL(imageAvailable(int,QVideoFrame)), this, SLOT(pictureReady(int,QVideoFrame)));
QObject::connect(mImageCapture, SIGNAL(error(int,QCameraImageCapture::Error,QString)), this, SLOT(onImgCapError(int, QCameraImageCapture::Error, QString)));
QObject::connect(mImageCapture, SIGNAL(readyForCaptureChanged(bool)), this, SLOT(onReadyForCapture(bool)));
QObject::connect(mCamera, SIGNAL(lockStatusChanged(QCamera::LockStatus,QCamera::LockChangeReason)), this, SLOT(onLockStatusChange(QCamera::LockStatus, QCamera::LockChangeReason)));
mCamera->setCaptureMode(QCamera::CaptureStillImage);
mCamera->start();
mViewFinder = new QCameraViewfinder();
mViewFinder->show();
mCamera->setViewfinder(mViewFinder);
}
void BarcodeReader::onReadyForCapture(bool isReadyForCapture)
{
if(isReadyForCapture)
{
qCritical() << "ready for capture";
}
else
{
qCritical() << "not ready for capture";
}
}
void BarcodeReader::onImgCapError(int value, QCameraImageCapture::Error error, QString string)
{
qWarning() << "integer: " <<value << "error:" << error << "string:" << string;
}
void BarcodeReader::onLockStatusChange(QCamera::LockStatus lockstatus, QCamera::LockChangeReason reason)
{
qWarning() << "lock status: " << lockstatus << "reason: " << reason;
}
/*!
* \brief BarcodeReader::startBarcodeScan
* called when the barcode scan button has been pressed
*/
void BarcodeReader::startBarcodeScan()
{
if(!mIsScanningActive)
{
qDebug() << "start barcode scan";
qWarning() <<"is capture mode supported?" << mCamera->isCaptureModeSupported(QCamera::CaptureStillImage);
if(mDecoder != NULL)
{
delete mDecoder;
mDecoder = NULL;
}
mDecoder = new QZXing();
QObject::connect(mDecoder, SIGNAL(error(QString)), this, SLOT(decodeError(QString)));
mDecoder->setDecoder(QZXing::DecoderFormat_EAN_8 | QZXing::DecoderFormat_EAN_13 | QZXing::DecoderFormat_QR_CODE | QZXing::DecoderFormat_DATA_MATRIX);
mIsScanningActive = true;
mCurrentBarcodeValue.clear();
mScanningTimeout.stop();
mScanningTimeout.start();
qCritical() << "camera state: " << mCamera->state();
qCritical() << "camera availability" << mCamera->availability();
if(mImageCapture->isReadyForCapture())
{
mImageCapture->capture();
}
else
{
qWarning() << "not ready for capture";
}
}
else
{
qWarning() << "barcode scanning already active";
}
}
void BarcodeReader::pictureReady(int id, const QVideoFrame &preview)
{
qWarning() << "PICTURE READY - videoframe";
}
void BarcodeReader::pictureReady(int id, const QImage &preview)
{
qWarning() << "PICTURE READY";
QString result = mDecoder->decodeImage(preview);
if(result.isEmpty())
{
if(mTimeoutExpired)
{
//if the timeout expired while we were waiting for a picture to be taken
//process the last image and then call it quits
endBarcodeScan();
}
else
{
qWarning() << "try another barcode pic";
mCamera->start();
//take a new picture and hope for the best
if(mImageCapture->isReadyForCapture())
{
mImageCapture->capture();
mCamera->unlock();
}
else
{
qWarning() << "not ready for capture";
}
}
}
}
/*!
* \brief BarcodeReader::decodeError
* \param err - gets called when QZXing encounters a scan error
*/
void BarcodeReader::decodeError(QString err)
{
qWarning() << "Decode Error: " << err;
}
/*!
* \brief BarcodeReader::onBarcodeFailureTimeout
* if weve been scanning for barcodes for X time and havent found anything,
* this function gets called by the timer
*/
void BarcodeReader::onBarcodeFailureTimeout()
{
qWarning() << "Could not find barcode: timeout expired";
mTimeoutExpired = true;
}
and here is the relevant debug output of the QT application
BarcodeReader::init - we have cameras
BarcodeReader::init - "/dev/video0"
BarcodeReader::init - "/dev/video1"
(qWirelessController:29981): GStreamer-CRITICAL **: gst_element_link_pads_full: assertion 'GST_IS_ELEMENT (src)' failed
(qWirelessController:29981): GStreamer-CRITICAL **: gst_object_unref: assertion 'object != NULL' failed
- CameraBin error: "GStreamer error: negotiation problem."
- Unable to query the parameter info: "Invalid argument"
- Unable to query the parameter info: "Invalid argument"
- Unable to query the parameter info: "Invalid argument"
- Unable to query the parameter info: "Invalid argument"
- Unable to query the parameter info: "Invalid argument"
- Unable to query the parameter info: "Invalid argument"
BarcodeReader::startBarcodeScan - start barcode scan
BarcodeReader::startBarcodeScan - is capture mode supported? true
BarcodeReader::startBarcodeScan - camera state: QCamera::ActiveState
BarcodeReader::startBarcodeScan - camera availability 0
BarcodeReader::startBarcodeScan - not ready for capture
BarcodeReader::onBarcodeFailureTimeout - Could not find barcode: timeout expired
and here is a small portion of the gst_debug=4 output where the error occurs
:00:28.200701097 19677 0x1722a00 INFO viewfinderbin gstviewfinderbin.c:312:gst_viewfinder_bin_set_video_sink:<vf-bin> Setting video sink to <qgstvideorenderersink0>
0:00:28.218930847 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:995:set_capsfilter_caps:<camera_source> new_caps:ANY
0:00:28.233847264 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:884:gst_wrapper_camera_bin_src_set_zoom:<camera_source> setting zoom 1.000000
0:00:28.251089431 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:890:gst_wrapper_camera_bin_src_set_zoom:<camera_source> zoom set using digitalzoom
0:00:28.268826264 19677 0x1722a00 INFO GST_EVENT gstevent.c:1517:gst_event_new_reconfigure: creating reconfigure event
0:00:28.283702722 19677 0x1722a00 INFO GST_EVENT gstevent.c:1517:gst_event_new_reconfigure: creating reconfigure event
0:00:28.298587347 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:1003:set_capsfilter_caps:<camera_source> updated
0:00:28.313344514 19677 0x1722a00 INFO GST_STATES gstbin.c:2089:gst_bin_get_state_func:<preview-pipeline> getting state
0:00:28.328236139 19677 0x1722a00 INFO GST_ELEMENT_PADS gstpad.c:2134:gst_pad_unlink: unlinking preview-appsrc:src(0x18ced98) and preview-vscale:sink(0x18d2460)
0:00:28.346451306 19677 0x1722a00 INFO GST_ELEMENT_PADS gstpad.c:2188:gst_pad_unlink: unlinked preview-appsrc:src and preview-vscale:sink
0:00:28.362642306 19677 0x1722a00 INFO GST_ELEMENT_PADS gstutils.c:1774:gst_element_link_pads_full: trying to link element preview-appsrc:src to element preview-vscale:sink
0:00:28.381914181 19677 0x1722a00 INFO GST_ELEMENT_PADS gstelement.c:920:gst_element_get_static_pad: found pad preview-appsrc:src
0:00:28.397408972 19677 0x1722a00 INFO GST_ELEMENT_PADS gstelement.c:920:gst_element_get_static_pad: found pad preview-vscale:sink
0:00:28.412992139 19677 0x1722a00 INFO GST_PADS gstutils.c:1588:prepare_link_maybe_ghosting: preview-appsrc and preview-vscale in same bin, no need for ghost pads
0:00:28.432092806 19677 0x1722a00 INFO GST_PADS gstpad.c:2378:gst_pad_link_prepare: trying to link preview-appsrc:src and preview-vscale:sink
0:00:28.449303764 19677 0x1722a00 INFO GST_PADS gstpad.c:2586:gst_pad_link_full: linked preview-appsrc:src and preview-vscale:sink, successful
0:00:28.466614764 19677 0x1722a00 INFO GST_EVENT gstevent.c:1517:gst_event_new_reconfigure: creating reconfigure event
0:00:28.481535556 19677 0x1722a00 INFO GST_EVENT gstpad.c:5808:gst_pad_send_event_unchecked:<preview-appsrc:src> Received event on flushing pad. Discarding
0:00:28.499853681 19677 0x1722a00 INFO GST_STATES gstbin.c:2954:gst_bin_change_state_func:<camerabin> child 'imagebin-filesink' changed state to 2(READY) successfully
0:00:28.518643139 19677 0x1722a00 INFO GST_STATES gstbin.c:2954:gst_bin_change_state_func:<camerabin> child 'videobin-filesink' changed state to 2(READY) successfully
0:00:28.537619556 19677 0x1722a00 INFO GST_STATES gstbin.c:2506:gst_bin_element_set_state:<vf-bin> current NULL pending VOID_PENDING, desired next READY
0:00:28.555408306 19677 0x1722a00 INFO GST_ELEMENT_PADS gstpad.c:2134:gst_pad_unlink: unlinking vfbin->videoscale:src(0x196f780) and fakesink0:sink(0x18d2868)
0:00:28.573421056 19677 0x1722a00 INFO GST_ELEMENT_PADS gstpad.c:2188:gst_pad_unlink: unlinked vfbin->videoscale:src and fakesink0:sink
0:00:28.589460389 19677 0x1722a00 INFO GST_PARENTAGE gstbin.c:1801:gst_bin_remove_func:<vf-bin> removed child "fakesink0"
0:00:28.604571931 19677 0x1722a00 INFO GST_PARENTAGE gstbin.c:4468:gst_bin_get_by_name: [vf-bin]: looking up child element vfbin-videscale
(qWirelessController:19677): GStreamer-CRITICAL **: gst_element_link_pads_full: assertion 'GST_IS_ELEMENT (src)' failed
0:00:28.632689764 19677 0x1722a00 WARN viewfinderbin gstviewfinderbin.c:237:gst_viewfinder_bin_create_elements:<vf-bin> error: linking videoscale and viewfindersink failed
0:00:28.651664431 19677 0x1722a00 INFO GST_ERROR_SYSTEM gstelement.c:2145:gst_element_message_full_with_details:<vf-bin> posting message: GStreamer error: negotiation problem.
0:00:28.671232931 19677 0x1722a00 INFO GST_ERROR_SYSTEM gstelement.c:2172:gst_element_message_full_with_details:<vf-bin> posted error message: GStreamer error: negotiation problem.
(qWirelessController:19677): GStreamer-CRITICAL **: gst_object_unref: assertion 'object != NULL' failed
0:00:28.701280723 19677 0x1722a00 INFO GST_STATES gstbin.c:2506:gst_bin_element_set_state:<qgstvideorenderersink0> current NULL pending VOID_PENDING, desired next READY
0:00:28.720489598 19677 0x1722a00 INFO GST_STATES gstelement.c:2676:gst_element_continue_state:<qgstvideorenderersink0> completed state change to READY
0:00:28.738063973 19677 0x1722a00 INFO GST_STATES gstelement.c:2579:_priv_gst_element_state_changed:<qgstvideorenderersink0> notifying about state-changed NULL to READY (VOID_PENDING pending)
Does anyone know what is causing this error? i can provide the full logs if someone finds it useful.
Yes.
0:00:28.218930847 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:995:set_capsfilter_caps:<camera_source> new_caps:ANY
0:00:28.233847264 19677 0x1722a00 INFO wrappercamerabinsrc gstwrappercamerabinsrc.c:884:gst_wrapper_camera_bin_src_set_zoom:<camera_source> setting zoom 1.000000
0:00:28.251089431 19677 0x1722a00 INFO wrappe
You need to remove the redundancies of the wrapper.
I'm trying to take a video frame into OpenCV, do some processing on it (to be exact, aruco detection) and then package the resultant frame into a RTSP stream with GStreamer.
I've seen a Python solution to this problem, but I'm having trouble translating it to C++.
Here's my attempt at recreating the SensorFactory class:
#include <glib-object.h>
#include <iostream>
#include "SensorFactory.h"
SensorFactory::SensorFactory(std::string launch) {
launchString = launch;
cap = cv::VideoCapture(0);
// should be incremented once on each frame for timestamping
numberFrames = 0;
// simple struct with only the cap (int*), lastFrame (cv::Mat*) and numberFrames (int* again) fields
CVData cvData;
cvData.cap = ∩
cvData.lastFrame = &lastFrame;
cvData.numberFrames = &numberFrames;
}
GstFlowReturn SensorFactory::on_need_data(GstElement *src, CVData *datum) {
if (datum->cap->isOpened()) {
if (datum->cap->read(*(datum->lastFrame))) {
std::string data = std::string(reinterpret_cast<char * > (datum->lastFrame->data));
GstBuffer *buf = gst_buffer_new_allocate(nullptr, data.max_size(), nullptr);
gst_buffer_fill(buf, 0, &data, data.max_size());
buf->duration = static_cast<GstClockTime>(duration);
GstClockTimeDiff timestamp = *(datum->numberFrames) * duration;
buf->pts = buf->dts = static_cast<GstClockTime>(timestamp);
buf->offset = static_cast<guint64>(timestamp);
int *numf = datum->numberFrames;
*numf += 1;
g_signal_emit_by_name(src, "push-buffer", buf);
gst_buffer_unref(buf);
return GST_FLOW_OK;
}
}
// never reached
return GST_FLOW_NOT_LINKED;
}
GstElement *SensorFactory::create_element(const GstRTSPUrl *url) { return gst_parse_launch(launchString.c_str(), nullptr); }
void SensorFactory::configure(GstRTSPMedia *rtspMedia) {
numberFrames = 0;
GstElement *appsrc;
appsrc = gst_rtsp_media_get_element(rtspMedia);
g_signal_connect(appsrc, "need-data", (GCallback) on_need_data, &cvData);
}
The header for SensorFactory is nothing special:
#include <gst/rtsp-server/rtsp-media-factory.h>
#include <gst/rtsp-server/rtsp-media.h>
#include <gst/app/gstappsrc.h>
#include <opencv2/videoio.hpp>
class SensorFactory : public GstRTSPMediaFactory {
public:
typedef struct _CVData {
cv::VideoCapture *cap;
cv::Mat *lastFrame;
int *numberFrames;
} CVData;
CVData cvData;
std::string launchString;
cv::VideoCapture cap;
cv::Mat lastFrame;
int numberFrames = 0;
const static int framerate = 30;
const static GstClockTimeDiff duration = 1 / framerate * GST_SECOND;
explicit SensorFactory(std::string launch);
static GstFlowReturn on_need_data(GstElement *src, CVData *datum);
GstElement *create_element(const GstRTSPUrl *url);
void configure(GstRTSPMedia *media);
};
And then main.cpp looks like so:
#include <gst/gst.h>
#include "src/SensorFactory.h"
int main() {
gst_init(nullptr, nullptr);
GstRTSPServer *server;
server = gst_rtsp_server_new();
SensorFactory sensorFactory("appsrc name=source is-live=true block=true format=GST_FORMAT_TIME"
"caps=video/x-raw,format=BGR ! "
"videoconvert ! video/x-raw,format=I420 ! "
"x264enc speed-preset=ultrafast tune=zerolatency ! rtph264pay name=pay0");
g_print("setting shared\n");
gst_rtsp_media_factory_set_shared(&sensorFactory, true);
g_print("set shared\n");
GstRTSPMountPoints *mounts;
mounts = gst_rtsp_server_get_mount_points(server);
gst_rtsp_mount_points_add_factory(mounts, "/test", &sensorFactory);
GMainLoop *loop;
loop = g_main_loop_new(nullptr, false);
g_main_loop_run(loop);
}
The program compiles fine, and will even start running, but segfaults on gst_rtsp_media_factory_set_shared(&sensorFactory, true);. There isn't any other hacky memory management in this program.
You can try the steps below to write the stream as RTMP.
if (platform is "Windows") {
// if the platform is windows, then add the head data of the video
// otherwise it will not work on the HTML flash player
headData = " ! video/x-h264,profile=high";
}
// to rtmp (media server e.g: NGINX)
rtmpUrl = "appsrc ! videoconvert ! x264enc speed-preset=ultrafast tune=zerolatency "+headData+" ! flvmux ! rtmpsink location=rtmp://192.168.1.25/mylive/test";
// using UDP broadcast to all 1~255 IPs
rtmpUrl = "appsrc ! videoconvert ! x264enc speed-preset=ultrafast tune=zerolatency "+headData+" ! flvmux ! udpsink host=192.168.1.255 port=5000";
// using UDP broadcast specific IP
rtmpUrl = "appsrc ! videoconvert ! x264enc speed-preset=ultrafast tune=zerolatency "+headData+" ! flvmux ! udpsink host=192.168.1.25 port=5000";
// give the FPS and the size of the video
VideoWriter writer = new VideoWriter(rtmpUrl, Videoio.CAP_GSTREAMER, FOURCC, currentFps, new Size(width, height));
// then you can write the video using writer
NOTE: Make sure you build OpenCV with GStreamer.
Here is an alternative approach.
Seperate your SensorFactory from the rtsp code for now.
Start your SensorFactory with the pipeline.
appsrc name=source is-live=true block=true format=GST_FORMAT_TIME caps=video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency ! udpsink port=5050
We end that pipeline by piping the h264 over a udpsink on port 5050.
Then compile the gstreamer rtsp server example here
And launch that with pipeline
./test-launch "( udpsrc port=5050 ! rtph264pay name=pay0 pt=96 )"
Assuming your SensorFactory works as you intend, this should get you an RTSP Stream serving at rtsp://localhost:8554/test
I have a problem with linking of a two elements: avdec_h264 and avenc_mpeg4. I think that somehow these elements can't negotiate a capabilities of a data.
I've tested my pipeline with gst-launch:
gst-launch-1.0 rtspsrc location="rtsp://camera" ! rtph264depay ! h264parse ! avdec_h264 ! avenc_mpeg4 ! fakesink
It have worked fine.
When I use my application where the pipeline is implemented:
pipeline_ = gst_pipeline_new("default");
if (!pipeline_)
{
return false;
}
receiver_ = gst_element_factory_make("rtspsrc", "receiver");
demuxer_ = gst_element_factory_make("rtph264depay", "demuxer");
parser_ = gst_element_factory_make("h264parse", "parser");
decoder_ = gst_element_factory_make("avdec_h264", "decoder");
encoder_ = gst_element_factory_make("avenc_mpeg4, "encoder");
output_ = gst_element_factory_make("fakesink", "output");
if (!receiver_ || !demuxer_ || !parser_ ||
!decoder_ || !encoder_ || !output_)
{
return false;
}
g_object_set(GST_OBJECT(receiver_), "location", "rtsp://camera", nullptr);
// On this signal source pad of the receiver is being connected to
// the sink pad of the demuxer.
g_signal_connect(receiver_, "pad-added", G_CALLBACK(on_pad_added), this);
gst_bin_add_many(GST_BIN(pipeline_), receiver_, demuxer_, parser_,
decoder_, encoder_, output, nullptr);
if (!gst_element_link_many(demuxer_, parser_, decoder_,
encoder_, output_, nullptr))
{
return false;
}
Everything links successfully. All elements change their state to PLAYING, but I get nothing: I do not get GST_MESSAGE_STREAM_START on the pipeline's bus.
Here is the graphs from gst-launch and my application:
If I change avenc_mpeg4 to, videoconvert element, which is not an encoder, everything will works well. If I put an other encoder, I will still have current problem.
Probably I don't know about some particular things on how to work with the encoder. But I could not find solution.
Thank you.
A few points:
The code you posted above should listed for pad-added messages of decodebin. I am surprised if the code would work as is (maybe put the full code on a gist and link from here). See https://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-pads.html
Insert a videoconvert between the decoder and the encoder.
Where are you linking reciever to demuxer ? that is needed as i understand.
I need to mux klv metadata into the h264 stream. I have created application. But the stream is playing only as long as klv-data is being inserted. When i stop pushing klv-data the whole stream stops. What is the right method to mux asynchronous klv data by mpegtsmux?
Klv-data need to be inserted into the following working pipeline:
v4l2src input-src=Camera ! videorate drop-only=true ! 'video/x-raw, format=(string)NV12, width=1920, height=1088, framerate=25/1' ! ce_h264enc target-bitrate=6000000 idrinterval=25 intraframe-interval=60 ! queue ! mpegtsmux alignment=7 ! udpsink host=192.168.0.1 port=3000 -v
This pipeline is collected in the application. To insert klv-metedata appsrc is created:
appSrc = gst_element_factory_make("appsrc", nullptr);
gst_app_src_set_caps (GST_APP_SRC (appSrc), gst_caps_new_simple("meta/x-klv", "parsed", G_TYPE_BOOLEAN, TRUE, "sparse", G_TYPE_BOOLEAN, TRUE, nullptr));
g_object_set(appSrc, "format", GST_FORMAT_TIME, nullptr);
Then appsrc is linked to the pipeline:
gst_bin_add(GST_BIN(pipeline), appSrc);
gst_element_link(appSrc, mpegtsmux);
Here is push function:
void AppSrc::pushData(const std::string &data)
{
GstBuffer *buffer = gst_buffer_new_allocate(nullptr, data.size(), nullptr);
GstMapInfo map;
GstClock *clock;
GstClockTime abs_time, base_time;
gst_buffer_map (buffer, &map, GST_MAP_WRITE);
memcpy(map.data, data.data(), data.size());
gst_buffer_unmap (buffer, &map);
GST_OBJECT_LOCK (element);
clock = GST_ELEMENT_CLOCK (element);
base_time = GST_ELEMENT (element)->base_time;
gst_object_ref (clock);
GST_OBJECT_UNLOCK (element);
abs_time = gst_clock_get_time (clock);
gst_object_unref (clock);
GST_BUFFER_PTS (buffer) = abs_time - base_time;
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 1);
gst_app_src_push_buffer(GST_APP_SRC(element), buffer);
}
Gstreamer version is 1.6.1.
What can be wrong with my code? I'd appreciate your help.
I can push dummy klv-packets to maintain video stream. But i don't want to pollute upcomming stream and i am sure there should be more delicate solution.
I have found that i can send event with GST_STREAM_FLAG_SPARSE, which should be appropriate for subtitles. But as a result i have no output at all.
GstEvent* stream_start = gst_event_new_stream_start("klv-04");
gst_event_set_stream_flags(stream_start, GST_STREAM_FLAG_SPARSE);
GstPad* pad = gst_element_get_static_pad(GST_ELEMENT(element), "src");
gst_pad_push_event (pad, stream_start);
While debugging i have found that after applying the following patch to the gstreamer and using GST_STREAM_FLAG_SPARSE, the stream doesn't stop when the appsrc stops pushing packets.
diff --git a/libs/gst/base/gstcollectpads.c b/libs/gst/base/gstcollectpads.c
index 8edfe41..14f9926 100644
--- a/libs/gst/base/gstcollectpads.c
+++ b/libs/gst/base/gstcollectpads.c
## -1440,7 +1440,8 ## gst_collect_pads_recalculate_waiting (GstCollectPads * pads)
if (!GST_COLLECT_PADS_STATE_IS_SET (data, GST_COLLECT_PADS_STATE_WAITING)) {
/* start waiting */
gst_collect_pads_set_waiting (pads, data, TRUE);
- result = TRUE;
+ if (!GST_COLLECT_PADS_STATE_IS_SET (data, GST_COLLECT_PADS_STATE_LOCKED))
+ result = TRUE;
}
}
}
Anyway, the receiver stops updating screen 10 seconds after the last klv packet.
This is a bit of an old thread but,
In my experience though, if there is no queue between the appsrc and the muxer, you will get this behavior. I would change your:
gst_element_link(appSrc, mpegtsmux);
To this:
gst_element_link(appSrc, appSrcQueue);
gst_element_link(appSrcQueue, mpegtsmux);
And I'm not sure if the mpegtsmux has the capability for it or not but the muxer that we have used has a property called do-timestamping and when that was set to TRUE we had a better experience.
Another tip I would give is to use the gst-inspect tool to see what options each elements have.