I need to make a GStreamer audio pipeline to redirect audio stream.
GStreamer has been built from vcpg (v 1.19.2).
I also made an install from msi file,
with same issue.
Project is made with Visual Studio 2019.
I succeed to get some elements from factories :
GstElement* queue2 = gst_element_factory_make("queue", "queue");
GstElement* audio_sink = gst_element_factory_make("autoaudiosink", "sink");
but still failed to get appsrc element:
GstElement* app_source = gst_element_factory_make("appsrc", "source"); // null !!!
It appears that relevant plugin exists (gst-inpect):
appsrc: Factory Details:
appsrc: Rank none (0)
appsrc: Long-name AppSrc
appsrc: Klass Generic/Source
appsrc: Description Allow the application to feed buffers to a pipeline
appsrc: Author David Schleef <ds#schleef.org>, Wim Taymans <wim.taymans#gmail.com>
appsrc:
appsrc: Plugin Details:
appsrc: Name app
appsrc: Description Elements used to communicate with applications
appsrc: Filename C:\src\vcpkg\installed\x64-windows\bin\gstapp.dll
appsrc: Version 1.19.2
appsrc: License LGPL
appsrc: Source module gst-plugins-base
appsrc: Source release date 2021-09-23
appsrc: Binary package GStreamer Base Plug-ins source release
appsrc: Origin URL Unknown package origin
appsrc:
appsrc: GObject
appsrc: +----GInitiallyUnowned
appsrc: +----GstObject
appsrc: +----GstElement
appsrc: +----GstBaseSrc
appsrc: +----GstAppSrc
...
I tested:
GstPlugin* pl = gst_plugin_load_file("C:\\src\\vcpkg\\installed\\x64-windows\\bin\\gstapp.dll",&error);
// still NULL... after warning message "specified module not found"
However same code works with other plugins. e.g. "gstcoreelements.dll"
Strangely, looping on element factories and plugins show me what I need:
GList* list, * walk;
list = gst_registry_feature_filter(registry, filter_vis_features, FALSE, NULL);
for (walk = list; walk != NULL; walk = g_list_next(walk)) {
const gchar* name;
GstElementFactory* factory;
factory = GST_ELEMENT_FACTORY(walk->data);
name = gst_element_factory_get_longname(factory);
g_print(" %s\n", name);
// returns notably: AppSink and AppSrc
}
GList *list2 = gst_registry_plugin_filter(registry, filter_vis_plugins, FALSE, NULL);
for (walk = list2; walk != NULL; walk = g_list_next(walk)) {
const gchar* name;
GstPlugin* plugin;
plugin = GST_PLUGIN(walk->data);
name = gst_plugin_get_filename(plugin);
g_print(" ---- %s\n", name);
name = gst_plugin_get_name(plugin);
g_print(" ---- %s\n", name);
// ---- C:\src\vcpkg\installed\x64-windows\bin\gstapp.dll
// ---- app
}
Using break points, it seems that module loading failed in glib
"g_module_open_full" function.
I don't know why at the moment, because debug infos stopped with "gmodule-win32.c not found".
Any idea, tip, clue (,solution) will be highly appreciated.
Finally get it working by adding gstapp-1.0-0.dll in .exe directory.
Related
Using GStreamer 1.16.3 on Ubuntu 20, C/C++. At some point during runtime, I'm trying to link a tee element (its src_0 is already linked and playing) to a recording bin which ends with a filesink.
The following snippet: create the recording bin, get the new filesink element, set its location property, add the recording bin to pipeline, sync recording bin state with pipeline, get pipeline's tee element, get its src request pad, get recording bin's sink pad, link src and sink pads to complete the process.
auto error = (GError *)nullptr;
auto recording_bin = gst_parse_bin_from_description("queue ! filesink name=recording_filesink", true, &error);
auto filesink = gst_bin_get_by_name(GST_BIN(recording_bin), "recording_filesink");
g_object_set(G_OBJECT(filesink), "location", "/home/.../record.ts", NULL);
g_object_unref(filesink);
gst_bin_add(GST_BIN(pipeline), recording_bin);
gst_element_sync_state_with_parent(recording_bin);
auto tee = gst_bin_get_by_name(GST_BIN(pipeline), "tee");
auto tee_src = gst_element_get_request_pad(tee, "src_%u");
auto recording_sink = gst_element_get_static_pad(recording_bin, "sink");
auto ret = gst_pad_link(tee_src, recording_sink);
g_object_unref(recording_sink);
g_object_unref(tee_src);
g_object_unref(tee);
After executing this above, ret is GST_PAD_LINK_OK, but the pipeline gets stuck - not displaying (1st tee src) and not recording (2nd tee src).
Is this the right way to dynamically link a tee src to a recording bin?
I have the following pipeline :
v4l2src -> queue -> h264parse -> avdec_h264 -> identity ->
imagefreeze(added/removed dynamically) -> glupload -> glcolorconvert ->
gltransformation -> glimagesink
I have a added a probe on the element identity srcpad.
Based, on the user-input I add or remove dynamically the element imagefreeze
Here is the pseudo code :
#show live video on rendering window till no user input
#if user_input == 1:
insert_imagefreeze #analogous to image being displayed on rendering window
#if user_input == 2:
delete_imagefreeze #resume back showing live video as before
Inserting imagefreeze is no problem, it works fine. I can observe the results that I would want to with a imagefreeze
However, after the element imagefreeze is added, the element v4l2src task goes to a paused state. Here is the info log :
0:03:39.608226968 [333m29510[00m 0x1561c00 [36mINFO [00m [00m v4l2src gstv4l2src.c:949:gst_v4l2src_create:<source>[00m sync to 0:03:39.066664476 out ts 0:03:39.375180156
0:03:39.608449406 [333m29510[00m 0x1561c00 [36mINFO [00m [00m basesrc gstbasesrc.c:2965:gst_base_src_loop:<source>[00m pausing after gst_pad_push() = eos
0:03:39.608561724 [333m29510[00m 0x1561c00 [36mINFO [00m [00m task gsttask.c:316:gst_task_func:<source:src>[00m Task going to paused.
Can anyone explain, why the source element of pipeline goes to a paused state once a new element is added to the pipeline.
And snippets from actual code :
def add_delete(self):
if ui_input_cnt = 1 #updated based on user input
self.idsrcpad = self.identity.get_static_pad("src")
self.in_idle_probe = False
self.probeID = self.idsrcpad.add_probe(Gst.PadProbeType.IDLE,self.lengthen_pipeline)
if ui_input_cnt = 2
self.probeID2 = self.idsrcpad.add_probe(Gst.PadProbeType.IDLE,self.shorten_pipeline)
def lengthen_pipeline(self,pad,info):
print("entered callback")
global pipeline
#restrict only 1 call to this callback
if self.in_idle_probe == True:
print("callback for now restricted to one call per thread")
return Gst.PadProbeReturn.OK
if self.in_idle_probe == False:
self.in_idle_probe == True
#create image freze element
self.ifreeze = Gst.ElementFactory.make("imagefreeze","ifreeze")
# increment reference
self.ifreeze.ref()
#add imagefreze to pipeline
pipeline.add(self.ifreeze)
#sync image freeze state to main pipeline
self.ifreeze.sync_state_with_parent()
#unlink identity and upload
#1.get sink pad of upload and srcpad of identity
sinkpad = self.upload.get_static_pad("sink")
srcpad = self.identity.get_static_pad("src")
print("unlinking identit srcpad - uplaod sinkpad")
if self.check_and_unlink(srcpad,sinkpad):
#2.get sink pad of imagefreeze
sinkpad = self.ifreeze.get_static_pad("sink")
#3. link identity src pad to image freeze sinkpad
print("linking identity srcpad - ifreeze sinkpad")
self.check_and_link(srcpad,sinkpad)
#4. link imagefreeze src pad to upload sink pad
#get image freeze srcpad and sinkpad of upload
srcpad = self.ifreeze.get_static_pad("src")
sinkpad = self.upload.get_static_pad("sink")
print("linking ifreeze srcpad - upload sinkpad")
if self.check_and_link(srcpad,sinkpad):
return Gst.PadProbeReturn.REMOVE
else:
print("ERORR : unlinking")
return -1
The functions check_and_link(srcpad,sinkpad) and check_and_unlink(srcpad,sinkpad), does no more than checking the src and sink pads, and then linking and unlinking accordingly.
The sink sends a EOS event, when the imagefreeze is added dynamically. This makes the pipeline go into a PAUSED state. However, why the sink receives a EOS event is still unclear, and needs further investigation. More information of the observations are provided here.
I am using Gstreamer 1.0 with Python bindings.
Below is the pipeline I am trying to build considering Opengl plugins :
gltestsrc -> gltransformation -> glimagesink
I am trying to modify the properties of element 'gltransformation' dynamically based on the values received from external hardware device. Here is the link, for a similar question but it did not help me much in my usecase.
Below is the snipped of the python script :
import gi
gi.require_version('Gst','1.0')
from gi.repository import Gst,GstController
#global variables
#the values of a,b,c get updated for certain events dynamically based on external hardware
a = 0
b= 0
c = 0
source = Gst.ElementFactory.make("gltestsrc", "source")
gltrnsfrm = Gst.ElementFactory.make("gltransformation","gltrnsfrm")
sink = Gst.ElementFactory.make("glimagesink", "sink")
# create the empty pipeline
pipeline = Gst.Pipeline.new("test-pipeline")
if not pipeline or not source or not gltrnsfrm or not sink:
print("ERROR: Not all elements could be created")
sys.exit(1)
# build the pipeline
pipeline.add(source,gltrnsfrm,sink)
if not source.link(gltrnsfrm):
print("ERROR: Could not link source to gltrsnfrm")
sys.exit(1)
if not gltrnsfrm.link(sink):
print("ERROR: Could not link gltrsnfrm to sink")
sys.exit(1)
# modify the gltransformation's properties
gltrnsfrm.set_property("rotation-z",a)
gltrnsfrm.set_property("rotation-x",b)
gltrnsfrm.set_property("rotation-y",c)
#dynamic controller
cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
cb= Gstcontorller.DirectControlBinding.new(gltrnsfrm,"rotation-x",cs)
gltrnsfrm.add_control_binding(cb)
#modify the values
cs.set(0*Gst.SECOND,b) #use updated values of b
cs.set(1*Gst.SECOND,b)
The above example shows only modification of 1 element property, however, I have other properties as well to be modified based on the values of a,b & c.
Executing the above script gives me the following error:
GStreamer-CRITICAL : gst_object_add_control_binding: assertion 'binding->pspec' failed.
I think I have to set certain more attributes in python to get this working.
Does anyone have a hand on this issue?
EDIT:
After the suggestions from Hugh Fisher, I tried to trace back to the source of the file. Here is a snippet from the original code :
GST_INFO_OBJECT (object, "trying to put property '%s' under control",
binding->name);
/* check if the object has a property of that name */
if ((pspec =
g_object_class_find_property (G_OBJECT_GET_CLASS (object),
binding->name))) {
GST_DEBUG_OBJECT (object, " psec->flags : 0x%08x", pspec->flags);
/* check if this param is witable && controlable && !construct-only */
if ((pspec->flags & (G_PARAM_WRITABLE | GST_PARAM_CONTROLLABLE |
G_PARAM_CONSTRUCT_ONLY)) ==
(G_PARAM_WRITABLE | GST_PARAM_CONTROLLABLE)) {
binding->pspec = pspec;
} else {
GST_WARNING_OBJECT (object,
"property '%s' on class '%s' needs to "
"be writeable, controlable and not construct_only", binding->name,
G_OBJECT_TYPE_NAME (object));
}
} else {
GST_WARNING_OBJECT (object, "class '%s' has no property '%s'",
G_OBJECT_TYPE_NAME (object), binding->name);
}
gst_object_unref (object);
And this is the log file for my script :
0:00:00.174410648 [336m 8309[00m 0xd1b750 [37mTRACE [00m [00;01;31;44m GST_REFCOUNTING gstobject.c:207:gst_object_init:<GstObject#0x10b0020>[00m 0x10b0020 new
0:00:00.174697421 [336m 8309[00m 0xd1b750 [37mTRACE [00m [00;01;31;44m GST_REFCOUNTING gstobject.c:207:gst_object_init:<GstObject#0x10b20f0>[00m 0x10b20f0 new
0:00:00.174716708 [336m 8309[00m 0xd1b750 [36mINFO [00m [00m gstcontrolbinding gstcontrolbinding.c:144:gst_control_binding_constructor:<gltrnsfrm>[00m trying to put property 'rotation-x' under control
0:00:00.174723927 [336m 8309[00m 0xd1b750 [37mDEBUG [00m [00m gstcontrolbinding gstcontrolbinding.c:150:gst_control_binding_constructor:<gltrnsfrm>[00m psec->flags : 0x000000e3
0:00:00.174729088 [336m 8309[00m 0xd1b750 [33;01mWARN [00m [00m gstcontrolbinding gstcontrolbinding.c:161:gst_control_binding_constructor:<gltrnsfrm>[00m property 'rotation-x' on class 'GstGLTransformation' needs to be writeable, controlable and not construct_only
0:00:00.174733951 [336m 8309[00m 0xd1b750 [37mTRACE [00m [00;01;31;44m GST_REFCOUNTING gstobject.c:264:gst_object_unref:<gltrnsfrm>[00m 0x10a60e0 unref 4->3
(python3:8309): GStreamer-CRITICAL **: 10:37:00.609: gst_object_add_control_binding: assertion 'binding->pspec' failed
As per the man page of 'gltransformation', the properties rotation-x/y/z, are writable and readable. Also here is a link, of an application that takes input from GUI and changes the rotation-x/y/z for 'gltransformation'.
I have no clue, why here in my case this is an issue.
#gst-inspect-1.0 gltransformation
translation-x : Translates the video at the X-Axis, in universal [0-1] coordinate.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
translation-y : Translates the video at the Y-Axis, in universal [0-1] coordinate.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
translation-z : Translates the video at the Z-Axis, in universal [0-1] coordinate.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
rotation-x : Rotates the video around the X-Axis in degrees.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
rotation-y : Rotates the video around the Y-Axis in degrees.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
rotation-z : Rotates the video around the Z-Axis in degrees.
flags: readable, writable
Float. Range: -3,402823e+38 - 3,402823e+38 Default: 0
Edit 2: updated the code, with a workaround for issue:
class Thread(object):
def __init__(self):
thread = threading.Thread(target=self.get)
self.gltrnsfrm = Gst.ElementFactory.make("gltransformation","gltrnsfrm")
thread.start()
def get(self):
try:
global a,b,c
while True:
self.gltrnsfrm.set_property("rotation-z",a)
self.gltrnsfrm.set_property("rotation-x",b)
self.gltrnsfrm.set_property("rotation-y",c)
#time.sleep(0.01)
except KeyboardInterrupt:
pass
The rest of the code is the same(with minor adaptation to use the threads) as described before in the post. However, the following code was ommited :
#dynamic controller
cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
cb= Gstcontorller.DirectControlBinding.new(gltrnsfrm,"rotation-x",cs)
gltrnsfrm.add_control_binding(cb)
#modify the values
cs.set(0*Gst.SECOND,b) #use updated values of b
cs.set(1*Gst.SECOND,b)
The problem here is that the gltransformation properties are not controllable. i.e. they don't have the GST_PARAM_CONTROLLABLE flag on property creation in the code.
You can either add controllability to gltransformation by proposing a patch upstream or not relying on controllable properties.
I know this question has been posted several times. But none of the solutions worked for me.
I am trying to play .wav files using gstreamer apis.
The below command plays .wav files (with or without the audioconvert):
gst-launch-1.0 filesrc location=sound.wav ! wavparse ! audioconvert ! alsasink
I have written a simple c++ code for the above command referring the GStreamer Hello world example. But it ends with "Internal data flow error". This is my code:
{
gst_init(NULL, NULL);
m_pipeline = gst_pipeline_new("audio-player");
m_fileSource = gst_element_factory_make("filesrc", "file-source");
m_parser = gst_element_factory_make("wavparse", "wav-parser");
m_sink = gst_element_factory_make("alsasink", "audio-output");
if (!m_pipeline || !m_fileSource || !m_parser || !m_sink) {
g_printerr ("One or more elements could not be created !! \n");
return false;
}
/* Set up the pipeline */
else {
/* set the input filename to the source element */
g_object_set (G_OBJECT (m_fileSource), "location", path.toLatin1().data(), NULL);
/* set a message handler on a bus */
GstBus *bus = gst_pipeline_get_bus (GST_PIPELINE (m_pipeline));
gst_bus_add_watch(bus, bus_call, this);
gst_object_unref(bus);
/* add all elements into the pipeline */
gst_bin_add_many (GST_BIN (m_pipeline), m_fileSource, m_parser, m_sink, NULL);
/* link the elements together */
gst_element_link (m_fileSource, m_parser);
g_signal_connect(m_parser, "pad-added", G_CALLBACK(on_pad_added), m_sink);
}
gst_element_set_state(m_pipeline, GST_STATE_READY);
gst_element_set_state(m_pipeline, GST_STATE_PAUSED);
}
on_pad_added(GstElement *src_element, GstPad *src_pad, gpointer data)
{
GstElement *sink_element = (GstElement *)data;
GstPad *sink_pad = gst_element_get_static_pad(sink_element, "sink");
gst_pad_link(src_pad, sink_pad);
gst_object_unref(sink_pad);
src_element = NULL;
}
I even tried the solutions suggested in this link; replaced "oggdemux" with "wavparse" and "vorbisdec" with "identity" from the Gstreamer Hello World example as suggested in this link. Error I received:
0:00:00.289443936 1624 0x1234e00 WARN basesrc gstbasesrc.c:3470:gst_base_src_start_complete:<file-source> pad not activated yet
0:00:00.290993573 1624 0x1740030 FIXME default gstutils.c:3643:gst_pad_create_stream_id_internal:<wav-parser:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.291883886 1624 0x1740030 WARN wavparse gstwavparse.c:1564:gst_wavparse_stream_headers:<wav-parser> Ignoring chunk bext
0:00:00.292198262 1624 0x1740030 WARN wavparse gstwavparse.c:1564:gst_wavparse_stream_headers:<wav-parser> Ignoring chunk junk
0:00:00.305086920 1624 0x1234e00 WARN basesrc gstbasesrc.c:3470:gst_base_src_start_complete:<file-source> pad not activated yet
0:00:00.306444838 1624 0x17400c0 FIXME default gstutils.c:3643:gst_pad_create_stream_id_internal:<wav-parser:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.307224214 1624 0x17400c0 WARN wavparse gstwavparse.c:1564:gst_wavparse_stream_headers:<wav-parser> Ignoring chunk bext
0:00:00.307636819 1624 0x17400c0 WARN wavparse gstwavparse.c:1564:gst_wavparse_stream_headers:<wav-parser> Ignoring chunk junk
0:00:00.526277506 1624 0x17400c0 WARN wavparse gstwavparse.c:2186:gst_wavparse_loop:<wav-parser> error: Internal data flow error.
0:00:00.526495475 1624 0x17400c0 WARN wavparse gstwavparse.c:2186:gst_wavparse_loop:<wav-parser> error: streaming task paused, reason not-linked (-1)
0:00:00.527296570 1624 0x1740030 WARN wavparse gstwavparse.c:2186:gst_wavparse_loop:<wav-parser> error: Internal data flow error.
0:00:00.527439278 1624 0x1740030 WARN wavparse gstwavparse.c:2186:gst_wavparse_loop:<wav-parser> error: streaming task paused, reason not-linked (-1)
ERROR: Internal data flow error.
ERROR: Internal data flow error.
What am I missing in the code?
You get a not-linked error, some element was left unlinked.
The element is wavparse. Your code is trying to wait for it to add a pad to link it, but wavparse has an 'always' source pad, meaning that it is there since the creation of the element so you can just link it directly just like you did with your filesrc.
I am muxing h264 encoded video data and PCM g711 encoded audio data into a .mov media container. I am trying to write metadata on header but the metadata is not showing when I go to file->right click->properties->details on windows and likewise in Ubuntu. This is my code -
// Instead of creating new AVDictionary object, I also tried following way
// stated here: http://stackoverflow.com/questions/17024192/how-to-set-header-metadata-to-encoded-video
// but no luck
AVDictionary* pMetaData = m_pFormatCtx->metadata;
av_dict_set(&pMetaData, "title", "Cloud Recording", 0);
av_dict_set(&pMetaData, "artist", "Foobar", 0);
av_dict_set(&pMetaData, "copyright", "Foobar", 0);
av_dict_set(&pMetaData, "filename", m_sFilename.c_str(), 0);
time_t now = time(0);
struct tm tStruct = *localtime(&now);
char date[100];
strftime(date, sizeof(date), "%c", &tStruct); // i.e. Thu Aug 23 14:55:02 2001
av_dict_set(&pMetaData, "date", date, 0);
av_dict_set(&pMetaData, "creation_time", date, 0);
av_dict_set(&pMetaData, "comment", "This video has been created using Eyeball MSDK", 0);
// ....................
// .................
/* write the stream header, if any */
int ret = avformat_write_header(m_pFormatCtx, &pMetaData);
I also tried to see if the file contains any metadata using mediainfo and exiftools in linux. Also I tried ffmpeg -i output.mov but no metadata is shown.
Whats the problem? Is the flags value 0 in av_dict_set okay? DO I need to set different flags for different platform (windows/linux) ?
I saw this link and it stated that for windows, I have to use id3v2_version 3 and -write_id3v1 1 to make metadata working. If so, how can I do this in C++?
I have something similar to your code, but I'm adding the AVDictionary to my AVFormatContext metadata parameter and it works for me that way. Here's a snippet based on your code.
AVDictionary *pMetaData = NULL;
av_dict_set(&pMetaData, "title", "Cloud Recording", 0);
m_pFormatCtx->metadata = pMetaData;
avformat_write_header(m_pFormatCtx, NULL);