gst_element_request_pad_simple fails for hlssink2 - gstreamer

My appsrc pipeline is as follows:
appsrc-openh264enc-h264parse-hlssink2
I first retrieve each element with gst_element_factory_make successfully, each element is checked for nullptrs.
Then I add/link all the elements with 'always' presence via:
gst_bin_add_many(pipeline, appsrc,h264Encoder,h264Parse, NULL);
assert(gst_element_link_many(pipeline, (GstElement*)appsrc, h264Encoder, h264Parse, NULL));
According to the gstreamer plugin docs, h264parse as a 'src' has 'always' but hlssink2 as 'video' is 'request'. So, I try to retrieve the pad to link the two:
//hlspad returns null
GstPad* hlspad = gst_element_request_pad_simple(hlssink2, "video");
//videoParsePad is non-null
GstPad* videoParsePad = gst_element_get_static_pad(h264Parse, "src");
This is native side on Android, anyone know why this isn't working? Everything should be compatible

Using hlssink instead of hlssink2 works, good enough workaround

Related

what is the use of failIfNotExists parameter in sitecore pipeline run method

I'm trying to get response from my custom sitecore pipeline of run method, below is my code
Sitecore.Pipelines.CorePipeline.Run("customPipelineName", args, true);
the third parameter is failIFnotExists. Can anybody explain the use of this parameter and how to get response from puipeline when I abort pipeline in processor.
below is declaration in Kernel DLL
public static void Run(string pipelineName, PipelineArgs args, bool failIfNotExists);
after decompile I found below implemetation for run method. if failIfNotExist parameter use is only to run or not the pipeline then why can't we do it while calling run method?
if (pipeline == null && !failIfNotExists)
return;
Assert.IsNotNull((object) pipeline, "Could not get pipeline: {0} (domain: {1})", new object[2]
{
(object) pipelineName,
(object) pipelineDomain
});
The usage of this parameter is quite simple. Like the comment says: If set to true the code will throw an exception if pipeline not found. Taking into consideration that comment and bellow decompiled code:
public override void Run(string pipelineName, PipelineArgs args, string pipelineDomain, bool failIfNotExists)
{
Assert.ArgumentNotNullOrEmpty(pipelineName, "pipelineName");
Assert.ArgumentNotNull((object) args, "args");
Assert.ArgumentNotNull((object) pipelineDomain, "pipelineDomain");
CorePipeline pipeline = this.GetPipeline(pipelineName, pipelineDomain);
if (pipeline == null && !failIfNotExists)
return;
Assert.IsNotNull((object) pipeline, "Could not get pipeline: {0} (domain: {1})", new object[2]
{
(object) pipelineName,
(object) pipelineDomain
});
pipeline.Run(args);
}
you can see that it just about throwing an exception when the pipeline is not found or not. If you are not sure that the pipeline exists and can see this param to true and use try/catch statement in your code to know if the pipeline was run or it doesn't exist. It has nothing to do with the results of the pipeline when it gets aborted.
Regarding the pipeline results. You are passing args to the pipeline:
Sitecore.Pipelines.CorePipeline.Run("customPipelineName", args, true);
Those can be your custom one or standard Sitecore ones. Everything what pipeline will do will be (should be) reflected in those args. If this is your custom pipeline and custom args than, put additional properties to those arguments and full fill them in the pipeline processors so even if one of the processors will abort pipeline it will be reflected in the args.

Gstreamer 1.0 - Creating custom message/event/signal

I am writing a custom plugin for gstreamer 1.0 in C.
This plugin perform some processing on frames and should send an event to the application whenever some conditions are met.
It should not block the pipeline not interfere with it, just a signal so the application can trigger an action unrelated to the pipeline on the side.
The processing is working well but ... i don't know what to do next.
There is a lot of already existing message like EOS or seek but how do i create my own?
The message should contain custom data and therefore i must create one myself that i could send.
Either by sending events or signal i could not find any examples/documentations/explainations on how to handle custom events from a plugin.
I don't even have a sample code to start with.
Any insight would be appreciated.
Take a look at the fpsdisplaysink element:
https://github.com/GStreamer/gst-plugins-bad/blob/master/gst/debugutils/fpsdisplaysink.c
This one emits signals which the application can connect to. Most interesting probably the signal creation:
g_signal_new ("fps-measurements", G_TYPE_FROM_CLASS (klass),
G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL,
G_TYPE_NONE, 3, G_TYPE_DOUBLE, G_TYPE_DOUBLE, G_TYPE_DOUBLE);
and the periodically triggering of said signal:
g_signal_emit (G_OBJECT (self),
fpsdisplaysink_signals[SIGNAL_FPS_MEASUREMENTS], 0, rr, dr,
average_fps);
Detailed information should be found at the GLib signals documentation:
https://developer.gnome.org/gobject/stable/gobject-Signals.html
#
Alternatively you create your own GstMessage and post it on the bus. See the GstMessage documentation:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstMessage.html
GstMessage *gst_message_new_application (GstObject *src,
GstStructure *structure);
You can then wrap your data inside the GstStructure. And then post the message to the bus with gst_bus_post().
Thank you Florian for your insight which helped me a lot.
I ended up using gst_message_new and gst_post_bus.
For those who might be interested here is the code in python where i implemented a run loop.
def connect(bus, name):
def _connect(f):
bus.connect(name, f)
return f
return _connect
....
bus = self.pipeline.get_bus()
bus.add_signal_watch()
ret = self.pipeline.set_state(Gst.State.PLAYING)
if ret == Gst.StateChangeReturn.FAILURE:
logger.error("ERROR: Unable to set the pipeline to the playing state")
loop = GObject.MainLoop()
print()
#connect(bus, "message::"+Gst.MessageType.get_name(Gst.MessageType.ERROR))
def on_error(bus, message):
err, dbg = message.parse_error()
print("ERROR:", message.src.get_name().encode('utf-8'), ":", err.message.encode('utf-8'))
if dbg:
print("debugging info:", dbg)
loop.quit()
#connect(bus, "message::"+Gst.MessageType.get_name(Gst.MessageType.EOS))
def on_eos(bus, message):
logger.info("End-Of-Stream reached")
loop.quit()
.... other events
try:
loop.run()
except KeyboardInterrupt:
pass
print("START : Pipeline has stopped")
self.pipeline.set_state(Gst.State.NULL)

InterpolationControlSource with Gst.parse_launch()

My app (in Python), loads the Gstreamer library, parses and launches a pipeline spec that composites subtitles from an SRT file on top of a prepared video from an MP4 file, then creates a control source with a binding to the 'alpha' property of the sink pad of the videomixer element that is linked to the subtitle image source.
First I wrote a small proof-of-concept which works like a champ. If you can run it with an X-windows server (in Unix or Linux for example), you will see a black square on a green background. After a second, the black square gradually fades out over several seconds.
My app has a pipeline that is a bit more complex. Below is a summary of the relevant code:
pipeline_spec = '''
videomixer name=mixer ! ... other stuff downstream
filesrc location=sample_videos/my-video.mp4 ! decodebin name=demuxer ! mixer.sink_0
filesrc location=subtitles.srt ! subparse ! textrender ! mixer.sink_1
demuxer. ! audioconvert ! audioresample ! faac ! muxer.
'''
self.pipeline = Gst.parse_launch(pipeline_spec)
mixer = self.pipeline.get_by_name('mixer')
#vidpad = mixer.get_static_pad('sink_0')
srtpad = mixer.get_static_pad('sink_1')
self.logger.debug([ pad.name for pad in mixer.pads ])
cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
binding = GstController.DirectControlBinding.new(srtpad, 'alpha', cs)
cs.add_control_binding(binding)
with open(srtfilepath) as srtfile:
for timestamps in parsesrt.parse(srtfile):
start, end = timestamps
self._set_subtitle_fade(alpha_cs, start, end)
def _set_fade_effect(self, controlsource, start, duration, alpha_begin, alpha_end):
controlsource.set(start, alpha_begin)
controlsource.set(start + duration, alpha_end)
self.logger.debug('set fade-{0} from {1} to {2}'.format('in' if alpha_begin < alpha_end else 'out', start, start + duration))
def _set_subtitle_fade(self, controlsource, start_subtitle, end_subtitle):
self._set_fade_effect(controlsource, start_subtitle, self.DURATION_FADEIN, 0, 1)
self._set_fade_effect(controlsource, end_subtitle - self.DURATION_FADEOUT, self.DURATION_FADEOUT, 1, 0)
One difference between the two pipelines is that in the first example, the videomixer pads are request pads. But in the real app, they turn out to be static pads. And only 'sink_1' is present in the log statement.
DEBUG, ['src', 'sink_1']
I'm not sure why this is so or whether it makes a difference.
When I run the app in a web server and check in a browser, the subtitles appear but they do not fade in or out.
I checked the timestamps and they look good. They are in nanoseconds (10^9).
set fade-in from 2440000000 to 3440000000
set fade-out from 2375000000 to 4375000000
set fade-in from 7476000000 to 8476000000
...
So what stone have I left unturned?
The other big difference between your first and second prototypes is videotestsrc changing to filesrc ! decodebin. gst_parse_launch won't immediately connect decodebin to videomixer. What'll happen is this:
Pipeline is parsed but decodebin doesn't know the contents of filesrc until it de-muxes it. It could be audio or a Powerpoint presentation or a PGP signature or anything. So it returns no src pads initially.
You play the pipeline. decodebin begins receiving data from filesrc, identifies the content as mp4, and demuxes it. It discovers it has video content that matches pads for videomixer and makes the connection to the first open pad.
So what you probably need to do is listen for the pad-added event on decodebin, check that it's the right pad, and then make your binding.
def decodebin_pad_added(self, decodebin, pad):
#return if pad is wrong type
#make the binding to the pad
decodebin.connect("pad_added", decodebin_pad_added)
You can see know that this behavior will be present by running gst-inspect-1.0 on the element in question and examining the pads. You can see that decodebin has a "sometimes" pad template vs. a constant pad that's present on subparse:
subparse:
Pads:
...
SRC: 'src'
Implementation:
Has custom eventfunc(): gst_sub_parse_src_event
Has custom queryfunc(): gst_sub_parse_src_query
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Pad Template: 'src'
decodebin:
Pad Templates:
SRC template: 'src_%u'
Availability: Sometimes
Capabilities:
ANY

how do i add attributes to xml using xerces?

i have currently generated some XML using xercer in C++, using the following code:
XMLCh tempAttribute[100];
XMLString::transcode("ad", tempStr, 99);
doc = impl->createDocument(0,tempStr ,0);
root = doc->getDocumentElement();
XMLString::transcode("imageAd", tempStr, 99);
element = doc->createElement(tempStr);
root->appendChild(element);
However i am attempting to get the attributes within the top "ad" element (as below), however i have had little luck in doing so, can someone with experience using xerces please advise.
Thanks in advance!
<ad xsi:noNamespaceSchemaLocation="smaato_ad_v0.9.xsd" modelVersion="0.9">
<imageAd>
maybe you didn't saw the call to setAttribute in my previous answer but you can set any attribute for any element with calls like
root->setAttribute(L"modelVersion", L"0.9");
root->setAttribute(L"xsi:noNamespaceSchemaLocation", L"xsi:noNamespaceSchemaLocation");
Where root is the pointer to your root element.

Putting a CGImageRef on the clipboard

I'm trying to copy a CGImageRef to the clipboard pasteboard. I found a function that claims it should do this by creating a destination from (zero sized), adding the image to the destination, finalizing, then PasteboardPutItemFlavor the ref into the clipboard.
However it doesn't work, so two questions:
Is this the correct way to go about this? (ie, is there just a small bug, or am I doing it wrong?)
What type should I make the destination? The source had it as TIFF, but word doesn't seem to know how to deal with that, I changed it to PICT, which at least gave me the "paste" option, but then said it was too big...
Code:
void copyCGImageRefToPasteboard(CGImageRef ref)
{
OSStatus err = noErr;
PasteboardRef theClipboard;
err = PasteboardCreate( kPasteboardClipboard, &theClipboard );
err = PasteboardClear( theClipboard );// 1
CFMutableDataRef url = CFDataCreateMutable(kCFAllocatorDefault, 0);
CFStringRef type = kUTTypePICT;
size_t count = 1;
CFDictionaryRef options = NULL;
CGImageDestinationRef dest = CGImageDestinationCreateWithData(url, type, count, options);
CGImageDestinationAddImage(dest, ref, NULL);
CGImageDestinationFinalize(dest);
err = PasteboardPutItemFlavor( theClipboard, (PasteboardItemID)1, type, url, 0 );
}
Enter "The Cupertino Tongue Twister" by James Dempsey
Peter put a PICT upon the pasteboard.
Deprecated PICT's a poor pasteboard type to pick.
For reference see: http://developer.apple.com/mac/library/documentation/cocoa/Conceptual/PasteboardGuide106/Articles/pbUpdating105.html
In short: it's deprecated to put PICT on the pasteboard.
Ok, I'm answering my own question here, but here's what I've found:
Apple wants you to use PDF for pasteboards. So if you swap out Pict with PDF, it pretty muc just works. However, MS Word (what I was testing with) only started to allow pasting of PDF in the newest version (Which I don't have).
So, that's the solution, use PDF, and require Word 2008.