I'm continuing my journey through GStreamer and am stuck once again. I'm running the first tutorial and can hear sound but no video.
The error that I'm seeing is:
gldisplay gstgldisplay_cocoa.m:175:gst_gl_display_cocoa_setup_nsapp: Custom NSApp initialization failed
Am I missing a plugin or some required library here? I'm on an M1 Mac.
I should note, that via the command line everything works just fine.
gst-launch-1.0 playbin uri="file:///Users/vukasin/Downloads/sintel_trailer-480p.webm"
Maybe this is OpenGL related? I'm guessing its trying, and failing to open the video.
I just figured out what was going on with this error after spending hours thinking my system was not setup correctly. This issue is that a GMainLoop must be ran for the windowing to work correctly from the main thread. Interesting enough the tutorial 12 does it correctly and does the same exact thing. https://gstreamer.freedesktop.org/documentation/tutorials/basic/streaming.html?gi-language=c
Related
I am trying to write a service that converts a m3u8 link from youtube.
I am loading the stream info then setting all but 1 video and 1 audio stream to AVDISCARD_ALL.
Everything works fine except after several seconds of playing av_read_frame blocks for 30 seconds to a couple minutes.
I'm not the best versed in ffmpeg so I'm not exactly sure what to do from here.
Extracting one of these links into ffplay works fine it seems, but I don't see what they may be doing differently? I'm just hoping someone can point me in the right direction of anything to try.
Attached is a callstack from when this is happening. I am not sure if anything here looks amiss.
I got an issue with SDL, after I successfully init SDL audio and run SDL_OpenAudio() , everything goes well ,but if disconnect the current audio device or switch to another , SDL_AudioCallBack() never runs again and SDL_CloseAudio()\SDL_QuitSubSystems() can not return .
so, how can I fix this ? thanks a lot!
solved out, cuz SDL uses XAudio2 defaultly, but this goes unsuitable in SDL,maybe a issue on SDL or XAudio2.
then I use DirectSound as the driver , code runs fine.
SetEnvironmentVariable(L"SDL_AUDIODRIVER",L"directsound"); // tell SDL use DirectSound
First of all, you should know this question is titled that way because that's were I ended up stuck after narrowing down my problem for quite a while. Since there probably are better approaches to my problem I'm also explaining below my problem and what I've been doing to try and solve it. Suggestions on other approaches would be very welcome.
The problem
I'm using a gstreamer port to Android to render videos from remote cameras through the RTSP protocol (UDP is the transport method).
Using playbin things were working quite fine until they didn't anymore for a subset of these cameras.
Unfortunately I don't have access to the cameras themselves since they belong to our company's client, but the first thing that sprung to my mind was that it's got to be a problem with them.
Then, there's another Android app which we're using as reference that is still able to play video from these cameras normally, so I'm now trying to do my best to further investigate the issue on my end (our Android app).
The problem has been quite deterministic: some cameras always fail, others always work. When they fail, sometimes it would be with reason not-linked as the cause.
I managed to dump the pipeline graph associated with each of these cameras when the application tries to play video from them. Then I could notice that for each of the cameras that are failing, the associated pipelines are always missing something. Some miss just the sink element, others miss both the source and the sink:
Dump of pipeline with source only:
Dump of pipeline without a source or a sink:
Dump of pipeline with both (these are the cases where we can indeed play):
These are dumps of pipelines built by the playbin.
Attempted solution
I've been trying to test what would happen if I built the pipeline manually from scratch (so that it's the same being build by the playbin in the third image above) and forced all camera's videos to be processed by this pipeline. Since all cameras used to work, my guess is that somehow negotiation is failing now for some cameras so that the playbin is not building the pipeline properly for these cameras but if I assemble it myself, eventually it all would work as expected (I'm assuming that rtspsrc in combination with glimagesink was also the chosen pipeline by the playbin for playing video from these cameras).
This is how I'm trying to build this pipeline myself:
priv->pipeline = gst_pipeline_new("rtspstreamer");
source = gst_element_factory_make("rtspsrc", NULL);
if (!source) {
GST_DEBUG("Source could not be created");
}
sink = gst_element_factory_make("glimagesink", NULL);
if (!sink) {
GST_DEBUG("Sink could not be created");
}
if (!gst_bin_add(GST_BIN(priv->pipeline), source)) {
GST_DEBUG("Could not add source to pipeline");
}
if (!gst_bin_add(GST_BIN(priv->pipeline), sink)) {
GST_DEBUG("Could not add sink to pipeline");
}
if (!gst_element_link(source, sink)) {
GST_DEBUG("Source and sink could not be linked");
}
g_object_set(source, "location", uri, NULL);
So, running the code above, I get the following error:
Source and sink could not be linked
This is where I'm stuck. How could I investigate further on why these components are unable to link to each other? I think that maybe there should be some other component between them in the pipeline, but I think that's not the case by looking at the dump of the successful pipeline (third image) above.
Thanks in advance for any help.
I've been playing around the QMediaplayer library. I was curious about how it would work with some streaming video source, so I've used VLC to stream some videos using udp protocol.
To make a quick test, I've used the Qt example named MediaPlayer example. As the example is designed to work only with offline files, I've added on dumb function on the Player implementation.
void setM(QUrl url){player->setMedia(url);player->play();}
Then, on the main.cpp file I call this function like this:
...
player.setM(QUrl("udp://239.1.1.1:1234"));
return app.exec();
What this do is start to reproducing the stream source once the program is read.
The problem here is that Qt through me the following error:
DirectShowPlayerService::doSetUrlSource: Unresolved error code 800c000d
To do this with local files and http streaming and it works... but when I tried with UDP or RTP I always get the same error.
I've spent few hours looking for more information, but always get the same response... use QMLVLC... For example, look this.
Does anyone tried this before? What's is wrong here?
PD: I know that there is a VLC plugging to make this work, but I would like make this work only with Qt (or at least, understand what is happening here).
PD2: I'm on windows 8.1, Qt 5.5 (mingw 4.9.2) and I have all the important codecs installed.
Thanks in advance,
UPDATE
Finally, I manage to deal with the new http://code.qt.io and here is the code I suspect is blocking udp (and others) protocols-> here.
Maybe, only "http" and "https" are accepted as valid stream sources on Directshowsservice... I'll try to get some extra time this week to recompile just the multimedia module for windows in order to add udp procotol to the function doSetUrlSource and see what happens. If anyone test it first, please let me know here!
UPDATE 2
First of all, I suspect QMediaPlayer couldn't reproduce UDP/RTP content because the AddFilter method... Anyway, http,https and rtsp works perfectly.
Secondly, I've found some strange behavior over udp protocol.
I'm using "udp://#239.1.1.1:1234" as test multicast direction. The strange thing is that during one test I put this direction by mistake "udp://#239.1.1.1:1234z" and this time no error has been through. It seems that the direction needs to contain a letter.
This is going to be my first question in StackOverflow after several days looking for an explanation. Please, be gentle with me for asking because I know my problem is a bit bizarre to be a general problem.
I made a MF capture video application, based in the Microsoft example 'CaptureToFile'. It did work on Windows 7 x64. I upgraded to Visual Studio 2013 without problems. Problems arose when I try to put all the development on a Windows 8.1 x64 machine.
The app compiles and executes without error, but it's UNABLE to capture samples by using m_pReader->ReadSample() in asynchronous mode; only the first two samples arrive to OnReadSample method; and there must be 'control' samples, because the IMFSample is null in all of them. After that, the app gets 'hanged' waiting for data.
I've tried the original MFCaptureToFile sample with the same sad results.
Of course, I think hardware and software are similar (the same capture card with the same driver version, both are desktop PC...)
Do you know any possible reason for this behaviour? in Win7 everything is working flawless! Or at least, if you could light me a bit about new paths for finding what's happening
Thanks in advance
UPDATE: There is another 'player' in the game. Looking into the threads, I see that a worker thread is in 'RTWorkQ.dll', the real-time work queue container, specific only for Windows 8. I will go on investigating. In the meantime, if you have any idea, something to share, I'll be glad to hear you.
UPDATE 2: I've modified the sample MFCaptureToFile to get the video samples synchronously, because I thought the problem could be due to the asynchronous behaviour; related with the queues. I've to say that the problem persist even with this change. The second time it tries to read a sample, the application gets 'hanged' waiting for a reading that doesn't never arrives.
UPDATE 3: I've tried with the CaptureEngine sample application that uses another MF way to capture video (MFCaptureEngine). It builds and runs flawlessly but doesn't show any images when starting the 'preview' and doesn't record any useful, only non-playable files.
UPDATE 4: I've installed Visual Studio 2010 Ultimate in Windows 8 PRO. The sample MFCaptureToFile fails again in the sample. It's unable to read a 2nd sample from the frame grabber. I'm starting to think that can be an incompatibility between the capture card (Datapath VisionRGB-E1S) and Windows 8 PRO despite the driver assures it works fine in this platform and the test program shows images. Tomorrow I'm going to try the test with an external USB webcam.
Finally, I have figured out the reason of this problem.
With Windows 8.1 release Microsoft has introduced New AVStream Interfaces for Windows 8.1
There is a small but very important change in KS_FRAME_INFO structure - the new FrameCompletionNumber member.
An identifying sequence number for the frame in the completed queue.
This number is used to verify proper frame order. When this value is
0, the frame was cancelled. This member is available starting with
Windows 8.1.
DirectShow doesn't care about this number. And MediaFoundation cares.
So, you cannot just fix that on your user-mode side. The manufacture developers must release an update. Btw, I have two webcams - Logitech C270 and Creative Live Socialize HD. Logitech supports Metro while Creative does not.
I have successfully updated my driver with only a few lines of code (to set up FrameCompletionNumber properly).
UPD. similar thread http://www.osronline.com/showthread.cfm?link=255004
It must be a problem of the frame grabber Datapath VisionRGB-E1S. I've tried with the brand-new USB webcam LifeCam Studio, and everything worked fine.
I will left for other future thread why this unpaired behaviour between Windows 8 and Windows 7, but it could be something related to the User-mode access...
I had the same kind of issue:
IMFSourceReader was obtained successfully
reader->SetCurrentMediaType() reported no error.
reader->ReadSample() was successful.
then OnReadSample() was called only once and the hrStatus argument 0x80070491
For me, the issue was that I modified the video subtype IMFMediaType, then applied to the reader as current media type.