Playing IMFSamples using Media Foundation - c++

I am creating sample application using Windows Media Foundation.
I have used Source Reader IMFSourceReader to Read the media file and then After I am processing the samples IMFSamples using Custom MFT IMFTransform.
In the MFT I have processed IMFSamples, How can I play/display them in a windows. I don't want to use EVR for display.
Also I have read the question:
How to play IMFMediaSample in media foundation?
As per the suggestion I need to use MFPlay for playing the samples, but exactly how this can be done.
In the interface IMFPMediaPlayer I am not able to find any method where I can push the media samples.
https://msdn.microsoft.com/en-us/library/windows/desktop/dd374329(v=vs.85).aspx

IMFSample is a wrapper over raw data. If you happen to waive standard API offering for playback/presentation (such as EVR for video), you will have to extract the data from the media sample object and consume it otherwise, such as using another API at your discretion.
This does not have to be visualization exactly, you are not limited in consumption ideas: writing to file, sending via network etc. For visualization you have other Windows APIs at your choice: DirectX, DirectShow, legacy DirectDraw, GDI, GDI+, Direct2D etc.
IMFSample is not immediately accepted by other APIs right away because it is not what it is designed for. In Media Foundation API EVR is designed for presentation, and EVR is what you are supposed to use.
The video sample object is a specialized implementation of the IMFSample interface for use with the Enhanced Video Renderer (EVR)...

Related

Getting error MF_E_STREAMSINKS_FIXED for EVR - Windows Media Foundation

I am implementing an example of media foundation, using below link.
https://msdn.microsoft.com/en-us/library/windows/desktop/ms701605(v=vs.85).aspx
The change that I have did to this example is I have added to Streams by calling.
CreateMediaSource(wFile1, &m_pSource_1);
CreateMediaSource(wFile2, &m_pSource_2);
CreateAggregatedSource(m_pSource_1, m_pSource_2, &m_pAggregatedSource);
m_pAggregatedSource->CreatePresentationDescriptor(&pSourcePD);
m_pSession->SetTopology(0, pTopology);
Issue that I am facing is, I am getting below error when I run the application:
Code: 0xC00D4A3B
Enum: MF_E_STREAMSINKS_FIXED
Message: Stream Sinks cannot be added to or removed from this Media Sink because its set of streams is fixed.
What I want to implement:
I want to display two video streams in one video renderer using EVR Windows Media Foundation.
After a lot of investigation on EVR and using Video Mixer to display two video, my conclusion is that evr is not a solution for this (at least on Windows 7).
EVR and Mixer Video simply fail to render two or more video in a simple case. Perhaps a lack of documentation, perhaps...
For me the best way would be to use a custom evr renderer that will do the mixing, without using the design of a mixer video (no need imftransform). The renderer handle the directx things, so it can handle directly the video mixing.

Grabber for splitting in UWP

I need your advice. I'd like to develop the app for audio/video splitting using Metro interface.
Usually I use DirectShow for it using the follow schema: create a grabber, add it to DS graph, capture by it the audio/video streams and pass them to my AVstream drivers for splitting. But in new program I want to use Media Foundation and insert it into UWP.
How I see my new app. It must have Metro interface for common control: choice of sources, adding parameters, changing modes and etc. I'd like to use MediaCapture class for capture of streams and rendering them too. Here I don't see any problems, MSDN has many samples for it. But I have no ideas how to insert a grabber between source and render.
Which operations a grabber will do:
Receive input stream from MediaCapture.
Stream converting : YUV -> RGB, adding effects and etc.
Send output stream for rendering (MediaCapture) and to my AVstream driver for splitting with any apps (Skype, Adobe Flash Player, Edge, ....).
How to make a grabber. In MSDN I found three ways:
Sample Grabber Sink (https://msdn.microsoft.com/en-us/library/windows/desktop/hh184779(v=vs.85).aspx). No problem to receive/control/send stream in MF dll. But I don't know how to link that dll with MediaCapture?
Source Reader (https://msdn.microsoft.com/en-us/library/windows/desktop/dd940436(v=vs.85).aspx). The same problems, plus the Source Reader doesn't work for playback.
Custom MFT? Any case MediaCapture allows to connect to MFT by AddEffectAsync().
My environment: MS Windows 10, MS Visual Studio Community 2015.
Thank you for any ideas.
This question and UWP are not actual for me at all. I found the following:
"Some apps can work intensively in background, for example it maybe video converting, online financial data processing and more.
Now UWP application will suspended when it go offscreen."
https://wpdev.uservoice.com/forums/110705-universal-windows-platform/suggestions/9950598-exclude-suspending-in-desktop
So if the user minimizes the program window, then the program stops a video stream.

C++ Winapi - MPEG movies as animated background

MPEG is a really nice format, specially because it really compress the file to unimaginable sizes. A 140Mb raw AVI is now only 4Mb and the quality is still very good. With the Animation Control Windows provides I can play only raw AVI but I would really like to play a MPEG instead, due to the the sizes of the video file.
Now, how would I do that with C++ and WINAPI? Do I have to use some ActiveX components? How do I make sure other users can run my application without being harassed about missing plug-ins/codecs/third-party programs? Can I use the Animation Control someway for displaying the MPEG video?
Thanks
I took a look at the MSDN documentation and it looks like you can not use the Animation Control to play MPEG video, you seem to have two choices:
1. DirectShow.
2 The newer Microsoft Media Foundation.
Both choices based on COM (and not ActiveX as I stated earlier).
As for making sure your users can run your application, see this page on Building DirectShow Applications which answers that questions for DirectShow. For Microsoft Media Foundation your users need to be running MS Vista or later.

How to stream raw synthesized PCM audio in C++/CX Windows 8 apps?

Simply put, I want my C++/CX XAML Windows 8 app to output continuous synthesized sound (not sound effects). However I've been looking all over the Web and I cannot figure out how to get the system feed it buffers of PCM samples (or better, have it ask me for them through callbacks) for them to be played. I would use the old waveOut* APIs, however they are banned in Store app development.
So, what is the simplest way to do this? Please note that I am not interested in playing media files (.wav, .mp3) or web audio streaming.
Thanks in advance.
You need to use WASAPI which is enabled in Windows Store apps. This article will get you started with how to use the API to render audio. One annoyance is that WASAPI devices generally don't resample for you so you'll have to be willing to go with what the device is using (probably 44.1kHz or 48kHz) or do the resampling yourself (for which you can make use of the Resampler Media Foundation transform).

Display streaming video in desktop app

I have a Windows native desktop app (C++/Delphi), and I'm successfully using Directshow to display live video in it from a 'local' video capture device.
The next thing I want to do is display video from a 'remote' capture device, streamed over the LAN.
To stream the video, I guess I can use something like Expression Encoder or VLC, but I'm not sure what's the easiest way to receive/decode the streamed video. Inserting an ActiveX VLC or Flash player might be one option (although the licensing may be an issue then), but I was wondering if there's any way to achieve this with Directshow...
Application needs to run on XP, and the video decoding should ideally be royalty free.
Suggestions, please!
Using Directshow for receiving and displaying your video can work, but the simplicity, "openness" and performances will depend on the video format and streaming method you'll be using.
A lot of open/free source filters exist for RTSP (e.g. based on live555), but you may also find that creating your own source filter is a better fit.
The best solution won't be the same for H264 diffusion through RTP/RTSP and for MJPEG diffusion through simple UDP.