I have used x264 DirectShow filter from Monogram for decoding h264 avc video , i need to create intrance and add filter to graph in directshow , i checked the CLSID of it from grapthEdit , and that is 'x264' , i guess that to create instance we need the GUID for that filter , i have no clue how can i create filter instance using 'x264' value.
I am using directshow with vc++
Can any body have idea on this???
As this filter is open source, you only need to watch in the right headers. You just need to copy CLSID_MonogramX264 from here and create the filter with CoCreateInstance.
You can use Monogram Graph Studio to see an CLSID, as I remember than I checked it last time all was OK.
Related
Currently I'm creating WMV file with Windows Media Foundation SDK using the method (Transcode API) in following article: https://learn.microsoft.com/en-us/windows/desktop/medfound/tutorial--using-the-sink-writer-to-encode-video
What I noticed is if I don't set MFPKEY_ASFMEDIASINK_AUTOADJUST_BITRATE, output WMV file may be corrupted (show up as terrible artifacts during playback) when coded WMV file over the specified bitrate.
According to following article, the flag needs to set to IMFASFContentInfo:
https://learn.microsoft.com/en-us/windows/desktop/medfound/mfpkey-asfmediasink-autoadjust-bitrate-property
So I tried following way:
IMFSinkWriter* pSinkWriter = NULL;
//initialize sink writer with MFCreateSinkWriterFromURL
...
IMFASFContentInfo* pContentInfo = NULL;
pSinkWriter->GetServiceForStream((DWORD)MF_SINK_WRITER_MEDIASINK, GUID_NULL, __uuidof(IMFASFContentInfo), (LPVOID*)&pContentInfo);
IPropertyStore* pPropStore = NULL;
pContentInfo->GetEncodingConfigurationPropertyStore(0, &pPropStore);
PROPVARIANT var;
PropVariantInit(&var);
var.vt = VT_BOOL;
var.boolVal = VARIANT_TRUE;
pPropStore->SetValue(MFPKEY_ASFMEDIASINK_AUTOADJUST_BITRATE, var);
PropVariantClear(&var);
//Add Video stream and BeginWriting, then start passing samples
...
But, the settings does not seem to be applied to muxer, and I still see obvious artifacts of corrupted stream.
What I guess is since MFCreateSinkWriterFromURL will create MediaSink and underlying IMFASFMultiplexer internally, however MFASF_MULTIPLEXER_AUTOADJUST_BITRATE needs to be set during creating IMFASFMultiplexer, so settings the flag after SinkWriter is created is too late.
If I don't use Transcode API, and create IMFASFWriter by myself, I think I can set MFASF_MULTIPLEXER_AUTOADJUST_BITRATE during creating IMFASFWriter manually but since I already have a working code besides settings this flag by using Transcode API, if possible I want to keep current way.
If anybody has any clue/solution/workaround, please let me know.
You should be able to query for IPropertyStore of to the ASF File Sink from the Sink Writer directly using GetServiceForStream and specifying MF_SINK_WRITER_MEDIASINK like this:
pSinkWriter->GetServiceForStream(MF_SINK_WRITER_MEDIASINK, GUID_NULL, IID_PPV_ARGS(&pPropertyStore));
where pPropertyStore points to an IPropertyStore.
After that you should set the MFPKEY_ASFMEDIASINK_AUTOADJUST_BITRATE property as explained here
Is there a way (or hack) to let me use a "custom" video capturer to create a VideoTrack and provide frames to it ?
The classic way to build a VideoTrack is :
Get a VideoCapturer Instance :
std::unique_ptr<cricket::VideoCapturer> capturer;
Create a VideoSource with a provided capturer :
rtc::scoped_refptr<webrtc::VideoTrackSourceInterface> videoSource = peer_connection_factory_->CreateVideoSource(std::move(capturer), NULL);
Create a VideoTrack using the VideoSource :
rtc::scoped_refptr<webrtc::VideoTrackInterface> video_track;
video_track = peer_connection_factory_->CreateVideoTrack(kVideoLabel, videoSource);
I was wondering if there is a way to override step one, instead of using the native one, using a custom capturer, so that i can provide the frames to the video track using a callback. That will let me use any video source (file, yuv stream...) and be very flexible.
Any advice on this one ?
This question is a C++ reference to : Create a WebRTC VideoTrack with a “custom” Capturer on Android with libjingle
I finally found a way to make my own native C++ Video Capture. Basically you have to override some functions from webrtc::I420BufferInterface and cricket::VideoCapturer.
If someone wants any further explanations please feel free to ask.
I am using the nokiatech heif api (github.com/nokiatech/heif) to process heic files produced by the IOS betas.
I am able to get the tiles and metadata like rotation and dimensions, but I am unable to locate the capture date of the image. I found some timestamp functions but they complain about "Forced FPS not set for meta context" which leads me to think these functions are related to tracks and not items.
Any help would be appreciated.
EDIT:
So there is a typo in the documentation for getReferencedToItemListByType (and getReferencedFromItemListByType), it says it takes "cdcs" as a referenceType parameter. It is ofcource "cdsc" (Content Describe).
So to get the metadata blob from a stil image as of now you can do the following:
reader.getItemListByType(contextId, "grid", gridItemIds);
ImageFileReaderInterface::IdVector cdscItemIds;
reader.getReferencedToItemListByType(contextId, gridItemIds.at(0), "cdsc", cdscItemIds);
ImageFileReaderInterface::DataVector data;
reader.getItemData(contextId, cdscItemIds.at(0), data);
Then you need to decode the exif. You can easily use Exiftool cli or an api like exiv2.
I'm trying deinterlacing video with ffmpeg in my C++ program.
First of all, i used avpicture_deinterlacebut is deprecated.
Looking for more information, I've tried avfilter_get_by_name("yadif")after avfilter_register_all()but always return NULL. I've tried the next code too, but still not working. I've tried different parameters in avfilter_init_strfunction buterris always less than 0, that means there is an error.
int err;
// Register all built-in filters
avfilter_register_all();
// Find the yadif filter
AVFilter *yadif_filter = avfilter_get_by_name("buffer");
AVFilterContext *filter_ctx;
// Create the filter context with yadif filter
avfilter_open(&filter_ctx, yadif_filter, NULL);
// Init the yadif context with "1:-1" option
err = avfilter_init_str(filter_ctx, "\"yadif=1:-1\"");
I know filtering_video.c file is a good start point to understand how to build a filter but I don't want to build one, I only need to use the yadif deinterlacing filter. I have the AVFramebut I don't know how to apply de yadif filter to it.
Any help could be welcome.
In older FFmpeg releases, yadif was only compiled if --enable-gpl configure option was used. You probably need to update to a later release or re-compile the old release with --enable-gpl.
I am trying to load PNGs for HBITMAP. I found this post from stackoverflow. When I run the code I get REGDB_E_CLASSNOTREG on CoCreateInstance(CLSID_WICPngDecoder, NULL, CLSCTX_INPROC_SERVER, __uuidof(ipDecoder), reinterpret_cast<void**>(&ipDecoder). I am using Visual Studio 2012 RC, I've done CoInitlaize and I am still getting same error what could be possibly wrong?
WIC API suggest that you create decoder from factory, rather than directly using its CLSID. See IWICImagingFactory interface and sample code there.
You have options to create from file, stream or specifying container format GUID.