DirectX11 2 window rendering - c++

How can I render my objects with DirectX into 2 separated windows?

You need to create one SwapChain and RenderTargetView for every window.
1 if you created your device via CreateDeviceAndSwapChain you need to obtain IDXGIFactory first
IDXGIDevice * device;
d3ddevice->QueryInterface(__uuidof(IDXGIDevice), (void**)&device);
IDXGIAdapter * adapter;
device->GetParent(__uuidof(IDXGIAdapter), (void**)&adapter);
IDXGIFactory * factory;
adapter->GetParent(__uuidof(DDXGIFactory), (void**)&factory);
With DXGIFactory you can create additional swapchain for new window
factory->CreateSwapChain(g_pd3dDevice, &sd, &g_pSwapChain2);
then create a render target view
ID3D11Texture2D* pBackBuffer = NULL;
hr = g_pSwapChain2->GetBuffer( 0, __uuidof( ID3D11Texture2D ), ( LPVOID* )&pBackBuffer );
if( FAILED( hr ) )
return hr;
hr = g_pd3dDevice->CreateRenderTargetView( pBackBuffer, NULL, &g_pRenderTargetView );
pBackBuffer->Release();
if( FAILED( hr ) )
return hr;
And finally just set your render target(s) and Draw something!
g_immediateContext->OMSetRenderTargets(1, &g_RenderTargetView, NULL);
....
I hope this has been helpful.
Best regards Quest :)

Related

following "What's a Creel?"s tutorial: Can't create a IWICBitmapDecoder, in visual studio 2015

I've been following "what's a Creel?"'s tutorial for direct 2d. I got to tutorial 8: 'Loading an image'. I didn't have the spritesheet object save the pointer to the Graphics object as this caused problems with this version of visual studio, so it's passed every time something needing it is called. main point: when I try creating a IWICBitmapDecoder with the wicfactory->CreateDecoderFromFile() method, I get the following error:
Exception thrown at 0x008C70A7 in Project8.exe: 0xC0000005: Access violation reading location 0x00000000.
and in the Autos I get:
hr | E_NOINTERFACE No such interface supported.
this | 0x00c1a5c8 {bmp=0x00000000 <NULL> } spritesheet *
wicfactory | 0x00000000<NULL>
wicdecoder | 0xcccccccc{...}
the code being this:
#pragma once
#include <wincodec.h> //include windowscodecs.lib in the linker input
#include "Graphics.h"
#include <d2d1.h>
#include<string>
class spritesheet {
public:
ID2D1Bitmap* bmp;
spritesheet() {}
spritesheet(LPCWSTR file, graphics* gfx) {
//this->gfx = gfx;
//bmp = NULL;
HRESULT hr;
//create an image factory
IWICImagingFactory *wicFactory;
hr = CoCreateInstance(
CLSID_WICImagingFactory,
NULL,
CLSCTX_INPROC_SERVER,
CLSID_WICImagingFactory,
(LPVOID*)&wicFactory
);
//create a decoder
IWICBitmapDecoder *wicdecoder;
hr = wicFactory->CreateDecoderFromFilename(
file,
NULL,
GENERIC_READ,
WICDecodeMetadataCacheOnLoad,
&wicdecoder
);
IWICBitmapFrameDecode* wicframe = NULL;
hr = wicdecoder->GetFrame(0, &wicframe);
IWICFormatConverter *wicconverter = NULL;
hr = wicFactory->CreateFormatConverter(&wicconverter);
hr = wicconverter->Initialize(
wicframe,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.0,
WICBitmapPaletteTypeCustom
);
gfx->gettarget()->CreateBitmapFromWicBitmap(
wicconverter,
NULL,
&bmp
);
if (wicdecoder) wicdecoder->Release();
if (wicFactory) wicFactory->Release();
if (wicconverter) wicconverter->Release();
if (wicframe) wicframe->Release();
}
void init(wchar_t * file, graphics * gfx) {
//this->gfx = gfx;
//bmp = NULL;
HRESULT hr;
//create an image factory
IWICImagingFactory* wicFactory;
hr = CoCreateInstance(
CLSID_WICImagingFactory,
NULL,
CLSCTX_INPROC_SERVER,
CLSID_WICImagingFactory,
(LPVOID*)&wicFactory
);
//create a decoder
IWICBitmapDecoder* wicdecoder;
hr = wicFactory->CreateDecoderFromFilename(
file,
NULL,
GENERIC_READ,
WICDecodeMetadataCacheOnLoad,
&wicdecoder
);
IWICBitmapFrameDecode* wicframe = NULL;
hr = wicdecoder->GetFrame(0, &wicframe);
IWICFormatConverter *wicconverter = NULL;
hr = wicFactory->CreateFormatConverter(&wicconverter);
hr = wicconverter->Initialize(
wicframe,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.0,
WICBitmapPaletteTypeCustom
);
gfx->rendertarget->CreateBitmapFromWicBitmap(
wicconverter,
NULL,
&bmp
);
if (wicdecoder) wicdecoder->Release();
if (wicFactory) wicFactory->Release();
if (wicconverter) wicconverter->Release();
if (wicframe) wicframe->Release();
gfx->rendertarget->DrawBitmap(
bmp,
D2D1::RectF(0.0f, 0.0f, 10, 10), //dest rect
1.0f,
D2D1_BITMAP_INTERPOLATION_MODE::D2D1_BITMAP_INTERPOLATION_MODE_NEAREST_NEIGHBOR, //effect for scaling
D2D1::RectF(0, 0, 10, 10)); //scource rect
}
void draw(graphics *gfx) {
gfx->rendertarget->DrawBitmap(
bmp,
D2D1::RectF(0.0f, 0.0f, 10, 10), //dest rect
1.0f,
D2D1_BITMAP_INTERPOLATION_MODE::D2D1_BITMAP_INTERPOLATION_MODE_NEAREST_NEIGHBOR, //effect for scaling
D2D1::RectF(0, 0, 10, 10)); //scource rect
}
};
now, just to test things, I did put a ID2D1Bitmap* bmp; at the start of each method just to see where things got, but the wicdecoder error message just changed to a random place in memory.
Found the issue. the implementation of CoCreateInstance() was fine, but there were a few problems with his code, relating to it being outdated.
NOW, before i begin, i must say, this was PARTLY taken from: https://msdn.microsoft.com/en-us/library/windows/desktop/dd756686(v=vs.85).aspx
the implementation here still gives no usable implementation of CoCreateInstance(), (and also requires you to have the boilerplate code of this tutorial. Also, the tutorial's code is from 2008 and the architecture expands the amount of memory the app uses every time you resize the window... but i digress. main point: DON'T use that tutorial's program example as a model for your own programs, but it does give a good example on how to declare/initialize the proper WIC and D2D objects) the Microsoft documentation for DX12 is also out of date, but this is what I will clarify here.
the correct implementation of CoCreateInstance(), and the correct way to load an image is:
void spritesheet::load(PCWSTR uri, ID2D1HwndRenderTarget * gfx) {
IWICBitmapDecoder *pDecoder = NULL;
IWICBitmapFrameDecode *pSource = NULL;
IWICStream *pStream = NULL;
IWICFormatConverter *pConverter = NULL;
IWICBitmapScaler *pScaler = NULL;
IWICImagingFactory *wicFactory = NULL;
CoCreateInstance(
CLSID_WICImagingFactory,
NULL,
CLSCTX_INPROC_SERVER,
IID_IWICImagingFactory,
(LPVOID*)&wicFactory);
HRESULT hr = wicFactory->CreateDecoderFromFilename(
uri,
NULL,
GENERIC_READ,
WICDecodeMetadataCacheOnLoad,
&pDecoder
);
if (SUCCEEDED(hr)) {
// Create the initial frame.
hr = pDecoder->GetFrame(0, &pSource);
}
if (SUCCEEDED(hr)) {
// Convert the image format to 32bppPBGRA
// (DXGI_FORMAT_B8G8R8A8_UNORM + D2D1_ALPHA_MODE_PREMULTIPLIED).
hr = wicFactory->CreateFormatConverter(&pConverter);
}
if (SUCCEEDED(hr)) {
hr = pConverter->Initialize(
pSource,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.f,
WICBitmapPaletteTypeMedianCut
);
}
if (SUCCEEDED(hr))
{
// Create a Direct2D bitmap from the WIC bitmap.
hr = gfx->CreateBitmapFromWicBitmap(
pConverter,
NULL,
&bmp
);
}
if (pDecoder) {
pDecoder->Release();
}
if (pSource) {
pSource->Release();
}
if (pConverter) {
pConverter->Release();
}
}
Note:
uri simply being the file name, in a PCWSTR format, declared as L"name.bmp" when passing to the function.
the correct way to draw an image is:
void spritesheet::draw(ID2D1HwndRenderTarget * gfx, float bx, float by, float rx, float ry, float dx, float dy) {
D2D1_SIZE_F size = bmp->GetSize();
D2D1_POINT_2F upperLeftCorner = D2D1::Point2F(dx, dy);
gfx->DrawBitmap(
bmp,
D2D1::RectF(
upperLeftCorner.x,
upperLeftCorner.y,
upperLeftCorner.x + rx,
upperLeftCorner.y + ry),
1,
D2D1_BITMAP_INTERPOLATION_MODE::D2D1_BITMAP_INTERPOLATION_MODE_NEAREST_NEIGHBOR,
D2D1::RectF(
bx,
by,
rx,
ry)
);
}
Now, the rest of the D2D documentation is not up to date, but still gives you a good example of what's out there to use. Some of the objects don't exist anymore, like D2D1::Rect being canned and now you only have: D2D1::RectF, nad some of the documentation and tutorials are rather disjointed and untested, but if you dig long enough, you can create a game. just remember to have the x86/x64 redistributeable packages run as part of your installer written in c# (because c++ won't run natively in windows anymore, and c# is half as fast), and your c++ program will run where you need it to.
Im tring to do the solution you gave here but no luck still wicFactory is null.
D2DGauge* d2dg = reinterpret_cast<D2DGauge*>(args);
HRESULT hr;
bmp = nullptr;
IWICBitmapDecoder *pDecoder = NULL;
IWICBitmapFrameDecode *pSource = NULL;
IWICFormatConverter *pConverter = NULL;
//Creating a factory
IWICImagingFactory *wicFactory = NULL;
CoCreateInstance(
CLSID_WICImagingFactory,
NULL,
CLSCTX_INPROC_SERVER,
IID_IWICImagingFactory,
(LPVOID*)&wicFactory);
MessageBox(NULL, (LPCWSTR)wicFactory, NULL, MB_OK);
//Creating a decoder
hr = wicFactory->CreateDecoderFromFilename(
filename,
NULL,
GENERIC_READ,
WICDecodeMetadataCacheOnLoad,
&pDecoder);
//Read Frame from image
if (SUCCEEDED(hr)) {
// Create the initial frame.
hr = pDecoder->GetFrame(0, &pSource);
}
//Creating a Converter
if (SUCCEEDED(hr)) {
// Convert the image format to 32bppPBGRA
// (DXGI_FORMAT_B8G8R8A8_UNORM + D2D1_ALPHA_MODE_PREMULTIPLIED).
hr = wicFactory->CreateFormatConverter(&pConverter);
}
//Setup the converter
if (SUCCEEDED(hr)) {
hr = pConverter->Initialize(
pSource,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.f,
WICBitmapPaletteTypeMedianCut
);
}
//Use the converter to create an D2D1Bitmap
//ID2D1Bitmap* bmp;
if (SUCCEEDED(hr))
{
hr = d2dg->pRT->CreateBitmapFromWicBitmap(
pConverter,
NULL,
&bmp
);
}
if (wicFactory)wicFactory->Release();
if (pDecoder)pDecoder->Release();
if (pConverter)pConverter->Release();
if (pSource)pSource->Release();
How could I make it not a nullptr?

Direct2D Loading and Drawing Bitmap

I'm trying to draw a bitmap to the screen; the code that calls to load the bitmap from a file works, but it seems the actual drawing part is causing some program crashes.
I followed a tutorial from MSDN, but the only change I made was instead of using ID2D1RenderTarget, I'm using a ID2D1DCRenderTarget.
Here's the Load method that works:
HRESULT LoadBitmapFromFile(
ID2D1DCRenderTarget *pRenderTarget,
IWICImagingFactory *pIWICFactory,
PCWSTR uri,
UINT destinationWidth,
UINT destinationHeight,
ID2D1Bitmap **ppBitmap
)
{
IWICBitmapDecoder *pDecoder = NULL;
IWICBitmapFrameDecode *pSource = NULL;
IWICStream *pStream = NULL;
IWICFormatConverter *pConverter = NULL;
IWICBitmapScaler *pScaler = NULL;
HRESULT hr = pIWICFactory->CreateDecoderFromFilename(
uri,
NULL,
GENERIC_READ,
WICDecodeMetadataCacheOnLoad,
&pDecoder
);
if (SUCCEEDED(hr))
{
// Create the initial frame.
hr = pDecoder->GetFrame(0, &pSource);
}
if (SUCCEEDED(hr))
{
// Convert the image format to 32bppPBGRA
// (DXGI_FORMAT_B8G8R8A8_UNORM + D2D1_ALPHA_MODE_PREMULTIPLIED).
hr = pIWICFactory->CreateFormatConverter(&pConverter);
}
if (SUCCEEDED(hr))
{
// If a new width or height was specified, create an
// IWICBitmapScaler and use it to resize the image.
if (destinationWidth != 0 || destinationHeight != 0)
{
UINT originalWidth, originalHeight;
hr = pSource->GetSize(&originalWidth, &originalHeight);
if (SUCCEEDED(hr))
{
if (destinationWidth == 0)
{
FLOAT scalar = static_cast<FLOAT>(destinationHeight) / static_cast<FLOAT>(originalHeight);
destinationWidth = static_cast<UINT>(scalar * static_cast<FLOAT>(originalWidth));
}
else if (destinationHeight == 0)
{
FLOAT scalar = static_cast<FLOAT>(destinationWidth) / static_cast<FLOAT>(originalWidth);
destinationHeight = static_cast<UINT>(scalar * static_cast<FLOAT>(originalHeight));
}
hr = pIWICFactory->CreateBitmapScaler(&pScaler);
if (SUCCEEDED(hr))
{
hr = pScaler->Initialize(
pSource,
destinationWidth,
destinationHeight,
WICBitmapInterpolationModeCubic
);
}
if (SUCCEEDED(hr))
{
hr = pConverter->Initialize(
pScaler,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.f,
WICBitmapPaletteTypeMedianCut
);
}
}
}
else // Don't scale the image.
{
hr = pConverter->Initialize(
pSource,
GUID_WICPixelFormat32bppPBGRA,
WICBitmapDitherTypeNone,
NULL,
0.f,
WICBitmapPaletteTypeMedianCut
);
}
}
if (SUCCEEDED(hr))
{
// Create a Direct2D bitmap from the WIC bitmap.
hr = pRenderTarget->CreateBitmapFromWicBitmap(
pConverter,
NULL,
ppBitmap
);
}
if (pDecoder) pDecoder->Release();
if (pSource) pSource->Release();
if (pStream) pStream->Release();
if (pScaler) pScaler->Release();
return hr;
}
And in between the BeginDraw() and EndDraw() for the render target, I have the following code to draw the bitmap. This is the part that crashes the program when I attempt to run it:
d2dg->pRT->BeginDraw();
// Create a WIC Factory
HRESULT hr = CoCreateInstance(
CLSID_WICImagingFactory,
NULL,
CLSCTX_INPROC_SERVER,
IID_IWICImagingFactory,
(LPVOID*)&wicFactory);
hr = LoadBitmapFromFile(d2dg->pRT, wicFactory, L"image1.png", 400, 400, &pBitmap);
D2D1_SIZE_F size = pBitmap->GetSize();
D2D1_POINT_2F upperLeftCorner = D2D1::Point2F(100.f, 10.f);
// Draw a bitmap.
d2dg->pRT->DrawBitmap(
pBitmap,
D2D1::RectF(
upperLeftCorner.x,
upperLeftCorner.y,
upperLeftCorner.x + size.width,
upperLeftCorner.y + size.height)
);
d2dg->pRT->EndDraw();
Found the issue to this: the bitmap I wanted to draw was still NULL.
This can be fixed by deleting the ID2D1Bitmap **ppBitmap parameter from LoadBitmapFromFile and just work with the pointer to &pBitmap for any references to a bitmap within the method.
That way, pBitmap gets manipulated and will not remain NULL.

GmfBridge doesn't connect sink filter with source filter

I am trying to use GmfBridge library to dynamically change source filters from cam to file and vice versa. All functions return S_OK (well, almost all - pMediaControlOutput->Run() returns S_FALSE actually, but in msdn said that this can be ok too), so I've assumed that all is ok but data isn't transfered to the other side of the bridge. I use GraphStudio to connect to remote graph and all seems to be ok - all filters in both graph are connected as it should be. But while playing I receive following messages in bridge log file:
0: Started 2011-09-10 15:58:38
0: Buffer minimum 100
0: Added stream 1: 畡楤o, Decompressed Only, Discard mode
1: Sink 0x7ae0ca8 has 1 pins
1: Sink filter 0x7ae0cf8 in graph 0x193bf0
2: Source 0x7ae1030 has 1 pins
2: Source filter 0x7ae1080 in graph 0x194c18
14: ReceiveConnection Aware: false
14: Bridging 0x194c18 to 0x193bf0
14: Pin 0x7ae3438 disconnect
25: Source 0x7ae1030 pause from 0
25: Source pin 0x7ae3618 active
2234: Pin 0x7ae3438 receive 0x721ec68
2234: Sink pin 0x7ae3438 disconnected: discarding 0x721ec68
3389: Pin 0x7ae3438 receive 0x721ec68
3389: Sink pin 0x7ae3438 disconnected: discarding 0x721ec68
3940: Pin 0x7ae3438 receive 0x721ec68
3940: Sink pin 0x7ae3438 disconnected: discarding 0x721ec68
4440: Pin 0x7ae3438 receive 0x721ec68
So as you can see, left graph isn't connected to right one despite BridgeGraphs() returned S_OK and media sample is discarded. Below there is my code. Where am I going wrong?
// Create graphs
HRESULT hr = m_graphInput.CreateInstance(CLSID_FilterGraph);
ATLASSERT( SUCCEEDED( hr ) );
hr = m_graphOutput.CreateInstance(CLSID_FilterGraph);
ATLASSERT( SUCCEEDED( hr ) );
// Get IMediaControl interfaces
hr = m_graphInput.QueryInterface( IID_IMediaControl, (void**)&pMediaControlInput );
ATLASSERT( SUCCEEDED( hr ) );
hr = m_graphOutput.QueryInterface( IID_IMediaControl, (void**)&pMediaControlOutput );
ATLASSERT( SUCCEEDED( hr ) );
// Get builder interfaces
hr = m_graphInput.QueryInterface( IID_IGraphBuilder, (void**)&pBuilderInput );
ATLASSERT( SUCCEEDED( hr ) );
hr = m_graphOutput.QueryInterface( IID_IGraphBuilder, (void**)&pBuilderOutput );
ATLASSERT( SUCCEEDED( hr ) );
// Load source filter (on sink side)
LocateFilter( SOURCE_FILTER_NAME, CLSID_LegacyAmFilterCategory, &inputDevice );
hr = m_graphInput->AddFilter( inputDevice, SOURCE_FILTER_NAME );
ATLASSERT( SUCCEEDED( hr ) );
// Load render filter (on bridge's source side)
LocateFilter( _T( "Default DirectSound Device" ), CLSID_AudioRendererCategory, &audioOutputPreview );
hr = m_graphOutput->AddFilter( audioOutputPreview, _T( "Default DirectSound Device" ) );
ATLASSERT( SUCCEEDED( hr ) );
// Init bridge
bridge.CreateInstance( __uuidof(GMFBridgeController) );
hr = bridge->SetBufferMinimum( 100 );
ATLASSERT( SUCCEEDED( hr ) );
hr = bridge->AddStream( false, eUncompressed, false );
ATLASSERT( SUCCEEDED( hr ) );
// Add sink filter and connect to input graph
IUnknownPtr pSinkFilter;
{
hr = bridge->InsertSinkFilter( m_graphInput, (IUnknown**)&pSinkFilter );
ATLASSERT( SUCCEEDED( hr ) );
// Using own functions get pins
IPin* pInAudio = CPinController::getOutputPin( inputDevice, _T("Audio"));
IPin* pOutAudio = CPinController::getInputPin( pSinkFilter );
hr = pBuilderInput->Connect( pOutAudio, pInAudio );
ATLASSERT( SUCCEEDED( hr ) );
}
// Add output filter and connect to output graph
IUnknownPtr pFeederFilter;
{
hr = bridge->InsertSourceFilter( pSinkFilter, m_graphOutput, &pFeederFilter );
ATLASSERT( SUCCEEDED( hr ) );
// Get pins
IPin* pInAudio = CPinController::getOutputPin( pFeederFilter/*, _T("Audio")*/);
IPin* pOutAudio = CPinController::getInputPin( audioOutputPreview );
hr = pBuilderOutput->Connect( pInAudio, pOutAudio );
ATLASSERT( SUCCEEDED( hr ) );
}
// Run left
hr = pMediaControlInput->Run();
ATLASSERT( SUCCEEDED( hr ) );
// Run right
hr = pMediaControlOutput->Run();
ATLASSERT( SUCCEEDED( hr ) );
hr = bridge->BridgeGraphs( m_graphOutput, m_graphInput );
ATLASSERT( SUCCEEDED( hr ) );
It's really ridiculous but about a few minutes ago after a day of searching we have found the answer. All deal was about really huge hole in GmfBridge. I was giving wrong interfaces here (there are graphs instead of bridge's sink and source filters) 'coz function needed pointers to IUnknown:
hr = bridge->BridgeGraphs( m_graphOutput, m_graphInput );
And in GmfBridge library this situation wasn't handled properly - there is no "else" brunch to handle error and function returns hr which was set in begin to S_OK:
HRESULT STDMETHODCALLTYPE BridgeGraphs(
/* [in] */ IUnknown *pSourceGraphSinkFilter,
/* [in] */ IUnknown *pRenderGraphSourceFilter)
{
HRESULT hr = S_OK;
...
// if we are not given both filters, then
// we need do nothing
IBridgeSinkPtr pSink = pSourceGraphSinkFilter;
IBridgeSourcePtr pSource = pRenderGraphSourceFilter;
if ((pSink != NULL) && (pSource != NULL))
{
...
}
return hr;
}
So as you can see it just says there is nothing wrong and then it just do nothing! I think it's good idea to notify authors of the lib about this bug.
Hope this info will help someone.

ITuner::put_TuneRequest() call ignored

I have a DirectShow graph with a "Microsoft DVBT Network Provider", "AVerMedia BDA DVBT Tuner", "AVerMEdia BDA Digital Capture", "Sample Grabber" and "NULL Renderer".
These filters are connected.
Beside that I also have an "MPEG-2 Demultiplexer" and a "BDA MPEG2 Transport Information Filter", but these two filters are NOT connected! It seems like they have to be here in order to run the graph.
When I start the graph, I'm receiving TS data, but no matter what I do, I'm not able to put the tuning request. I can only capture the MUX data from the last tuned frequency with some other application like Windows Media Center.
Here is the code for putting the tune request:
// creating tuning space
CComPtr<IDVBTuningSpace> pDVBTuningSpace;<br>
hr = pDVBTuningSpace.CoCreateInstance( __uuidof( DVBTuningSpace ) );
WCHAR szFriendlyName[ 64 ] = L"Local DVB-T Digital Antenna";<br> BSTR bstrFriendlyName = SysAllocString( szFriendlyName );
hr = pDVBTuningSpace->put_UniqueName( bstrFriendlyName );<br>
hr = pDVBTuningSpace->put_FriendlyName( bstrFriendlyName );
SysFreeString( bstrFriendlyName );
CComBSTR clsid_dvbt = ("{216C62DF-6D7F-4e9a-8571-05F14EDB766A}");<br>
hr = pDVBTuningSpace->put_NetworkType( clsid_dvbt );<br>
hr = pDVBTuningSpace->put_SystemType( DVB_Terrestrial );<br>
// creating tune request<br>
CComPtr<ITuneRequest> pTuneRequest;
hr = pDVBTuningSpace->CreateTuneRequest( &pTuneRequest );
CComQIPtr<IDVBTuneRequest> pDVBTuneRequest( pTuneRequest );
hr = pDVBTuneRequest->put_ONID( -1 );<br>
hr = pDVBTuneRequest->put_TSID( -1 );<br>
hr = pDVBTuneRequest->put_SID( -1 );
// locator<br>
CComPtr<IDVBTLocator> pDVBTLocator;
hr = pDVBTLocator.CoCreateInstance( __uuidof( DVBTLocator ) );<br>
hr = pDVBTLocator->put_Bandwidth( 8 );<br>
hr = pDVBTLocator->put_CarrierFrequency( 506000 );
hr = pDVBTuneRequest->put_Locator( pDVBTLocator );
CComQIPtr<ITuner> pTuner( pNetworkProvider_ );
hr = pTuner->put_TuneRequest( pDVBTuneRequest );
This is executed immediately after adding the "Microsoft DVBT Network Provider" filter in the graph.
All "hr" values from the above code are S_OK.
What am I doing wrong? Or, did I miss something big in this "tune request" thing.
(Bandwidth and frequency values are correct)
I think put_Bandwidth( 8 ) is wrong, it should be a bandwidth in Hz. Anyway, I show you some code I use. Maybe it helps.
HRESULT hr;
CComBSTR TuningName;
hr = pDVBTuningSpace2.CoCreateInstance(CLSID_DVBTuningSpace);
hr = pDVBTuningSpace2->put_SystemType(DVB_Terrestrial);
TuningName = L"My DVB-T";
hr = pDVBTuningSpace2->put__NetworkType(CLSID_DVBTNetworkProvider);
CComPtr <IDVBTLocator> pDVBTLocator;
hr = pDVBTLocator.CoCreateInstance(CLSID_DVBTLocator);
hr = pDVBTLocator->put_CarrierFrequency(config->GetFreq());
hr = pDVBTLocator->put_Bandwidth(config->GetSymbolRate());
hr = pDVBTuningSpace2->put_DefaultLocator(pDVBTLocator);
hr = pDVBTuningSpace2->put_UniqueName(TuningName);
hr = pDVBTuningSpace2->put_FriendlyName(TuningName);
hr = pDVBTuningSpace2->put_FrequencyMapping(L"");
CComPtr <ITuningSpaceContainer> pTuningSpaceContainer;
hr = pTuningSpaceContainer.CoCreateInstance(CLSID_SystemTuningSpaces);
VARIANT tiIndex;
hr = pTuningSpaceContainer->Add(pDVBTuningSpace2,&tiIndex);
if (!SUCCEEDED(hr)) {
// Get the enumerator for the collection.
CComPtr<IEnumTuningSpaces> pTuningSpaceEnum;
hr = pTuningSpaceContainer->get_EnumTuningSpaces(&pTuningSpaceEnum);
if (SUCCEEDED(hr)) {
// Loop through the collection.
CComPtr<ITuningSpace> pTuningSpace;
//ITuningSpace *pTuningSpace;
tiIndex.intVal=0;
while (S_OK == pTuningSpaceEnum->Next(1, &pTuningSpace, NULL)) {
USES_CONVERSION;
BSTR Name;
hr = pTuningSpace->get_UniqueName(&Name);
if (SUCCEEDED(hr)) {
if (wcscmp(OLE2W(Name), TuningName) == 0) {
hr = pTuningSpaceContainer->put_Item(tiIndex,pDVBTuningSpace2);
}
SysFreeString(Name);
}
tiIndex.intVal++;
//pTuningSpace->Release();
pTuningSpace.Release();
}
}
}
CComPtr<ITuneRequest> pTuneRequest;
hr = pDVBTuningSpace2->CreateTuneRequest(&pTuneRequest);
CComQIPtr<IDVBTuneRequest> pDVBTuneRequest(pTuneRequest);
if(pDVBTuneRequest) {
hr = pDVBTuneRequest->put_SID(config->GetSid());
hr = pDVBTuneRequest->put_TSID(config->GetTsid());
hr = pDVBTuneRequest->put_ONID(config->GetOnid());
}
GUID CLSIDNetworkType;
hr = pDVBTuningSpace2->get__NetworkType(&CLSIDNetworkType);
hr = CoCreateInstance(CLSIDNetworkType, NULL, CLSCTX_INPROC_SERVER,
IID_IBaseFilter, (void **) &pNetworkProvider);
hr = graph->AddFilter(pNetworkProvider,L"Network Provider");
// Query for ITuner.
CComQIPtr<ITuner> pTuner(pNetworkProvider);
if (pTuner) {
// Submit the tune request to the network provider.
hr = pTuner->put_TuneRequest(pTuneRequest);
}
hr = graph->AddFilter(pBdaNetworkTuner,L"BDA Source");
hr = ConnectFilters(pNetworkProvider,pBdaNetworkTuner);
CComPtr<IBaseFilter> pBdaReceiver;
hr = FindDevice(KSCATEGORY_BDA_RECEIVER_COMPONENT, &pBdaReceiver, 0, 0, 0);
hr = graph->AddFilter(pBdaReceiver,L"BDA Receiver");
hr = ConnectFilters(pBdaNetworkTuner,pBdaReceiver);
CComPtr<IBaseFilter> pMpegDemux;
hr = pMpegDemux.CoCreateInstance(CLSID_MPEG2Demultiplexer);
hr = graph->AddFilter(pMpegDemux,L"MPEG Demux");
hr = ConnectFilters(pBdaReceiver,pMpegDemux);
You are doing some things in a different order, but I'm not sure if it matters.

How to get width and height of directshow webcam video stream

I found a bit of code that gets me access to the raw pixel data from my webcam. However I need to know the image width, height, pixel format and preferably the data stride(pitch, memory padding or whatever you want to call it) if its ever gonna be something other than the width * bytes per pixel
#include <windows.h>
#include <dshow.h>
#pragma comment(lib,"Strmiids.lib")
#define DsHook(a,b,c) if (!c##_) { INT_PTR* p=b+*(INT_PTR**)a; VirtualProtect(&c##_,4,PAGE_EXECUTE_READWRITE,&no);\
*(INT_PTR*)&c##_=*p; VirtualProtect(p, 4,PAGE_EXECUTE_READWRITE,&no); *p=(INT_PTR)c; }
// Here you get image video data in buf / len. Process it before calling Receive_ because renderer dealocates it.
HRESULT ( __stdcall * Receive_ ) ( void* inst, IMediaSample *smp ) ;
HRESULT __stdcall Receive ( void* inst, IMediaSample *smp ) {
BYTE* buf; smp->GetPointer(&buf); DWORD len = smp->GetActualDataLength();
//AM_MEDIA_TYPE* info;
//smp->GetMediaType(&info);
HRESULT ret = Receive_ ( inst, smp );
return ret;
}
int WINAPI WinMain(HINSTANCE inst,HINSTANCE prev,LPSTR cmd,int show){
HRESULT hr = CoInitialize(0); MSG msg={0}; DWORD no;
IGraphBuilder* graph= 0; hr = CoCreateInstance( CLSID_FilterGraph, 0, CLSCTX_INPROC,IID_IGraphBuilder, (void **)&graph );
IMediaControl* ctrl = 0; hr = graph->QueryInterface( IID_IMediaControl, (void **)&ctrl );
ICreateDevEnum* devs = 0; hr = CoCreateInstance (CLSID_SystemDeviceEnum, 0, CLSCTX_INPROC, IID_ICreateDevEnum, (void **) &devs);
IEnumMoniker* cams = 0; hr = devs?devs->CreateClassEnumerator (CLSID_VideoInputDeviceCategory, &cams, 0):0;
IMoniker* mon = 0; hr = cams->Next (1,&mon,0); // get first found capture device (webcam?)
IBaseFilter* cam = 0; hr = mon->BindToObject(0,0,IID_IBaseFilter, (void**)&cam);
hr = graph->AddFilter(cam, L"Capture Source"); // add web cam to graph as source
IEnumPins* pins = 0; hr = cam?cam->EnumPins(&pins):0; // we need output pin to autogenerate rest of the graph
IPin* pin = 0; hr = pins?pins->Next(1,&pin, 0):0; // via graph->Render
hr = graph->Render(pin); // graph builder now builds whole filter chain including MJPG decompression on some webcams
IEnumFilters* fil = 0; hr = graph->EnumFilters(&fil); // from all newly added filters
IBaseFilter* rnd = 0; hr = fil->Next(1,&rnd,0); // we find last one (renderer)
hr = rnd->EnumPins(&pins); // because data we are intersted in are pumped to renderers input pin
hr = pins->Next(1,&pin, 0); // via Receive member of IMemInputPin interface
IMemInputPin* mem = 0; hr = pin->QueryInterface(IID_IMemInputPin,(void**)&mem);
DsHook(mem,6,Receive); // so we redirect it to our own proc to grab image data
hr = ctrl->Run();
while ( GetMessage( &msg, 0, 0, 0 ) ) {
TranslateMessage( &msg );
DispatchMessage( &msg );
}
};
Bonus points if you can tell me how get this thing not to render a window but still get me access to the image data.
That's really ugly. Please don't do that. Insert a pass-through filter like the sample grabber instead (as I replied to your other post on the same topic). Connecting the sample grabber to the null renderer gets you the bits in a clean, safe way without rendering the image.
To get the stride, you need to get the media type, either through ISampleGrabber or IPin::ConnectionMediaType. The format block will be either a VIDEOINFOHEADER or a VIDEOINFOHEADER2 (check the format GUID). The bitmapinfo header biWidth and biHeight defines the bitmap dimensions (and hence stride). If the RECT is not empty, then that defines the relevant image area within the bitmap.
I'm going to have to wash my hands now after touching this post.
I am sorry for you. When the interface was created there were probably not the best programmer to it.
// Here you get image video data in buf / len. Process it before calling Receive_ because renderer dealocates it.
BITMAPINFOHEADER bmpInfo; // current bitmap header info
int stride;
HRESULT ( __stdcall * Receive_ ) ( void* inst, IMediaSample *smp ) ;
HRESULT __stdcall Receive ( void* inst, IMediaSample *smp )
{
BYTE* buf; smp->GetPointer(&buf); DWORD len = smp->GetActualDataLength();
HRESULT ret = Receive_ ( inst, smp );
AM_MEDIA_TYPE* info;
HRESULT hr = smp->GetMediaType(&info);
if ( hr != S_OK )
{ //TODO: error }
else
{
if ( type->formattype == FORMAT_VideoInfo )
{
const VIDEOINFOHEADER * vi = reinterpret_cast<VIDEOINFOHEADER*>( type->pbFormat );
const BITMAPINFOHEADER & bmiHeader = vi->bmiHeader;
//! now the bmiHeader.biWidth contains the data stride
stride = bmiHeader.biWidth;
bmpInfo = bmiHeader;
int width = ( vi->rcTarget.right - vi->rcTarget.left );
//! replace the data stride be the actual width
if ( width != 0 )
bmpInfo.biWidth = width;
}
else
{ // unsupported format }
}
DeleteMediaType( info );
return ret;
}
Here's how to add the Null Renderer that suppresses the rendering window. Add directly after creating the IGraphBuilder*
//create null renderer and add null renderer to graph
IBaseFilter *m_pNULLRenderer; hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void **)&m_pNULLRenderer);
hr = graph->AddFilter(m_pNULLRenderer, L"Null Renderer");
That dshook hack is the only elegant directshow code of which I am aware.
In my experience, the DirectShow API is a convoluted nightmare, requiring hundreds of lines of code to do even the simplest operation, and adapting a whole programming paradigm in order to access your web camera. So if this code does the job for you, as it did for me, use it and enjoy fewer lines of code to maintain.