Good afternoon,
I have a barebone Direct3D App that works on a host PC, but fails to initialize DirectX while running via remote desktop.
I traced the failure to this call, where it fails with
result = adapterOutput->GetDisplayModeList(DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_ENUM_MODES_INTERLACED, &numModes, NULL);
if(FAILED(result))
{
return false;
}
It fails with:
result = 0x887a0022 : A resource is not available at the time of the call, but may become available later.
The full initialization code is from Rastertek tutorials, found here:
http://www.rastertek.com/dx11tut03.html
Does anyone know a workaround for this problem?
Remote Desktop involves some corner-cases, and keep in mind it's sometimes using the 'Microsoft Basic Renderer' (a.k.a. the software WARP driver). See this blog post.
You can also guard your use of GetDisplayModeList in the remote scenario by detecting it in the first place. For example, the legacy DXUT sample framework did this in it's enumeration code:
// mode for the current screen resolution for the remote session.
if( 0 != GetSystemMetrics( SM_REMOTESESSION) )
{
DEVMODE DevMode;
DevMode.dmSize = sizeof( DEVMODE );
if( EnumDisplaySettings( nullptr, ENUM_CURRENT_SETTINGS, &DevMode ) )
{
NumModes = 1;
pDesc[0].Width = DevMode.dmPelsWidth;
pDesc[0].Height = DevMode.dmPelsHeight;
pDesc[0].Format = DXGI_FORMAT_R8G8B8A8_UNORM;
pDesc[0].RefreshRate.Numerator = 0;
pDesc[0].RefreshRate.Denominator = 0;
pDesc[0].ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_PROGRESSIVE;
pDesc[0].Scaling = DXGI_MODE_SCALING_CENTERED;
}
}
You also can't use 'full-screen exclusive' mode in remote desktop:
if( GetSystemMetrics(SM_REMOTESESSION) != 0 )
{
sd.Windowed = TRUE;
}
You don't really need to use GetDisplayModeList at all. Just pick a reasonable starting size or start your window 'maximized'. See the directx-vs-templates for an approach that just uses the 'native resolution' of the desktop for both windowed and 'fake full screen'. It also all works well for remote desktop.
Another 'corner-case' with remote desktop is "raw input" for mouse. See the implementation of Mouse from the DirectX Tool Kit.
Not technically a solution, but the problem was in refresh rate initialization, bypassing this with a try{}-catch{} block allowed me to run with a default refresh rate via remote desktop. Everything else initialized without issues
Related
I have a problem I am not able to solve. My application should be able to switch the default audio device during runtime. To achieve this I am using XAudio2 from the DirectXTK.
I implemented the IMMNotificationClient into my audio class to be able to react on default device changes.
For instance when the default device changes I stop my current source voice, reset the audio engine and start the source voice again. Everything works as expected.
However when my default device is a USB soundcard and I unplug it during the source voice is playing the application freezes.
The reason for this is that the source voice hangs when stopping the voice. Sometimes also when flushing the source buffer. Seems like the source voice could not be stopped anymore when the audio device was removed the source voice was using.
Does someone had the same problem and was able to solve this?
Here's the function I am using to reset the audio engine.
HRESULT DX::XAudioEngine::ResetAudioDevice()
{
HRESULT hr = S_OK;
this->m_retryAudio = TRUE;
if (SUCCEEDED(hr) && m_pSourceVoice)
{
hr = m_pSourceVoice->Stop();
}
if (SUCCEEDED(hr))
{
hr = m_pSourceVoice->FlushSourceBuffers();
}
if (m_audEngine && m_pSourceVoice)
{
// Get the source voice back from the smart pointer
IXAudio2SourceVoice* ptmpSrcVoice = nullptr;
ptmpSrcVoice = m_pSourceVoice.release();
// Destroy the voice
ptmpSrcVoice->DestroyVoice();
}
m_audEngine->Reset(&m_waveFormat, NULL);
// Create source voice
IXAudio2SourceVoice* ptmpSrcVoice = nullptr;
m_audEngine->AllocateVoice(
(WAVEFORMATEX*)&m_waveFormat,
SoundEffectInstance_Default,
false,
&ptmpSrcVoice
);
// Add source voice to smart pointer
m_pSourceVoice.reset(ptmpSrcVoice);
// Set the input volume
if (this->m_inputVolume != 1.0f) {
hr = m_pSourceVoice->SetVolume(this->m_inputVolume);
}
hr = m_pSourceVoice->Start(0);
this->m_retryAudio = FALSE;
return hr;
}
I develop an application which shows something like a video in its window. I use technologies which are described here Introducing Direct2D 1.1. In my case the only difference is that eventually I create a bitmap using
ID2D1DeviceContext::CreateBitmap
then I use
ID2D1Bitmap::CopyFromMemory
to copy raw RGB data to it and then I call
ID2D1DeviceContext::DrawBitmap
to draw the bitmap. I use the high quality cubic interpolation mode D2D1_INTERPOLATION_MODE_HIGH_QUALITY_CUBIC for scaling to have the best picture but in some cases (RDP, Citrix, virtual machines, etc) it is very slow and has very high CPU consumption. It happens because in those cases a non-hardware video adapter is used. So for non-hardware adapters I am trying to turn off the interpolation and use faster methods. The problem is that I cannot exactly check if the system has a true hardware adapter.
When I call D3D11CreateDevice, I use it with D3D_DRIVER_TYPE_HARDWARE but on virtual machines it typically returns "Microsoft Basic Render Driver" which is a software driver and does not use GPU (it consumes CPU). So currently I check the vendor ID. If the vendor is AMD (ATI), NVIDIA or Intel, then I use the cubic interpolation. In the other case I use the fastest method which does not consume CPU a lot.
Microsoft::WRL::ComPtr<IDXGIDevice> dxgiDevice;
if (SUCCEEDED(m_pD3dDevice->QueryInterface(...)))
{
Microsoft::WRL::ComPtr<IDXGIAdapter> adapter;
if (SUCCEEDED(dxgiDevice->GetAdapter(&adapter)))
{
DXGI_ADAPTER_DESC desc;
if (SUCCEEDED(adapter->GetDesc(&desc)))
{
// NVIDIA
if (desc.VendorId == 0x10DE ||
// AMD
desc.VendorId == 0x1002 || // 0x1022 ?
// Intel
desc.VendorId == 0x8086) // 0x163C, 0x8087 ?
{
bSupported = true;
}
}
}
}
It works for physical (console) Windows session even in virtual machines. But for RDP sessions IDXGIAdapter still returns the vendors in case of real machines but it does not use GPU (I can see it via the Process Hacker 2 and AMD System Monitor (in case of ATI Radeon)) so I still have high CPU consumption with the cubic interpolation. In case of an RDP session to Windows 7 with ATI Radeon it is 10% bigger than via the physical console.
Or am I mistaken and somehow RDP uses GPU resources and that is the reason why it returns a real hardware adapter via IDXGIAdapter::GetDesc?
DirectDraw
Also I looked at DirectX Diagnostic Tool. It looks like the "DirectDraw Acceleration" info field returns exactly what I need. In case of physical (console) sessions it says "Enabled". In case of RDP and virtual machine (without hardware video acceleration) sessions it says "Not Available". I looked at sources and theoretically I can use the verification algorithm. But it is actually for DirectDraw which I do not use in my application. I would like to use something which is directly linked to ID3D11Device, IDXGIDevice, IDXGIAdapter and so on.
IDXGIAdapter1::GetDesc1 and DXGI_ADAPTER_FLAG
I also tried to use IDXGIAdapter1::GetDesc1 and check the flags.
Microsoft::WRL::ComPtr<IDXGIDevice> dxgiDevice;
if (SUCCEEDED(m_pD3dDevice->QueryInterface(...)))
{
Microsoft::WRL::ComPtr<IDXGIAdapter> adapter;
if (SUCCEEDED(dxgiDevice->GetAdapter(&adapter)))
{
Microsoft::WRL::ComPtr<IDXGIAdapter1> adapter1;
if (SUCCEEDED(adapter->QueryInterface(__uuidof(IDXGIAdapter1), reinterpret_cast<void**>(adapter1.GetAddressOf()))))
{
DXGI_ADAPTER_DESC1 desc;
if (SUCCEEDED(adapter1->GetDesc1(&desc)))
{
// desc.Flags
// DXGI_ADAPTER_FLAG_NONE = 0,
// DXGI_ADAPTER_FLAG_REMOTE = 1,
// DXGI_ADAPTER_FLAG_SOFTWARE = 2,
// DXGI_ADAPTER_FLAG_FORCE_DWORD = 0xffffffff
}
}
}
}
Information about the DXGI_ADAPTER_FLAG_SOFTWARE flag
Virtual Machine RDP Win Serv 2012 (Microsoft Basic Render Driver) -> (0x02) DXGI_ADAPTER_FLAG_SOFTWARE
Physical Win 10 (Intel Video) -> (0x00) DXGI_ADAPTER_FLAG_NONE
Physical Win 7 (ATI Radeon) - > (0x00) DXGI_ADAPTER_FLAG_NONE
RDP Win 10 (Intel Video) -> (0x00) DXGI_ADAPTER_FLAG_NONE
RDP Win 7 (ATI Radeon) -> (0x00) DXGI_ADAPTER_FLAG_NONE
In case of RDP session on a real machine with a hardware adapter, Flags == 0 but as I can see via Process Hacker 2 the GPU is not used. At least on Windows 7 with ATI Radeon I can see bigger CPU usage in case of an RDP session. So it looks like DXGI_ADAPTER_FLAG_SOFTWARE is only for Microsoft Basic Render Driver. So the issue is not solved.
The question
Is there a correct way to check if a real hardware video card (GPU) is used for the current Windows session? Or maybe it is possible to check if a specific interpolation mode of ID2D1DeviceContext::DrawBitmap has hardware implementation and uses GPU for the current session?
UPD
The topic is not about detecting RDP or Citrix sessions. It is not about detecting if the application is inside a virtual machine or not. I already have the all verifications and use the linear interpolation for those cases. The topic is about detecting if a real GPU is used for the current Windows session to display the desktop. I am looking for a more sophisticated solution to make decision using features of DirectX and DXGI.
If you want to detect the Microsoft Basic Renderer, the best option is to use it's VID/PID combo:
ComPtr<IDXGIDevice> dxgiDevice;
if (SUCCEEDED(device.As(&dxgiDevice)))
{
ComPtr<IDXGIAdapter> adapter;
if (SUCCEEDED(dxgiDevice->GetAdapter(&adapter)))
{
DXGI_ADAPTER_DESC desc;
if (SUCCEEDED(adapter->GetDesc(&desc)))
{
if ( (desc.VendorId == 0x1414) && (desc.DeviceId == 0x8c) )
{
// WARNING: Microsoft Basic Render Driver is active.
// Performance of this application may be unsatisfactory.
// Please ensure that your video card is Direct3D10/11 capable
// and has the appropriate driver installed.
}
}
}
}
See Microsoft Docs and Anatomy of Direct3D 11 Create Device
You will probably find for testing/debugging that you don't want to explicitly block these scenarios, but you do want to provide some kind of warning or notice feedback to the user that they are using software rather than hardware rendering.
Remote Desktop detection from Win32 classic desktop applications is better done directly via GetSystemMetrics( SM_REMOTESESSION ).
See Microsoft Docs
Answering a 3 years old question as I struggled myself to do so.
I had to go through the registry. First thing is to find the adapter LUID in the registry, to get the adapter GUID
private string GetAdapterGuid(long luid)
{
var directXRegistryKey = Registry.LocalMachine.OpenSubKey(#"SOFTWARE\Microsoft\DirectX");
if (directXRegistryKey == null)
return "";
var subKeyNames = directXRegistryKey.GetSubKeyNames();
foreach (var subKeyName in subKeyNames)
{
var subKey = directXRegistryKey.OpenSubKey(subKeyName);
if (subKey.GetValueKind("AdapterLuid") != RegistryValueKind.QWord)
continue;
var luidValue = (long)subKey.GetValue("AdapterLuid");
if (luidValue == luid)
return subKeyName;
}
return "";
}
Once you have that Guid, you can search for the details of the graphic card in HKLM like this. If it is virtual, the service name will be "INDIRECTKMD" :
private bool IsVirtualAdapter(string adapterGuid)
{
var videoRegistryKey = Registry.LocalMachine.OpenSubKey($#"SYSTEM\CurrentControlSet\Control\Video\{adapterGuid}\Video");
if (videoRegistryKey == null)
return false;
if (videoRegistryKey.GetValueKind("Service") != RegistryValueKind.String)
return false;
var serviceName = (string)videoRegistryKey.GetValue("Service");
return serviceName.ToUpper() == "INDIRECTKMD";
}
Checking the service name felt easier than parsing the DeviceDesc value.
My use case involved having the Guid ready so I split up the function, you could merge it into one.
It also only detect RDP/MSTSC through this, additional service names might be needed for other virtual adapters. Or you could try to detect only Nvidia/AMD/Intel driver names... up to you.
I have an Directshow based mediaplayer application . It works very well without any issues during normal playabck . But occasionally i am facing one issue when the Mediaplayer started just after system boot .
HRESULT CSDirectShow::RenderOutputPins (IBaseFilter* pFilter)
{
const char* funcName = "CSDirectShow::RenderOutputPins()";
HRESULT hr = S_OK;
// Enumerate all pins on the source filter,
// looking for the output pins so that I can call Render() on them
//
CComPtr< IEnumPins > pEnumPin;
if (!FAILED (pFilter->EnumPins (&pEnumPin)))
{
while (true)
{
// get the next pin
//
CComPtr< IPin > pPin;
if (pEnumPin->Next (1L, &pPin, NULL) != S_OK) break;
// I'm not interested in connected pins
// if this pin is an unconnected output pin, then render it.
//
CComPtr< IPin > pConnectedPin;
if (pPin->ConnectedTo (&pConnectedPin) == VFW_E_NOT_CONNECTED)
{
PIN_DIRECTION pinDirection;
PIN_INFO pinInfo;
//Get the information of the pin
if (pPin->QueryDirection (&pinDirection) == S_OK
&& pinDirection == PINDIR_OUTPUT
&& pPin->QueryPinInfo(&pinInfo) == S_OK
&& strstr((char*)pinInfo.achName,"~")==NULL)
{
if (FAILED (hr = m_pGB->Render (pPin)))
{
SafeRelease(&pinInfo.pFilter);
return hr;
}
}
SafeRelease(&pinInfo.pFilter);
}
}
}
TraceMsg ("%s: exit",funcName);
return S_OK;
}
When m_pGB->Render (pPin) is called ,This function never returns and it is blocked inside .I confirmed using logs .This issue happens only when i start my application immediately after bootup . When issues occures if I close and restart my application it works like a charm .Since application is designed to start automatically after system bootup this behaviour has become a bigger concern .Kindly help
IGraphBuilder.Render call does a lot internally, and specifically it goes over enumeration of potentially suitable filter, which in turn attempts to load additional DLLs registered with DirectShow environment. Such file could have missing dependencies, or dependencies on remote or temporarily inaccessible drivers (just one example).
If you experience a deadlock, you can troubleshoot it further (debug it) and get details on locked state, and on activity during Render call.
If the problem is caused by third party filters (esp. codec packs registering a collection of filters at once without thinking too much on compatibility) registered with the system in a not so good way, perhaps you could identify them and uninstall.
If you want to improve the player on your side, you should avoid Render call, and build your filter graph with smaller increments - adding specific filter and connecting pins, without leaving big tasks at mercy of Intelligent Connect, which works well in general but is sensitive to compatibility problems as mentioned above.
Overview:
We're currently building an app that interfaces with the Windows Sensor API using the SensorManager COM. This is a C++, non-UWP project. Specifically, we're integrating with an aggregated orientation sensor (SENSOR_TYPE_AGGREGATED_DEVICE_ORIENTATION) on a Surface tablet. This works perfectly in Windows 8. We've recently upgraded our machines to Windows 10 and are now experiencing extremely slow reporting intervals from the orientation sensor. One report every ~500-1500ms, regardless of setting the SENSOR_PROPERTY_MIN_REPORT_INTERVAL, etc. Has anyone observed this behavior on Windows 10, or have a reason/solution to this issue? Thanks!
Details:
This is a Visual Studio 2015, C++ project using Cinder. Our sensor manager is being initialized as follows:
hr = ::CoCreateInstance(
CLSID_SensorManager,
nullptr,
CLSCTX_INPROC_SERVER,
IID_PPV_ARGS( &mSensorManager ) );
After acquiring a collection of applicable sensors and validating, we persist our sensor, then set the event sink. Code below:
STDMETHODIMP OrientationSensor::OnDataUpdated( ISensor *sensor, ISensorDataReport *report )
{
HRESULT hr;
PROPVARIANT pvQuat;
PropVariantInit( &pvQuat );
hr = report->GetSensorValue( SENSOR_DATA_TYPE_QUATERNION, &pvQuat );
if( SUCCEEDED( hr ) ) {
if( pvQuat.vt == ( VT_VECTOR | VT_UI1 ) ) {
float* pElement = (float*) pvQuat.caub.pElems;
// SET orientation quaternion
mOrientation = ci::quat( pElement[3], pElement[0], pElement[1], pElement[2] );
}
}
// CLEAR prop variant
PropVariantClear( &pvQuat );
return hr;
}
AddRef(), Release() and QueryInterface() are implemented as defined in all development guides.
We do receive correct data, simply at slow and long intervals, regardless of settings. We're trying to understand why this is, and why we only see this on Windows 10.
To confirm that the sensor was nominal, we compiled the UWP Orientation Sample. Indeed, this reported fast and correct readings from the orientation sensor. We're specifically looking for a solution that allows us to interface with the orientation sensor, through the sensor manager COM API.
I saw the same behavior of the Acceleration sensor on my SurfacePro,
I explicitly set the sensors sensitivity property and it solved the issue.
I am attempting to programatically make the 2nd Monitor have a duplicate display. My function below should change the 2nd monitors display to 'duplicate display', ie, make the 2nd monitor display everything that is on the 1st/Primary monitor.
My Problem: When I run my function it successfully finds the 2nd monitor and it changes that monitors display x coordinate to 0, ie, the left of the primary monitor screen by changing the DEVMODE dmPosition.x property. Both of my 2 monitors refresh themselves(they go black then reshow their screen) but the 2nd monitor still has the extended display instead of a duplicate display.
Any ideas how I can make my 2nd Monitor have a duplicate display?
Some relevant information:
- My 2nd monitor is a LCD TV and is connected to my laptop via HDMI
- My function code is exacty the same as the example on this MSDN Page that describes how to attach a 2nd monitor without having to restart. I have changed LINE 30 though.
- I am aware I can change the display on Windows 7 using one WinAPI function call but I need my program to work on Windows 2000 and up.
// From http://support.microsoft.com/kb/308216/en-gb Title: You must restart...
BOOL TVManager::AddUnattachedDisplayDeviceToDesktop()
{
DWORD DispNum = 0;
DISPLAY_DEVICE DisplayDevice;
DEVMODE defaultMode;
HDC hdc;
int nWidth;
BOOL bFoundSecondary = FALSE;
hdc = GetDC(0);
nWidth = GetDeviceCaps(hdc, HORZRES);
ReleaseDC(0, hdc);
// Initialize DisplayDevice.
ZeroMemory(&DisplayDevice, sizeof(DisplayDevice));
DisplayDevice.cb = sizeof(DisplayDevice);
// Get display devices.
while ((EnumDisplayDevices(NULL, DispNum, &DisplayDevice, 0)) && (bFoundSecondary == FALSE))
{
ZeroMemory(&defaultMode, sizeof(DEVMODE));
defaultMode.dmSize = sizeof(DEVMODE);
if (!EnumDisplaySettings((LPTSTR)DisplayDevice.DeviceName, ENUM_REGISTRY_SETTINGS, &defaultMode)) {
printf("1\n");
return FALSE; // Store default failed
}
if (!(DisplayDevice.StateFlags & DISPLAY_DEVICE_PRIMARY_DEVICE)) {
//Found the first secondary device.
_tprintf(_T("Found the first secondary device: Name: %s, Pos: %d, Width: %d\n"), DisplayDevice.DeviceName, defaultMode.dmPosition.x, nWidth);
bFoundSecondary = TRUE;
defaultMode.dmPosition.x = 0; // LINE CHANGED: ONLY CHANGE FROM MSDN'S CODE
defaultMode.dmFields = DM_POSITION;
ChangeDisplaySettingsEx((LPTSTR)DisplayDevice.DeviceName, &defaultMode, NULL, CDS_NORESET|CDS_UPDATEREGISTRY, NULL);
_tprintf(_T("Check for error: %u\n"), GetLastError()); // prints "Check for error: 0" which means no error occurred
// A second call to ChangeDisplaySettings updates the monitor.
ChangeDisplaySettings(NULL, 0);
_tprintf(_T("Check for error: %u\n"), GetLastError()); // prints "Check for error: 0" which means no error occurred
}
// Reinitialize DisplayDevice.
ZeroMemory(&DisplayDevice, sizeof(DisplayDevice));
DisplayDevice.cb = sizeof(DisplayDevice);
DispNum++;
} // End while the display devices.
return TRUE;
}
Windows XP and earlier uses a different display driver model (XPDM) from Vista and later (WDDM). Mirroring on XPDM depends very much on your graphics card vendor. The general idea is that for extending the desktop, you provide an extend driver; for mirroring a portion of the desktop, you provide a mirror driver.
In most cases, each extend driver is responsible for one output on your graphics card. Let's say that you have a dual DVI card, then you should see two extend drivers in your Device Manager, each is responsible for one of the DVI port. When you want to set your monitor to extend the desktop, you enable the extend driver and give it a sensible location.
Mirroring is trickier. This is where the behaviour can vary a bit between the different card vendors. From the perspective of the OS, this is what's happening. The extend driver associated with the graphics card port is disabled. The mirror driver is enabled if it was not enabled already. The mirror driver is then placed at (0, 0). Then some trickery happens inside your graphics card/driver and the monitor is showing what's inside the mirror driver's screen buffer.
In order to set a monitor into mirror mode on XPDM, you need find the extend driver it's currently showing stuff from and disable it. This may be all you have to do. Some of the vendors will automatically do the rest for you and start mirroring the primary display. Some vendors will do whatever your monitor was doing last before it was put into extend mode. If you find your monitor not showing anything, you can try to enable the mirror driver. If you manage to find the mirror driver and enable it, what happens after is anyone's guess. There isn't a universal way to wire up a monitor to a mirror driver.