OpenGL supports only the default graphics adapter - opengl

For some reason I can no longer run the OculusRoomTiny sample program anymore because I keep getting this pop up "OpenGL supports only the default graphics adapter."
It's being triggered by this code shown below in main.cpp:
if (Compare(luid, GetDefaultAdapterLuid())) // If luid that the Rift is on is not the default adapter LUID...
{
VALIDATE(false, "OpenGL supports only the default graphics adapter.");
}
and
static ovrGraphicsLuid GetDefaultAdapterLuid()
{
ovrGraphicsLuid luid = ovrGraphicsLuid();
#if defined(_WIN32)
IDXGIFactory* factory = nullptr;
if (SUCCEEDED(CreateDXGIFactory(IID_PPV_ARGS(&factory))))
{
IDXGIAdapter* adapter = nullptr;
if (SUCCEEDED(factory->EnumAdapters(0, &adapter)))
{
DXGI_ADAPTER_DESC desc;
adapter->GetDesc(&desc);
memcpy(&luid, &desc.AdapterLuid, sizeof(luid));
adapter->Release();
}
factory->Release();
}
#endif
return luid;
}
I've never had this issue before, haven't changed any code, reinstalled the SDK, and I still get the same problem - did something happen to my headset - why isn't the luid the same? I'm using the DK2 and SDK 1.9.0
When I comment out the VALIDATE statement, the program runs, but the oculus just gets stuck in the "please wait" screen forever.
Thanks for your help in advance!

I had the same problem.
I noticed that the app was trying to use my onboard Intel graphics card.
I solved the problem by changing the NVidia driver in windows to make it the default graphics card.
Hope that helps.

Related

Can't go fullscreen under DirectX12

I have an app which uses DirectX12. I wanted to support fullscreen mode in my app.
However, each time I called IDXGISwapChain::SetFullscreenState(), I got this error:
DXGI ERROR: IDXGISwapChain::GetContainingOutput: The swapchain's adapter
does not control the output on which the swapchain's window resides.
The error code returned by IDXGISwapChain::SetFullscreenState() was
0x887a0004
My computer has two GPU:
Intel(R) HD Graphics 630 and
NVIDIA GeForce GTX 1060
the latter was the adapter used to create d3d12 device when the error occurred.
And if the adapter was the former, there would be no error.
The code flow used to create a IDXGISwapChain3 was
//variables used in the code example
ID3D12Device *pDevice;
IDXGIFactory4 *pDXGIFactory;
IDXGIAdapter *pAdapter;
ID3D12CommandQueue *pCommandQueue;
IDXGISwapChain1 *pTempSwapchain;
IDXGISwapChain3 *pSwapchain;
//the code flow
CreateDXGIFactory2();
pDXGIFactory->EnumAdapter();
D3D12CreateDevice(pAdapter, ...);
pD3DDevice->CreateCommandQueue();
pDXGIFactory->CreateSwapChainForHwnd(pCommandQueue, ..., &pTempSwapchain);
pTempSwapchain->QueryInterface(IID_PPV_ARGS(&pSwapChain));
The IDXGISwapChain::SetFullscreenState() should succeed, but it failed.
Note that the problem is a specific combination of a quirk of the DXGI debug layer and the way that the "hybrid graphics" solutions are implemented depending on which device you are using at the time.
The best option here is to just suppress the error:
#if defined(_DEBUG)
// Enable the debug layer (requires the Graphics Tools "optional feature").
//
// NOTE: Enabling the debug layer after device creation will invalidate the active device.
{
ComPtr<ID3D12Debug> debugController;
if (SUCCEEDED(D3D12GetDebugInterface(IID_PPV_ARGS(debugController.GetAddressOf()))))
{
debugController->EnableDebugLayer();
}
else
{
OutputDebugStringA("WARNING: Direct3D Debug Device is not available\n");
}
ComPtr<IDXGIInfoQueue> dxgiInfoQueue;
if (SUCCEEDED(DXGIGetDebugInterface1(0, IID_PPV_ARGS(dxgiInfoQueue.GetAddressOf()))))
{
dxgiInfoQueue->SetBreakOnSeverity(DXGI_DEBUG_ALL, DXGI_INFO_QUEUE_MESSAGE_SEVERITY_ERROR, true);
dxgiInfoQueue->SetBreakOnSeverity(DXGI_DEBUG_ALL, DXGI_INFO_QUEUE_MESSAGE_SEVERITY_CORRUPTION, true);
DXGI_INFO_QUEUE_MESSAGE_ID hide[] =
{
80 /* IDXGISwapChain::GetContainingOutput: The swapchain's adapter does not control the output on which the swapchain's window resides. */,
};
DXGI_INFO_QUEUE_FILTER filter = {};
filter.DenyList.NumIDs = _countof(hide);
filter.DenyList.pIDList = hide;
dxgiInfoQueue->AddStorageFilterEntries(DXGI_DEBUG_DXGI, &filter);
}
}
#endif
I use this in all my templates on GitHub.
I have found a solution. Use IDXGIFactory6::EnumAdapterByGpuPreference() method instead of IDXGIFactory::EnumAdapter() method then the error will disappear.

cpp - SDL_JOYHATMOTION different when using Windows?

I'm using SDL2 in my programm.
The Gamepad is initialised using:
SDL_Joystick* Pad1 = NULL;
Pad1 = SDL_JoystickOpen( 0 );
In my Event-Handling function, i included this thing:
switch( event.type ){
//Button-Event, as an example:
case SDL_JOYBUTTONDOWN:
//printf("Button: %d", event.jbutton.button, " ");
if(event.jbutton.button==ControllP1.MoveLeftButton)
MoveLeft=true;
//lot of other cases
case SDL_JOYHATMOTION:
if(event.jhat.value==SDL_HAT_UP){MoveUp=true;MoveLeft=false; MoveRight=false; MoveDown=false;}
if(event.jhat.value==SDL_HAT_DOWN){MoveDown=true;MoveUp=false; MoveLeft=false; MoveRight=false;}
if(event.jhat.value==SDL_HAT_LEFT){MoveLeft=true; MoveDown=false; MoveUp=false; MoveRight=false;}
if(event.jhat.value==SDL_HAT_RIGHT){MoveRight=true;MoveDown=false; MoveUp=false; MoveLeft=false; }
if(event.jhat.value==SDL_HAT_CENTERED){MoveDown=false; MoveUp=false; MoveLeft=false; MoveRight=false;}
if(event.jhat.value==SDL_HAT_LEFTUP){MoveDown=false; MoveUp=true; MoveLeft=true; MoveRight=false;}
if(event.jhat.value==SDL_HAT_RIGHTUP){MoveDown=false; MoveUp=true; MoveLeft=false; MoveRight=true;}
if(event.jhat.value==SDL_HAT_RIGHTDOWN){MoveDown=true; MoveUp=false; MoveLeft=false; MoveRight=true;}
if(event.jhat.value==SDL_HAT_LEFTDOWN){MoveDown=true; MoveUp=false; MoveLeft=true; MoveRight=false;}
break;
Note that this code isn't targeting only the specified pad but should react to the input on any gamepad.
Within OpenSuse/Linux this is fine. As soon as I use the Hat on any Gamepad, it triggers the event. It however doesn't work for windows. The rest of the Code is running as intented (including the specified axis, button, etc. events) but using the Hat doesn't cause any reaction. What is the reason for this? Do i need to specifiy a gamepad when using SDL2 under Windows?
Thanks and greetings, mumbo
Edit1:
Surfing arround, I probably did find an explanation for my problem:
https://forums.libsdl.org/viewtopic.php?p=39991
I suppose that the DPAD isn't detected as an HAT but rather as an Analog-Stick under Windows when using the Joystick-API?
Edit2:
It was a bug in the SDL2.dll on the windows-machine i used for testing. Replacing the SDL2.dll with the fresh one solved the Problem, hats are responding as intended :)
Thanks for the help guys, good to know about the GameController-API.
I did update SDL2 on the target-windows-machine - and the whole thing is working as intented. Code is fine.
Thanks for the Help anyone, good to have learned about the GameController-API.
tl;dr: On windows you might be having driver problems if your device is a weird one, and you might want to use the gamecontroller API if you're targeting gamepads as it gives you a more consistent interface to use.
Mumbo: The hat got usually the form of a cross (or a circle with a cross-form ontop). You can usually find it on the left side of your gamepad
So you mean the DPAD.
First, the joystick API from SDL is a bit lower level, handling stuff like actual joysticks, steering wheels and (in your use case) gamepads indistinguishable of the device. This means the API might not be consistent across devices, for example two different gamepads might map a button to different indexes.
Although I think the joyhat might be always mapped to the DPAD in the more common devices, the other buttons might not, (triggers, x, y, a, b star, circle, etc). Come GamePadController to save the day which gives you a more consistent way to handle the controller (by giving you an Xbox 360 like gamepad and a database of mappings for several devices).
In the source tree of SDL there is a databse of controllers you can load (or is loaded by default, I didn't check), you can also check this link where I think there is another database of mappings for all kinds of controllers that you can load into your program by hand.
This example uses the GameController API instead of the JoyStick API and prints values when the DPAD is pressed. I did a test on linux only, might hop on windows later to try it out.
#include <SDL2/SDL.h>
#include <SDL2/SDL_image.h>
#include <thread>
#define HEIGHT 600
#define WIDTH 800
using namespace std;
int main() {
SDL_Init(SDL_INIT_VIDEO | SDL_INIT_GAMECONTROLLER);
SDL_Window *window = SDL_CreateWindow("Test", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, WIDTH, HEIGHT, SDL_WINDOW_SHOWN);
SDL_Event event;
SDL_GameController *controller = SDL_GameControllerOpen(0);
bool quit = false;
//SDL_Joystick *joy = SDL_GameControllerGetJoystick(controller);
while (!quit) {
while (SDL_PollEvent(&event)) {
if (event.type == SDL_QUIT) {
quit = true;
}
if (event.type == SDL_CONTROLLERBUTTONDOWN || event.type == SDL_CONTROLLERBUTTONUP) {
SDL_ControllerButtonEvent ev = event.cbutton;
if (ev.button == SDL_CONTROLLER_BUTTON_DPAD_DOWN)
printf("SDL_DPAD_HAT_DOWN_UP\n");
if (ev.button == SDL_CONTROLLER_BUTTON_DPAD_UP)
printf("SDL_DPAD_HAT_UP_UP\n");
if (ev.button == SDL_CONTROLLER_BUTTON_DPAD_RIGHT)
printf("SDL_DPAD_HAT_RIGHT_UP\n");
if (ev.button == SDL_CONTROLLER_BUTTON_DPAD_LEFT)
printf("SDL_DPAD_HAT_LEFT_UP\n");
}
if (event.type == SDL_CONTROLLERBUTTONDOWN) { puts ("DPAD DOWN STATE"); }
}
std::this_thread::sleep_for(std::chrono::milliseconds{33});
}
SDL_DestroyWindow(window);
SDL_Quit();
return 0;
}
On the other hand you might have DRIVER problems (not uncommon on windows with rando controllers) or be against a gamepad that isn't mapped yet. (I've tried on linux with a PS4 controller and it worked correctly but with a cheap knockoff of a PS2 controller it didn't).

SDL 2 Cannot Open Controller but Joystick Recognized

I am writing a C++ program using SDL 2 for the platform layer and opengl for graphics and rendering. I have a full working prototype with keyboard and mouse input. Now I am now trying to use SDL's game controller API to connect a gamepad (to replace or supplement keyboard controls). Unfortunately the controller does not seem to be recognized despite the fact that it works perfectly with other software. It's a Sony Dualshock 4 (for the Playstation 4 system). My system is Mac OS 10.9.5, and I am using SDL 2.0.5 with the official community controller database for SDL 2.0.5, which contains ps4 controller mappings:
030000004c050000c405000000000000,PS4 Controller,a:b1,b:b2,back:b8,dpdown:h0.4,dpleft:h0.8,dpright:h0.2,dpup:h0.1,guide:b12,leftshoulder:b4,leftstick:b10,lefttrigger:a3,leftx:a0,lefty:a1,rightshoulder:b5,rightstick:b11,righttrigger:a4,rightx:a2,righty:a5,start:b9,x:b0,y:b3,platform:Mac OS X,
4c05000000000000c405000000000000,PS4 Controller,a:b1,b:b2,back:b8,dpdown:h0.4,dpleft:h0.8,dpright:h0.2,dpup:h0.1,guide:b12,leftshoulder:b4,leftstick:b10,lefttrigger:a3,leftx:a0,lefty:a1,rightshoulder:b5,rightstick:b11,righttrigger:a4,rightx:a2,righty:a5,start:b9,x:b0,y:b3,platform:Mac OS X
I also added a new mapping using one of the official tools. That also loads successfully according to the relevant function call.
The following is my code, and it's about as close to a minimal example as I can get:
// in main
// window and graphics context initialization here
// initialize SDL
if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_GAMECONTROLLER | SDL_INIT_HAPTIC) < 0) {
fprintf(stderr, "%s\n", "SDL could not initialize");
return EXIT_FAILURE;
}
// load controller mappings, I tested this and 35 mappings load successfully, which is expected
SDL_GameControllerAddMappingsFromFile("./mapping/gamecontrollerdb_205.txt");
// the controller handle
SDL_GameController* controller = nullptr;
// max_joysticks is 1, which means that the device connects at least
int max_joysticks = SDL_NumJoysticks();
if (max_joysticks < 1) {
return EXIT_FAILURE;
}
// this returns, which means that the joystick exists, but it isn't recognized as a game controller.
if (!SDL_IsGameController(0)) {
return EXIT_FAILURE;
}
// I never get passed this.
controller = SDL_GameControllerOpen(0);
fprintf(stdout, "CONTROLLER: %s\n", SDL_GameControllerName(controller));
Has anyone encountered this problem? I've done some preliminary searching as I mentioned, but it seems that usually either the number of joysticks is 0, or everything is recognized.
Also, SDL_CONTROLLERDEVICEADDED isn't firing when I connect the controller.
The controller is connected via USB before I start the program. Also, this is one of the new controllers, and I'm not sure whether the mappings work with that new one. I assume so considering that there are two distinct entries.
Thank you.
EDIT:
I double checked and the PS4 controller works fine as a joystick, but it isn't recognized as a controller, which means that the mapping is incorrect or non-existent. This may be because my controller is "version 2" of the dualshock 4, and I'm not sure whether a 2.0.5-compatible mapping was added. hmmm
The controller was recognized as a joystick but not as a controller, meaning that none of the available mappings I could find (in 2.0.5 controller mapping format) corresponded with the controller. Updating from SDL 2.0.5 to 2.0.8 also updated available mappings it seems, and now the controller is recognized as a game controller.
Note: normally it is a terrible idea to upgrade tools mid-project, but in this case it was safe to do.

opengl on Linux : unable to use a correct version

I tried to build a program using glfw + glew on Fedora 25.
part of it is:
int main()
{
glfwInit();
glfwWindowHint(GLFW_OPENGL_PROFILE,GLFW_OPENGL_ANY_PROFILE);
//glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR,3);
//glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR,3);
mainWindow = glfwCreateWindow(1024,768,"NONE",nullptr,nullptr);
if(mainWindow == nullptr)
{
std::cout<<"Creating window ERROR.\n"<<std::endl;
glfwTerminate();
return 1;
}
.....
}
If I use glfwWindowHint(GLFW_OPENGL_PROFILE,GLFW_OPENGL_CORE_PROFILE); to get the version 3.3, it'll be unable to create window though.
My hardware supports openGL 4.1.
upd: got the answer....
Just uncomment glfwWindowHint(GLFW_CONTEXT_VERSION_*,*) there.
While using core profile with glfwWindowHint(GLFW_OPENGL_PROFILE,GLFW_OPENGL_CORE_PROFILE),
the explicit version needs to be requested.
Thanks to the comment of Dietrich Epp.
By the way, can anyone tell me how to get such information?

GLX/GLEW order of initialization catch-22: GLXEW_ARB_create_context, glXCreateContextAttribsARB, glXCreateContext

Currently I'm working on an application that uses GLEW and GLX (on X11).
The logic works as follows...
glewInit(); /* <- needed so 'GLXEW_ARB_create_context' is set! */
if (GLXEW_ARB_create_context) {
/* opengl >= 3.0*/
.. get fb_config ..
context = glXCreateContextAttribsARB(...);
}
else {
/* legacy context */
context = glXCreateContext(...);
}
The problem I'm running into, is GLXEW_ARB_create_context is initialized by glew, but initializing glew calls glGetString, which crashes if its called before (glXCreateContextAttribsARB / glXCreateContext).
Note that this only happens with Mesa's software rasterizer, (libGL.so compiled with swrast). So its possibly a problem with Mesa too.
Correction, this works on Mesa-SWRast and NVidia's propriatry OpenGL drivers, but segfaults with Intel's OpenGL.
Though its possible this is a bug in the Intel drivers. Need to check how other projects handle this.
The cause in this case is the case of intel is glXGetCurrentDisplay() returns NULL before glx is initialized (another catch-22).
So for now, as far as I can tell, its best do avoid glew before glx context is created, and instead use glx directly, eg:
if (glXQueryExtension(m_display, NULL, NULL)) {
const char *glx_ext = glXGetClientString(display, GLX_EXTENSIONS);
if (... check_string_for_extension(glx_ext, "GLX_SOME_EXTENSION")) {
printf("We have the extension!\n");
}
}
Old answer...
Found the solution (seems obvious in retrospect!)
First call glxewInit()
check GLXEW_ARB_create_context
create the context with glXCreateContextAttribsARB or glXCreateContext.
call glewInit()