OpenGL version stuck at 1.1 software - opengl

I've been trying to use OpenGL 4 and the first obstacle was actually loading GL4 instead of the disgusting software-only GL 1.1 that comes with MS Windows. I tried using GLEW and then updated my drivers a second time, and still GL continued putting the version as 1.1.0. It turns out that it was not a problem with GLEW (nor did it even require GLEW, it seems), nor was it SDL breaking something internally (which I considered since that is what I use to create the context). Andon M. Coleman brought up the issue of pixel format, which is something I had totally overlooked to begin with. It turns out that I'd been using 8 bits for each of red, green, blue, and alpha, plus 32 bits for depth. I thought that was a nice even 64 bits, with 32 for the color and 32 for the distance. However, since SDL assumes you want a stencil buffer too (which I realize now is actually needed), it was actually making the pixel format 72 bits, which is not allowed for an accelerated context, since GPUs typically handle 64 bits at most. So, it was defaulting to the ancient GL 1.1 support provided by Windows for use in the absence of drivers, also making it software-only.
The code has a lot of other stuff in it, so I have put together a basic sample of what I was trying to do. It is valid C++ and compiles on MinGW, assuming you link it properly.
#include <gl\gl.h>
#include <SDL2\SDL.h>
#include <stdio.h>
#define SDL_main main
int main ()
{
SDL_Init(SDL_INIT_EVERYTHING);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE,8);
/// What I *was* doing...
/* SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,32); */
// And then I didn't even set SDL_STENCIL_SIZE at all.
// Results in the entire size for each pixel being more than 64.
/// What I *am* doing...
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,24);
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE,8);
// Nice even 32 bits for this and 32 bits for the color is 64.
SDL_Window* window = SDL_CreateWindow(
"Hay GPU y is u no taek pixelz ovar 64 bitz?",
SDL_WINDOWPOS_UNDEFINED,SDL_WINDOWPOS_UNDEFINED,
1024,768,
SDL_WINDOW_FULLSCREEN | SDL_WINDOW_OPENGL
);
SDL_GLContext context = SDL_GL_CreateContext(window);
printf("GL Version [%s]\n",glGetString(GL_VERSION));
SDL_DestroyWindow(window);
SDL_GL_DeleteContext(context);
SDL_Quit();
return 0;
};
Hopefully other people who have been having a similar issue can learn from my mistake, or at least be able to mark it off a list of possible problems.

From SDL docs:
SDL_GL_CONTEXT_PROFILE_MASK determines the type of context created,
while both SDL_GL_CONTEXT_MAJOR_VERSION and
SDL_GL_CONTEXT_MINOR_VERSION determine which version. All three
attributes must be set prior to creating the first window, and in
general you can't change the value of SDL_GL_CONTEXT_PROFILE_MASK
without first destroying all windows created with the previous
setting.
Try setting the right values for SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_MAJOR_VERSION and SDL_GL_CONTEXT_MINOR_VERSION, that will probably solve the issue (if your drivers actually support OpenGL 4.x contexts)

Related

OpenGL / SDL2 : stencil buffer bits always 0 on PC

I'm writing an app using SDL2 / OpenGL, and doing some stencil operations.
Everything works as expected on Mac, however on PC the stenciling doesn't work.
Upon closer inspection I realized that the following code provides different outcomes on my Mac and PC:
SDL_Init(SDL_INIT_VIDEO);
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
SDL_CreateWindow( ... );
SDL_CreateRenderer( ... )
... do stuff ...
When I print out the stencil bits ( SDL_GL_STENCIL_SIZE ) on Mac I get 8. When I do the same on PC, I get 0.
The same happens whether I run it on an actual PC, or on a PC emulator on the Mac.
What am I missing? How can I force SDL2 to request a context with a stencil buffer?
It looks to me like the Mac's OpenGL implementation has different defaults than the PC one, so I'm probably forgetting to do something to specifically request a stencil buffer, but I can't find any good information online ...
Help ^_^' ?
Never mind, I found the answer:
On PC SDL2 was defaulting to Direct3D (which I guess would explain why my opengl stencil buffer was not there ^_^').
To force SDL2 to use a specific driver, you can use the second parameter in SDL_CreateRenderer.
Problem solved :D
StackOverflow, the biggest rubber duck available... ^-^'

OpenGL version and color depth

I made a program, that changes resolution, color depth,... and then it render simple texture on screen. It all works without any problem until I switch to 8b color depth. Then there appears problem of calling non-existing functions (function points to 0x00) like glCreateShader. It made me wonder and I got idea, which proved to be correct. Created context have really low version.
After calling glGetString(GL_VERSION) i recieved that context version was 1.1.0. With higher color depths it returns 4.4
Is there any reason for decreasing version? I looked through google and some of opengl.org pages, but I did not found anything about deprecating 8b color depth. Even Windows CAN switch to this color depth so there is no reason why OpenGL shouldn't be able to handle this.
Sure i can emulate it by decreasing number of colors, memory is not what I am concerned. I just want to know why is this happening. Program is prototype for lab experiments, so i need to have as many options as possible and this is just cutting one third away.
Last thing i should add is that program is written in C/C++ with Winapi and some WGL functions, but I think that this does not matter much.
Your graphics driver is falling back to the software implementation because no hardware accelerated pixel format matching your criteria could be found.
Most drivers will not give you hardware accelerated 8-bit per-pixel formats, especially if you request an RGB[A] (WGL_TYPE_RGBA_ARB) color mode.
Sure i can emulate it by decreasing number of colors, memory is not what I am concerned. I just want to know why is this happening.
To get an 8-bit format, you must use an indexed color mode (WGL_TYPE_COLORINDEX_ARB); paletted rendering. I suspect modern drivers will not even support that sort of thing unless they offer a compatibility profile (which rules out platforms like OS X).
The smallest RGB color depth you should realistically attempt is RGB555 or RGB565. 15/16-bit color is supported on modern hardware. Indexed color modes, on the other hand, are really pushing your luck.

masked_blit Allegro 4.2.1

when I using masked_blit function:
masked_blit( animations[which], buffer, 0, 0, x, y, animations[which]->w, animations[which]->h )
I get wrong colors on my buffer, my bmp is something like shifted or shocked.
this is my problem.
thanks for help
The most likely cause of your problem is that you are using a video bitmap with this call and your hardware does not support it OR the two bitmaps (animations[] and buffer) are different color depths.
To discard the possibility of hardware not supporting this feature, check that the GFX_HW_VRAM_BLIT_MASKED bit in the gfx_capabilities flag is set on your PC.
If they are the same color depth, but your hardware does not support the feature, you can always use the calls with memory bitmaps as a source, so the animations will reside in RAM instead of the video memory.
Source:
Allegro 4.2.1 manual (pdf) - Sections 1.15.3 masked_blit and 1.9.13 gfx_capabilities

Weird results when running my program in gDebugger

I'm trying to run a an OpenGL program through gDEBugger (http://www.gremedy.com) and I'm seeing a couple of strange things:
The frames seem to be rendering MUCH faster with gDEBugger. For example, if I update some object's position every frame - it'll just fly across the screen really fast, but when the program is run without gDEBugger, it'll move at much slower speed.
Strangely, gDEBugger reports 8 GL frames/second. Which doesn't seem realistic: clearly, FPS is higher than 8 (btw I have checked every possible OpenGL Render Frame Terminator in the Debug Settings dialog). Here's a screenshot (click here for full resolution):
My program uses SDL to create an OpenGL rendering context:
Uint32 flags = SDL_HWSURFACE | SDL_DOUBLEBUF | SDL_OPENGL;
if(fullscreen) flags |= SDL_FULLSCREEN;
// Initialize SDL's video subsystem
SDL_Init(SDL_INIT_VIDEO) == -1;
// Attempt to set the video mode
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_Surface* s = SDL_SetVideoMode(width, height, 0, flags);
I'm using Windows 7 and an NVidia graphics card (geforce gtx 660m).
My question is, how does one explain the strange behavior that I'm seeing in 1) and 2) ? Could it be that for some reason the rendering is being performed in software instead of the graphics card?
UPD: Obviously, I'm calling SDL_GL_SwapBuffers (which isn't listed as one of render frame terminators) at the end of each frame, but I assume it should just call the windows SwapBuffers function.
Regarding issue 1: apparently gDebugger disables wait-for-vsync, which is why the framerate is much higher than 60 fps.
Regarding issue 2: for some reason, when working with SDL, 2 OpenGL contexts are created. One can see the correct number by adding performance counters for the second context.

SDL Won't Use OpenGL driver, defaults to DirectX

I am using SDL 1.2 for a project. It renders things just fine, but I want to do some small pixel shader effects. All of the examples for this show using OpenGl driver for SDL's video subsystem.
So, I start the video subsystem with opengl as the driver, and tell SDL_SetVideoMode() to use SDL_OPENGL. When I go to run the program, it now starts crashing on the SetVideoMode() call, which worked fine without forcing OpenGl).
I went back and ran the program without forcing OpenGl and dumped out SDL_VideoDriverName() and it says I am using the "directx" driver.
My question is two-pronged: what is wrong that it doesn't like the opengl driver, and how to I get SDL to use opengl without problems here? Or, how do I get the SDL surface into DirectX to apply pixel shader effects?
I would prefer to use OpenGl as it would be easier to port code to other platforms.
As an example, I have added this code that breaks when I try to use the OpenGl system:
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
#include <SDL.h>
INT WINAPI WinMain( HINSTANCE hInst, HINSTANCE, LPSTR strCmdLine, INT )
{
SDL_putenv("SDL_VIDEODRIVER=opengl");
SDL_Init( SDL_INIT_EVERYTHING );
SDL_VideoInit("opengl",0);
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 ); // crashes here
SDL_Surface *mWindow = SDL_SetVideoMode(1024,768,32,SDL_HWSURFACE|SDL_HWPALETTE|SDL_DOUBLEBUF|SDL_OPENGL);
SDL_Quit();
return 0;
}
SDL without the OpenGL option uses DirectX to obtain direct access to a 3D drawing surface. Using OpenGL triggers a whole different codepath in SDL. And using OpenGL with SDL you no longer can use the SDL surface for direct access to the pixels It's very likely your program crashes, because you're still trying to directly access the surface.
Anyway, if you want to use pixel shaders, you no longer must use direct pixel buffer access, as provided by plain SDL. You have to do everything through OpenGL then.
Update
Some of the parameters you give to SDL are mutually exclusive. Also the driver name given to SDL_VideoInit makes no sense if used together with OpenGL (it's only relevant together with DirectDraw to select a specific output device).
Also, because you already did call SDL_Init(SDL_INIT_EVERYTHING) the call to SDL_VideoInit is redundant and maybe harmfull actually.
See this for a fully working OpenGL example:
http://sdl.beuc.net/sdl.wiki/Using_OpenGL_With_SDL