Resource Initialization and OpenGL Contexts - opengl

We have an OpenGL Application (using Ogre3d and SDL, not directly calling OpenGL) and we are trying to change the Resolution at runtime. It seems that we need to re-initialize our OpenGL context with the new Resolution but a number of items are breaking along the way. On Linux it seems to work for a while, then we get graphical corruption on screen. On Windows it simply crashes the next time we try to render a frame. We have forced the reloading of textures in Ogre, and if we rendering nothing but textures (no 3d models) then this works fine, but any 3d models cause a crash and reloading before rendering them has no effect.
Here is a link to an in depth explanation of Ogre3d calls we are doing: http://www.ogre3d.org/forums/viewtopic.php?f=2&t=62825
All we really need to know is, When re-initializing an Opengl context what resources need to be restored?
Why does adjusting an OpenGL context affect other resources? Is it the way OpenGL works, or did one of the libraries we use introduce this issue? Could we have added this issue without knowing it?

Did you have a look at this forum thread ?
SDL seems to destroy the OpenGL when changing resolution. In this case, all you GL resources are destroyed with the context.
One possible solution would be to create another 'dummy' GL context, sharing resources with you 'real' GL context, and to keep it alive with SDL destroys the 'main' context. This way most of your resources should survive.
Note that some resources can't be shared, textures and VBO are fine, but VAO can't.

OpenGL support was added SDL after its surface code had been established. That's why changing the size of a SDL window is destructive. You were pointed to OpenGL context sharing and its caveats. However I'd avoid the problem alltogether by not using SDL for creating an OpenGL window. You can use all the other facilities SDL provides without a window managed by SDL, so the only thing that would change is input event processing and how the window's created. Instead of SDL I'd use GLFW, which like SDL requires you to implement your own event processing loop, so using GLFW as a drop-in replacement for OpenGL window and context creation is straightforward.

Related

How to Get Unity Context into OpenGL Window

I want to get the unity context into opengl so I can display a unity render texture in an opengl glfw window. I tried using
oldContext = glfwGetCurrentContext(); but the value of oldContext is just null.
I am trying to use the low-level native unity plugin and Texture.GetNativeTexturePtr
Any help would be greatly appreciated!
OpenGL context cannot be queried like OpenGL state related objects via some glGet* API.Context is not part of OpenGL API,it is a part of the system you're running on and it exists to allow you maintaining of OpenGL state and issue command to the driver. You must access a system specific handle that points to the context via system specific API.On Windows (WinGDI)that's would be
HGLRC wglGetCurrentContext();
On linux see related GLX API. You need to find functions to access GLXContext
I did it once in Unity3D (framebuffer readout plugin). But it used Unity's OpenGL or DirectX context to issue API commands only.
Also,I am not sure you can 'inject' or share a context for a window that doesn't own that context. You see, when you (or Unity) init display it creates context and related GL resources,like the default FBO with all required attachments on its own,and that FBO is mapped to some system resource(device) which takes actually care of presenting those pixels on the screen. Therefore, I am not sure display context can be moved from Window to Window in the same manner that a context can be shared between threads.(But I can be wrong on this one)
You can create your plugin Window on some thread,with its own GL context. Then create and share a texture object between those two. Remember, GL textures are shareable. If you copy contents from Unity's screen FBO into that texture,then you can copy it into your plugin's screen FBO from that texture as well.
Btw,look at this SO question .You can see there vendor specific GL extensions which allow copying data into texture from different contexts without requiring shared context,share lists setup.
Regarding why GLFW returns you nullptr. In your example you use GLFW library.
glfwGetCurrentContext()
But if you look at the source code,you see this:
GLFWAPI GLFWwindow* glfwGetCurrentContext(void)
{
_GLFW_REQUIRE_INIT_OR_RETURN(NULL);
return _glfwPlatformGetTls(&_glfw.contextSlot);
}
Which probably means that it retrieves a pointer to GLFWWindow from its own cache and not from the system.And if you didn't create that context via GLFW,you won't get any valid pointer. So try working directly with your system related API as explained above.

Multisampled framebuffer in OpenGL under Windows

How do I set the number of samples in the default framebuffer, given by Windows?
I found this page, but although I use glew, there is no available wglChoosePixelFormatARB function in my context. What could be the cause of this?
With WGL, you can only set the pixel format for a window once when you create the context, and you can only call the wglChoosePixelFormatARB() function once the client driver is loaded, and the client driver is only loaded once you have an OpenGL context. Yes, that's circular. So, this means you have to do the following:
Create a window with an OpenGL context.
Get the function pointer for wglChoosePixelFormatARB().
Destroy the window, and create a new window with the desired pixel format.
If you've got any sense in you, you'll use SDL or GLFW to do this for you, because it's just a bunch of plumbing you have to write, there's no value in learning how to do it, and you probably want to get some real work done. SDL/GLFW/etc. is how 99% of the OpenGL game devs out there do it.
If you really want to do this yourself and get stuck, look at the SDL or GLFW source code to see how they do it.
In SDL, the src/video/windows/SDL_windowsopengl.c file has a function WIN_GL_ChoosePixelFormatARB() which does what you want. Also note the function WIN_GL_LoadLibrary().
In GLFW, the src/win32_window.c file has a function _glfwPlatformCreateWindow() which does what you want.
P.S. GLEW is a bit broken with core contexts and modern cards, so watch out. There are other GL loaders out there.

Using a SDL window with OpenGL the other way around

I just recently changed all rendering in my application/game from SDL to OpenGL. I still use SDL for keyboard input, loading images and creating the window.
But since i only render in OpenGL do you think that i should change the window to a OpenGL initialized window instead of a SDL window. The header i use for OpenGL functions at the moment is "SDL_opengl.h". Does that affect things?
What are the advantages and disadvantages if i do this? Right now it feels like it's the logical "next step" since i got rid of all other SDL rendering code.
Just keep on using SDL for input and window management.
There's no provision for either in the OpenGL spec so there's really no such thing as an "OpenGL initialized window".
As genpfault said, you should keep using SDL for your window initialization. It's more clean and portable than OS-specific methods of initializing a window for OpenGL rendering.
I would also recommend switching from SDL_opengl.h to SDL.h and gl.h; the definitions in SDL_opengl.h aren't necessary for basic window management, and they conflict with the definitions in other OpenGL libraries like GLEW (which you may want to use later for pixel shaders and framebuffers).

WGL: possible to find offscreen context and render to window?

There is an interesting browser framework called Awesomium, which is basically a wrapper around the Chromium browser engine.
I'm interested in using it to redistribute WebGL-based games for the desktop. However Awesomium only supports rendering using a pixel buffer sent to the CPU, even though the WebGL context itself is based on a real hardware-accelerated OpenGL context. This is inefficient for real-time high-performance games and can kill the framerate on low-end machines.
Awesomium may eventually fix this, but it got me thinking: is it possible to search a process for an offscreen OpenGL context and render it directly to a window? This would avoid the inefficient rendering method, keeping rendering entirely on the GPU. I'm using a native C++ app on Windows, so presumably this will involve WGL specifics. Also since Chromium is a multithreaded browser engine it may involve finding an OpenGL context on a different thread or event a different process. Is it possible?
is it possible to search a process for an offscreen OpenGL context and render it directly to a window?
No, it it not possible. If the opengl context is created for the OS buffer, then it is not possible to redirect it to other buffer and other opengl context.
Maybe you can use shared opengl resources (if both opengl contexts are created using such option).

Can you create OpenGL context without opening a window?

Occassionally I hit places where I'd want to get an OpenGL framebuffer object, but where I'm not interested about opening a window of any kind.
Is it possible to create an opengl context without attaching it to a window of any kind?
Yes! you can use the desktop window as the window passed to OpenGL- as long as you don't try to display anything on it ;)
Just Call GetDesktopWindow and pass the result as an argument when creating new OpenGL window.
http://www.opengl.org/wiki/Creating_an_OpenGL_Context
According to this Web page, WGL_ARB_create_context can be used to create a context without a window. I have not actually tried it myself. I used freeGLUT to create the context and then rendered off-screen to a framebuffer+renderbuffer. I exit the program without ever calling glutMainLoop. It is klugy, but it works for my purposes.
Yes, you can perform off-screen rendering with OpenGL, but the exact way to set it up is dependent on the operating system.
The closest you get to an OS independent way would be to use Mesa 3D, but then your off-screen rendering would not be hw accelerated.