OpenGL and SDL, resizing window while keeping internal resolution unaltered? - opengl

I am doing a game in OpenGL and using SDL for managing the window, setting the icons, and all that stuff.
Now that I have set rendering the scene to a framebuffer, I wondered if I could resize the SDL window while keeping my starting GL settings (I am trying to emulate a exact resolution so window resizing is a rescale of the framebuffer to the window size)
I tried giving the SDL window double the resolution of the resolution I pass to glortho, but it gives unexpected results©. Is this possible at all, or do I need to adapt my working resolution to the screen resolution all the time?
I use this code to initialize video
SDL_SetVideoMode(XRES, YRES, bpp, SDL_OPENGL | SDL_HWPALETTE);
gl_init(XRES,YRES);
And into gl_init I set glortho to glOrtho(0, width, 0, height, -1, 1), and then the framebuffer "blank" texture to width and height in size, as well.
When the function is called as above, all is well. But if I try something like
SDL_SetVideoMode(XRES*2, YRES*2, bpp, SDL_OPENGL | SDL_HWPALETTE);
gl_init(XRES,YRES);
Instead of getting my expected results (scaled output) I find out that the output is somewhere at the far left on X axis and somewhere in the middle of the Y axis, like if GL size was even bigger than the screen and the rest was cropped out. Is there anything I am missing?

Try to simply set the FBO texture size to 1/4 of the window size (1/2 of its edge lengths), then render the FBO's color buffer texture to the entire SDL window.

I know this is an old question, but it is a top result on Google and does not have an answer.
You'll need to call glViewport(). Suppose you want your internal resolution as 1024x768, and your window resolution is windowWidth and windowHeight. Before you write to your FBO, call glViewport(0, 0, 1024, 768). Then, before writing your FBO to the window, call glViewport(0, 0, windowWidth, windowHeight).

You use this code in your game loop
int w, h;
SDL_GetWindowSize(Window, &w, &h);
glViewport(0, 0, w, h);

Related

glDrawPixels isn't filling the window [duplicate]

My computer is a Mac pro with a 13 inch retina screen. The screen resolution is 1280*800 (default).
Using the following code:
gWindow = glfwCreateWindow(800, 600, "OpenGL Tutorial", NULL, NULL);
//case 1
glViewport(0,0,1600,1200);
//case 2
glViewport(0,0,800,600);
Case 1 results in a triangle that fits the window.
Case 2 results in a triangle that is 1/4th the size of the window.
Half of viewport:
The GLFW documentation indicates the following (from here):
While the size of a window is measured in screen coordinates, OpenGL
works with pixels. The size you pass into glViewport, for example,
should be in pixels. On some machines screen coordinates and pixels
are the same, but on others they will not be. There is a second set of
functions to retrieve the size, in pixels, of the framebuffer of a
window.
Why my retina screen coordinate value is twice the value of pixel value?
As Sabuncu said is hard to know what result should be correct without knowing how you draw the triangle.
But I guess your problems is related to the fact that with retina screen, when you use the 2.0 scale factor you need to render twice the pixels as you would with a regular screen - see here
The method you're after is shown just a few lines below your GLFL link
There is also glfwGetFramebufferSize for directly retrieving the current size of the framebuffer of a window.
int width, height;
glfwGetFramebufferSize(window, &width, &height);
glViewport(0, 0, width, height);
The size of a framebuffer may change independently of the size of a window, for example if the window is dragged between a regular monitor and a high-DPI one.
In your case I'm betting the framebuffer size you'll get will be twice the window size, and your gl viewport needs to match it.
The frame-buffer size never needs to be equal to the size of the window, as of that you need to use glfwGetFramebufferSize:
This function retrieves the size, in pixels, of the framebuffer of the specified window. If you wish to retrieve the size of the window in screen coordinates, see glfwGetWindowSize.
Whenever you resize your window you need to retrieve the size of its frambuffer and update the Viewport according to it:
glfwGetFramebufferSize(gWindow, &framebufferWidth, &framebufferHeight);
glViewport(0, 0, framebufferWidth, framebufferHeight);
With retina display, the default framebuffer (the one that rendered onto the canvas) is twice the resolution of the display. Thus, if the display is 800x600, the internal canvas is 1600x1200, and therefore your viewpoert should be 1600x1200 since this is the "window" into the framebuffer.

OpenGL FrameBuffer and glViewport

I am going through a render pass onto a FrameBufferObject such that I can use the result as a texture in the next rendering pass. Looking at tutorials, I need to call:
glBindFramebuffer(GL_TEXTURE2D, mFrameBufferObjectID);
glViewport(0, 0, mWidth, mHeight);
where mWidth and mHeight are the width and height of the frame buffer object. It seems without this glViewport call, nothing gets drawn correctly. What's strange is that upon starting the next frame, I need to call:
glViewport(0, 0, window_width, window_height);
so that I can go back to my previous width/height of the window; but calling it seems to only render my stuff at half of the original window size. So I physically only see a quarter of my screen gets stuff rendered onto (yes the entire scene is on it). I tried putting a break point and looking at the width and heigh values, they are the original values (1024, 640). Why is this happening? I tried doubling those and it correctly draws on my entire window.
I'm doing this on Mac via xcode and with glew.
The viewport settings are stored in the global state. So if you change it, it stays at the new values until you call glViewport again. As you already noted, you'll have to set the size whenever you want to render to a FBO (or backbuffer) that has a different size then the previously bound one.
Try adjusting the scissor box as well as the viewport if you have the scissor test enabled using
glEnable(GL_SCISSOR_TEST);
To fix your problem write
glViewport(0, 0, mWidth, mHeight);
glScissor(0, 0, mWidth, mHeight);
and
glViewport(0, 0, window_width, window_height);
glScissor(0, 0, window_width, window_height);
everytime when you switch to a new framebuffer with a different size as the previous one, or always just to be safe.
See this link of the reference glScissor
Or this other post explaining the purpose of it https://gamedev.stackexchange.com/questions/40704/what-is-the-purpose-of-glscissor

Why retina screen coordinate value is twice the value of pixel value

My computer is a Mac pro with a 13 inch retina screen. The screen resolution is 1280*800 (default).
Using the following code:
gWindow = glfwCreateWindow(800, 600, "OpenGL Tutorial", NULL, NULL);
//case 1
glViewport(0,0,1600,1200);
//case 2
glViewport(0,0,800,600);
Case 1 results in a triangle that fits the window.
Case 2 results in a triangle that is 1/4th the size of the window.
Half of viewport:
The GLFW documentation indicates the following (from here):
While the size of a window is measured in screen coordinates, OpenGL
works with pixels. The size you pass into glViewport, for example,
should be in pixels. On some machines screen coordinates and pixels
are the same, but on others they will not be. There is a second set of
functions to retrieve the size, in pixels, of the framebuffer of a
window.
Why my retina screen coordinate value is twice the value of pixel value?
As Sabuncu said is hard to know what result should be correct without knowing how you draw the triangle.
But I guess your problems is related to the fact that with retina screen, when you use the 2.0 scale factor you need to render twice the pixels as you would with a regular screen - see here
The method you're after is shown just a few lines below your GLFL link
There is also glfwGetFramebufferSize for directly retrieving the current size of the framebuffer of a window.
int width, height;
glfwGetFramebufferSize(window, &width, &height);
glViewport(0, 0, width, height);
The size of a framebuffer may change independently of the size of a window, for example if the window is dragged between a regular monitor and a high-DPI one.
In your case I'm betting the framebuffer size you'll get will be twice the window size, and your gl viewport needs to match it.
The frame-buffer size never needs to be equal to the size of the window, as of that you need to use glfwGetFramebufferSize:
This function retrieves the size, in pixels, of the framebuffer of the specified window. If you wish to retrieve the size of the window in screen coordinates, see glfwGetWindowSize.
Whenever you resize your window you need to retrieve the size of its frambuffer and update the Viewport according to it:
glfwGetFramebufferSize(gWindow, &framebufferWidth, &framebufferHeight);
glViewport(0, 0, framebufferWidth, framebufferHeight);
With retina display, the default framebuffer (the one that rendered onto the canvas) is twice the resolution of the display. Thus, if the display is 800x600, the internal canvas is 1600x1200, and therefore your viewpoert should be 1600x1200 since this is the "window" into the framebuffer.

Drawing to different size FBO

I'm having an issue while using FBO.
My window size is 1200x300.
When I create a FBO that's 1200x300, everything is fine.
However, when I create FBO with 2400x600 size (effectively, two times bigger on both axes) and try to render the exact same primitives, I get used only one quarter of the FBO's actual area.
FBO same size as window:
FBO twice bigger (triangle clipping can be noticed):
I render these two triangles into FBO, then render a fullscreen quad with a FBO's texture over it. I clear FBO with this pine green color, so I know for sure that all that empty space on the second picture actually comes from the FBO.
// init() of the program
albedo = new RenderTarget(2400, 600, 24 /*depth*/); // in first case, params are 1200, 300, 24
// draw()
RenderTarget::set(albedo); // render to fbo
RenderTarget::clearColor(0.0f, 0.3f, 0.3f, 1.0f);
RenderTarget::clear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// render triangles ...
glDrawArrays(GL_TRIANGLES, 0, 6);
// now it's time to render a fullscreen quad
RenderTarget::set(); // render to back-buffer
RenderTarget::clearColor(0.3f, 0.0f, 0.0f, 1.0f);
RenderTarget::clear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, albedo->texture());
glUniform1i(albedoUnifLoc, 0);
RenderTarget::drawFSQ(); // draw fullscreen quad
I have no cameras of any kind, I don't use glViewport anywhere, I always send coordiantes of the primitives to be drawn in the unit-square space (both x and y coord are in [-1,1] range).
Question is, what am I doing wrong and how do I fix it?
Aside question is, is glViewport in any kind related to currently bound framebuffer? As far as I could understand, that function is just used to set the rectangle area on the window in which the drawing will occur.
Any suggestion would be greatly appreciated. I tried searching for the problem online, the only similar thing was in this SO question, but it hasn't helped me.
You need to call glViewport() with the size of your render target. The only time you can get away without calling it is when you render to the window, and the window is never resized. That's because the default viewport matches the initial window size. From the spec:
In the initial state, w and h are set to the width and height, respectively, of the window into which the GL is to do its rendering.
If you want to render to an FBO with a size different from your window, you have to call glViewport() with the size of the FBO. And when you go back to rendering to the window, you need to call glViewport() with the window size again.
The viewport dimensions are not per framebuffer state. I always thought that would have made sense, but it is not defined that way. So whenever you call glViewport(), you are changing global (i.e. per context) state, independent of the currently bound framebuffer.

C++ GLFW3 fullscreen stretch issue in linux

I am porting one of my game from Windows to Linux using GLFW3 for window creation. The code runs perfectly well when I run it in Windows (using GLFW3 and opengl) but when I compile and run it in ubuntu 12.10, there is an issue in fullscreen mode (in windowed mode it runs well) where the right part (about 25%) of the frame gets stretched and goes off screen.
Here's how I am creating GLFW window:
window = glfwCreateWindow(1024, 768, "Chaos Shell", glfwGetPrimaryMonitor(), NULL);
And here's my opengl initialisation code:
glViewport(0, 0, 1024, 768);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-512.0f, 512.0f, -384.0f, 384.0f, 0.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Above code should load up the game in fullscreen mode with 1024 by 768 resolution.
When I run it, glfwCreateWindow changes the screen resolution from my current screen resolution (1366 by 768) to 1024 by 768, but the right part of the frame goes off screen. If I manually change the resolution to 1024 by 768 and then run the game, everything looks alright.
Also, running this same code in windows doesn't show any issue no matter what my current screen resolution is. It just changes the resolution to 1024 by 768 and then everything looks perfect.
If someone can find why it is acting weird in ubuntu then I will really appreciate...
You're probably running into an issue with the window manager. In short terms, the window manager didn't notice the change the change in resolution and due to the fullscreen flag expands the window to the old resolution.
Or you didn't get 1024Ă—768 at all, because your screen doesn't support it and instead got a smaller, 16:9 resolution. So don't use hardcoded values for setting the viewport.
Honestly: You shouldn't change the screen resolution at all! Hardly anybody uses CRT displays anymore. And for displays using discrete pixels (LCDs, AMOLED, DLP projectors, LCoS projectors) it makes little sense to run them at anything else than their native resolution. So just create a fullscreen window without make the system change the resolution.
When setting the viewport query the actual window size from GLFW instead of relying on your hardcoded values (this actually could also fix your issue with a resolution change).
If you want to reduce the load on the GPU when rendering: Use a FBO to render to a texture of the desired resolution and in a last step draw that texture to a full screen quad, to stretch it up to display size. It looks better than what most screen scalers produce and your game doesn't mess with the rest of the system.
Update due to comment
Setting the screen resolution in response to the game being unable to cope with non 4:3 resolutions is very bad style. It took long enough for large game studios to adopt to wide screens. Which is unacceptable, because it's so easy to fix.
Don't cover up mistakes with forcing something on the user he might not want. And if the user has a nice display give him the opportunity to actually use it!
Your problem is not the display resolution. It's the hard coded viewport and projection setup. You need to fix that.
To fix your "game looks horrible at different resolution" issue you need to set the viewport and projection in response to the window's size. Like this:
int window_width, window_height;
glfwGetWindowSize(window, &window_width, &window_height);
if( 0 == window_width
|| 0 == window_height) {
/* window has no area, so there's nothing to draw to */
return;
}
float const window_aspect = (float)window_width / (float)window_height;
/* we want to draw with a total of 768 units vertically as if we
* were drawing to a screen 768 pixels in height. */
float const projection_scale = 768./2;
glViewport(0, 0, window_width, window_height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho( -aspect * projection_scale,
aspect * projection_scale,
-projection_scale,
projection_scale,
0.0f,
1.0f );