opengl: Trouble setting up viewport and glortho - opengl

I am trying to setup my OpenGL views for some texture rendering. Following some advice on the forum, I set up my viewport and ortho matrix as follows:
First I try to compute the screen width and height that I can use while maintaining the aspect ratio of my image:
void resize(int w, int h)
{
float target_aspect_ratio = image_width / image_height;
width = w;
height = (int)(width / target_aspect_ratio + 0.5f);
if (height > h) {
height = h;
width = (int)(height * target_aspect_ratio + 0.5f);
}
off_x = (w - width)/2.f;
off_y = (h - height)/2;
// I want to center my image. So I have these offsets
glViewport(off_x, off_y, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, width, 0, height, 0.0f, 1.0f);
Now when I want to render my texture I do:
void paint()
{
// texture binding etc.
glTexCoord2f(0, 0); glVertex2f(0, 0);
glTexCoord2f(1, 0); glVertex2f(width, 0);
glTexCoord2f(1, 1); glVertex2f(width, height);
glTexCoord2f(0, 1); glVertex2f(0, height);
}
However, this does not show the image as expected. It does not maintain the aspect ratio as I size the screen. It is almost like the glViewport has no effect and I can verify this function gets called every time my window is resized.
Update:
It is strange. Almost as if these calls have no effect. I even did something as:
_off_x = _off_y = 0;
_width = 500;
_height = 500;
So I expected the viewport to be lower left box of my screen but the image is being drawn as before basically using the whole screen as the viewport.
Update 2:
Ok, so if I call
glViewport(_off_x, _off_y, _width, _height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, _width, 0, _height, 0, 1);
in my paint events, it works as expected! However, I thought it was enough to put this in the resize event handler.

Before start drawing, you need to switch your Matrix mode to GL_MODELVIEW. You don't need to set your projection matrix inside your render function at each frame.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
Here is a detailed analysis that I wrote about glMatrixMode() function modes :
OpenGL glMatrixMode(GL_PROJECTION) vs glMatrixMode(GL_MODELVIEW)

Related

OpenGL 2d rectangle not being rendered

I am trying to render a rectangle onto the screen. When the program is run, only the clear color shows up, and no rectangle.
Here's the code:
glClearColor(0.0, 0.0, 0.0, 0.0);
glViewport(0, 0, 1280, 720);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1280, 720, 0, -10, 10);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT || GL_DEPTH_BUFFER_BIT); //Clear the screen and depth buffer
int x = 100;
int y = 100;
while (!glfwWindowShouldClose(window)) {
glfwPollEvents();
glBegin(GL_QUADS);
glVertex2f(x, y);
glVertex2f(x + 10, y);
glVertex2f(x + 10, y + 10);
glVertex2f(x, y + 10);
glEnd();
gsm->update();
gsm->render();
glfwSwapBuffers(window);
}
It got culled. You had inverted Y axis with your projection, by supplying bottom =720 larger than top 0. Your quad is counterclockwise in your local coordinates, but in normalized coordinates it is clockwise. Remember, projection matrix is a part of global transform matrix! Now, if that's default state, then out of those two winding directions
the GL_CCW is the actual one, it is considered "Front". By default OpenGL culls triangles with mode glCullFace(GL_BACK), and quad internally is considered as pair of triangles).
Either change order of vertices
glBegin(GL_QUADS);
glVertex2f(x, y);
glVertex2f(x, y + 10);
glVertex2f(x + 10, y + 10);
glVertex2f(x + 10, y);
glEnd();
or change culling mode to match left-handedness of your coordinate system or disable culling.
See also:
1. https://www.khronos.org/opengl/wiki/Viewing_and_Transformations
2. The answer to Is OpenGL coordinate system left-handed or right-handed?

Why is my Texture Quad positioned incorrectly with glTexCoord2f using LWJGL?

I am going to make a adventure game in 2D with Grass,Trees and other things if i can make these. My problem is that when i use glTexCoord2f to clamp the texture to a quad then they get seperated from each others about 25 pixels. These quads is supposed to be connected together like any 2D games.
Im loading them with SlickUtil and the size of the Texture is 100x100
Here's my source code for rendering quads and InitGL
public static void Render(){
example--;
GL11.glLoadIdentity();
Color.white.bind();
system.Game.ground.bind();
GL11.glTranslatef(example, 0, 0);
//I used for loop for cloning quad at each side.
for(int x = 0; x <= width; x++){
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(x * 100, 0);
GL11.glTexCoord2f(1, 0);
GL11.glVertex2f(x * 100 + 100, 0);
GL11.glTexCoord2f(1, 1);
GL11.glVertex2f(x * 100 + 100, 100);
GL11.glTexCoord2f(0, 1);
GL11.glVertex2f(x * 100, 100);
GL11.glEnd();
}
GL11.glLoadIdentity();
}
Heres my InitGL Code.
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
GL11.glEnable(GL11.GL_BLEND);
GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
GL11.glViewport(0,0,width,height);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, width, height, 0, 1, -1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
SlickUtil only reads Textures with dimensions that are powers of two. The reason is, that only since OpenGL 2.0, OpenGL could read textures with non-power-of-two dimensions. Try loading a texture with the dimensions of 128 by 128 pixels. Then display it with 100 by 100 pixels (like you did in your example).

How do you adjust an opengl viewport size without scaling its contents

When my window is resized, i don't want the contents to scale but just to increase the view port size. I found this while searching on stackoverflow (http://stackoverflow.com/questions/5894866/resize-viewport-crop-scene) which is pretty much the same as my problem. However I'm confused as to what to set the Zoom to and where, i tried it with 1.0f but then nothing was shown at all :s
This is resize function code at the moment which does scaling:
void GLRenderer::resize() {
RECT rect;
int width, height;
GLfloat aspect;
GetClientRect(hWnd, &rect);
width = rect.right;
height = rect.bottom;
if (height == 0) {
height = 1;
}
aspect = (GLfloat) width / height;
glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0, aspect, 0.1, 100.0);
glMatrixMode(GL_MODELVIEW);
}
And my function to render a simple triangle:
void GLRenderer::render() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslated(0, 0, -20);
glBegin(GL_TRIANGLES);
glColor3d(1, 0, 0);
glVertex3d(0, 1, 0);
glVertex3d(1, -1, 0);
glVertex3d(-1, -1, 0);
glEnd();
SwapBuffers(hDC);
}
You can change the zoom in y (height) with the "field of view" parameter to gluPerspective. The one that is 45 degrees in your code. As it is currently always 45 degrees, you will always get the same view angle (in y). How to change this value as a function of the height of the window is not obvious. A linear relation would fail for big values (180 degrees and up). I would try to use arctan(height/k), where 'k' is something like 500.
Notice also that when you extend the window in x, you will already get what you want (the way your source code currently is). That is, you get a wider field of view. That is because you change the aspect (second argument) to a value depending on the ratio between x and y.
Height and Width is measured in pixels, so a value of 1 is not good.
Notice that you are using deprecated legacy OpenGL. See Legacy OpenGL for more information.

Move 2D Object to a Point in OpenGL

How to move a 2D object in the direction of a point (not a GL_POINTS, but coordinates) using OpenGL?
For a better understanding of my code:
I've splited most of my code into different source codes, but this is the one that is actually creating the shapes and setting the scene:
void setupScene(int clearColor[]) {
glClearColor(clearColor[0], clearColor[1], clearColor[2], clearColor[3]);
//glClearColor(250, 250, 250, 1.0); // Set the cleared screen colour to black.
glViewport(0, 0, WINDOW_WIDTH, WINDOW_HEIGHT); // This sets up the viewport so that the coordinates (0, 0) are at the top left of the window.
// Set up the orthographic projection so that coordinates (0, 0) are in the top left.
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, WINDOW_WIDTH, WINDOW_HEIGHT, 0, -10, 10);
// Back to the modelview so we can draw stuff.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear the screen and depth buffer.
}
void drawScene() {
setupScene((int[]){250, 250, 250, 1});
triangle(210, WINDOW_WIDTH, WINDOW_HEIGHT);
glBegin(GL_QUADS);
glColor3f(RGB(80), RGB(80), RGB(80));
glPushMatrix();
glTranslatef(400, 400, 0);
glVertex2d(200, 100);
glVertex2d(100, 100);
glVertex2d(100, 200);
glVertex2d(200, 200);
glPopMatrix();
glEnd();
glutSwapBuffers(); // Send the scene to the screen.
}
void update(int value) {
glutPostRedisplay(); // Tell GLUT that the display has changed.
glutTimerFunc(25, update, 0); // Tell GLUT to call update again in 25 milliseconds.
}
You need to translate the modelview matrix. Assuming you're in modelview mode already:
glPushMatrix();
glTranslatef(x, y, z);
// Draw your shape
glPopMatrix();
[Edit]
#paddy: Something like this? I tried this but the square isn't moving.
pastebin.com/2PCsy5kC
Try explicitly selecting the modelview matrix. Your example does not tell us which mode it's currently in:
glSetMatrixMode(GL_MODELVIEW);
glPushMatrix();
glTranslatef(x, y, z);
// Draw your shape
glPopMatrix();
Normally at the beginning of your render you reset everything... So you enter the GL_PROJECTION mode, call glLoadIdentity() to reset it and set up your camera, then do this for the GL_MODELVIEW matrix as well.
Answer on the behalf of the OP:
Thanks #paddy, I was trying to understand the use of glTranslatef and came with the solution. Here is the working code, it will create a square at 100x100 and will move it until 400x200:
void setupScene(int clearColor[]) {
glClearColor(clearColor[0], clearColor[1], clearColor[2], clearColor[3]);
//glClearColor(250, 250, 250, 1.0); // Set the cleared screen colour to black.
glViewport(0, 0, WINDOW_WIDTH, WINDOW_HEIGHT); // This sets up the viewport so that the coordinates (0, 0) are at the top left of the window.
// Set up the orthographic projection so that coordinates (0, 0) are in the top left.
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, WINDOW_WIDTH, WINDOW_HEIGHT, 0, -10, 10);
// Back to the modelview so we can draw stuff.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear the screen and depth buffer.
}
int a = 100;
int b = 200;
int x = 100;
int y = 100;
void drawScene() {
setupScene((int[]){250, 250, 250, 1});
triangle(210, WINDOW_WIDTH, WINDOW_HEIGHT);
glPushMatrix();
glTranslatef(x, y, 0);
glBegin(GL_QUADS);
glColor3f(RGB(80), RGB(80), RGB(80));
glVertex2d(b, a);
glVertex2d(a, a);
glVertex2d(a, b);
glVertex2d(b, b);
glEnd();
glPopMatrix();
glutSwapBuffers(); // Send the scene to the screen.
}
void update(int value) {
if (x != 400 && y != 200) {
x += 4;
y += 2;
}
glutPostRedisplay(); // Tell GLUT that the display has changed.
glutTimerFunc(25, update, 0); // Tell GLUT to call update again in 25 milliseconds.
}

gluLookAt trouble

I have some code which draws a line along the x, y and z axes. My problem is that these lines are being clipped so that they are invisible near the origin:
This sounds like a far clipping plane issue, but I gave zFar=50 to gluPerspective, which should be plenty. Making it even larger doesn't seem to help. What else could be causing the clipping?
Here is my code:
import static org.lwjgl.opengl.GL11.*;
import org.lwjgl.opengl.*;
import org.lwjgl.util.glu.GLU;
public class Test {
static int width = 300, height = 200;
public static void main(String[] _) throws Exception {
Display.setDisplayMode(new DisplayMode(width, height));
Display.create();
glClear(GL_COLOR_BUFFER_BIT);
// projection matrix
glMatrixMode(GL_PROJECTION_MATRIX);
glLoadIdentity();
GLU.gluPerspective(50, width / (float) height, .1f, 50);
// modelview matrix
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
GLU.gluLookAt(
.8f, .8f, .8f,
0, 0, 0,
0, 1, 0);
// draw a line for each axis
glBegin(GL_LINES);
// x axis in red
glColor3f(1, 0, 0);
glVertex3i(0, 0, 0);
glVertex3i(10, 0, 0);
// y axis in green
glColor3f(0, 1, 0);
glVertex3i(0, 0, 0);
glVertex3i(0, 10, 0);
// z axis in blue
glColor3f(0, 0, 1);
glVertex3i(0, 0, 0);
glVertex3i(0, 0, 10);
glEnd();
Display.update();
// wait for a close event
while (!Display.isCloseRequested()) {
Thread.sleep(20);
Display.processMessages();
}
Display.destroy();
}
}
Update - Removing glLoadIdentity(); after glMatrixMode(GL_MODELVIEW); gives the desired result, but I don't understand why. Isn't the default modelview matrix the identity matrix?
Update - I wrote a C version of the same code and it works as desired. Why the difference?
Indeed, after testing it, it turns out that glMatrixMode(GL_PROJECTION_MATRIX); should be glMatrixMode(GL_PROJECTION); instead.
So it seems that the modelview was active by default and glLoadIdentity() cleared the results of GLU.gluPerspective(50, width / (float) height, .1f, 50);
edit: Btw. in case you wonder what GL_PROJECTION_MATRIX is for, it's to retrieve the current matrix from the top of the matrix stack with glGetFloatv(GL_PROJECTION_MATRIX,output); or glGetDoublev(GL_PROJECTION_MATRIX,output);