fill entire widget opengl - c++

So I'm working with a 2d array, and I'm trying to display it on a widget using opengl. It seems to work fine, but it does not fill the widget properly. Rather than filling it evenly it's moved to the top right as seen in the image below. How can I get this to be centered?
int x = -0.1;
int y = -0.1;
float lengthX = 0.9 / ROW;
float lengthY = 0.9 / COLM;
for (int i = 0; i < ROW; i++) {
for (int j = 0; j < COLM; j++) {
if (arr[i][j] == 1) {
glColor3f(1.0f, 1.0f, 1.0f);
} else {
glColor3f(0.0f, 0.0f, 0.0f);
}
glBegin(GL_QUADS);
// Q3 Q4 Q1 Q2
glVertex2f((-x) + 2 * i * lengthX, (-y) + 2 * j * lengthY);
glVertex2f(x + (2 * i + 1) * lengthX, (-y) + (2 * j + 1) * lengthX);
glVertex2f(x + (2 * i + 1) * lengthX, y + (2 * j + 1) * lengthY);
glVertex2f((-x) + 2 * i * lengthX, y + 2 * j * lengthY);
glEnd();
}
}

First off, your code is wrong with it's variables; Such as int x = -0.1.
Now, to fix the problem just add this to the beginning of your code:
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1, 1, 0, -1, 1);
glMatrixMode(GL_MODELVIEW);
The problem was that you're using the default matrix setup. Basically, it ranges from -1, -1 to 1, 1, instead of 0, 0, to 1, 1. I can't quite read your code, but if you can edit the glOrtho function's first 4 parameters (the last two shouldn't effect anything) to change the visible range of what you're drawing.
Just to explain a bit more, the first parameter sets the left side, the second right, the third bottom, and the fourth top. So glOrtho(0, 600, 800, 0) means setting a vertex to 0, 0 means it's showing on the top left, while a vertex set to 800, 600 will be showing on the bottom right.
Your setup was only showing a part of what it was supposed to show because the center of it, was the corner of your screen.

Related

Calculate grid width given a set of points

I am drawing a grid in opengl using this -
void draw_grid()
{
glColor3f(0.32, 0.32, 0.32);
int iterations = int(WIDTH / grid_width) + 1;
glBegin(GL_LINES);
for (int i = 0; i < iterations; i++)
{
int x = i * grid_width;
glVertex2f(x, 0);
glVertex2f(x, HEIGHT);
}
for (int i = 0; i < iterations; i++)
{
int y = i * grid_width;
glVertex2f(0, y);
glVertex2f(WIDTH, y);
}
glEnd();
}
And, then I plot points at the intersections of this grid.
I am implementing simple dda method for drawing lines.
For endpoints (2,3) and (15, 8) I get this as output -
But, for endpoints (2,3) and (35, 8) I get this -
You can see that for the second case some points get plotted outside the window and thus are not visible.
This happens because I have hardcoded the grid_width.
I understand that more the difference between the endpoints the smaller the grid_width is supposed to be.
But I can't figure out how exactly to calculate the grid_width so that no matter what endpoints are given they are drawn in the bounds of the window.
You have to determine the grid_width depending on the maximum coordinate. The maximum width of the grid is WIDTH / maxC where maxC is the maximum coordinate. e.g.:
grid_width = int(WIDTH / maxC);

Rotating Vertex Array Object not working

I am using vertex arrays to store circle vertices and colors.
Here is the setup function:
void setup1(void)
{
glClearColor(1.0, 1.0, 1.0, 0.0);
// Enable two vertex arrays: co-ordinates and color.
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
// Specify locations for the co-ordinates and color arrays.
glVertexPointer(3, GL_FLOAT, 0, Vertices1);
glColorPointer(3, GL_FLOAT, 0, Colors1);
}
The global declaration of the arrays is here:
static float Vertices1[500] = { 0 };
static float Colors1[500] = { 0 };
The arrays are all set up here (R is the radius, X and Y are the (X,Y) center, and t is the angle parameter of the circle)
void doGlobals1()
{
for (int i = 0; i < numVertices1 * 3; i += 3)
{
Vertices1[i] = X + R * cos(t);
Vertices1[i + 1] = Y + R * sin(t);
Vertices1[i + 2] = 0.0;
t += 2 * PI / numVertices1;
}
for (int j = 0; j < numVertices1 * 3; j += 3)
{
Colors1[j] = (float)rand() / (float)RAND_MAX;
Colors1[j + 1] = (float)rand() / (float)RAND_MAX;
Colors1[j + 2] = (float)rand() / (float)RAND_MAX;
}
}
Finally, this is where the shape is drawn.
// Window 1 drawing routine.
void drawScene1(void)
{
glutSetWindow(win1);
glLoadIdentity();
doGlobals1();
glClear(GL_COLOR_BUFFER_BIT);
glRotatef(15, 1, 0, 0);
glDrawArrays(GL_TRIANGLE_FAN, 0, numVertices1);
glFlush();
}
Without the Rotation, the circle draws just fine. The circle also draws fine with any Scale/Translate function. I suspect there is some special protocol necessary to rotate an object drawn with vertex arrays.
Can anyone tell me where I have gone wrong, what I will need to do in order to rotate the object, or offer any advice?
glRotatef(15, 1, 0, 0);
^ why the X axis?
The default ortho projection matrix has pretty tight near/far clipping planes: -1 to 1.
Rotating your circle of X/Y coordinates outside of the X/Y plane will tend to make those points get clipped.
Rotate around the Z axis instead:
glRotatef(15, 0, 0, 1);

Finding center of image for rotation in opengl

So I have this piece of code, which pretty much draws various 2D textures on the screen, though there are multiple sprites that have to be 'dissected' from the texture (spritesheet). The problem is that rotation is not working properly; while it rotates, it does not rotate on the center of the texture, which is what I am trying to do. I have narrowed it down to the translation being incorrect:
glTranslatef(x + sr->x/2 - sr->w/2,
y + sr->y/2 - sr->h/2,0);
glRotatef(ang,0,0,1.f);
glTranslatef(-x + -sr->x/2 - -sr->w/2,
-y + -sr->y/2 - -sr->h/2,0);
X and Y is the position that it's being drawn to, the sheet rect struct contains the position X and Y of the sprite being drawn from the texture, along with w and h, which are the width and heights of the 'sprite' from the texture. I've tried various other formulas, such as:
glTranslatef(x, y, 0);
The below three switching the negative sign to positive (x - y to x + y)
glTranslatef(sr->x/2 - sr->w/2, sr->y/2 - sr->h/2 0 );
glTranslatef(sr->x - sr->w/2, sr->y - sr->h/2, 0 );
glTranslatef(sr->x - sr->w, sr->y - sr->w, 0 );
glTranslatef(.5,.5,0);
It might also be helpful to say that:
glOrtho(0,screen_width,screen_height,0,-2,10);
is in use.
I've tried reading various tutorials, going through various forums, asking various people, but there doesn't seem to be a solution that works, nor can I find any useful resources that explain to me how I find the center of the image in order to translate it to '(0,0)'. I'm pretty new to OpenGL so a lot of this stuff takes awhile for me to digest.
Here's the entire function:
void Apply_Surface( float x, float y, Sheet_Container* source, Sheet_Rect* sr , float ang = 0, bool flipx = 0, bool flipy = 0, int e_x = -1, int e_y = -1 ) {
float imgwi,imghi;
glLoadIdentity();
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,source->rt());
// rotation
imghi = source->rh();
imgwi = source->rw();
Sheet_Rect t_shtrct(0,0,imgwi,imghi);
if ( sr == NULL ) // in case a sheet rect is not provided, assume it's width
//and height of texture with 0/0 x/y
sr = &t_shtrct;
glPushMatrix();
//
int wid, hei;
glGetTexLevelParameteriv(GL_TEXTURE_2D,0,GL_TEXTURE_WIDTH,&wid);
glGetTexLevelParameteriv(GL_TEXTURE_2D,0,GL_TEXTURE_HEIGHT,&hei);
glTranslatef(-sr->x + -sr->w,
-sr->y + -sr->h,0);
glRotatef(ang,0,0,1.f);
glTranslatef(sr->x + sr->w,
sr->y + sr->h,0);
// Yeah, out-dated way of drawing to the screen but it works for now.
GLfloat tex[] = {
(sr->x+sr->w * flipx) /imgwi, 1 - (sr->y+sr->h *!flipy )/imghi,
(sr->x+sr->w * flipx) /imgwi, 1 - (sr->y+sr->h * flipy)/imghi,
(sr->x+sr->w * !flipx) /imgwi, 1 - (sr->y+sr->h * flipy)/imghi,
(sr->x+sr->w * !flipx) /imgwi, 1 - (sr->y+sr->h *!flipy)/imghi
};
GLfloat vertices[] = { // vertices to put on screen
x, (y + sr->h),
x, y,
(x +sr->w), y,
(x +sr->w),(y +sr->h)
};
// index array
GLubyte index[6] = { 0,1,2, 2,3,0 };
float fx = (x/(float)screen_width)-(float)sr->w/2/(float)imgwi;
float fy = (y/(float)screen_height)-(float)sr->h/2/(float)imghi;
// activate arrays
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
// pass verteices and texture information
glVertexPointer(2, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, tex);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, index);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glPopMatrix();
glDisable(GL_TEXTURE_2D);
}
Sheet container class:
class Sheet_Container {
GLuint texture;
int width, height;
public:
Sheet_Container();
Sheet_Container(GLuint, int = -1,int = -1);
void Load(GLuint,int = -1,int = -1);
float rw();
float rh();
GLuint rt();
};
Sheet rect class:
struct Sheet_Rect {
float x, y, w, h;
Sheet_Rect();
Sheet_Rect(int xx,int yy,int ww,int hh);
};
Image loading function:
Sheet_Container Game_Info::Load_Image(const char* fil) {
ILuint t_id;
ilGenImages(1, &t_id);
ilBindImage(t_id);
ilLoadImage(const_cast<char*>(fil));
int width = ilGetInteger(IL_IMAGE_WIDTH), height = ilGetInteger(IL_IMAGE_HEIGHT);
return Sheet_Container(ilutGLLoadImage(const_cast<char*>(fil)),width,height);
}
Your quad (two triangles) is centered at:
( x + sr->w / 2, y + sr->h / 2 )
You need to move that point to the origin, rotate, and then move it back:
glTranslatef ( (x + sr->w / 2.0f), (y + sr->h / 2.0f), 0.0f); // 3rd
glRotatef (0,0,0,1.f); // 2nd
glTranslatef (-(x + sr->w / 2.0f), -(y + sr->h / 2.0f), 0.0f); // 1st
Here is where I think you are getting tripped up. People naturally assume that OpenGL applies transformations in the order they appear (top-to-bottom), that is not the case. OpenGL effectively swaps the operands everytime it multiplies two matrices:
M1 x M2 x M3
~~~~~~~
(1)
~~~~~~~~~~
(2)
(1) M2 * M1
(2) M3 * (M2 * M1) --> M3 * M2 * M1 (row-major / textbook math notation)
The technical term for this is post-multiplication, it all has to do with the way matrices are implemented in OpenGL (column-major). Suffice it to say, you should generally read glTranslatef, glRotatef, glScalef, etc. calls from bottom-to-top.
With that out of the way, your current rotation does not make any sense.
You are telling GL to rotate 0 degrees around an axis: <0,0,1> (the z-axis in other words). The axis is correct, but a 0 degree rotation is not going to do anything ;)

glRotatef results on a wirecube

I am using a rand() function for and x variable and y variable to be used in glRotatef. Rand is working just fine. What is happening is when I get a 0 value in both X and Y, the cube fluctuates by shrinking and inflating. I have tried a couple of ways to make sure this does not happen but alas, I am here. Here is the function I am working with:
void initRandoms()
{
maxCubes = rand() % (CUBE_HIGH - CUBE_LOW + 1) + CUBE_LOW;
for (int i = 0; i < maxCubes; i++)
{
cubeOrigins[i].x = X_LOW + (float)rand() / (float)RAND_MAX * (X_HIGH - X_LOW);
cubeOrigins[i].y = Y_LOW + (float)rand() / (float)RAND_MAX * (Y_HIGH - Y_LOW);
//cubeOrigins[i].z =
cubeOrigins[i].size = SIZE_LOW + (double)rand() / (double)RAND_MAX * (SIZE_HIGH - SIZE_LOW);
cubeOrigins[i].rotateX = rand() % 2;
cubeOrigins[i].rotateY = rand() % 2;
}
As I said before, each cube will rotate either on the X-axis, Y-axis, XY-axis or shrink and inflate. It is the shrink and inflate I need to remove which corresponds to 0 = X AND 0 = Y. I can have X = 0 OR Y = 0. What I have tried is some if else conditionals but what happens is I take the shrink & inflate out but then all the cubes rotate in the same direction. I am hoping someone can figure out what I am doing wrong and show me what I need to do. I appreciate everyone's help. I will put more code up if needed.
Here is myDisplay function where the above function is used. The above function is also called in main:
void myDisplay()
{
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClear(GL_COLOR_BUFFER_BIT);
for(int i = 0; i < maxCubes; i++)
{
glLoadIdentity();
glTranslatef(0.0f, 0.0f, -120.0);
glTranslatef(cubeOrigins[i].x, cubeOrigins[i].y, cubeZ);
glRotatef(rotateAxis, cubeOrigins[i].rotateX, cubeOrigins[i].rotateY, 0.0f);
//glRotatef(rotateAxis, 1, 1, 0);
glutWireCube(cubeOrigins[i].size);
}
cubeZ += 0.050f;
glutSwapBuffers();
glFlush();
if (cubeZ > 120.0f)
{
cubeZ -= 100.f;
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClear(GL_COLOR_BUFFER_BIT);
for(int i = 0; i < maxCubes; i++)
{
initRandoms();
glLoadIdentity();
glTranslatef(0.0f, 0.0f, -120.0);
glTranslatef(cubeOrigins[i].x, cubeOrigins[i].y, cubeZ);
glRotatef(rotateAxis, cubeOrigins[i].rotateX, cubeOrigins[i].rotateY, 0.0f);
//glRotatef(rotateAxis, 1, 1, 0);
glutWireCube(cubeOrigins[i].size);
}
cubeZ += 0.050f;
glutSwapBuffers();
glFlush();
}
}
Looking at your code, I could see that you are rotating your cubes by using this function:
glRotatef(rotateAxis, cubeOrigins[i].rotateX, cubeOrigins[i].rotateY, 0.0f);
The question is: What happens when both, cubeOrigins[i].rotateX and cubeOrigins[i].rotateY are zero? OpenGL rotates matrix by using a mathematic function called quaternion (http://www.opengl.org/sdk/docs/man2/xhtml/glRotate.xml) and, when x, y and z are zero, I guess (because your are violating one of its premises that is ||x, y, z|| must be 1) it is reduced to this matrix:
|cos(angle) 0 0 0|
| 0 cos(angle) 0 0|
| 0 0 cos(angle) 0|
| 0 0 0 1|
which is essentially a scale matrix. This explain your beforementioned effect of shrinking and inflating.
So, as you observed before, you need to avoid this situation.
Second, you are using, for all cubes, the same rotateAxis. So, every cube that is with same cubeOrigins[i].rotateX and cubeOrigins[i].rotateY will got the same rotations. You need to get some variations here (changing rotateAxis for each cube)
Finally, you dont need to call both:
glutSwapBuffers();
glFlush();
glutSwapBuffers() already call, internally glFlush
first things first pls
(im not sure if you already did but)
before using the C rand() method you need to give it a seed so that you get "real" random numbers using the srand() function

create a 3d playing surface/game board in opengl- c++

I am trying to create a 3D box, that would play the role of a playing board/surface/room for a 3D game in C++, using OpenGl. As a starting point I found some code that does that for a 2D surface. My question would be how to modify the following code to serve my purpose:
for (float i = -width; i + 0.1 <= width; i += 0.1) {
for (float j = -height; j + 0.1 <= height; j+= 0.1) {
glBegin(GL_QUADS);
glNormal3f(0, 1, 0);
glVertex3f(i, 0, j);
glVertex3f(i, 0, j + 0.1);
glVertex3f(i + 0.1, 0, j + 0.1);
glVertex3f(i + 0.1, 0, j);
glEnd();
}
}
Thanks a lot.
You can either use above code 6 times but each time apply different rotation/translation matrix, or you can do it the right way and generate correct geometry for missing 5 walls of your box. There are lots of samples for drawing cubes, ie.:
http://www.opengl.org/resources/code/samples/glut_examples/examples/examples.html