i was trying to draw a cube, using the glDrawElements function, but even the simple code below only gives me a black screen. If helps, i'm programming on XCode 6.
//
// main.cpp
// Copyright (c) 2014 Guilherme Cardoso. All rights reserved.
//
#include <iostream>
#include <OpenGL/OpenGL.h>
#include <GLUT/GLUT.h>
#include <vector>
#include <math.h>
const GLfloat width = 500;
const GLfloat height = 500;
GLubyte cubeIndices[24] = {0,3,2,1,2,3,7,6
,0,4,7,3,1,2,6,5,4,5,6,7,0,1,5,4};
GLfloat vertices[][3] =
{{-1.0,-1.0,-1.0},{1.0,-1.0,-1.0},
{1.0,1.0,-1.0}, {-1.0,1.0,-1.0}, {-1.0,-1.0,1.0},
{1.0,-1.0,1.0}, {1.0,1.0,1.0}, {-1.0,1.0,1.0}};
GLfloat colors[][3] =
{{0.0,0.0,0.0},{1.0,0.0,0.0},
{1.0,1.0,0.0}, {0.0,1.0,0.0}, {0.0,0.0,1.0},
{1.0,0.0,1.0}, {1.0,1.0,1.0}, {0.0,1.0,1.0}};
void display(){
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(3, GL_FLOAT, 0, colors);
glVertexPointer(3, GL_FLOAT, 0, vertices);
//glDrawArrays(GL_QUADS, 0, 24);
glDrawElements(GL_QUADS, 24,GL_UNSIGNED_BYTE, cubeIndices);
glDisableClientState(GL_VERTEX_ARRAY);
glutSwapBuffers();
}
void mouse(int button, int state, int x, int y){
}
void keyboard(unsigned char key, int x, int y){
if(key=='q' || key == 'Q') exit(0);
}
void init(){
glEnable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0,width ,height, 0);
glMatrixMode(GL_MODELVIEW);
glClearColor (1.0, 1.0, 1.0,1.0);
//glColor3f(0.0,0.0,0.0);
}
void idle(){
}
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(width, height);
glutCreateWindow("Assignment 3");
glutPositionWindow(0, 0);
glutDisplayFunc(display);
glutMouseFunc(mouse);
glutKeyboardFunc(keyboard);
glutIdleFunc(idle);
init();
glutMainLoop();
}
i already checked some tutorials, and is no much different of what i'm doing.
You never clear the color and especially the depth buffer. You should do that at the beginning of your display() function.
Also, your matrix setup is a bit weird. You set up some ortho projection for the pixels, but try to draw a cube in the range [-1,1], so it will be 2 pixel wide on the screen.
Actually your code is drawing the cube just fine. Look closely :P
The main issue is your projection. The initial GL viewing volume is a -1 to 1 cube, but the cube you're drawing is -1 to 1. The call to gluOrtho2D defines the projection in OpenGL coordinates, which you make the same as pixels, but since your cube is -1 to 1 it is only two pixels big, the rest being offscreen.
Instead, drop the gluOrtho2D, which sets the Z dimension -1 to 1 and only allows you to set X/Y, and create a slightly bigger projection...
glOrtho(-2, 2, -2, 2, -2, 2);
Note: As #derhass suggests, calling glClear is important especially with depth testing enabled (without it, the cube from the last draw call will hide the updated cube).
Related
I follow the code tutorial from the OpenGL programming book, but it doesn't work. It is showing white rectangle at the top left of my window. Could you please tell me what could be wrong with it?
#include<windows.h>
#include <GL/glut.h>
float yRot=0.0;
void Render()
{
//clear color and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();//load identity matrix
glTranslatef(0.0f,0.0f,-4.0f);//move forward 4 units
//rotate along the y-axis
glRotatef(yRot,0.0f,1.0f,0.0f);
glColor3f(0.0f,0.0f,1.0f); //blue color
glBegin(GL_POLYGON);//begin drawing of polygon
glVertex3f(-0.5f,0.5f,0.0f);//first vertex
glVertex3f(0.5f,0.5f,0.0f);//second vertex
glVertex3f(1.0f,0.0f,0.0f);//third vertex
glVertex3f(0.5f,-0.5f,0.0f);//fourth vertex
glVertex3f(-0.5f,-0.5f,0.0f);//fifth vertex
glVertex3f(-1.0f,0.0f,0.0f);//sixth vertex
glEnd();//end drawing of polygon
yRot+=0.1f;//increment the yRot variable
}
//method the reshape the entire figure.
void reshape(int x, int h){
glViewport(0,0,x,h);
}
void init()
{
glClearColor(0.0,0.0,0.2,0.8);
}
int main(int argc, char** argv)
{
glutCreateWindow("simple triangles");
glutDisplayFunc(Render);
glutReshapeFunc(reshape);
init();
glutMainLoop();
}
First of all, you're not calling glutInit(&argc, argv) in main() before all the other GLUT related calls. Second of all, you're not calling glutSwapBuffers() in Render().
Besides that you aren't changing the projection matrix, and thus don't have the same resize function as the one presented in the beginning of the tutorial.
void Resize(int width, int height)
{
glViewport(0, 0, (GLsizei)width, (GLsizei)height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0f, (GLfloat)width / (GLfloat)height, 1.0f, 1000.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
Changing those things and your code should work.
Write a C++ program which will draw a triangle having vertices at (300,210),
(340,215) and (320,250). Center of the triangle lies at (320,240).
#include <GL/glut.h>
#include <stdlib.h>
void display(void)
{
glClearColor(1,1,0,0);
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(0.5,0,0);
glVertex2f(300.0,210.0);
glVertex2f(340.0,215.0);
glVertex2f(320.0,250.0);
glEnd();
glFlush();
}
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitWindowSize(640,500);
glutInitWindowPosition(1,1);
glutCreateWindow("Triangle");
glutDisplayFunc(display);
glutMainLoop();
return EXIT_SUCCESS;
}
Issue triangle isn't appearing only a yellow screen appears.
Your program needs an appropriate view/projection matrix. glOrtho(0, 640, 480, 0, -1, 1) should do the trick. Ideally it should be called with MatrixMode set to GL_PROJECTION.
The coordinate system in OpenGL is from -1 -> 1. You'll have to convert your coordinates from your desired pixel values.
This can be done by some linear interpolation. Something like this should work:
float c = -1.0 + 2.0*desiredPixel/pixelWidth
You would need to do this conversion for all your triangle coordinates.
Below is a simple and valid triangle code:
glBegin(GL_TRIANGLES);
glColor3f(0.1, 0.2, 0.3);
glVertex3f(0, 0, 0);
glVertex3f(1, 0, 0);
glVertex3f(0, 1, 0);
glEnd();
glMatrixMode(GL_PROJECTION);
gluOrtho2D(0,400,0,500);
This solved my issue mostly 3D perspective was not working i think
I am in the process of building a simple 3D game engine that is built on top of OpenGL, and for windowing and I/O, GLUT. I have run into a problem with the OpenGL accumulation buffer when trying to build a motion-blur option into the engine. Essentially, here is the small block of code that is supposed to do this for me:
glAccum(GL_MULT, 0.99f);
glAccum(GL_ACCUM, 1.0f - 0.99f);
glAccum(GL_RETURN, 1.0f);
I first tried this block of code by planting it in my Render() method, but it showed a corrupt-looking view where only a select few pixels were visible. So, I then tried it with the rest of the source from the website from which I found the code. I still got the same issue. Below is an image of the issue:
Then, I just took out the accumulation buffer portion (the three lines that are supposed to achieve the motion blur), and here is what I got:
Of course, there would be no motion blur since I removed the glAccum() lines, but that at least told me there is either a problem with my graphics card (it doesn't like accumulation buffers?) or those lines of code don't work.
I don't know if it matters, but I am running the code through NetBeans 7.2 (C++) on a MacBook Pro from 2011. Also, I did request an accumulation buffer in the following line:
glutInitDisplayMode(GLUT_DEPTH | GLUT_ACCUM | GLUT_DOUBLE | GLUT_RGBA);
Here is a sample piece of code I just threw together. I'm not sure if something is wrong in the code, and I know I probably didn't use best practices either, but it gets the point across. I still experienced the error with this code:
#include <iostream>
#include <GLUT/GLUT.h>
using namespace std;
float Rotation = 0.0f;
void Reshape(int width, int height)
{
glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1, 1, -1.0f * ((float)height / (float)width), 1.0f * ((float)height / (float)width), 0.1f, 200.0f);
glMatrixMode(GL_MODELVIEW);
}
void Update(int value)
{
Rotation++;
glutPostRedisplay();
glutTimerFunc(17, Update, 0);
}
void InitGL()
{
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glEnable(GL_COLOR_MATERIAL);
glClearDepth(100.0f);
}
void Render(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
gluLookAt(0, 0, 5.0f, 0, 0, 0, 0, 1, 0);
glPushMatrix();
{
glRotatef(Rotation, 0.0, 1.0, 0.0);
/* Render Icosahedron */
glColor3f(0.5f, 0.5f, 0.5f);
glutSolidIcosahedron();
/* Render wireframe */
glColor4f(1.0, 1.0, 1.0, 1.0);
glLineWidth(2.0);
glutWireIcosahedron();
}
glPopMatrix();
/* Blur */
glAccum(GL_MULT, 0.99);
glAccum(GL_ACCUM, 0.01);
glAccum(GL_RETURN, 1.0);
glFlush();
glutSwapBuffers();
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA | GLUT_ACCUM);
glutInitWindowSize(400, 400);
glutCreateWindow("Test");
glutDisplayFunc(Render);
glutReshapeFunc(Reshape);
InitGL();
Reshape(400, 400);
glutTimerFunc(17, Update, 0);
glutMainLoop();
return 0;
}
I'm using a vertex with three floats for the position (XYZ) and three other for the color (RGB):
XYZ RGB
XYZ RGB
...
I'm currently trying to plot a red triangle. Unfortunately I end up with a white window. I think there's a problem with the stride but I can't figure it out. I've tried many values for the stride and the size, still it doesn't seem to display anything.
//main.cpp
#include "data.h"
GLuint ID;
int size,el_size;
void init(){
vector<float>data_vector(18);
data_vector[0]=0; //x
data_vector[1]=0; //y
data_vector[2]=0; //z
data_vector[3]=1;
data_vector[4]=0;
data_vector[5]=0;
data_vector[6]=1; //x
data_vector[7]=0; //y
data_vector[8]=0; //z
data_vector[9]=1;
data_vector[10]=0;
data_vector[11]=0;
data_vector[12]=0; //x
data_vector[13]=1; //y
data_vector[14]=0; //z
data_vector[15]=1;
data_vector[16]=0;
data_vector[17]=0;
size=data_vector.size();
// Init GLEW
if ( glewInit() != GLEW_OK ){
cerr << "Failed to initialize GLEW." << endl;
exit(-1);
}
if ( !glewIsSupported("GL_VERSION_1_5") && !glewIsSupported( "GL_ARB_vertex_buffer_object" ) ){
cerr << "ARB_vertex_buffer_object not supported!" << endl;
exit(-2);
}
glOrtho(-1, 1,1,-1, -5.0f, 5.0f);
glClearColor(1.0f, 1.0f, 1.0f, 0.0f);
glShadeModel(GL_SMOOTH);
glEnableClientState(GL_VERTEX_ARRAY);
glGenBuffers(1,&ID);
glBindBuffer(GL_ARRAY_BUFFER, ID);
glBufferData(GL_ARRAY_BUFFER,size*sizeof(float), &data_vector[0], GL_STATIC_DRAW);
el_size=3*sizeof(data_vector[0]);
}
void reshape(int w, int h){
cout<<"reshape"<<endl;
glViewport(0,0, (GLsizei) w, (GLsizei) h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0.0f, (GLdouble) w, 0.0f, (GLdouble) h);
}
void display(){
cout<<"display"<<endl;
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindBuffer(GL_ARRAY_BUFFER, ID);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, el_size, 0);
glEnableClientState(GL_COLOR_ARRAY);
glColorPointer(3,GL_FLOAT, el_size,(void*)(el_size));
glDrawArrays(GL_TRIANGLES,0,size/6);
glFlush();
}
int main(int argc, char **argv){
cout<<"main"<<endl;
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(500,500);
glutInitWindowPosition(300,300);
glutCreateWindow(argv[0]);
init();
glutDisplayFunc(display);
// glutReshapeFunc(reshape);
glutMainLoop();
return 0;
}
You do appear to be using the incorrect stride. The stride should be the distance from the start of one vertex to the start of the next vertex, or in your case 6 floats.
You've set the stride as el_size, which is only 3 floats.
Also take care that your resize function is using an ortho matrix from 0 to screen width, and your init function is setting it from -1 to 1. If resize ever gets called your scene will become radically different.
One problem I see is that call to glOrtho in the init function. Whatever you intend to do, this surely doesn't do what you want. As a general rule, put all drawing state related commands only into the display function, nowhere else. Setting transformation matrices (and the viewport) should happen there. Doing so saves a lot of headaches later.
The other problem is, that the stride you define is to short. The stride is the distance from vertex to vertex in an interlaced array, not the length of a single attribute. el_size is calculated wrong.
I don't have much OpenGL experience. I am trying to draw a teapot and move a camera around the teapot. To this end I am using the gluLookAt function. The problem is that when I call gluLookAt the screen is blank and I can't see my teapot.
#include "openGLer.h"
void openGLer::simulate(grid* toSim, int* argc, char** argv)
{
myGrid = toSim;
glutInit(argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(400, 400); //Set the window size
glutCreateWindow("");
glutDisplayFunc(display);
glutIdleFunc(display);
glutKeyboardFunc(handleKeypress);
glEnable(GL_DEPTH_TEST);
glutMainLoop();
}
void openGLer::handleKeypress(unsigned char key, //The key that was pressed
int x, int y)
{
switch (key)
{
case 27: exit(0);
}
}
void openGLer::camera()
{
gluLookAt(3, 3, 0,
0, 0, 0,
0, 1, 0
);
}
void openGLer::draw()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
camera();
glutWireTeapot(0.5);
glutSwapBuffers();
}
void openGLer::display()
{
draw();
}
Why does gluLookAt() make the screen blank and how do I fix this? When camera() is not called code performs as expected; with a teapot being displayed.
Have you set up your projection matrix correctly? Otherwise, your call to gluLookAt will cause the teapot to be too far away and therefore be clipped by the far plane.
Try adding this to your initialization code (and also your resize handler to fix the aspect ratio when the window is resized). I've set the far plane at 100, which should be plenty for your teapot.
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective (60.0, width/(float)height, 0.1, 100.0);
glMatrixMode(GL_MODELVIEW);