Why is exit(0) giving me errors? - c++

So I've been following a tutorial and when I tried to compile the below code:
#include <glut.h>
#include <iostream>
void render(void);
void keyboard(unsigned char c, int x, int y);
void mouse(int button, int state, int x, int y);
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100, 100);
glutInitWindowSize(640, 480);
glutCreateWindow("Test GLUT App");
glutDisplayFunc(render); // render
glutKeyboardFunc(keyboard);
glutMouseFunc(mouse);
glutMainLoop(); // initialization finished. start rendering
}
void render(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glColor3f(0.5, 0.2, 0.9);
glVertex2f(-0.5, -0.5);
glColor3f(0.1, 0.2, 0.5);
glVertex2f(0.0, -0.5);
glColor3f(0.3, 0.9, 0.7);
glVertex2f(0.0, 0.5);
glEnd();
glutSwapBuffers();
}
void keyboard(unsigned char c, int x, int y)
{
if(c == 27)
{
exit(0);
}
}
void mouse(int button, int state, int x, int y)
{
if(button == GLUT_RIGHT_BUTTON)
{
exit(0);
}
}
I get 3 errors out of nowhere:
Error 1 error C2381: 'exit' : redefinition; __declspec(noreturn) differs c:\program files (x86)\microsoft visual studio 10.0\vc\include\stdlib.h 353
Error 2 error C3861: 'exit': identifier not found ....main.cpp 45
Error 3 error C3861: 'exit': identifier not found ....main.cpp 53
Does anyone see why this error appears? Im using VS2010.

You need to #include <cstdlib>.
edit:
You are probably following a very known tutorial that provides a header file for you.
This will help you then GLUT exit redefinition error

If your Visual Studio throws a build error saying IntelliSense did not identify 'exit' then you must include process.h

Try adding using namespace std to the top. I'm not sure if this will fix it but i had a similar error earlier and that fixed it. good luck.

u need to declare header simple as work , works in my comp
#include <stdlib.h>
#include <cstdlib>
#include <glut.h>
#include <iostream>

Related

OpenGL: Bresenham's Line Drawing Algorithm Implementation

I've been trying to generate a line using Bresenham's Algorithm (Yes, I know in built functions exist, but this is something I've been asked to implement) using the following code.
But for some reason, I am not being able to see the line on the window. I just get an empty window.
I initially tried drawing points with SetPixel() but I was short of 2 arguments(HDC and COLORREF) apart from just the X and Y coordinates. I don't know what the other 2 arguments do, so I had to try something else.
So I used a solution I found here on StackOverflow to generate the point. Though I do not get any compilation error or warnings, the code just doesn't seem to work. How about you try looking for the problem:
#include<iostream>
#include<GL/glut.h>
#include<stdlib.h>
#include<math.h>
using namespace std;
int x00;
int y00;
int xEnd;
int yEnd;
void init(){
glClearColor(1,0,0,0);
glMatrixMode( GL_PROJECTION );
gluOrtho2D(0,500,0,500);
}
void bres()
{
int dx = fabs(xEnd - x00), dy = fabs(yEnd - y00);
int p = 2*dy-dx;
int x, y;
if(x00>xEnd){
x=xEnd;
y=yEnd;
xEnd=x00;
}
else{
x=x00;
y=y00;
}
//Stack Overflow Solution to generate a point:
glBegin(GL_POINTS);
glColor3f(0,0,0);
glVertex2i(x,y);
glEnd();
while(x<xEnd){
x++;
if(p<0){
p = p + 2*dy;
}
else{
y++;
p= p + 2*dy - 2*dx;
}
glBegin(GL_POINTS);
glColor3f(0,0,0);
glVertex2i(x,y);
glEnd();
}
}
void display()
{
glClear(GL_COLOR_BUFFER_BIT);
bres();
glFlush();
}
int main(int argc, char* argv[])
{
cout<<"Enter the co ordinates for 2 points: ";
cin>>x00>>y00>>xEnd>>yEnd;
glutInit(&argc, argv);
glutInitWindowSize(600,600);
glutInitWindowPosition(10,10);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
glutCreateWindow("Bresenham's Algo");
init();
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
Since you are using a double buffered window
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
you have to call glutSwapBuffers instead of glFlush.
If you would use a single buffered window
glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE | GLUT_DEPTH);
then glFlush would work.
Change your code somehow like this:
void display()
{
glClear(GL_COLOR_BUFFER_BIT);
bres();
//glFlush();
glutSwapBuffers();
}

GLSDK - Can't execute example OpenGL program

I am in the process of checking my OpenGL install works fine, but my example program crashes at execution (with no real error message hint). I am using the unofficial GLSDK (http://glsdk.sourceforge.net/docs/html/index.html) distribution and compiled it under windows 8.
The program (http://www.transmissionzero.co.uk/computing/using-glut-with-mingw/)
#include <glload/gl_3_2_comp.h>
#include <GL/freeglut.h>
void keyboard(unsigned char key, int x, int y);
void display(void);
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutCreateWindow("GLUT Test");
glutKeyboardFunc(&keyboard);
glutDisplayFunc(&display);
glutMainLoop();
return EXIT_SUCCESS;
}
void keyboard(unsigned char key, int x, int y)
{
switch (key)
{
case '\x1B':
exit(EXIT_SUCCESS);
break;
}
}
void display()
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0f, 0.0f, 0.0f);
glBegin(GL_POLYGON);
glVertex2f(-0.5f, -0.5f);
glVertex2f( 0.5f, -0.5f);
glVertex2f( 0.5f, 0.5f);
glVertex2f(-0.5f, 0.5f);
glEnd();
glFlush();
}
I know that #include <glload/gl_3_2_comp.h> is the culprit here because if I change this line to #include <GL/gl.h> then the example program runs fine and displays a nice red square on a black background... Alternatively if I remove the contents of the display() function the program runs fine too.
Problem is : I need to use OpenGL 3.x or above API so I can't just include the OS header who is ridiculously outdated (Windows 8).
My Linker settings (in Code::Blocks) :
glloadD
glimgD
glutilD
glmeshD
freeglutD
glu32
opengl32
gdi32
winmm
user32
With include paths :
glsdk\glload\lib
glsdk\glimg\lib
glsdk\glutil\lib
glsdk\glmesh\lib
glsdk\freeglut\lib
And #Defines :
FREEGLUT_STATIC
_LIB
FREEGLUT_LIB_PRAGMAS=0
Based on the GL Load documentation, it looks like you need to explicitly initialize it:
#include <glload/gl_load.h>
...
ogl_LoadFunctions();
where the ogl_LoadFunctions() call needs to be after you set up GLUT.

I tried to make a window using GLUT to use OpenGL, but it gave me errors I didn't understand

I just started using Graphics Library Utility Toolkit, and Open Graphics Library (GLUT+OpenGL).
I wanted to create a window and I tried this code using various tutorials, but it didn't work.
#include <OpenGL/OpenGL.h>
#include <GLUT/GLUT.h>
void display(void) {
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
gluLookAt (0.0, 0.0, 5.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
glFlush();
}
int main (int argc, char **argv[]) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE);
glutInitWindowSize(400, 400);
glutInitWindowPosition(100, 100);
glutCreateWindow("My First GLUT/OpenGL Window");
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
It gave me the error:
cannot convert 'char***' to 'char**' for argument '2' to 'void glutInit(int*, char**)'"
It gave me the error at line 13.
Could anybody please give me some information on this? Thank you!
The error message is telling you that you're passing a pointer-to-pointer-to-pointer-to-char as a second argument to glutInit. That's not what that function expects, it takes a pointer-to-pointer-to-char.
The problem comes from your signature for main. The standard two-argument main function takes an int and a pointer-to-pointer-to-char.
int main(int argc, char **argv)
Which can also be written:
int main(int argc, char *argv[])
(The two forms are equivalent.)
You're adding one more indirection level.

An OpenGL qeustion:why doesn't glutSwapBuffer() function work?

This code is from the red book, example 2-15 (well, the code is not exactly the one in the book). Take care of the note I noted.
#include <fstream>
#include <stdlib.h>
#include <GL/glew.h>
#include <GL/glut.h>
#pragma comment(lib,"glew32.lib")
using namespace std;
#define BUFFER_OFFSET(offset) ((GLubyte *)NULL+offset)
#define XStart -0.8
#define XEnd 0.8
#define YStart -0.8
#define YEnd 0.8
#define NumXPoints 11
#define NumYPoints 11
#define NumPoints (NumXPoints * NumYPoints)
#define NumPointsPerStrip (2*NumXPoints)
#define NumStrips (NumYPoints-1)
#define RestartIndex 0xffff
void display(void)
{
int i,start;
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glColor3f(1,1,1);
glDrawElements(GL_TRIANGLE_STRIP,NumStrips*(NumPointsPerStrip+1),GL_UNSIGNED_SHORT,BUFFER_OFFSET(0));
//glFlush();//it works,show a white square with black backgroud
glutSwapBuffers();///it doesn't work,show what tha area looked like before
}
void init (void)
{
GLuint vbo,ebo;
GLfloat *vertices;
GLushort *indices;
glewInit();
glGenBuffers(1,&vbo);
glBindBuffer(GL_ARRAY_BUFFER,vbo);
glBufferData(GL_ARRAY_BUFFER,2*NumPoints*sizeof(GLfloat),NULL,GL_STATIC_DRAW);
vertices=(GLfloat *)glMapBuffer(GL_ARRAY_BUFFER,GL_WRITE_ONLY);
if(vertices==NULL)
{
fprintf(stderr,"Unable to map vertex buffer\n");
exit(EXIT_FAILURE);
}
else
{
int i,j;
GLfloat dx=(XEnd-XStart)/(NumXPoints-1);
GLfloat dy=(YEnd-YStart)/(NumYPoints-1);
GLfloat *tmp=vertices;
int n=0;
for(j=0;j<NumYPoints;++j)
{
GLfloat y=YStart+j*dy;
for(i=0;i<NumXPoints;++i)
{
GLfloat x=XStart + i*dx;
*tmp++=x;
*tmp++=y;
}
}
glUnmapBuffer(GL_ARRAY_BUFFER);
glVertexPointer(2,GL_FLOAT,0,BUFFER_OFFSET(0));
glEnableClientState(GL_VERTEX_ARRAY);
}
glGenBuffers(1,&ebo);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,ebo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER,NumStrips*(NumPointsPerStrip+1)*sizeof(GLushort),NULL,GL_STATIC_DRAW);
indices=(GLushort *)glMapBuffer(GL_ELEMENT_ARRAY_BUFFER,GL_WRITE_ONLY);
if(indices==NULL)
{
fprintf(stderr,"Unable to map index buffer\n");
exit(EXIT_FAILURE);
}
else
{
int i,j;
GLushort *index=indices;
for(j=0;j<NumStrips;++j)
{
GLushort bottomRow=j*NumYPoints;
GLushort topRow=bottomRow+NumYPoints;
for(i=0;i<NumXPoints;++i)
{
*index++=topRow+i;
*index++=bottomRow+i;
}
*index++=RestartIndex;
}
glUnmapBuffer(GL_ELEMENT_ARRAY_BUFFER);
}
glPrimitiveRestartIndex(RestartIndex);
glEnable(GL_PRIMITIVE_RESTART);
}
void reshape (int w, int h)
{
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
gluOrtho2D (-1,1,-1,1);
glViewport (0,0,w,h);
}
void keyboard(unsigned char key, int x, int y)
{
switch (key) {
case 27:
exit(0);
break;
}
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize (200, 200);
glutInitWindowPosition (100, 100);
glutCreateWindow (argv[0]);
init ();
glutDisplayFunc(display);
glutReshapeFunc(reshape);
glutKeyboardFunc (keyboard);
glutMainLoop();
return 0;
}
as the http://www.opengl.org/resources/libraries/glut/spec3/node21.html says:
An implicit glFlush is done by glutSwapBuffers before it returns. Subsequent OpenGL commands can be issued immediately after calling glutSwapBuffers, but are not executed until the buffer exchange is completed.
If the layer in use is not double buffered, glutSwapBuffers has no effect.
How do I know whether the layer in use is double buffered?give an double buffered example.is this code I wrote is double buffered?
In your main function, you call
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
That needs to be GLUT_DOUBLE. See the documentation for glutInitDisplayMode().

openGL window crash

i am working on openGL in Vc6
every time i run the following simple code output window crashes
#include <stdio.h>
#include <gl/glut.h>
//#include <gl/glaux.h>
void display(void)
{
glColor3f(255.0f,255.0f,255.0f);
glBegin(GL_QUADS);
glVertex3f(0.0f,0.0f,0.0f);
glVertex3f(0.0f,5.0f,0.0f);
glVertex3f(5.0f,5.0f,0.0f);
glVertex3f(5.0f,0.0f,0.0f);
glVertex3f(0.0f,0.0f,0.0f);
glEnd();
glFlush();
}
void init(void)
{
glViewport(0,0,400,400);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0,4/3,4.0,1000.0);
glMatrixMode(GL_MODELVIEW);
gluLookAt(2.0,2.0,2.0,1.0,2.0,1.0,0.0,1.0,0.0);
}
int main(int argc, char *argv[])
{
glutInit(&argc,argv);
init();
glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGB|GLUT_DEPTH);
glutInitWindowPosition(400,400);
glutInitWindowSize(400,400);
glutCreateWindow("Trial");
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
i don't know what is going wrong any boby please help
You are using OpenGL functions before you have an OpenGL context (which is a requirement to call any GL functions at all). The context is created by glutCreateWindow, but your first call to GL functions happens in init(). To fix this, you could move your init() call right below the glutCreateWindow call.