Is it possible to switch from "vbo mode" to immediate mode in opengl and backward?
When debugging, I find it easier to just go glBegin(...) than to setup vaos, vbos etc..
In my init() function, there's this:
if (!(GLEW_ARB_vertex_shader && GLEW_ARB_fragment_shader))
{
Log::instance() << "glsl not ready.\n";
return false;
}
Does it initiate the shaders state and means I can't go back to fixed function pipeline?
Edit:
My initialization:
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_MULTISAMPLE);
glutInitWindowSize(1600,900);
glutInitWindowPosition(200, 50);
glutCreateWindow("OpenGL4");
glutIgnoreKeyRepeat(0);
glutKeyboardUpFunc(keyboardUp);
glutSpecialFunc(specialKeyboard);
glutSpecialUpFunc(specialKeyboardUp);
glutSetCursor(GLUT_CURSOR_NONE);
...
GLenum result = glewInit();
if(result != GLEW_OK) {
Log::instance() << "glewInit() error.\n";
return false;
}
if (!(GLEW_ARB_vertex_shader && GLEW_ARB_fragment_shader))
{
Log::instance() << "glsl not ready.\n";
return false;
}
//opengl stuff
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glClearColor(0.15f, 0.15f, 0.15f, 1.0f);
}
Related
The background color of my scene is black. How can I change this color?
Looks like I'm doing something wrong because glClearColor() function is not working: I tried to change the values but nothing happened. I'm new to OpenGL and programming in general.
#include <GL/glut.h>
void Ayarlar(void);
void CizimFonksiyonu(void);
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowPosition(200, 200);
glutInitWindowSize(400, 400);
glutCreateWindow("ilk OpenGL programim");
glutDisplayFunc(CizimFonksiyonu);
glutMainLoop();
Ayarlar();
return 0;
}
void Ayarlar(void) {
glClearColor(1 ,0 ,0 , 1);
glShadeModel(GLU_FLAT);
}
void CizimFonksiyonu(void) {
glClear(GL_COLOR_BUFFER_BIT);
glFlush();
}
Ayarlar() has to be called be for glutMainLoop(). glutMainLoop enters the GLUT event processing loop and never returns. You have to set the OpenGL states before.
glutDisplayFunc(CizimFonksiyonu);
Ayarlar();
glutMainLoop();
I am writting a C++ application which works with OpenGL on Mac OS X.
I have tried GLFW and Freeglut for window management.
Both glfw and freeglut have been installed with brew
There is something i do not understand.
Here is my C++ Code for FreeGlut:
int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitContextVersion (3, 3);
glutInitContextFlags (GLUT_CORE_PROFILE | GLUT_DEBUG);
glutInitWindowSize(WIDTH, HEIGHT);
glutCreateWindow("Test");
glewExperimental = GL_TRUE;
GLenum err = glewInit();
if (GLEW_OK != err)
{
return -1;
}
cout < <"GL_SHADING_LANGUAGE_VERSION: "<< glGetString (GL_SHADING_LANGUAGE_VERSION) << endl;
...
There is the output:
GL_SHADING_LANGUAGE_VERSION: 1.20
And here is my C++ code with GLFW:
int main(int argc, const char * argv[])
{
if (!glfwInit())
{
return -1;
}
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
window = glfwCreateWindow(640, 480, "Test", NULL, NULL);
if (window == NULL) {
return -1;
}
glfwMakeContextCurrent(window);
glewExperimental = true;
if (glewInit() != GLEW_OK) {
return -1;
}
std::cout << "GL_SHADING_LANGUAGE_VERSION: " << glGetString(GL_SHADING_LANGUAGE_VERSION) << std::endl;
Here is the output:
GL_SHADING_LANGUAGE_VERSION: 4.10
My question is why the GLSL version is not the same ?
Thanks
The glut initialization is wrong. GLUT_CORE_PROFILE is not a valid parameter for glutInitContextFlags. The correct code should look like this:
glutInitContextVersion(3, 3);
glutInitContextProfile(GLUT_CORE_PROFILE);
glutInitContextFlags(GLUT_DEBUG);
Source
Also note, that you are not requesting the same profile in both examples. The GLUT examples asks for 3.3 Core with Debug while the glfw example asks for 3.3 Core with Forward Compatibility.
I'm a complete novice in OpenGL and have started following a book on this topic. The first program that they demonstrate is the Sierpinski Gasket program. Here's what the code looks like in the book:
#include<GL/glut.h>
#include<stdlib.h>
void myinit()
{
//attributes
glClearColor(1.0,1.0,1.0,1.0);
glColor3f(1.0,0.0,0.0); //draw in red
//set up viewing
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0.0,50.0,0.0,50.0);
glMatrixMode(GL_MODELVIEW);
}
void display()
{
GLfloat vertices[3][2]={{0.0,0.0},{25.0,50.0},{50.0,0.0}}; //triangle
GLfloat p[2]={7.5,5.0}; //arbitrary point
glClear(GL_COLOR_BUFFER_BIT); //clear the window
glBegin(GL_POINTS);
//Gasket Program
for(int i=0;i<5000;i++){
int j=rand()%3;
//compute points halfway between random vertex and arbitrary point
p[0]=(vertices[j][0]+p[0])/2;
p[1]=(vertices[j][1]+p[1])/2;
//plot the point
glVertex2fv(p);
}
glEnd();
glFlush();
}
int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE);
glutInitWindowSize(500,500);
glutInitWindowPosition(0,0);
glutCreateWindow("Sierpinski Gasket");
glutDisplayFunc(display);
myinit();
glutMainLoop();
return 0;
}
However, when I compile the program it only displays a window completely filled with white and nothing else. Why isn't the above code working the way it should?
Swap glFlush() for glutSwapBuffers():
#include<GL/glut.h>
void display()
{
glClearColor(1.0,1.0,1.0,1.0);
glClear(GL_COLOR_BUFFER_BIT); //clear the window
//set up viewing
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0.0,50.0,0.0,50.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
GLfloat vertices[3][2]={{0.0,0.0},{25.0,50.0},{50.0,0.0}}; //triangle
GLfloat p[2]={7.5,5.0}; //arbitrary point
glColor3f(1.0,0.0,0.0); //draw in red
glBegin(GL_POINTS);
//Gasket Program
for(int i=0;i<5000;i++)
{
int j=rand()%3;
//compute points halfway between random vertex and arbitrary point
p[0]=(vertices[j][0]+p[0])/2;
p[1]=(vertices[j][1]+p[1])/2;
//plot the point
glVertex2fv(p);
}
glEnd();
glutSwapBuffers();
}
int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutInitWindowSize(500,500);
glutCreateWindow("Sierpinski Gasket");
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
glFlush() won't actually swap the back/front buffers requested by GLUT_DOUBLE. You need glutSwapBuffers() for that.
#include <stdio.h>
#include<stdlib.h>
#include<windows.h>
#include<GL/glut.h>
void display (void)
{
glClearColor(1.f, 0.f, 0.f, 1.f);
glEnd();
glFlush();
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE);
glutInitWindowSize(500,500);
glutInitWindowPosition(100,100);
glutCreateWindow("Colorcube Viewer");
glutDisplayFunc(display);
glEnable(GL_DEPTH_TEST);
glutMainLoop();
return 0;
}
i am not able to figure out whats the problem with this code ?
it does not give me a red window.
You need to call glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); after setting the clear color (because you have depth test enabled, make sure that you're clearing both the color buffer AND the depth buffer
Hi my program is supposed to display a solid red colored sphere in the center of the screen, all i am getting is the boundary of the sphere :
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGBA);
glutInitWindowSize(800,600);
glutInitWindowPosition(0,0);
glutCreateWindow("Sphere");
glutDisplayFunc(renderScene);
glutReshapeFunc(changeSize);
glutMainLoop();
return 0;
}
void renderScene() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glColor3f(1.0f,0.0f,0.0f);
glutSolidSphere(2.5, 50, 40);
glutSwapBuffers();
}
Try adding glPolygonMode(GL_FRONT_AND_BACK, GL_FILL); before your glutSolidSphere(2.5, 50, 40);
What do you mean "boundary"?
Solid doesn't mean filled, it means the surface contains no openings. This is in contrast to glutWireSphere, which is just the wire frame.