Iam trying to write a small Opengl program to draw a single triangle using only Vertex Buffer Objects (without using VAO)s but whenever i want to compile it, it only shows a blue screen
Here is my code
#include <iostream>
#include <GLUT/glut.h>
#include <OpenGL/gl3.h>
GLuint VBO;
GLuint VAO;
void display();
float vertex[] = {-1.0, 0.0 , 0.0,
0.0 , 1.0 , 0.0 ,
1.0 , 0.0 , 0.0 };
int main (int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE);
glutInitWindowSize(1000, 400);
glutInitWindowPosition(100, 100);
glutCreateWindow("My First GLUT/OpenGL Window");
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
void display()
{
glClearColor(0, 0, 1,1);
glClear(GL_COLOR_BUFFER_BIT);
glGenBuffers(1,&VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER,9 *sizeof(vertex),vertex, GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 3,GL_FLOAT, GL_TRUE, 0, 0);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
glutSwapBuffers();
};
Three problems:
Your code misses setting the viewport (if the window happens to be created with a size of 0×0 and gets resized only later the initial viewport size will be 0×0).
Your use of the sizeof operator is wrong. vertex is a statically allocated array, and so the sizeof operator will return the total size of the vertex array, nout just the size of a single element. So in that particular case just sizeof(vertex) without multiplying it with 9 would suffice.
And last but not least, and the true cause of your problem:
Where are your shaders? Using generic vertex attributes, and of course mandatory by OpenGL-3 you must supply a valid combination of a vertex and fragment shader. Without those, nothing will render.
Related
I'm attempting to draw a single large triangle in a window in OpenGL. My program compiles and runs, but I get just a black screen in my window.
I've checked and double-checked multiple tutorials and it seems like my steps are correct... Am I missing something obvious?
Here is the program in its entirety:
#include <stdlib.h>
#include <stdio.h>
#include <GL/glew.h>
#include <GLUT/glut.h>
GLuint VBO;
struct vector {
float _x;
float _y;
float _z;
vector() { }
vector(float x, float y, float z) { _x = x; _y = y; _z = z; }
};
void render()
{
glClear(GL_COLOR_BUFFER_BIT);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
glutSwapBuffers();
}
void create_vbo()
{
vector verts[3];
verts[0] = vector(-1.0f, -1.0f, 0.0f);
verts[1] = vector(1.0f, -1.0f, 0.0f);
verts[2] = vector(0.0f, 1.0f, 0.0f);
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGBA);
glutInitWindowSize(1024, 768);
glutInitWindowPosition(100, 100);
glutCreateWindow("Triangle Test");
glutDisplayFunc(render);
glewInit();
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
create_vbo();
glutMainLoop();
return 0;
}
Update: It turns out that drawing this way without a "program" (that is, compiled shader files) produces undefined behavior (the newer your graphics card, the more likely it is to work, however).
Because my card is right on the edge and only supports OpenGL 2.1, it was a little difficult to find an appropriate shader example that would work -- seems like there are many different tutorials out there written at different stages in the evolution of OpenGL.
My vertex shader (entire file):
void main()
{
gl_Position = ftransform();
}
My fragment shader (entire file):
void main()
{
gl_FragColor = vec4(0.4,0.4,0.8,1.0);
}
I used the example LoadShaders function from this OpenGL Tutorial Site to create the program, and now, I, too, can see the triangle!
(Thanks to #chbaker0 for pointing me in the right direction.)
I do not know if this will help you or not but in your create_vbo() function where you have:
glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);
try this instead:
glBufferData( GL_ARRAY_BUFFER, sizeof( verts[0] * 3 ), &verts[0], GL_STATIC_DRAW );
And after this function call add in this function call to the end of your create_vbo() function
// This MUST BE LAST! Used to Stop The Buffer!
glBindBuffer( GL_ARRAY_BUFFER, 0 );
It is hard for me to see your error. In my projects I do have some vbos, but I am also using vaos as well. My code is able to working in OpenGL 2.0 - 4.5 but for the older versions there is a split in logic because of the deprecated functions within the API. I also do not use glut. I hope this helps.
The other thing I noticed too is did you pay attention to your vertex winding order? Meaning are they being used by OpenGL in a CCW order or CW order? Is back face culling turned on or off? There are a lot of elements to consider when setting up and configuring an OpenGL context. It has been a while since I worked with older versions of OpenGL but I do know that once you start working with a specific version or newer you will have to supply your own model view projection matrix, just something to consider.
The issue I ran into was using pipeline features without defining a shader program. The spec says this should work, but on my graphics card it did not did. (See my update in the question for more specifics).
Thanks to all the commenters for nudging me in the right direction.
For the past three hours I am trying to figure out how to draw two different triangles with different colours using shaders in OpenGL and still cannot figure it out. Here is my code:
void setShaders(void)
{
vshader = loadShader("test.vert", GL_VERTEX_SHADER_ARB);
fshader = loadShader("test.frag", GL_FRAGMENT_SHADER_ARB);
vshader2 = loadShader("test2.vert", GL_VERTEX_SHADER_ARB);
fshader2 = loadShader("test2.frag", GL_FRAGMENT_SHADER_ARB);
shaderProg = glCreateProgramObjectARB();
glAttachObjectARB(shaderProg, vshader);
glAttachObjectARB(shaderProg, fshader);
glLinkProgramARB(shaderProg);
shaderProg2 = glCreateProgramObjectARB();
glAttachObjectARB(shaderProg2, vshader2);
glAttachObjectARB(shaderProg2, fshader2);
glLinkProgramARB(shaderProg2);
}
void makeBuffers(void)
{
// smaller orange triangle
glGenBuffers (1, &vbo);
glBindBuffer (GL_ARRAY_BUFFER, vbo);
glBufferData (GL_ARRAY_BUFFER, sizeof(points), points, GL_STATIC_DRAW);
glGenVertexArrays (1, &vao);
glBindVertexArray (vao);
glEnableVertexAttribArray (0);
glBindBuffer (GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer (0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
// larger purple triangle
glGenBuffers (1, &vbo2);
glBindBuffer (GL_ARRAY_BUFFER, vbo2);
glBufferData (GL_ARRAY_BUFFER, sizeof(points2), points2, GL_STATIC_DRAW);
glGenVertexArrays (1, &vao2);
glBindVertexArray (vao2);
glEnableVertexAttribArray (0);
glBindBuffer (GL_ARRAY_BUFFER, vbo2);
glVertexAttribPointer (0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
}
void window::displayCallback(void)
{
Matrix4 m4; // MT = UT * SpinMatrix
m4 = cube.getMatrix(); // make copy of the cube main matrix
cube.get_spin().mult(m4); // mult
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // clear color and depth buffers
glMatrixMode(GL_MODELVIEW);
glLoadMatrixd(cube.get_spin().getPointer()); // pass the pointer to new MT matrix
// draw smaller orange triangle
glUseProgramObjectARB(shaderProg);
glBindVertexArray(vao);
glDrawArrays (GL_TRIANGLES, 0, 3);
glDeleteObjectARB(shaderProg);
// draw the larger purple triangle
glUseProgramObjectARB(shaderProg2);
glBindVertexArray(vao2);
glDrawArrays (GL_TRIANGLES, 0, 3);
glDeleteObjectARB(shaderProg2);
glFlush();
glutSwapBuffers();
}
shaders:
test.vert and test2.vert are the same and are:
#version 120
//varying vec3 vp;
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
test.frag:
#version 120
void main()
{
gl_FragColor = vec4(1.0, 0.5, 0.0, 1.0);
}
test2.frag:
#version 120
void main()
{
gl_FragColor = vec4(0.5, 0.0, 0.5, 1.0);
}
But what I get is two triangles that are coloured purple. What am I doing wrong that causes my smaller orange triangle is getting rewritten in purple colour?
You are deleting the shader programs after you use them in the displayCallback() method:
...
glDrawArrays (GL_TRIANGLES, 0, 3);
glDeleteObjectARB(shaderProg);
...
glDrawArrays (GL_TRIANGLES, 0, 3);
glDeleteObjectARB(shaderProg2);
If drawCallback() is called more than once, which you certainly need to expect since a window will often have to be redrawn multiple times, the shaders will be gone after the first time. In fact, the second one will not be immediately deleted because it is the currently active program. Which explains why it continues to be used for both triangles.
Shader programs are only deleted after glDelete*() is called on them, and they are not referenced as the active program. So on your first glDelete*() call for shaderProg, that program is deleted once you make shaderProg2 active, because shaderProg is then not active anymore, which releases its last reference.
You should not delete the shader programs until shutdown, or until you don't plan to use them anymore for rendering because e.g. you're creating new prgrams. So in your case, you can delete them when the application exits. At least that's often considered good style, even though it's not technically necessary. OpenGL resources will be cleaned up automatically when an application exits, similar to regular memory allocations.
BTW, if you are using at least OpenGL 2.0, all the calls for using shaders and programs are core functionality. There's no need to use the ARB version calls.
i was trying to draw a cube, using the glDrawElements function, but even the simple code below only gives me a black screen. If helps, i'm programming on XCode 6.
//
// main.cpp
// Copyright (c) 2014 Guilherme Cardoso. All rights reserved.
//
#include <iostream>
#include <OpenGL/OpenGL.h>
#include <GLUT/GLUT.h>
#include <vector>
#include <math.h>
const GLfloat width = 500;
const GLfloat height = 500;
GLubyte cubeIndices[24] = {0,3,2,1,2,3,7,6
,0,4,7,3,1,2,6,5,4,5,6,7,0,1,5,4};
GLfloat vertices[][3] =
{{-1.0,-1.0,-1.0},{1.0,-1.0,-1.0},
{1.0,1.0,-1.0}, {-1.0,1.0,-1.0}, {-1.0,-1.0,1.0},
{1.0,-1.0,1.0}, {1.0,1.0,1.0}, {-1.0,1.0,1.0}};
GLfloat colors[][3] =
{{0.0,0.0,0.0},{1.0,0.0,0.0},
{1.0,1.0,0.0}, {0.0,1.0,0.0}, {0.0,0.0,1.0},
{1.0,0.0,1.0}, {1.0,1.0,1.0}, {0.0,1.0,1.0}};
void display(){
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(3, GL_FLOAT, 0, colors);
glVertexPointer(3, GL_FLOAT, 0, vertices);
//glDrawArrays(GL_QUADS, 0, 24);
glDrawElements(GL_QUADS, 24,GL_UNSIGNED_BYTE, cubeIndices);
glDisableClientState(GL_VERTEX_ARRAY);
glutSwapBuffers();
}
void mouse(int button, int state, int x, int y){
}
void keyboard(unsigned char key, int x, int y){
if(key=='q' || key == 'Q') exit(0);
}
void init(){
glEnable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0,width ,height, 0);
glMatrixMode(GL_MODELVIEW);
glClearColor (1.0, 1.0, 1.0,1.0);
//glColor3f(0.0,0.0,0.0);
}
void idle(){
}
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(width, height);
glutCreateWindow("Assignment 3");
glutPositionWindow(0, 0);
glutDisplayFunc(display);
glutMouseFunc(mouse);
glutKeyboardFunc(keyboard);
glutIdleFunc(idle);
init();
glutMainLoop();
}
i already checked some tutorials, and is no much different of what i'm doing.
You never clear the color and especially the depth buffer. You should do that at the beginning of your display() function.
Also, your matrix setup is a bit weird. You set up some ortho projection for the pixels, but try to draw a cube in the range [-1,1], so it will be 2 pixel wide on the screen.
Actually your code is drawing the cube just fine. Look closely :P
The main issue is your projection. The initial GL viewing volume is a -1 to 1 cube, but the cube you're drawing is -1 to 1. The call to gluOrtho2D defines the projection in OpenGL coordinates, which you make the same as pixels, but since your cube is -1 to 1 it is only two pixels big, the rest being offscreen.
Instead, drop the gluOrtho2D, which sets the Z dimension -1 to 1 and only allows you to set X/Y, and create a slightly bigger projection...
glOrtho(-2, 2, -2, 2, -2, 2);
Note: As #derhass suggests, calling glClear is important especially with depth testing enabled (without it, the cube from the last draw call will hide the updated cube).
I'm trying to draw some basic triangles using opengl, but it's not rendering on screen. These are the relevant functions:
glewInit();
glClearColor(0.0, 0.0, 0.0, 1.0);
glFrontFace(GL_CW);
glCullFace(GL_BACK);
glEnable(GL_CULL_FACE);
glEnable(GL_DEPTH_TEST);
Vertex vertices[] = {Vertex(Vector3f(0.0, 1.0, 0.0)),
Vertex(Vector3f(-1.0, -1.0, 0.0)),
Vertex(Vector3f(1.0, -1.0, 0.0))};
mesh.addVertices(vertices, 3);
Pastebin links to Vertex.hpp and Vector3f.hpp:
Vertex.hpp
Vector3f.hpp
/*
* Mesh.cpp:
*/
Mesh::Mesh()
{
glGenBuffers(1, &m_vbo); // unsigned int Mesh::m_vbo
}
void Mesh::addVertices(Vertex vertices[4], int indexSize)
{
m_size = indexSize * 3;
glBindBuffer(GL_ARRAY_BUFFER, m_vbo);
glBufferData(GL_ARRAY_BUFFER, m_size, vertices, GL_STATIC_DRAW);
}
void Mesh::draw()
{
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, m_vbo);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 4 * sizeof(Vertex), 0);
glDrawArrays(GL_TRIANGLES, 0, m_size);
glDisableVertexAttribArray(0);
}
It's just black if I call glClear otherwise just the random noise of a default window. I can make it draw a triangle by using the most primitive method:
glBegin(GL_TRIANGLES);
glColor3f(0.4, 0.0, 0.0);
glVertex2d(0.0, 0.5);
glVertex2d(-0.5, -0.5);
glVertex2d(0.5, -0.5);
glEnd();
That works and displays what it should do correctly, so I guess that at least says my application is not 100% busted. The tutorial I'm following is in Java, and I'm translating it to C++ SFML as I go along, so I guess it's possible that something got lost in translation so to speak, unless I'm just missing something really basic (more likely.)
How do we fix this so it uses the Vertex list to draw the triangle like it's supposed to?
So many mistakes. There are truly a lot of examples, in any language, so why?
const float pi = 3.141592653589793; is member field of Vector3f. Do you realise this is non-static member and it is included in each and every Vector3f you use, so your vectors actually have four elements - x, y, z, and pi? Did you informed GL about it, so it could skip this garbage data? I don't think so.
You using glVertexAttribPointer, but don't have active shader. There is no guarantee that position is in slot 0. Either use glVertexPointer, or use shader with position attribute bound to 0.
void Mesh::addVertices(Vertex vertices[4], int indexSize) what [4] supposed to mean here? While it is not an error, it is at least misguiding.
glBufferData(GL_ARRAY_BUFFER, m_size, vertices, GL_STATIC_DRAW); m_size is 3*3 in your example, while documentation says it should be array size in bytes - which is sizeof(Vertex) * indexSize.
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 4 * sizeof(Vertex), 0); why stride parameter is 4*sizeof(Vertex)? Either set it to 0 or write correct stride - which is sizeof(Vertex).
glDrawArrays(GL_TRIANGLES, 0, m_size); m_size is already [incorrectly] set as "vertex buffer size", while DrawArrays expects number of vertices to draw - which is m_size / sizeof(Vertex) (given m_size is calculated correctly).
I'm following this guide and I'm trying to draw a quad to the screen. I also saw the source code, it's the same and it should work, but in my case nothing is displayed on the screen. I'm using OpenGL 2.0 with a vertex shader that just sets the color to be red in a way that the quad should be visible on the screen.
Before callig glutMainLoop I generate the vertex buffer object:
#include <GL/glut.h>
#include <GL/glew.h>
vector<GLfloat> quad;
GLuint buffer;
void init()
{
// This routine gets called before glutMainLoop(), I omitted all the code
// that has to do with shaders, since it's correct.
glewInit();
quad= vector<GLfloat>{-1,-1,0, 1,-1,0, 1,1,0, -1,1,0};
glGenBuffers(1,&buffer);
glBindBuffer(GL_ARRAY_BUFFER, buffer);
glBufferData(GL_ARRAY_BUFFER,sizeof(GLfloat)*12,quad.data(),GL_STATIC_DRAW);
}
This is my rendering routine:
void display()
{
glClearColor(0,0,0,0);
glClear(GL_COLOR_BUFFER_BIT);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER,buffer);
glVertexAttribPointer(0,3,GL_FLOAT,GL_FALSE,0,0);
// I also tried passing quad.data() as last argument, but nothing to do.
glDrawArrays(GL_QUADS,0,12);
glDisableVertexAttribArray(0);
glutSwapBuffers();
}
The problem is that nothing is drawn to the screen, I just see a black window. The quad should be red because I set the red color in the vertex shader.
So maybe the problem is the count in the glDrawArrays(GL_QUADS, 0, 12); which must be glDrawArrays(GL_QUADS, 0, 4);
I was missing glEnableClientState like this:
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_QUADS,0,12);
glDisableClientState(GL_VERTEX_ARRAY);