I searched for this and only found a post from 2014 asking about a somewhat similar situation. However, as I couldn't understand what was done there, I'm asking again, specifically for my implementation, hoping this sheds some light on the topic in general as well. I am fairly new to c++ and openGL, so please be so kkind as to excuse stupid mistakes.
I'm trying to implement a simple 2D HUD for my 3D game. Now, my game is fully rendered, due to having a bloom effect in my game, I even rendered my game on a screen quad.
What I now want to do ist placing a HUD over this rendered scene, I, however, can't seem to do that.
My screen quad for the game is drawn like so:
unsigned int quadVAO = 0;
unsigned int quadVBO;
void renderQuad()
{
if (quadVAO == 0)
{
float quadVertices[] = {
// vertex attributes for a quad that fills the entire screen in Normalized Device Coordinates.
// texCoords
0.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
1.0f, 1.0f
};
// setup plane VAO
glGenVertexArrays(1, &quadVAO);
glGenBuffers(1, &quadVBO);
glBindVertexArray(quadVAO);
glBindBuffer(GL_ARRAY_BUFFER, quadVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(quadVertices), &quadVertices, GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float), (void*)0);
}
glBindVertexArray(quadVAO);
glDrawArrays(GL_TRIANGLES, 0, 6);
glBindVertexArray(0);
}
What I tried to do, ist change my renderQuad method to a renderHUDquad one by basically just changing the dimensions of the quad to make it appear in the bottom left corner of the screen.
The code looks as follows:
unsigned int HUDquadVAO = 0;
unsigned int HUDquadVBO;
void renderHUDQuad()
{
if (HUDquadVAO == 0)
{
float HUDquadVertices[] = {
// vertex attributes for a quad that fills the entire screen in Normalized Device Coordinates.
// texCoords
0.0f, 0.02f,
0.0f, 0.0f,
0.2f, 0.0f,
0.0f, 0.02f,
0.2f, 0.0f,
0.2f, 0.02f
};
// setup plane VAO
glGenVertexArrays(1, &HUDquadVAO);
glGenBuffers(1, &HUDquadVBO);
glBindVertexArray(HUDquadVAO);
glBindBuffer(GL_ARRAY_BUFFER, HUDquadVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(HUDquadVertices), &HUDquadVertices, GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float), (void*)0);
}
glBindVertexArray(HUDquadVAO);
glDrawArrays(GL_TRIANGLES, 0, 6);
glBindVertexArray(0);
}
As this only needs to be a small green quad, i.e. a health bar for the player, I was thinking about just assigning it a green texture or sth..
However, when drawing my two quads like this:
// Third pass = Combined bloom pictures
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
bloomShader->use();
// Set uniform for multiple layout uniforms
bloomShader->setUniform("scene", 0);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, colorAndLightBuffers[0]);
// Set uniform for multiple layout uniforms
bloomShader->setUniform("bloomBlur", 1);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, pingpongBuffer[horizontal == 0 ? 1 : 0]);
bloomShader->setUniform("bloom", bloom);
bloomShader->setUniform("exposure", exposure);
renderQuad();
renderHUDQuad();
// Swap buffers
glfwSwapBuffers(window);
I only get the HUD element without any of the stuff I drew before as if the rest of the screen was rendered black. I thought I could just add this to the old buffer, as there a way to do this?
You did screw up your GL state very badly:
void renderHUDQuad() {
if (HUDquadVAO == 0)
{
[...]
glGenVertexArrays(1, &quadVAO);
You actually use quadVAO in the rest of this function, so you overwrite your fullscreen quad by the smaller one, which means the rest of your scene will be scaled down to this quad from the next frame on...
Related
I have looked up almost all related questions regarding flickering in opengl. They all mostly have something to do with z-buffer or perspective projection. However, I'm rendering a single quad on screen that too without depth testing. I update model uniform every frame to the same value and then I get flickering. However if I have my object translate around the screen by updating uniform then it all works fine.
mat4 model = mat4_identity();
model = mat4_translatev(make_vec3(100.0f, 200.0f, 0.0f));
vec4 color = make_vec4(1.0f, 0.8f, 0.7f, 1.0f);
mat4 projection = mat4_ortho(0.0f, 800.0f, 600.0f, 0.0f, -1.0f, 1.0f);
Shader shader("generic_shader.vs", "generic_shader.fs");
shader.use();
//shader.set_vec4("color", &color);
shader.set_mat4("model", &model);
shader.set_mat4("projection", &projection);
float vertices[] = {
0.0f, 0.0f,
0.0f, 200.0f,
200.0f, 0.0f,
200.0f, 200.0f,
};
unsigned int indices[] = {
0, 1, 2,
2, 1, 3,
};
unsigned int vao, vbo, ebo;
glGenVertexArrays(1, &vao);
glGenBuffers(1, &vbo);
glGenBuffers(1, &ebo);
glBindVertexArray(vao);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ebo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);
glBindVertexArray(0);
while (!glfwWindowShouldClose(window))
{
float currentFrame = static_cast<f32>(glfwGetTime());
deltaTime = currentFrame - lastFrame;
while(deltaTime < REQUIRED_FRAME_TIME)
{
currentFrame = static_cast<f32>(glfwGetTime());
deltaTime = currentFrame - lastFrame;
}
deltaTime = currentFrame - lastFrame;
lastFrame = currentFrame;
processInput(window);
glDisable(GL_DEPTH_TEST);
glClearColor(1.0f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
shader.use();
//model = mat4_translatev(make_vec3(16.0f * currentFrame, 12.0f * currentFrame, 0.0f)); // <- if I uncomment this then it does not flicker
shader.set_mat4("model", &model);
glBindVertexArray(vao);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_INT, 0);
glBindVertexArray(0);
glfwPollEvents();
glfwSwapBuffers(window);
}
This is what shader.use does
void Shader::use()
{
glUseProgram(this->program_id);
}
My matrices are column major and this is how the shader function sets the uniform
void Shader::set_mat4(const char* uniform_name, mat4* value)
{
*value = mat4_transpose(*value);
glUniformMatrix4fv(glGetUniformLocation(this->program_id, uniform_name), 1, GL_TRUE, &value->E[0][0]);
}
processInput() doesn't do anything. Consider it as an empty body function.
I'm using my own math library for vector and matrix operations. I trying to learn opengl and have made notes on things I have learned. I hope someone already familiar with how opengl works can help me understand what is happening here.
A gif depicting the flickering. Please note the flickering stops if I uncomment that one line in code as marked above.
Flickering GIF
The problem is not with the OpenGL part of your code, but with the way how you transpose your model matrix.
The following code
*value = mat4_transpose(*value);
will override value with it's transposed representation, which means that every second frame the screen is rendered with a wrong matrix. Stop storing the result in value (use a local variable) and everything should work.
I've been looking into how to make the bubble texture transparent but haven't found anything that would fit in my code. I have tried with glColor4f() but I might be putting it in the wrong place. I'm a total beginner in openGL and I've been given a basic code to which I need to add other objects in order to make a 2D scene. I also don't exactly know what every single line does.
These are the relevant pieces of code:
// Globals
GLuint locT; // location of "T" uniform variable in myShaderProgram
GLuint locT2;
// Textures
GLuint myBubblesTexture = 0;
// Shader program object for applying textures to our shapes
GLuint myShaderProgram;
// Vertex Buffer Object IDs for the ground texture object
GLuint bubblesPosVBO, bubblesColourVBO, bubblesTexCoordVBO, bubblesIndicesVBO;
// 1) Position Array - Store vertices as (x,y) pairs
static GLfloat bubblesVertices[] = {
-0.2f, -0.2f,
-0.2f, 0.2f,
0.2f, -0.2f,
0.2f, 0.2f
};
// 2) Colour Array - Store RGB values as unsigned bytes
static GLubyte bubblesColors[] = {
255, 0, 0, 255,
255, 255, 0, 255,
0, 255, 0, 255,
0, 255, 255, 255
};
// 3) Texture coordinate array (store uv coordinates as floating point values)
static float bubblesTextureCoords[] = {
0.0f, 1.0f,
0.0f, 0.0f,
1.0f, 1.0f,
1.0f, 0.0f
};
// 4) Index Array - Store indices to quad vertices - this determines the order the vertices are to be processed
static GLubyte bubblesVertexIndices[] = { 0, 1, 2, 3 };
void init(int argc, char* argv[]);
void setupBubblesTextureVBO(void);
void display(void);
void drawTexturedBubblesVBO(void);
int _tmain(int argc, char* argv[])
{
init(argc, argv);
glutMainLoop();
// Shut down COM
shutdownCOM();
return 0;
}
void init(int argc, char* argv[]) {
// Initialise COM so we can use Windows Imaging Component
initCOM();
// Initialise FreeGLUT
glutInit(&argc, argv);
glutInitContextVersion(3, 3);
glutInitContextProfile(GLUT_COMPATIBILITY_PROFILE);
glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE);
glutInitWindowSize(1200, 800);
glutInitWindowPosition(64, 64);
glutCreateWindow("Funky Fish");
// Register callback functions
glutDisplayFunc(display);
// Initialise GLEW library
GLenum err = glewInit();
// Setup colour to clear the window
glClearColor(0.2f, 0.2f, 0.8f, 0.0f);
glLineWidth(9.0f);
//Load textures
myBubblesTexture = fiLoadTextureA("bubbles.png");
//Shader setup
myShaderProgram = setupShaders(string("Shaders\\basic_vertex_shader.txt"), string("Shaders\\basic_fragment_shader.txt"));
// Get uniform location of "T" variable in shader program (we'll use this in the play function to give the uniform variable "T" a value)
locT = glGetUniformLocation(myShaderProgram, "T");
//Setup the bubbles using VBO
setupBubblesTextureVBO();
}
// our bubbles
void setupBubblesTextureVBO(void) {
// setup VBO for the quad object position data
glGenBuffers(1, &bubblesPosVBO);
glBindBuffer(GL_ARRAY_BUFFER, bubblesPosVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(bubblesVertices), bubblesVertices, GL_STATIC_DRAW);
// setup VBO for the quad object colour data
glGenBuffers(1, &bubblesColourVBO);
glBindBuffer(GL_ARRAY_BUFFER, bubblesColourVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(bubblesColors), bubblesColors, GL_STATIC_DRAW);
// setup VBO for the quad object texture coord data
glGenBuffers(1, &bubblesTexCoordVBO);
glBindBuffer(GL_ARRAY_BUFFER, bubblesTexCoordVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(bubblesTextureCoords), bubblesTextureCoords, GL_STATIC_DRAW);
// setup quad vertex index array
glGenBuffers(1, &bubblesIndicesVBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, bubblesIndicesVBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(bubblesVertexIndices), bubblesVertexIndices, GL_STATIC_DRAW);
}
void display(void) {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// draw our bubble
drawTexturedBubblesVBO();
glutSwapBuffers();
}
void drawTexturedBubblesVBO(void) {
glUseProgram(myShaderProgram);
glEnable(GL_TEXTURE_2D);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
// Move our bubble to the centre of the screen
GUMatrix4 T = GUMatrix4::translationMatrix(0.0f, 0.0f, 0.0f);
glUniformMatrix4fv(locT, 1, GL_FALSE, (GLfloat*)&T);
// Bind each vertex buffer and enable
// The data is still stored in the GPU but we need to set it up (which also includes validation of the VBOs behind-the-scenes)
// Bind texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, myBubblesTexture);
glUniform1i(glGetUniformLocation(myShaderProgram, "texture"), 0);
glEnable(GL_TEXTURE_2D);
glBindBuffer(GL_ARRAY_BUFFER, bubblesPosVBO);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)0);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, bubblesColourVBO);
glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 0, (const GLvoid*)0);
glEnableVertexAttribArray(1);
glBindBuffer(GL_ARRAY_BUFFER, bubblesTexCoordVBO);
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)0);
glEnableVertexAttribArray(2);
// Bind the index buffer
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, bubblesIndicesVBO);
// Draw the object - same function call as used for vertex arrays but the last parameter is interpreted as an offset into the currently bound index buffer (set to 0 so we start drawing from the beginning of the buffer).
glDrawElements(GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_BYTE, (GLvoid*)0);
// glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glDisable(GL_TEXTURE_2D);
// use to force disable our shaderprogram
// glUseProgram(0);
}
The code looks OK so:
Does your Texture have an alpha channel?
Does your OpenGL context pixel format has Alpha channel buffer?
What kind of transparency you want (whole texture have the same transparency, or the transparency should be modulated)?
The glColor4f(1.0,1.0,1.0,alpha) will work only if your textures are configured to be modulated by glColor and have nonzero alpha channel set so you need to add:
glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);
call after binding the texture to make it work.
In case you do not have alpha channel (in texture or in pixelformat) you can use:
glBlendFunc(GL_SRC_COLOR, GL_ONE_MINUS_SRC_COLOR);
Also take a look at:
OpenGL - How to create Order Independent transparency?
I know in 2.0- openGL we can draw a line simply like this.
glBegin(GL_LINES);
glVertex3f(20.0f,150.0f,0.0f);
glVertex3f(220.0f,150.0f,0.0f);
glVertex3f(200.0f,160.0f,0.0f);
glVertex3f(200.0f,160.0f,0.0f);
glEnd();
but how to do similar thing in modern openGL(3.0+)
I have read Drawing round points using modern OpenGL but the answer is not about certain point,since I want to draw polygon with points have certain coordinates,it's not quite helpful.
I use this code,but it shows nothing except a blue background.what do I missed?
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
static const GLfloat g_vertex_buffer_data[] = {
20.0f, 150.0f, 0.0f, 1.0f,
220.0f, 150.0f, 0.0f, 1.0f,
200.0f, 160.0f, 0.0f, 1.0f
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
do{
// Clear the screen
glClear( GL_COLOR_BUFFER_BIT );
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
4, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_LINES, 0, 2); // 3 indices starting at 0 -> 1 triangle
glDisableVertexAttribArray(0);
// Swap buffers
glfwSwapBuffers(window);
} // Check if the ESC key was pressed or the window was closed
while( glfwGetKey(window, GLFW_KEY_ESCAPE ) != GLFW_PRESS &&
glfwWindowShouldClose(window) == 0 );
1) You have to define an array of vertices, that contain the points of your polygon lines. Like in your example:
GLfloat vertices[] =
{
20.0f, 150.0f, 0.0f, 1.0f,
220.0f, 150.0f, 0.0f, 1.0f,
200.0f, 160.0f, 0.0f, 1.0f
};
2) You have to define and bind a Vertex Buffer Object (VBO) to be able to pass your vertices to the vertex shader. Like this:
// This is the identifier for your vertex buffer
GLuint vbo;
// This creates our identifier and puts it in vbo
glGenBuffers(1, &vbo);
// This binds our vbo
glBindBuffer(GL_ARRAY_BUFFER, vbo);
// This hands the vertices into the vbo and to the rendering pipeline
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
3) Now we are ready to draw. Doing this:
// "Enable a port" to the shader pipeline
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
// pass information about how vertex array is composed
glVertexAttribPointer(0, // same as in glEnableVertexAttribArray(0)
4, // # of coordinates that build a vertex
GL_FLOAT, // data type
GL_FALSE, // normalized?
0, // stride
(void*)0);// vbo offset
glDrawArrays(GL_LINES, 0, 2);
glDisableVertexAttribArray(0);
Step 1) and 2) can be done before rendering as initialization. Step 3) is done in your rendering loop. Also you'll need a vertex shader and a fragment shader to visualize the line with color.
If you don't know anything about these things and like to start with OpenGL 3, I'd suggest to start over with a tutorial like this:
http://www.opengl-tutorial.org/beginners-tutorials/tutorial-1-opening-a-window/
I write a program to draw a simple triangle and I use VAO、VBO and GLSL shaders. The result is the following:
But if I enable depth test using:
glEnable(GL_DEPTH_TEST)
nothing appears in the window.
Now I post some code of my program:
float positionData[] = {
-0.8f, -0.8f, 0.0f,
0.8f, -0.8f, 0.0f,
0.0f, 0.8f, 0.0f };
float colorData[] = {
1.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 1.0f };
void initVBO()
{
// Create and populate the buffer objects
GLuint vboHandles[2];
glGenBuffers(2, vboHandles);
GLuint positionBufferHandle = vboHandles[0];
GLuint colorBufferHandle = vboHandles[1];
glBindBuffer(GL_ARRAY_BUFFER,positionBufferHandle);
glBufferData(GL_ARRAY_BUFFER,9 * sizeof(float),
positionData,GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER,colorBufferHandle);
glBufferData(GL_ARRAY_BUFFER,9 * sizeof(float),
colorData,GL_STATIC_DRAW);
glGenVertexArrays(1,&vaoHandle);
glBindVertexArray(vaoHandle);
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glBindBuffer(GL_ARRAY_BUFFER, positionBufferHandle);
glVertexAttribPointer( 0, 3, GL_FLOAT, GL_FALSE, 0, (GLubyte *)NULL );
glBindBuffer(GL_ARRAY_BUFFER, colorBufferHandle);
glVertexAttribPointer( 1, 3, GL_FLOAT, GL_FALSE, 0, (GLubyte *)NULL );
}
void display()
{
glClear(GL_COLOR_BUFFER_BIT);
glBindVertexArray(vaoHandle);
glDrawArrays(GL_TRIANGLES,0,3);
glBindVertexArray(0);
glutSwapBuffers();
}
My question is : why I cannot draw the triangle after enabling depth test?
There are multiple (types of) buffers used when rendering, typically. One is the color buffer, which contains the pixel data in some pixel format (IE: RGB with 8 bits for each color channel). Another typical buffer used is the depth buffer. Depth testing and writing to the depth buffer are two different things. Depth testing checks the depth value from a pixel against the depth value of the associated pixel(s) in the depth buffer and decides whether to accept or reject the pixel/fragment. Depth writing actually writes that value to a buffer, such as the depth buffer.
Your program probably writes to the depth buffer and test the depth buffer, but you never clear out the depth buffer, so it believes that, even though the color buffer has been cleared, that there are already things written to it that are at/in front (or whatever is configured) of the pixels you're trying to write to, so it rejects them.
Clear your depth buffer each frame, typically. You do this by passing the GL_DEPTH_BUFFER_BIT flag to glClear.
You need to explicitly clear the depth buffer, too:
glClear(GL_COLOR_BUFFER_BIT |
GL_DEPTH_BUFFER_BIT)
I'm just using the opengl SDL template with Xcode, and everything runs fine. I removed the Atlantis code, and changed the main extension to .mm, then added some testing code to drawGL. Drawing a simple triangle (using immediate mode) at this point inside drawGL gives me a white triangle, but when I add the code to draw using a vertex buffer object, i just get a black window.
Here is my VBO drawing code:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear The Screen And The Depth Buffer
glLoadIdentity();
GLuint buffer;
float vertices[] = {
0.0f, 1.0f, 0.0f,
-1.0f,-1.0f, 0.0f,
1.0f,-1.0f, 0.0f
};
// VBO doesn't work :(
glGenBuffers(1, &buffer);
glBindBuffer(GL_ARRAY_BUFFER, buffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 9, vertices, GL_STATIC_DRAW);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableClientState(GL_VERTEX_ARRAY);
Your glVertexPointer() call looks suspect for VBO usage. I think you need a BUFFER_OFFSET construct instead of the vertices pointer.