What does in vec and out vec means? - c++

In GLSL I didnt understood what is "in" and "out" variables, what does it mean?
Here is a sample of my code that I copied from a tutorial.
// Shader sources
const GLchar* vertexSource =
"#version 150 core\n"
"in vec2 position;"
"in vec3 color;"
"out vec3 Color;"
"void main() {"
" Color = color;"
" gl_Position = vec4(position, 0.0, 1.0);"
"}";
const GLchar* fragmentSource =
"#version 150 core\n"
"in vec3 Color;"
"out vec4 outColor;"
"void main() {"
" outColor = vec4(Color, 1.0);"
"}";

Variables declared in and out at "file" scope like that refer to stage input/output.
In a vertex shader, a variable declared in is a vertex attribute and is matched by an integer location to a vertex attribute pointer in OpenGL.
In a fragment shader, a variable declared in should match, by name, an output from the vertex shader (same name, but out).
In a fragment shader, a variable declared out is a color output and has a corresponding color attachment in the framebuffer you are drawing to.
In your vertex shader, you have two vertex attributes (position and color) used to compute the interpolated input in the fragment shader (Color). The fragment shader writes the interpolated color to the color buffer attachment identified by outColor.
It is impossible to tell what vertex attributes position and color and what color buffer outColor are associated with from your shader code. Those must be set in GL code through calls like glBindAttribLocation (...) and glBindFragDataLocation (...) prior to linking.

Related

How does the shader know that I'm editing color and not something else

There's this fragment shader which does work, but I do not understand its logic:
#version 330 core
out vec4 FragColor;
// the input variable from the vertex shader (same name and same type)
in vec4 vertexColor;
void main()
{
FragColor = vertexColor;
}
So how does the shader know that FragColor is supposed to represent the color of the shader, there is no assignment anywhere.
I'm saying it because in the vertex shader it's clear what is happening:
#version 330 core
// the position variable has attribute position 0
layout (location = 0) in vec3 aPos;
// specify a color output to the fragment shader
out vec4 vertexColor;
void main()
{
// see how we directly give a vec3 to vec4's constructor
gl_Position = vec4(aPos, 1.0);
// set the output variable to a dark-red color
vertexColor = vec4(0.5, 0.0, 0.0, 1.0);
}
The fragment shader does not know that FragColor represent a "color". For the shader it is just a vector with 4 components. The output(s) of the fragment shader are written into the frambuffer. A fragment shader has no other output (except depth buffer and stencil buffer). Therefore the shader does not need to know that the output variable represents a color.
See also Fragment Shader - Output buffers.

Why comment glBindFragDataLocation, the GL also works correctly?

const GLchar* vertexSource1 = "#version 330 core\n"
"layout (location = 0) in vec2 position;\n"
"layout (location = 1) in vec3 color;\n"
"out vec3 Color;\n"
"void main()\n"
"{\n"
"gl_Position = vec4(position, 0.0, 1.0);\n"
"Color = color;\n"
"}\0";
const GLchar* fragmentSource1 = "#version 330 core\n"
" in vec3 Color;\n"
" out vec4 outColor;\n"
" void main()\n"
" {\n"
" outColor = vec4(Color, 1.0);\n"
" }\n";
GLuint shaderProgram1 = glCreateProgram();
glAttachShader(shaderProgram1, vertexShader1);
glAttachShader(shaderProgram1, fragmentShader1);
// glBindFragDataLocation(shaderProgram1, 0, "Color");
glLinkProgram(shaderProgram1);
Whether I add glBindFragDataLocation or not, the GL works correctly, Why?
Because you're "lucky". The OpenGL specification provides no guarantees about how fragment shader output locations are assigned if you don't assign them. It only says that each one will have a separate location; what locations those are is up to the implementation.
However, considering the sheer volume of code that writes to a single output variable without explicitly assigning it to a location, it's highly unlikely that an OpenGL implementation would ever assign the first FS output location to anything other than 0. So while it isn't a spec guarantee, at this point, it is a de-facto requirement of implementations.
Note: That doesn't mean you shouldn't assign that location manually. It's always best to be on the safe and explicit side.
FYI: layout(location) works for fragment shader outputs too. So you should use that if you're using it on vertex attributes. Then you don't have to worry about doing it from code.

Vertex shaders in and out?

I've got two shaders like this:
const char* vertexShaderData =
"#version 450 \n"
"in vec3 vp;"
"in vec3 color;\n"
"out vec3 Color;\n"
"void main(){"
"Color=color;"
"gl_Position = vec4(vp, 1.0);"
"}";
const char* fragShaderData =
"#version 410\n"
"uniform vec4 incolor;\n"
"in vec3 Color;"
"out vec4 outColor;"
"void main(){"
"outColor = vec4(Color, 1.0);"
"}";
I understand that each shader is called for each vertex.
Where do the in paremters in my vertexShaderData get their values? In no point in the code do I specify what vp is or what color is. In the second shader, I get that the invalue comes from the first shader's out value. But where do thoes inital ins come from?
About the out value of the fragShaderData: How is this value used? In other words, how does OpenGL know that this is an RGB color value and know to paint the triangle with this color?
For the vertex shader,
you can use glGetAttribLocationin C++ to get the driver assigned location or manually set it like this: layout (location = 0) in vec3 vp; in GLSL to get the location for the attribute. Then you upload the data in C++ like this:
// (Vertex buffer must be bound at this point)
glEnableVertexAttribArray( a ); // 'a' would be 0 if you did the latter
glVertexAttribPointer( a, 3, GL_FLOAT, GL_FALSE, sizeof( your vertex ), nullptr );
For the fragment shader,
'in' variables must match vertex shader's 'out' variables, like in your sample code out vec3 Color; -> in vec3 Color;
gl_Position controls where outColor is painted.
You feed the data to the vertex shader from your OpenGL calls (in CPU). Once you compiled the program (vertex shader + fragment shader), you feed the vertex you want.
Different than the vertex shader, this fragment shader will run for once for EVERY pixel inside the triangle you are rendering. The outColor will be a vec4 (R,G,B,A) that "goes to your framebuffer". About the color, in theory, this is abstract for OpenGL. They are called RGBA for convenience... you can even access the same data as XYZW (it's an alias for RGBA). OpenGL will output NUMBERS to the framebuffer you desire (according the rules of color attachments, etc). In fact you will have 4 channels THAT BY THE WAY are used in the monitor to output RGB (and A used for transparency).... In other words, you can used GL programs to create triangles that will output 1 channel, or 2 channels, depending on your needs, and these channels can mean anything you need. For example, you can interpolate and YUV image, or a UV plane (2 channels). If you output these to monitor, you won't have the colors correct, once the monitor is expecting RGB, but the OpenGL concept is abroader than RGB. It will interpolate numbers for every pixel inside the triangle. That's it.

How to draw red lines if I used a fragment shader for texture?

I am writing a simple video player using opengl. I used Qt and followed its basic texture examples.
The vertex and fragment shaders are here:
QOpenGLShader *vshader = new QOpenGLShader(QOpenGLShader::Vertex, this);
const char *vsrc =
"attribute highp vec4 vertex;\n"
"attribute mediump vec4 texCoord;\n"
"varying mediump vec4 texc;\n"
"uniform mediump mat4 matrix;\n"
"void main(void)\n"
"{\n"
" gl_Position = matrix * vertex;\n"
" texc = texCoord;\n"
"}\n";
vshader->compileSourceCode(vsrc);
QOpenGLShader *fshader = new QOpenGLShader(QOpenGLShader::Fragment, this);
const char *fsrc =
"uniform sampler2D texture;\n"
"varying mediump vec4 texc;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = texture2D(texture, texc.st);\n"
"}\n";
fshader->compileSourceCode(fsrc);
And I did this to display a image:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture_cv.cols, texture_cv.rows, GL_RGB, GL_UNSIGNED_BYTE, texture_cv.data);
//then draw a quad
...
Then after this how could I draw several red lines on the screen, since I am using the fragment shader (I am very new to shader), I cannot turn off the texture.
By far the easiest solution is to use a different shader program for drawing your red lines. Since it just draws a solid color, it will be very easy. The fragment shader could be something like:
void main()
{
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}
The vertex shader will be very similar to what you have, except that it does not need to produce texture coordinates. You might even be able to use the existing vertex shader.
It is very common to use multiple shader programs for rendering. You have a shader program for each different type of rendering, and switch between them with glUseProgram().

OpenGL Rotation with vertices not working

I am trying to make a rotation with shaders on vertices, here is the code of my shader :
"#version 150 core\n"
"in vec2 position;"
"in vec3 color;"
"out vec3 Color;"
"uniform mat4 rotation;"
"void main() {"
" Color = color;"
" gl_Position = rotation*vec4(position, 0.0, 2.0);"
"}";
I am using it with a quat, here is the code producing the matrice and dumping it in the shader :
glm::quat rotation(x,0.0,0.0,0.5);
x+=0.001;
ctm = glm::mat4_cast(rotation);
GLint matrix_loc;
// get from shader pointer to global data
matrix_loc = glGetUniformLocation(shaderProgram, "rotation");
if (matrix_loc == -1)
std::cout << "pointer for rotation of shader not found" << matrix_loc << std::endl;
// put local data in shader :
glUniformMatrix4fv(matrix_loc, 1, GL_FALSE, glm::value_ptr(ctm));
But when it rotates, the object gets bigger and bigger, I know i don't need to GetUniformLocation every time i iterate in my loop but this is the code for a test. GlUniformMatrix is supposed to make the rotation happen as far as I know. After these calls i simply draw my vertex array.
Given its still drawing, rotation in the shader is probably a valid matrix. If it were an issue with the uniform it'd probably be all zeroes and nothing would draw.
As #genpfault says, ctm needs to be initialized:
ctm = glm::mat4_cast(rotation);
See: Converting glm quaternion to rotation matrix and using it with opengl
Also, shouldn't the 2.0 in vec4(position, 0.0, 2.0) be a 1.0?