I'm currently learning the differences between OpenGL 2 and 3, and I noticed that many functions like glVertex, glVertexPointer, glColor, glColorPointer, etc. have disappeared.
I'm used to using Cg to handle shaders. For example I'd write this simple vertex shader:
void main(in inPos : POSITION, out outPos : POSITION) {
outPos = inPos;
}
And then I'd use either glVertex or glVertexPointer to set the values of inPos.
But since these functions are no longer available in OpenGL 3, how are you supposed to do the bindings?
First I'll recommend you to take a look at the answer to this question: What's so different about OpenGL 3.x?
Secondly, Norbert Nopper has lots of examples on using OpenGL 3 and GLSL here
Finally here's a simple GLSL example which shows you how to bind both a vertex and a fragment shader program.
Related
I'm new to OpenGL, and I'm trying to understand vertex and fragment shaders. It seems you can use a vertex shader to make a gradient if you define the color you want each of the vertices to be, but it seems you can also make gradients using a fragment shader if you use the FragCoord variable, for example.
My question is, since you seem to be able to make color gradients using both kinds of shaders, which one is better to use? I'm guessing vertex shaders are faster or something since everyone seems to use them, but I just want to make sure.
... since everyone seems to use them
Using vertex and fragment shaders are mandatory in modern OpenGL for rendering absolutely everything.† So everyone uses both. It's the vertex shader responsibility to compute the color at the vertices, OpenGL's to interpolate it between them, and fragment shader's to write the interpolated value to the output color attachment.
† OK, you can also use a compute shader with imageStore, but I'm talking about the rasterization pipeline here.
Hey I currently have a system in OpenGL that uses glBlendFunc for bleding diffrent shaders but I would like to do something like this
fragColor = currentColor * lightAmount;
I tried to use gl_Color but its depricated and my engine will not let me use it.
According to this document there is no built-in access for the fragment color in the fragment shader.
What you could do is render your previous passes in another textures, send those textures to the GPU (as uniforms) and do the blending in your last pass.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I want to use shaders to use in WebGL and specifically three.js. Is there a specific version of GLSL that WebGL and three.js uses?
WebGL shaders follow the GLSL ES 1.017 spec
https://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf
That's different than Desktop OpenGL in several ways. One it's the 1.0 version of GLSL ES where as desktop GL at version 4.2 of GLSL (not ES)
One big difference between WebGL GLSL and many articles found about shaders on the internet is there's no fixed function pipeline in OpenGL ES 2.0 and therefore no fixed function pipeline in WebGL.
The fixed function pipeline is left over from OpenGL 1.0 where you'd use commands like glLight and glVertex and glNormal. And then your shaders needed a way to reference that data. In OpenGL ES and WebGL all that is gone because everything a shader does is 100% up to the user. All WebGL does is let you define your own inputs (attributes, uniforms) and name them whatever you want.
WebGL2 shaders follow the GLSL ES 3.00 spec
https://www.khronos.org/registry/OpenGL/specs/es/3.0/es_spec_3.0.pdf
As for three.js, three.js is a 3d engine and provides its own set of standard inputs, names, and other features when it generates a shader. See the docs for some of the details. The uniforms and attributes provided by default are documented here. You can also look at the source or check out an example.
Three.js also provides something called a RawShaderMaterial which does not add any predefined things apparently in which case you just write standard WebGL GLSL.
You can find three.js's standard attributes and uniforms here.
As for a place to learn GLSL I don't really have a suggestion. It really depends on your level of experience with programming in general and how you like to learn. I learn by looking at examples better than reading manuals. Maybe someone else can add some links.
Shaders as a concept are pretty simple. You create a pair of shaders, setup them up with inputs, call gl.drawArrays or gl.drawElements and pass in a count. Your vertex shader will be called count times and needs to set gl_Position. Every 1 to 3 times it's called WebGL will then draw a point, line, or triangle. To do this it will call your fragment shader asking for each pixel it's about to draw what color to make that pixel. The fragment shader needs to set gl_FragColor. The shaders get data from attributes, uniforms, textures and varyings. attributes are per vertex data. They pull their data from buffers, one piece of data per attribute per iteration of your vertex shader. Uniforms are like setting global variables before the shader runs. You can pass data from a vertex shader to a fragment shader with varying. That data will be interpolated or varied ;) between the values set for each vertex of a primitive (triangle) as the fragment shader is called to provide a color for each pixel.
It's up to you to creatively supply data to the shader and use that data creatively to set gl_Position and gl_FragColor. I get most ideas from looking at examples.
GLSL itself is pretty straight forward. There's a few types int, float, vec2, vec3, vec4, mat2, mat3, mat4. They respond to operators +, -, *, / etc. There's some built in functions.
You can find a terse version of all GLSL info on the last 2 pages of the WebGL Reference Card.
That was enough for me. That and looking at working programs.
The one interesting thing for me vs most languages was synonyms for vec fields and the swizzling. A vec4 for example
vec4 v = vec4(1.0, 2.0, 3.0, 4.0);
You can reference the various components of v using x,y,z,w or s,t,u,v or r,g,b,a or array style. So for example
float red = v.r; // OR
float red = v.x; // same thing OR
float red = v.s; // same thing OR
float red = v[0]; // same thing
The other thing you can do is swizzle
vec4 color = v.bgra; // swap red and blue
vec4 bw = v.ggga; // make a monotone color just green & keep alpha
And you can also get subcomponents
vec2 just_xy = v.xy;
I want to organize depth for objects in OpenGL ES 2.0 so that I can specify one object to be in front of other object(s)(just send uniform variable). If I want to draw two lines separatelly I just specify which one is closer.
I work with Qt and its OpenGL support. When I tend to use gl_FragDepth, it gives me linking error saying that gl_FragDepth is undeclared identifier.
Also, I tried, in vertex shader, something like gl_Position.z = depthAttr; where depthAttr is uniform variable for the rendering object.
Can someone tell me what else can I do, or what I did wrong? Is there preferred way?
The pre-defined fragment shader output variable gl_FragDepth is not supported in ES 2.0. It is only available in full OpenGL, and in ES 3.0 or later.
If you really want to specify the depth with a uniform variable, you need to have the uniform variable in the vertex shader, and use it to calculate gl_Position. This approach from your question looks fine:
uniform float depthAttr;
...
gl_Position = ...;
gl_Position.z = depthAttr;
A much more standard approach is to make the desired depth part of your position coordinates. If you currently use 2D coordinates for drawing your lines, simply add the desired depth as a third coordinate, and change the position attribute in the vertex shader from vec2 to vec3. The depth will then arrive in the shader as part of the attribute, and there's no need for additional uniform variables.
OpenGL ES 2.0 supports an optional extension - EXT_frag_depth, which allows the fragment shader to assign gl_FragDepthEXT.
See the specification, but of course, be prepared to fall back when it's not available.
So I have an opengl program that draws a group on objects. When I draw these objects I want to use my shader program is a vertex shader and a vertex shader exclusively. Basically, I am aiming to adjust the height of the model inside the vertex shader depending on a texture calculation. And that is it. Otherwise I want the object to be drawn as if using naked openGL (no shaders). I do not want to implement a fragment shader.
However I haven't been able to find how to make it so I can have a shader program with only a vertex shader and nothing else. Forgetting the part about adjust my model's height, so far I have:
gl_FrontColor = gl_Color;
gl_Position = modelViewProjectionMain * Position;
It transforms the object to the correct position alright, however when I do this I loose texture coordinates and also lighting information (normals are lost). What am I missing? How do I write a "do-nothing" vertex shader? That is, a vertex shader you could turn off and on when drawing a textured .obj with normals, and there would be no difference?
You can't write a shader with partial implementation. Either you do everything in a shader or completely rely on fixed functionality(deprecated) for a given object.
What you can do is this:
glUseProgram(handle)
// draw objects with shader
glUseProgram(0)
// draw objects with fixed functionality
To expand a little on the entirely correct answer by Abhishek Bansal, what you want to do would be nice but is not actually possible. You're going to have to write your own vertex and fragment shaders.
From your post, by "naked OpenGL" you mean the fixed-function pipeline in OpenGL 1 and 2, which included built-in lighting and texturing. Shaders in OpenGL entirely replace the fixed-function pipeline rather than extending it. And in OpenGL 3+ the old functionality has been removed, so now they're compulsory.
The good news is that vertex/fragment shaders to perform the same function as the original OpenGL lighting and texturing are easy to find and easy to modify for your purpose. The OpenGL Shading Language book by Rost, Licea-Kane, etc has a whole chapter "Emulating OpenGL Fixed Functionality" Or you could get a copy of the 5th edition OpenGL SuperBible book and code (not the 6th edition) which came with a bunch of useful predefined shaders. Or if you prefer online resources to books, there are the NeHe tutorials.
Writing shaders seems a bit daunting at first, but it's easier than you might think, and the extra flexibility is well worth it.