Shaders in WebGL vs openGL? [closed] - opengl

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I want to use shaders to use in WebGL and specifically three.js. Is there a specific version of GLSL that WebGL and three.js uses?

WebGL shaders follow the GLSL ES 1.017 spec
https://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf
That's different than Desktop OpenGL in several ways. One it's the 1.0 version of GLSL ES where as desktop GL at version 4.2 of GLSL (not ES)
One big difference between WebGL GLSL and many articles found about shaders on the internet is there's no fixed function pipeline in OpenGL ES 2.0 and therefore no fixed function pipeline in WebGL.
The fixed function pipeline is left over from OpenGL 1.0 where you'd use commands like glLight and glVertex and glNormal. And then your shaders needed a way to reference that data. In OpenGL ES and WebGL all that is gone because everything a shader does is 100% up to the user. All WebGL does is let you define your own inputs (attributes, uniforms) and name them whatever you want.
WebGL2 shaders follow the GLSL ES 3.00 spec
https://www.khronos.org/registry/OpenGL/specs/es/3.0/es_spec_3.0.pdf
As for three.js, three.js is a 3d engine and provides its own set of standard inputs, names, and other features when it generates a shader. See the docs for some of the details. The uniforms and attributes provided by default are documented here. You can also look at the source or check out an example.
Three.js also provides something called a RawShaderMaterial which does not add any predefined things apparently in which case you just write standard WebGL GLSL.
You can find three.js's standard attributes and uniforms here.
As for a place to learn GLSL I don't really have a suggestion. It really depends on your level of experience with programming in general and how you like to learn. I learn by looking at examples better than reading manuals. Maybe someone else can add some links.
Shaders as a concept are pretty simple. You create a pair of shaders, setup them up with inputs, call gl.drawArrays or gl.drawElements and pass in a count. Your vertex shader will be called count times and needs to set gl_Position. Every 1 to 3 times it's called WebGL will then draw a point, line, or triangle. To do this it will call your fragment shader asking for each pixel it's about to draw what color to make that pixel. The fragment shader needs to set gl_FragColor. The shaders get data from attributes, uniforms, textures and varyings. attributes are per vertex data. They pull their data from buffers, one piece of data per attribute per iteration of your vertex shader. Uniforms are like setting global variables before the shader runs. You can pass data from a vertex shader to a fragment shader with varying. That data will be interpolated or varied ;) between the values set for each vertex of a primitive (triangle) as the fragment shader is called to provide a color for each pixel.
It's up to you to creatively supply data to the shader and use that data creatively to set gl_Position and gl_FragColor. I get most ideas from looking at examples.
GLSL itself is pretty straight forward. There's a few types int, float, vec2, vec3, vec4, mat2, mat3, mat4. They respond to operators +, -, *, / etc. There's some built in functions.
You can find a terse version of all GLSL info on the last 2 pages of the WebGL Reference Card.
That was enough for me. That and looking at working programs.
The one interesting thing for me vs most languages was synonyms for vec fields and the swizzling. A vec4 for example
vec4 v = vec4(1.0, 2.0, 3.0, 4.0);
You can reference the various components of v using x,y,z,w or s,t,u,v or r,g,b,a or array style. So for example
float red = v.r; // OR
float red = v.x; // same thing OR
float red = v.s; // same thing OR
float red = v[0]; // same thing
The other thing you can do is swizzle
vec4 color = v.bgra; // swap red and blue
vec4 bw = v.ggga; // make a monotone color just green & keep alpha
And you can also get subcomponents
vec2 just_xy = v.xy;

Related

OpenGL: Fragment vs Vertex shader for gradients?

I'm new to OpenGL, and I'm trying to understand vertex and fragment shaders. It seems you can use a vertex shader to make a gradient if you define the color you want each of the vertices to be, but it seems you can also make gradients using a fragment shader if you use the FragCoord variable, for example.
My question is, since you seem to be able to make color gradients using both kinds of shaders, which one is better to use? I'm guessing vertex shaders are faster or something since everyone seems to use them, but I just want to make sure.
... since everyone seems to use them
Using vertex and fragment shaders are mandatory in modern OpenGL for rendering absolutely everything.† So everyone uses both. It's the vertex shader responsibility to compute the color at the vertices, OpenGL's to interpolate it between them, and fragment shader's to write the interpolated value to the output color attachment.
† OK, you can also use a compute shader with imageStore, but I'm talking about the rasterization pipeline here.

Rendering object depth in OpenGL ES 2.0

I want to organize depth for objects in OpenGL ES 2.0 so that I can specify one object to be in front of other object(s)(just send uniform variable). If I want to draw two lines separatelly I just specify which one is closer.
I work with Qt and its OpenGL support. When I tend to use gl_FragDepth, it gives me linking error saying that gl_FragDepth is undeclared identifier.
Also, I tried, in vertex shader, something like gl_Position.z = depthAttr; where depthAttr is uniform variable for the rendering object.
Can someone tell me what else can I do, or what I did wrong? Is there preferred way?
The pre-defined fragment shader output variable gl_FragDepth is not supported in ES 2.0. It is only available in full OpenGL, and in ES 3.0 or later.
If you really want to specify the depth with a uniform variable, you need to have the uniform variable in the vertex shader, and use it to calculate gl_Position. This approach from your question looks fine:
uniform float depthAttr;
...
gl_Position = ...;
gl_Position.z = depthAttr;
A much more standard approach is to make the desired depth part of your position coordinates. If you currently use 2D coordinates for drawing your lines, simply add the desired depth as a third coordinate, and change the position attribute in the vertex shader from vec2 to vec3. The depth will then arrive in the shader as part of the attribute, and there's no need for additional uniform variables.
OpenGL ES 2.0 supports an optional extension - EXT_frag_depth, which allows the fragment shader to assign gl_FragDepthEXT.
See the specification, but of course, be prepared to fall back when it's not available.

GLSL - A do-nothing vertex shader?

So I have an opengl program that draws a group on objects. When I draw these objects I want to use my shader program is a vertex shader and a vertex shader exclusively. Basically, I am aiming to adjust the height of the model inside the vertex shader depending on a texture calculation. And that is it. Otherwise I want the object to be drawn as if using naked openGL (no shaders). I do not want to implement a fragment shader.
However I haven't been able to find how to make it so I can have a shader program with only a vertex shader and nothing else. Forgetting the part about adjust my model's height, so far I have:
gl_FrontColor = gl_Color;
gl_Position = modelViewProjectionMain * Position;
It transforms the object to the correct position alright, however when I do this I loose texture coordinates and also lighting information (normals are lost). What am I missing? How do I write a "do-nothing" vertex shader? That is, a vertex shader you could turn off and on when drawing a textured .obj with normals, and there would be no difference?
You can't write a shader with partial implementation. Either you do everything in a shader or completely rely on fixed functionality(deprecated) for a given object.
What you can do is this:
glUseProgram(handle)
// draw objects with shader
glUseProgram(0)
// draw objects with fixed functionality
To expand a little on the entirely correct answer by Abhishek Bansal, what you want to do would be nice but is not actually possible. You're going to have to write your own vertex and fragment shaders.
From your post, by "naked OpenGL" you mean the fixed-function pipeline in OpenGL 1 and 2, which included built-in lighting and texturing. Shaders in OpenGL entirely replace the fixed-function pipeline rather than extending it. And in OpenGL 3+ the old functionality has been removed, so now they're compulsory.
The good news is that vertex/fragment shaders to perform the same function as the original OpenGL lighting and texturing are easy to find and easy to modify for your purpose. The OpenGL Shading Language book by Rost, Licea-Kane, etc has a whole chapter "Emulating OpenGL Fixed Functionality" Or you could get a copy of the 5th edition OpenGL SuperBible book and code (not the 6th edition) which came with a bunch of useful predefined shaders. Or if you prefer online resources to books, there are the NeHe tutorials.
Writing shaders seems a bit daunting at first, but it's easier than you might think, and the extra flexibility is well worth it.

Why gl_Color is not a built-in variable for the fragment shader?

The vertex shader is expected to output vertices positions in clip space:
Vertex shaders, as the name implies, operate on vertices.
Specifically, each invocation of a vertex shader operates on a single
vertex. These shaders must output, among any other user-defined
outputs, a clip-space position for that vertex. (source: Learning Modern 3D Graphics Programming, by Jason L. McKesson)
It has a built-in variable named gl_Position for that.
Similarly, the fragment shader is expected to output colors:
A fragment shader is used to compute the output color(s) of a
fragment. [...] After the fragment shader executes, the fragment
output color is written to the output image. (source: Learning
Modern 3D Graphics Programming, by Jason L. McKesson)
but there is no gl_Color built-in variable defined for that as stated here: opengl44-quick-reference-card.pdf
Why that (apparent) inconsistency in the OpenGL API?
That is because the OpenGL pipeline uses gl_Position for several tasks. The manual says: "The value written to gl_Position will be used by primitive assembly, clipping, culling and other fixed functionality operations, if present, that operate on primitives after vertex processing has occurred."
In contrast, the pipeline logic does not depend on the final pixel color.
The accepted answer does not adequately explain the real situation:
gl_Color was already used once-upon-a-time, but it was always defined as an input value.
In compatibility GLSL, gl_Color is the color vertex pointer in vertex shaders and it takes on the value of gl_FrontColor or gl_BackColor depending on which side of the polygon you are shading in a fragment shader.
However, none of this behavior exists in newer versions of GLSL. You must supply your own vertex attributes, your own varyings and you pick between colors using the value of gl_FrontFacing. I actually explained this in more detail in a different question related to OpenGL ES 2.0, but the basic principle is the same.
In fact, since gl_Color was already used as an input variable this is why the output of a fragment shader is called gl_FragColor instead. You cannot have a variable serve both as an input and an output in the same stage. The existence of an inout storage qualifier may lead you to believe otherwise, but that is for function parameters only.

GLSL Editor program [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I'm looking for a GLSL editor program. I did find some by googling, but I wanna know if there are any preferred ones based on user experience.
Possible features:
Syntax Highlighting
Intellisense
Automatic compile and link
P.S.
I'm not even sure if it's meaningful/possible for GLSL to be compiled automatically (any comments?).
EDIT:
Here's what I found:
Shader Maker
Try out KickJS's Shader Editor. It currently supports syntax highlight and compiles the code as you write.
http://www.kickjs.org/example/shader_editor/shader_editor.html
If you are running OS/X you should try out the OpenGL Shader Builder, even though this tool feels a little out-dated:
/Developer/Applications/Graphics Tools/OpenGL Shader Builder.app
There is also the GLman, which maybe is more a GLSL sandbox environment than a editor. A good introduction to the program is found in the excellent book: 'Graphics Shaders - Theory and practice - Second edition'.
http://web.engr.oregonstate.edu/~mjb/glman/
I found shader toy to be helpful. Contains some predefined shaders you can tweak and see instant results. All online and covers WebGL, OpenGL ES 1.1 / (some) 2.0, probably OpenGL various versions too.
https://www.shadertoy.com/
It passes in some predefined uniforms as well as up to 4 textures you can hyperlink too.
Here are the following inputs:
uniform vec4 mouse: xy contain the current pixel coords (if LMB is down). zw contain the click pixel.
uniform vec2 resolution: the rendering vieport resolution.
uniform float time: current time in seconds.
uniform sampler2D tex0: sampler for input texture 0.
uniform sampler2D tex1: sampler for input texture 1.
uniform sampler2D tex2: sampler for input texture 2.
uniform sampler2D tex3: sampler for input texture 3.
I've found http://glsl.heroku.com interesting, you can edit only the fragment shader, but it's quite useful for testing some effects.
And it's open source! You can get the source on github: https://github.com/mrdoob/glsl-sandbox
Example of a shader using this editor: http://glsl.heroku.com/e#7310.0 (it's not mine, btw)