WebGL `uniform uint` Causes Syntax Error - glsl

I am using Google Chrome Version 59.0.3071.115 (Official Build) (64-bit) on Windows 10.
I have a vertex shader that looks like so:
attribute vec3 aPosition;
attribute vec2 aTextureCoordinate;
uniform uint uLayer;
uniform vec2 uLocation;
varying highp vec2 vTextureCoordinate;
void main(void)
{
gl_Position = vec4(aPosition + vec3(uLocation, 0.0), 1.0);
vTextureCoordinate = aTextureCoordinate;
}
A prior version did not have line four (uniform unit uLayer;), and it compiled fine. Adding in that line causes an ERROR: 0:5: 'uLayer' : syntax error. As far as I can tell, there is nothing wrong with this line syntactically, and I cannot find anything stating uniform uint is not valid in a vertex shader. Is there something I am missing here?

WebGL 1 uses GLSL 100, which does not support uint. WebGL 2 uses GLSL 300 which adds uint.

Related

Vertex/Fragment Shader using Xcode output a syntax error

I'm following a tutorial on shader done using Visual Studio... I'm using Xcode on ElCapitan.
My readShaderCode() function pass the shaderCode files content ( vertex.glsl and fragment.glsl ) to the console --just to make sure-- along with the OpenGL version .
So this code works on Windows but not on my machine ! I can't see what the problem is and if somebody could figure it out for me I would really appreciate it! Thanks. And thanks for all the posts that have been helping me along for sometimes now :)
Here is the output:
Working with OpenGl: 2.1 INTEL-10.14.66
Here is the shader Code:
#version 120
#extension GL_ARB_separate_shader_objects : enable
in layout(location=0) vec2 position;
in layout(location=1) vec3 vertexColor;
out vec3 theColor;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
theColor = vertexColor;
}
*****************************************
Working with OpenGl: 2.1 INTEL-10.14.66
Here is the shader Code:
#version 120
#extension GL_ARB_separate_shader_objects : enable
out vec4 daColor;
in vec3 theColor;
void main() {
daColor = vec4(theColor, 1.0);
}
*****************************************
WARNING: 0:2: extension 'GL_ARB_separate_shader_objects' is not supported
ERROR: 0:4: '(' : syntax error: syntax error
You get perfectly reasonable errors and warnings.
WARNING: 0:2: extension 'GL_ARB_separate_shader_objects' is not supported
That means exactly what it says. Your shader depends on an extension that isn't supported.
ERROR: 0:4: '(' : syntax error: syntax error
There is only one ( character in line 4 of your vertex shader, so that makes it clear what the problem is.
This:
in layout(location=0) vec2 position;
Is not valid GLSL 1.20 code. GLSL 1.20 does not permit in qualified global variables, and GLSL 1.20 has no idea what layout means.
Your code is not reasonable GLSL code of the #version it is declared with. If another implementation took it, then you were relying on implementation-defined behavior.

c++/OpenGL/GLSL, textures with "random" artifacts

Would like to know if someone has experienced this and knows the reason. I'm getting these strange artifacts after using "texture arrays"
http://i.imgur.com/ZfLYmQB.png
(My gpu is AMD R9 270)
ninja edit deleted the rest of the post for readability since it was just showing code where the problem could have been, since the project is open source now I only show the source of the problem(fragment shader)
Frag:
#version 330 core
layout (location = 0) out vec4 color;
uniform vec4 colour;
uniform vec2 light;
in DATA{
vec4 position;
vec2 uv;
float tid;
vec4 color;
}fs_in;
uniform sampler2D textures[32];
void main()
{
float intensity = 1.0 / length(fs_in.position.xy - light);
vec4 texColor = fs_in.color;
if(fs_in.tid > 0.0){
int tid = int(fs_in.tid - 0.5);
texColor = texture(textures[tid], fs_in.uv);
}
color = texColor; // * intensity;
}
Edit: github repos (sorry if It is missing some libs, having trouble linking them to github) https://github.com/PedDavid/NubDevEngineCpp
Edit: Credits to derhass for pointing out I was doing something with undefined results (accessing the array without a constant ([tid])). I have it now working with
switch(tid){
case 0: textureColor = texture(textures[0], fs_in.uv); break;
...
case 31: textureColor = texture(textures[31], fs_in.uv); break;
}
Not the prettiest, but fine for now!
I'm getting these strange artifacts after using "texture arrays"
You are not using "texture arrays". You use arrays of texture samplers. From your fragment shader:
#version 330 core
// ...
in DATA{
// ...
float tid;
}fs_in;
//...
if(fs_in.tid > 0.0){
int tid = int(fs_in.tid - 0.5);
texColor = texture(textures[tid], fs_in.uv);
}
What you try to do here is not allowed as per the GLSL 3.30 specification which states
Samplers aggregated into arrays within a shader (using square brackets
[ ]) can only be indexed with integral constant expressions (see
section 4.3.3 “Constant Expressions”).
Your tid is not a constant, so this will not work.
In GL 4, this constraint has been somewhat relaxed to (quote is from GLSL 4.50 spec):
When aggregated into arrays within a shader, samplers can only be
indexed with a dynamically uniform integral expression, otherwise
results are undefined.
Your now your input also isn't dynamically uniform either, so you will get undefined results too.
I don't know what you are trying to achieve, but maybe you can get it dobe by using array textures, which will represent a complete set of images as a single GL texture object and will not impose such constraints at accessing them.
That looks like the shader is rendering whatever random data it finds in memory.
Have you tried checking that glBindTexture(...) is called at the right time (before render) and that the value used (as returned by glGenTextures(...)) is valid?

Opengl error 1282 (invalid operation) when using texture()

I have the following fragment shader:
#version 330 core
layout (location = 0) out vec4 color;
uniform vec4 colour;
uniform vec2 light_pos;
in DATA
{
vec4 position;
vec2 texCoord;
float tid;
vec4 color;
} fs_in;
uniform sampler2D textures[32];
void main()
{
float intensity = 1.0 / length(fs_in.position.xy - light_pos);
vec4 texColor = fs_in.color;
if (fs_in.tid > 0.0)
{
int tid = int(fs_in.tid + 0.5);
texColor = texture(textures[tid], fs_in.texCoord);
}
color = texColor * intensity;
}
If I run my program, I get opengl error 1282, which is invalid operation. If I don't use the texture(), so I write texCoord = vec4(...) it works perfectly. I'm always passing in tid (texture ID) as 0 (no texture) so that part shouldn't even run. I've set the textures uniform to some placeholder, but as far as I know this shouldn't even matter. What could cause the invalid operation then?
Your shader compilation has most likely failed. Make sure that you always check the compile status after trying to compile the shader, using:
GLint val = GL_FALSE;
glGetShaderiv(shaderId, GL_COMPILE_STATUS, &val);
if (val != GL_TRUE)
{
// compilation failed
}
In your case, the shader is illegal because you're trying to access an array of samplers with a variable index:
texColor = texture(textures[tid], fs_in.texCoord);
This is not supported in GLSL 3.30. From the spec (emphasis added):
Samplers aggregated into arrays within a shader (using square brackets [ ]) can only be indexed with integral constant expressions (see section 4.3.3 “Constant Expressions”).
This restriction is relaxed in later OpenGL versions. For example, from the GLSL 4.50 spec:
When aggregated into arrays within a shader, samplers can only be indexed with a dynamically uniform integral expression, otherwise results are undefined.
This change was introduced in GLSL 4.00. But it would still not be sufficient for your case, since you're trying to use an in variable as the index, which is not dynamically uniform.
If your textures are all the same size, you may want to consider using an array texture instead. That will allow you to sample one of the layers in the array texture based on a dynamically calculated index.
I know this solution is late, but if it helps anybody..
As per Cherno's video, this should work. He however uses the attribute 'fs_in.tid' as a 'GL_BYTE' in the gl_VertexAttribPointer fo the texture index, for some reason regarding casting 1.0f always got converted to 0.0f and hence did not work.
Changing GL_BYTE to GL_FLOAT resolved this issue for me.
Regarding the 'opengl error 1282', its a very common mistake I faced. I used to forget to call glUseProgram(ShaderID) before setting any of the uniforms. Because of this the uniforms even though not being used/set at the time can cause an error, namely '1282'. This could be one of the solutions, this solved it for me.

When switching to GLSL 300, met the following error

when I switch to use OpenGL ES 3 with GLSL 300, I met the following error in my frag shader
undeclared identifier gl_FragColor
when using GLSL 100, everything is fine.
Modern versions of GLSL do fragment shader outputs simply by declaring them as out values, and gl_FragColor is no longer supported, hence your error. Try this:
out vec4 fragColor;
void main()
{
fragColor = vec4(1.0, 0.0, 0.0, 1.0);
}
Note that gl_FragDepth hasn't changed and is still available.
For more information see https://www.opengl.org/wiki/Fragment_Shader
The predefined variable gl_FragColor does not exist anymore in GLSL ES 3.00. You need to define your own out variable for the output of the fragment shader. You can use any name you want, for example:
out vec4 FragColor;
void main() {
...
FragColor = ...;
}
This follows the Core Profile of full OpenGL. The reason for not having a pre-defined fragment shader output is that it does not scale well for multiple render targets, and for render targets that need types other than float vectors.

OpenGL shader version error

I am using Visual Studio 2013 but running under Visual Studio 2010 compiler.
I am running Windows 8 in bootcamp on a Macbook Pro with intel iris pro 5200 graphics.
I have a very simple vertex and fragment shader, I am just displaying simple primitives in a window but I am getting warnings in console stating..
OpenGL Debug Output: Source(Shader Comiler), type(Other), Priority(Medium), GLSL compile warning(s) for shader 3, "": WARNING: -1:65535: #version : version number deprecated in OGL 3.0 forward compatible context driver
Anyone have any idea how to get rid of these annoying errors..?
Vertex Shader Code:
#version 330 core
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projMatrix;
in vec3 position;
in vec2 texCoord;
in vec4 colour;
out Vertex {
vec2 texCoord;
vec4 colour;
} OUT;
void main(void) {
gl_Position = (projMatrix * viewMatrix * modelMatrix) * vec4(position, 1.0);
OUT.texCoord = texCoord;
OUT.colour = colour;
}
Frag Shader code
#version 330 core
in Vertex {
vec2 texCoord;
vec4 colour;
} IN;
out vec4 color;
void main() {
color= IN.colour;
//color= vec4(1,1,1,1);
}
I always knew Intel drivers were bad, but this is ridiculous. The #version directive is NOT deprecated in GL 3.0. In fact it is more important than ever beginning with GL 3.2, because in addition to the number you can also specify core (default) or compatibility.
Nevertheless, that is not an actual error. It is an invalid warning, and having OpenGL debug output setup is why you keep seeing it. You can ignore it. AMD seems to be the only vendor that uses debug output in a useful way. NV almost never outputs anything, opting instead to crash... and Intel appears to be spouting nonsense.
It is possible that what the driver is really trying to tell you is that you have an OpenGL 3.0 context and you are using a GLSL 3.30 shader. If that is the case, this has got to be the stupidest way I have ever seen of doing that.
Have you tried #version 130 instead? If you do this, the interface blocks (e.g. in Vertex { ...) should generate parse errors, but it would at least rule out the only interpretation of this warning that makes any sense.
There is another possibility, that makes a lot more sense in the end. The debug output mentions this is related to shader object #3. While there is no guarantee that shader names are assigned sequentially beginning with 0, this is usually the case. You have only shown a total of 2 shaders here, #3 would imply the 4th shader your software loaded.
Are you certain that these are the shaders causing the problem?