When I try to link my vertex and fragment shaders into a program, WebGL throws Varyings with the same name but different type, or statically used varyings in fragment shader are not declared in vertex shader: textureCoordinates
I have varying vec2 test in both my vertex and fragment shaders, and can't see any reason why the compiler wouldn't be able to find the same varying in both.
Vertex Shader:
varying vec2 test;
void main(void) {
gl_Position = vec4(0.0, 0.0, 0.0, 0.0);
test = vec2(1.0, 0.0);
}
Fragment Shader:
precision highp float;
varying vec2 test;
void main(void) {
gl_FragColor = vec4(test.xy, 0.0, 1.0);
}
Test code:
const canvas = document.createElement('canvas');
gl = canvas.getContext('webgl')
let vert = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vert, "varying vec2 test;\nvoid main(void) {\n gl_Position = vec4(0.0, 0.0, 0.0, 0.0);\n test = vec2(1.0, 0.0);\n}");
gl.compileShader(vert);
let frag = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(frag, "precision highp float;\nvarying vec2 test;\nvoid main() {\n\tgl_FragColor = vec4(test.xy, 0.0, 1.0);\n}");
gl.compileShader(frag);
let program = gl.createProgram();
gl.attachShader(program, vert);
gl.attachShader(program, frag);
gl.linkProgram(program);
gl.useProgram(program);
Just a guess, but I wonder if it's because you're not using the textureCoordinates in your fragment shader. The names & types match just fine, so i don't think that's the issue. I've done the same thing here:
Frag:
// The fragment shader is the rasterization process of webgl
// use float precision for this shader
precision mediump float;
// the input texture coordinate values from the vertex shader
varying vec2 vTextureCoord;
// the texture data, this is bound via gl.bindTexture()
uniform sampler2D texture;
// the colour uniform
uniform vec3 color;
void main(void) {
// gl_FragColor is the output colour for a particular pixel.
// use the texture data, specifying the texture coordinate, and modify it by the colour value.
gl_FragColor = texture2D(texture, vec2(vTextureCoord.s, vTextureCoord.t)) * vec4(color, 1.0);
}
Vert:
// setup passable attributes for the vertex position & texture coordinates
attribute vec3 aVertexPosition;
attribute vec2 aTextureCoord;
// setup a uniform for our perspective * lookat * model view matrix
uniform mat4 uMatrix;
// setup an output variable for our texture coordinates
varying vec2 vTextureCoord;
void main() {
// take our final matrix to modify the vertex position to display the data on screen in a perspective way
// With shader code here, you can modify the look of an image in all sorts of ways
// the 4th value here is the w coordinate, and it is called Homogeneous coordinates, (x,y,z,w).
// It effectively allows the perspective math to work. With 3d graphics, it should be set to 1. Less than 1 will appear too big
// Greater than 1 will appear too small
gl_Position = uMatrix * vec4(aVertexPosition, 1);
vTextureCoord = aTextureCoord;
}
Issue was resolved by updating Chrome for OSX from v51.something to 52.0.2743.82 (64-bit) Weird.
Related
As per my previous question here, what if I want to rotate a sampler2D texture inside the fragment shader?
In that question I rotated the texture inside vertex shader
#version 120
attribute vec3 a_position;
attribute vec2 a_texCoord;
varying vec2 v_texCoord;
void main()
{
const float w = 1.57;
mat3 A = mat3(cos(w), -sin(w), 0.0,
sin(w), cos(w), 0.0,
0.0, 0.0, 1.0);
gl_Position = vec4(A * a_position, 1.0);
v_texCoord = a_texCoord;
}
but my the fragment shader applies an heavy modification that was thought for a rotated clockwise texture, so using the vertex shader I have an horizontal effect that is applied to vertical coordinates by fragment shader.
Is it possible to rotate a sampler2D before apply the modification?
You cannot rotate a sampler2D, however you can rotated the texture coordinates:
#version 120
attribute vec3 a_position;
attribute vec2 a_texCoord;
varying vec2 v_texCoord;
void main()
{
const float w = 1.57;
mat2 uvRotate = mat2(cos(w), -sin(w),
sin(w), cos(w));
gl_Position = vec4(a_position, 1.0);
v_texCoord = uvRotate * a_texCoord;
}
I have a plane, made from a NURB surface, with many vertex so it can create a curved surface depending on the vertex positions ( control points ).
I bind the plane object with two different textures, one is the color texture to be displayed on the object, the other is an heightMap, ( black and white ), which has to alter de vertex yy positions of the plane depending of the color white in the correspondent texture coordinate.
I know the problem is in my shaders. I do not have many experience with OPENGL.
Here is the shader.vert that I use:
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat4 uNMatrix;
varying vec2 vTextureCoord;
uniform sampler2D uSampler2;
uniform float heightScale;
void main() {
//change texture coordinates
vec2 texInver=vec2(1.0, -1.0);
vTextureCoord = aTextureCoord*texInver;
//--------------------------
//change vertex position
vec4 filter = texture2D(uSampler2, vTextureCoord);
float offset = filter.r;
vec3 inc = vec3(0.0, offset, 0.0);
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition + inc, 1.0);
//----------------------
}
Since the image is black and white, R = G = B. That is why I only check the filter.r
And my shader.frag is:
#ifdef GL_ES
precision highp float;
#endif
varying vec2 vTextureCoord;
uniform sampler2D uSampler;
void main() {
gl_FragColor = texture2D(uSampler, vTextureCoord);
}
This is the height map ( .jpg ):
The result I get is a plane all incremented by 1 in the yy coordinate.
The result I expect is SOME vertex of the plane to be incremented by a 0-1 value in the yy coordinate.
I was forgetting to change the number of the object's vertexes
This was the problem, after I did that it was solved.
I'm working on a simple particle system in OpenGL; so far I've written two fragment shaders to update velocities and positions in response to my mouse, and they seem to work! I've looked at those two textures and they both seem to respond properly (going from random noise to an orderly structure in response to my mouse).
However, I'm having issues with how to draw the particles. I'm rather new to vertex shaders (having previously only used fragment shaders); it's my understanding that the usual way is a vertex shader like this:
uniform sampler2DRect tex;
varying vec4 cur;
void main() {
gl_FrontColor = gl_Color;
cur = texture2DRect(tex, gl_Vertex.xy);
vec2 pos = cur.xy;
gl_Position = gl_ModelViewProjectionMatrix * vec4(pos, 0., 1.);
}
Would transform the coordinates to the proper place according to the values in the position buffer. However, I'm getting gl errors when I run this that it can't be compiled -- after some research, it seems that gl_ModelViewProjectionMatrix is deprecated.
What would be the proper way to do this now that the model view matrix is deprecated? I'm not trying to do anything fancy with perspective, I just need a plain orthogonal view of the texture.
thanks!
What version of GLSL are you using (don't see any #version directive)? Yes, i think gl_ModelViewProjectionMatrix is really deprecated. However if you want to use it maybe this could help. By the way varying qualifier is quite old too. I would rather use in and out qualifiers it makes your shader code more 'readable'.
'Proper' way of doing that is that you create your own matrices - model and view (use glm library for example) and multiply them and then pass them as uniform to your shader. Tutorial with an example can be found here.
Here is my vs shader i used for displaying texture (fullscreen quad):
#version 430
layout(location = 0) in vec2 vPosition;
layout(location = 1) in vec2 vUV;
out vec2 uv;
void main()
{
gl_Position = vec4(vPosition,1.0,1.0);
uv = vUV;
}
fragment shader:
#version 430
in vec2 uv;
out vec4 final_color;
uniform sampler2D tex;
void main()
{
final_color = texture(tex, uv).rgba;
}
and here are my coordinates (mine are static, but you can change it and update buffer - shader can be the same):
//Quad verticles - omitted z coord, because it will always be 1
float pos[] = {
-1.0, 1.0,
1.0, 1.0,
-1.0, -1.0,
1.0, -1.0
};
float uv[] = {
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0
};
Maybe you could try to turn off depth comparison before executing this shader glDisable(GL_DEPTH_TEST);
I've setup an OpenGL environment with deferred shading following this tutorial but I can't make the second shader output on my final buffer.
I can see that the first shader (the one that doesn't use lights) is working properly because with gDEBugger I can see that the output buffers are correct, but the second shader really can't display anything. I've also tried to make the second shader output a single color for all the scene just to see if it was displying something, bot nothing is visible (the screen should be completely red but it isn't).
The first pass shader (the one I use to create the buffers for the GBuffer) is working so I'm not add it's code or how I created and implemented my GBuffer, but if you need I'll add them, just tell me.
I think the problem is when I tell OpenGL to output on the FrameBuffer 0 (my video).
This is how I enalbe OpenGL to write to the FrameBuffer 0:
glEnable(GL_BLEND);
m_MotoreGrafico->glBlendEquation(GL_FUNC_ADD);
glBlendFunc(GL_ONE, GL_ONE);
// Abilito la scrittura sul buffer finale
m_MotoreGrafico->glBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0);
m_gBuffer.BindForReading();
glClear(GL_COLOR_BUFFER_BIT);
// Imposto le matrici dello shader
SetUpOGLProjectionViewMatrix(1);
// Passo le texture del GBuffer allo shader
pActiveShader->setUniform1i(_T("gPositionMap"), m_gBuffer.GetPositionTexture());
pActiveShader->setUniform1i(_T("gColorMap"), m_gBuffer.GetDiffuseTexture());
pActiveShader->setUniform1i(_T("gNormalMap"), m_gBuffer.GetNormalTexture());
// Passo variabili necessarie allo shader
float dimensioneFinestra[2], posizioneCamera[3];
dimensioneFinestra[0] = m_nLarghezzaFinestra;
dimensioneFinestra[1] = m_nAltezzaFinestra;
m_MotoreGrafico->GetActiveCameraPosition(posizioneCamera);
pActiveShader->setUniform2f(_T("gScreenSize"), dimensioneFinestra);
pActiveShader->setUniform3f(_T("gCameraPos"), posizioneCamera);
pActiveShader->setUniform1i(_T("gUsaLuci"), 0);
// Disegno le luci
float coloreLuce[3], posizioneLuce[3], direzioneLuce[3], vUpLuce[3], vRightLuce[3], intensita;
for(int i = 0; i < GetDocument()->m_RTL.GetNLights(); i++)
{
CRTLuce* pRTLuce = GetDocument()->m_RTL.GetRTLightAt(i);
...
m_MotoreGrafico->glBindVertexArray(pRTLuce->GetRTLuce()->GetVBO()->getVBAIndex());
glDrawArrays(GL_TRIANGLES, 0, pRTLuce->GetRTLuce()->GetNVertPerShader());
}
The function m_gBuffer.BindForReading() is like this (bot I think it doesn't matter for my problem):
for (unsigned int i = 0 ; i < ARRAY_SIZE_IN_ELEMENTS(m_textures); i++)
{
m_pMotoreGrafico->glActiveTexture(GL_TEXTURE0 + i);
glBindTexture(GL_TEXTURE_2D, m_textures[GBUFFER_TEXTURE_TYPE_POSITION + i]);
}
So far my GBuffer is working (it creates the textures) and my first shader is also working (it's drawing the textures of my GBuffer).
The problem then is that I can't reset OpenGL to draw in my video.
The first 4 textures are the ones create with the first-pass shader.
This is my back buffer (after the second-pass shader)
And this is my front buffer (after the second-pass shader)
This is my second-pass fragment shader code (it outputs only red)
out vec4 outputColor;
void main()
{
outputColor = vec4(1.0, 0.0, 0.0, 1.0);
}
Does anyone have an idea of what I'm doing wrong?
Second-pass vertex shader code:
#version 330
uniform struct Matrici
{
mat4 projectionMatrix;
mat4 modelMatrix;
mat4 viewMatrix;
} matrices;
layout (location = 0) in vec3 inPosition;
void main()
{
vec4 vEyeSpacePosVertex = matrices.viewMatrix * matrices.modelMatrix * vec4(inPosition, 1.0);
gl_Position = matrices.projectionMatrix * vEyeSpacePosVertex;
}
Second-pass fragment shader code:
#version 330
uniform struct MDLight
{
vec3 vColor;
vec3 vPosition;
vec3 vDirection;
float fAmbientIntensity;
float fStrength;
int bOn;
float fConeCosine;
float fAltezza;
float fLarghezza;
vec3 vUp;
vec3 vRight;
} gLuce;
uniform float gSpecularIntensity;
uniform float gSpecularPower;
uniform sampler2D gPositionMap;
uniform sampler2D gColorMap;
uniform sampler2D gNormalMap;
uniform vec3 gCameraPos;
uniform vec2 gScreenSize;
uniform int gLightType;
uniform int gUsaLuci;
vec2 CalcTexCoord()
{
return gl_FragCoord.xy / gScreenSize;
}
out vec4 outputColor;
void main()
{
vec2 TexCoord = CalcTexCoord();
vec4 Color = texture(gColorMap, TexCoord);
outputColor = vec4(1.0, 0.0, 0.0, 1.0);
}
Lets say my Shaderprogram contains a Vertex- and a Fragmentshader.
How can i pass information straight to the fragmentshader?
When i Use gl_uniform... and specifiy the varaible I want to adress inside the fragmet shader it throws me an error like this:
Using ShaderProgram: The fragment shader uses varying myVarHere, but previous
shader doesnot write to it.
Since I attach the vertexshader first, I found out that I need to "pass the information through" the vertex shader using in and out.
Im not sure if this is the way to go, so my question: Is there a way to tell opengl to adress the variable in a certain shader or am I doing / understanding something horribly wrong?
VertexShader:
#version 130
precision highp float;
uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;
in vec3 vertexData;
in vec3 normalData;
out vec3 normal;
void main(void)
{
normal = (model * view * vec4(normalData, 0)).xyz;
gl_Position = projection * model * view * vec4(vertexData, 1);
}
Fragment:
#version 130
precision highp float;
in vec3 light0; // <- this is what i want to fill with data
const vec3 ambient = vec3(0.0, 0.0, 0.0);
const vec3 lightColor = vec3(0.6, 0.0, 0.0);
in vec3 normal;
out vec4 out_frag_color;
void main(void)
{
float diffuse = clamp(dot(normalize(light0), normalize(normal)), 0.5, 1.0);
out_frag_color = vec4(ambient + diffuse * lightColor, 1.0);
}
It seems like you want light0 to be a uniform.
Just make it a uniform:
#version 130
precision highp float;
uniform vec3 light0;
...