Black Texture when retrieving FrameBuffer in LibGDX - opengl

In my program I first render a scene seen by some camera, and then I want to retrieve this scene (Colorbuffer) to a Texture object so I can then paint it to a Quad taking up the whole screen. Though it looks stupid, the idea is just to see if the Texture is working fine, so I can use it in another not-so-stupid rendering (I'm trying to apply shadow mapping); and sadly, it's not working as I'd like. The getColorBufferTexture() call looks like it's returning a black texture, no matter what final colour I give to FragColor in the first render Fragment Shader. Testing just the first render, with the default ColorBuffer, is displaying on the screen what it has to, so there's no problem with the first rendering.
Here's the code:
First rendering,
FrameBuffer shadowBuffer = new FrameBuffer(Pixmap.Format.RGBA8888,20, 20, true);
shadowBuffer.begin();
shadowShaderProgram.begin();
for(DisplayableObject obj : objects) {
shadowShaderProgram.setUniformMatrix("u_modelViewProjectionMatrix", shadowCamera.getPVMatrix().mul(obj.getTMatrix()));
shadowShaderProgram.setUniformMatrix("u_modelViewMatrix", shadowCamera.getVMatrix().mul(obj.getTMatrix()));
obj.getTexture().bind();
obj.getMesh().render(shadowShaderProgram, GL20.GL_TRIANGLES);
}
shadowShaderProgram.end();
shadowBuffer.end();
Texture shadowMap = shadowBuffer.getColorBufferTexture();
And then the second, trying to paint the texture,
Mesh mesh = new Mesh(true, 4, 6, new VertexAttribute(VertexAttributes.Usage.Position, 4, "a_position"), new VertexAttribute(VertexAttributes.Usage.TextureCoordinates, 2, "a_texCoord0"));
mesh.setVertices(new float[] { //position, texCoord
-1f, -1f, 0f, 1f, 0f, 0f,
1f, -1f, 0f, 1f, 1f, 0f,
1f, 1f, 0f, 1f, 1f, 1f,
-1f, 1f, 0f, 1f, 0f, 1f
});
mesh.setIndices(new short[]{0,1,2,2,3,0});
String vs = Gdx.files.internal("auxShaderVert.glsl").readString();
String fs = Gdx.files.internal("auxShaderFrag.glsl").readString();
ShaderProgram auxShader = new ShaderProgram(vs,fs);
auxShader.begin();
shadowMap.bind();
auxShader.setUniformi("shadowMap", 0);
mesh.render(auxShader,GL20.GL_TRIANGLES);
auxShader.end();
The shaders of this second rendering are, respectively:
VS,
attribute vec4 a_position;
attribute vec2 a_texCoord0;
varying vec2 v_texCoord;
void main() {
v_texCoord = a_texCoord0;
gl_Position = a_position;
}
FS,
uniform sampler2D shadowMap;
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(shadowMap, v_texCoord);
}
I could not find much help through posts and questions online to see what's happening with the Texture retrieving; hope you can help me, I'm sure it must be something really stupid...
Thanks.

Related

Fragment shader not creating gradient like light in OpenGL GLSL

I am trying to understand how to manipulate my renderings with shaders. I haven't changed the projection matrix of the scene but I draw a triangle with vertices = {-0.5, -0.5} {0.5, -0.5} {0, 0.5}. I then pass in a vec2 position of a "light" to the uniform of my fragment shader that i want to essentially shine onto my triangle from the top right of the triangle (lightPos = (0.5,0.5))
Here is a very bad drawing of where everything is located.
and this is what I aim to have in my triangle (kind of.. it doesnt need to be white to blue it just needs to be brighter near the light and darker further away)
Here is the shader
#version 460 core
in vec3 oPos;
out vec4 fragColor;
uniform vec3 u_outputColor;
uniform vec2 u_lightPosition;
void main(){
float intensity = 1 / length(oPos.xy - u_lightPosition);
vec4 col = vec4(u_outputColor, 1.0f);
fragColor = col * intensity;
}
Here is the basic code to compiling the shader(most of it is abstracted away so it is fairly simple)
/* Test data for shader program. */
program.init("passthrough.vert", "passthrough.frag");
program.setUniformVec3("u_outputColor", 0.3f, 0.3f, 0.8f);
program.setUniformVec2("u_lightPosition", 0.5f, 0.5f);
GLfloat vertices[9] = {-0.5f, -0.5, 0, 0,0.5f,0, 0.5, -0.5, 0};
Here is vertex shader:
#version 460 core
layout (location = 0) in vec3 aPos;
out vec3 oPos;
void main(){
gl_Position = vec4(aPos.x, aPos.y, aPos.z, 1.0);
}
Every single test I have run to see why I can't get this to work seems to show me that if there is a slight color change it will change the entire triangle to a different shade. All tests show a triangle of ONE color across the entire thing; no gradient at all. I want the triangle to be a gradient that is brighter near the light and darker further from it. This is driving me crazy because I have been stuck on such a simple thing for 3 hours now and it just seems like any code I write modifies all 3 vertices at once as if they are in the exact same spot. I wrote the math out and I strongly feel as if this should work. Any help is very appreciated.
EDIT
The triangle after the solution fixed my issue:
Try this for your vertex shader:
#version 460 core
layout (location = 0) in vec3 aPos;
out vec3 oPos;
void main(){
oPos.xyz = aPos.xyz; // ADD THIS LINE
gl_Position = vec4(aPos.xyz, 1.0);
}
Your version never writes to oPos, so the fragment shader gets either a) a random value or, in your case b) vec3(0,0,0). Since your color calculation is based off of:
float intensity = 1 / length(oPos.xy - u_lightPosition);
This is basically the same as
float intensity = 1 / length(-1*u_lightPosition);
So the color only depends on the light position.
You can debug and verify this by setting your fragment color to oPos:
vec4 col = vec4(oPos.xy, oPos.z + 0.5, 1.0f);
If oPos was set correctly in the vertex shader, then this line in the fragment shader would show you an RGB ramp. If oPos is not set correctly, you'll see 50% blue.
Always check for errors and logs returned from OpenGL. It should have emitted a warning about this that would have sent you straight to the problem.
Also, I'm surprised that your entire triangle isn't being clipped since vertices have a z of 0.

Fragment shader to process texture data and pass through

I'm trying to use an atomic counter in a fragment shader.
Basically I'm displaying just a textured quad and need to make calculations on the texture data, and display the unaltered texture.
I began with writing a simple pass-through fragment shader, but the displayed texture is plain color. Either i'm missing something, i'm not sure if to use also vertex shader, as i'm processing only texture, seemed obsolete to me.
Fragment shader code:
#version 420
uniform vec2 texture_coordinate; uniform sampler2D my_color_texture;
void main() {
gl_FragColor = texture2D(my_color_texture, texture_coordinate);
}
Rendering pipeline - using shaprGL, but it's easily readable:
gl.Enable(OpenGL.GL_TEXTURE_2D);
String s = "my_color_texture"; //name of the texture variable in shader
gl.UseProgram(graycounter); // graycounter is a compiled/linked shader
GLint my_sampler_uniform_location = gl.GetUniformLocation(graycounter, s);
gl.ActiveTexture(OpenGL.GL_TEXTURE0);
gl.BindTexture(OpenGL.GL_TEXTURE_2D, ((dTexture)texture).texture); //texture id, working without shader on
gl.Uniform1ARB(my_sampler_uniform_location, 0);
gl.Begin(OpenGL.GL_QUADS); // show the rectangle
gl.Color(1.0f, 1.0f, 1.0f);
gl.TexCoord(0.0f, 0.0f); gl.Vertex(0.0f, 0.0f, 0.0f); // Bottom Left Of The Texture and Quad
gl.TexCoord(1.0f, 0.0f); gl.Vertex(w, 0.0f, 0.0f); // Bottom Right Of The Texture and Quad
gl.TexCoord(1.0f, 1.0f); gl.Vertex(w, h, 0.0f); // Top Right Of The Texture and Quad
gl.TexCoord(0.0f, 1.0f); gl.Vertex(0.0f, h, 0.0f); // Top Left Of The Texture and Quad
gl.End();
gl.UseProgram(0);
uniform vec2 texture_coordinate;
Uniform values do not change from fragment to fragment; that's why they're called "uniform". You probably wanted in rather than uniform.
Granted, since you're using legacy features, you probably need to hook into the gl_TexCoord[] array. That's what the legacy vertex processor (since you're not using a VS) will spit out. Texture coordinate 0 will map to gl_TexCoord[0].
I did figure it finally out, and if someone has the same problem, here is the vertex shader - note no min.version
out vec4 texCoord;
void main(void)
{
gl_Position = ftransform();
texCoord=gl_TextureMatrix[0] * gl_MultiTexCoord0;
}
and the fragment shader - using version #440 to support atomic counters
#version 440
layout (binding = 0, offset = 0) uniform atomic_uint g1;
uniform sampler2D my_color_texture;
in vec4 texCoord;
out vec4 fragColour;
void main() {
fragColour = texture2D(my_color_texture,texCoord.xy);
if (abs(fragColour.r*255 - 0)<1) atomicCounterIncrement(g1);
}

How would I draw a line between 2 world-coordinate points using OpenGL?

I have a basic 3D demo set up drawing lines with OpenGL using VAO's and shaders.
// Example vertices
float[] verts = {
1f, 1f, 1f,
0f, 0f, 0f
};
// Loading to VAO etc. as standard and enabling shader here...
// Draw call
glBindVertexArray(vaoID);
glDrawArrays(GL_LINES, 0, 2);
Vertex shader is basic, no model matrix in use at the moment:
gl_Position = projectionMatrix * viewMatrix * vec4(position, 1.0f);
This outputs the line drawn properly in the 3D world, but the start/end points are based on the vertex positions only which isn't too useful.
My question is: how can I specify the start and end locations in my world coordinates and draw a line between those points?
I would imagine I need to somehow transform the position of each of the two vertices by a different model matrix, but it's not at all clear how this would work, and what the original vertices should be for the line.

alpha not changing transparency of object using glsl shader

How come when i manually change the alpha value in array, being passed to shader, the result is the same for both 0.0f and 1.0f?
I was expecting the object to be drawn with some level of transparency, depending on alpha value.
I'm not using any textures. I always see my red object against a black background.
accessing glsl variable from java ..
float[] color = {1.0f, 0.0f, 0.0f, 1.0f};
gl2.glGetUniformLocation(shaderProgram, "vColor");
gl2.glUniform4fv(mColorHandle, 1, color, 0);
glsl, fragment shader ..
#version 120
uniform vec4 vColor;
void main() {
gl_FragColor = vColor;
gl_FragColor.a = 0.0; // does not make object transparent
// gl_FragColor.a = 1.0; // does not make object transparent
}
Needed to enable blending ..
gl2.glEnable(GL.GL_BLEND);
gl2.glBlendFunc(GL.GL_SRC_ALPHA, GL.GL_ONE_MINUS_SRC_ALPHA);

Superimpose lights with shaders in LibGDX game

I'm making a turn-based game with Libgdx.
I'm trying to create a fog of war for battles by adding a mask on the map and a maplight on each cell in the fight zone.
To do this I have to superimpose lights, but I can't.
The result in game :
The java render code :
//draw the light to the FBO
fbo.begin();
batch.setProjectionMatrix(cam.combined);
batch.setShader(defaultShader);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
float lightSize = lightOscillate? (4.75f + 0.25f * (float)Math.sin(zAngle) + .2f*MathUtils.random()):5.0f;
//Draw light 1
batch.draw(light, 0, 0, lightSize, lightSize);
//Draw light 2
batch.draw(light, 2, 2, lightSize, lightSize);
batch.end();
fbo.end();
//draw the actual scene
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(cam.combined);
batch.setShader(finalShader);
batch.begin();
fbo.getColorBufferTexture().bind(1); //this is important! bind the FBO to the 2nd texture unit
light.bind(0); //we force the binding of a texture on first texture unit to avoid artefacts
//this is because our default and ambiant shader dont use multi texturing...
//youc can basically bind anything, it doesnt matter
tilemap.render(batch, dt);
batch.end();
The fragment shader code :
varying LOWP vec4 vColor;
varying vec2 vTexCoord;
//texture samplers
uniform sampler2D u_texture; //diffuse map
uniform sampler2D u_lightmap; //light map
//additional parameters for the shader
uniform vec2 resolution; //resolution of screen
uniform LOWP vec4 ambientColor; //ambient RGB, alpha channel is intensity
void main() {
vec4 diffuseColor = texture2D(u_texture, vTexCoord);
vec2 lighCoord = gl_FragCoord.xy / resolution.xy;
vec4 light = texture2D(u_lightmap, lighCoord);
vec3 ambient = ambientColor.rgb * ambientColor.a;
vec3 intensity = ambient + light.rgb;
vec3 finalColor = diffuseColor.rgb * intensity;
gl_FragColor = vColor * light; //vec4(finalColor, diffuseColor.a);
}
I found the solution. The alpha blending had to be enabled.
Here the correct render code :
public void render() {
final float dt = Gdx.graphics.getRawDeltaTime();
Gdx.gl.glClearColor(0f,0f,0f,1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE);
//draw the light to the FBO
fbo.begin();
Gdx.gl.glClearColor(0f,0f,0f,0f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(cam.combined);
batch.setShader(defaultShader);
batch.begin();
float lightSize = lightOscillate? (4.75f + 0.25f * (float)Math.sin(zAngle) + .2f*MathUtils.random()):5.0f;
//Draw light 1
batch.draw(light, 0, 0, lightSize, lightSize);
//Draw light 2
batch.draw(light, 2, 2, lightSize, lightSize);
batch.end();
fbo.end();
batch.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
//draw the actual scene
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(cam.combined);
batch.setShader(finalShader);
batch.begin();
fbo.getColorBufferTexture().bind(1); //this is important! bind the FBO to the 2nd texture unit
light.bind(0); //we force the binding of a texture on first texture unit to avoid artefacts
//this is because our default and ambiant shader dont use multi texturing...
//youc can basically bind anything, it doesnt matter
tilemap.render(batch, dt);
batch.end();
}