I am working on a game project using LibGDX. Now I'm facing a problem I can't understand; let me explain what I've got before coming to the actual problem:
Static background C (tiled)
Moving background B (tiled)
Moving background A (tiled)
Characters, entities
Walls (tiled)
HUD
The render loop goes as follows:
Render 2 to 5 in previous list to a transparent FBO
Render the lightmap to another FBO (using box2dLights)
Blend both products together using a custom shader
Draw the static background
Draw the blended texture from step 3
Draw the HUD
The problem I'm facing is that I obtain a white sprite when rendering background B (not a white block; it is more like a rgba(1, 1, 1, x). Then the background B is blended correctly with the lightmap (render step 3), but I cannot get its real color, just white:
The background B should appear dark blue and with a texture; it appears as a white sprite (blended with the lightmap). Behind background A, at the left of the picture (and also where the "SHADOW" text is), you can still see background B in a purplish tone, result of the blending with the light map, but please note it has no texture and it seems to me like if the texture is just plain white + alpha.
The render code:
#Override
public void render(float delta) {
Gdx.gl.glClearColor(0f, 0f, 0f, 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
/* 1. Update Physics and lights */
if (worldFixedStep(delta)) {
rayHandler.update();
}
updateCameras();
cameraMatrixCopy.set(camera.combined);
rayHandler.setCombinedMatrix(cameraMatrixCopy.scale(Globals.BOX_TO_WORLD, Globals.BOX_TO_WORLD, 1.0f), camera.position.x,
camera.position.y, camera.viewportWidth * camera.zoom * Globals.BOX_TO_WORLD,
camera.viewportHeight * camera.zoom * Globals.BOX_TO_WORLD);
rayHandler.render();
final Texture lightMap = rayHandler.getLightMapTexture();
fbo.begin();
{
Gdx.gl.glClearColor(0f, 0f, 0f, 0f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
/* 2. Draw the backgrounds */
batch.enableBlending();
batch.setBlendFunction(GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA);
batch.setProjectionMatrix(bgCamera2.combined); //Background B
batch.begin();
bTileMapRenderer.setView(bgCamera2);
bTileMapRenderer.renderTileLayer(tilesBg2Layer);
batch.end();
batch.setProjectionMatrix(bgCamera1.combined); //Background A
batch.begin();
bTileMapRenderer.setView(bgCamera1);
bTileMapRenderer.renderTileLayer(tilesBg1Layer);
batch.end();
batch.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA); //FIXME this is where to touch to avoid gray alpha'ed borders.
batch.setProjectionMatrix(camera.combined);
batch.begin();
(...) //Draw everything else that needs to be blended with the light map
batch.end();
}
fbo.end();
/* 3. Render the static background (background C) */
batch.setProjectionMatrix(bgCameraStatic.combined);
batch.disableBlending();
batch.begin();
bTileMapRenderer.setView(bgCameraStatic);
bTileMapRenderer.renderTileLayer(tilesBg3Layer);
batch.end();
/* 4. Blend the frame buffer's texture with the light map in a fancy way */
Gdx.gl20.glActiveTexture(GL20.GL_TEXTURE0);
fboRegion.getTexture().bind();
Gdx.gl20.glActiveTexture(GL20.GL_TEXTURE1);
lightMap.bind();
Gdx.gl20.glEnable(Gdx.gl20.GL_BLEND);
Gdx.gl20.glBlendFunc(Gdx.gl20.GL_SRC_ALPHA, Gdx.gl20.GL_ONE_MINUS_SRC_ALPHA);
lightShader.begin();
lightShader.setUniformf("ambient_color", bgColor[0], bgColor[1], bgColor[2]);
lightShader.setUniformi("u_texture0", 0);
lightShader.setUniformi("u_texture1", 1);
fullScreenQuad.render(lightShader, GL20.GL_TRIANGLE_FAN, 0, 4);
lightShader.end();
Gdx.gl20.glDisable(Gdx.gl20.GL_BLEND);
hud.draw();
}
I just can't understand why this background is drawn white but still with its alpha data. The image is a premultiplied alpha texture, but again I cannot see why could this affect the color rendering.
Any help would be much appreciated.
Cheers
Related
I have two textures rendered in the same way. The green texture has the right transparency in the right places, but when I move the pink texture in front, it shows the background color where it should be transparent.
This is the snippet code of the paintGL method that renders the textures.
void OpenGLWidget::paintGL()
{
// ...
for (int i = 0; i < lights.size(); i++)
{
glUseProgram(lights[i].program);
setUniform3fv(program, "lightPosition", 1, &lights[i].position[0]);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, lights[i].texture);
lights[i].svg.setColor(toColor(lights[i].diffuse));
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, lights[i].svg.width(), lights[i].svg.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, lights[i].svg.toImage().constBits());
glGenerateMipmap(GL_TEXTURE_2D);
glBindVertexArray(lights[i].vertexArray);
glDrawElements(GL_TRIANGLES, lights[i].indices.size(), GL_UNSIGNED_BYTE, nullptr);
}
update();
}
The toImage method of the svg class generates a new QImage object from the svg file, so the new texture value should be updated with each frame.
Where am I doing wrong? Thanks!
This probably happens because you have depth testing enabled. Even though parts of the texture are (partly or fully) transparent, OpenGL still writes to the depth buffer, so the pink light's quad appears to obscure the green light. It works the other way round, because the pink light is drawn first, so the green light hasn't been written to the depth buffer at that point.
The usual solution to this is to render transparent textures in back to front order.
You could also just write your fragment shader to discard fragments if they are transparent. But this results in artifacts if you have semi-transparent fragments, which you have, because of texture filtering and mipmaps.
Usually with Processing, you fill a rectangle with a given color like so:
fill(255, 0, 0); // Make it red
rect(0,0, 100, 100); // Make it square
However, this does not work. Instead, the rectangle displays this shader. Somewhere earlier in the execution I call this:
PShader shader = loadShader(filePath); // A shader is loaded once upon startup
// In a draw() method
shader(shader);
rect(0, 0, screenWidth, screenHeight);
This draws a rectangle which covers the whole screen, and a nice dynamic background is displayed.
Why does the fill() call have no effect and why is the shader drawn in the rectangle instead? How can I keep the background shader and also display a red rectangle in Processing?
I am trying to create a god's ray effect from scratch using libgdx and opengl shader langage. To do this I use a background image as light source, then I apply another texture as a mask setting the spriteBatch color to full black.
background texture
mask texture
the mask is then rendered in full black over the background
Color batchColor = batch.getColor();
fbo1.begin();
batch.begin();
batch.draw(textureBackground, 0, 0, w, h);
batch.setColor(Color.BLACK);
batch.draw(textureBar, 0, 0, w, h);
batch.setColor(batchColor);
batch.end();
fbo1.end();
then the god's ray shader is applied
Sprite rayEffect = new Sprite(fbo1.getColorBufferTexture());
rayEffect.flip(false, true);
fbo2.begin();
batch.setShader(shaderGodRay);
batch.begin();
rayEffect.draw(batch);
batch.end();
batch.setShader(null);
fbo2.end();
The rays are ok at this stage. Know I would like to blend the original mask color with the rendered rays in order to obtain the final image. Only rendering the mask again on top of the rays are totally overlapped by the colored mask
rayEffect = new Sprite(fbo2.getColorBufferTexture());
rayEffect.flip(false, true);
batch.begin();
rayEffect.draw(batch);
batch.draw(textureBar, 0, 0, w, h);
batch.end();
I think alpha blending should do the trick, but on my ray rendered image, the opacity is full.
Does someone know how I may blend the two texture together in order to obtain the desired final result?
I'm working on a game using DirectX 9. Here's what I'm trying to do:
After the scene is rendered, on top of it I want to render few sprites: a black cover on entire scene and a few sprites, which are masks showing where the cover should have holes. So far I tried messing with blend mode but with no luck. My code setting it up looks like this:
D3DD->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
I guess the best way would be to multiply each sprites alpha, but according to http://msdn.microsoft.com/en-us/library/windows/desktop/bb172508%28v=vs.85%29.aspx no such mode is supported. Is There another way to do this?
edit
Following Nico Schertler's answer, here's the code I came up with:
LPDIRECT3DTEXTURE9 pRenderTexture;
LPDIRECT3DSURFACE9 pRenderSurface,
pBackBuffer;
// create texture
D3DD->CreateTexture(1024,
1024,
1,
D3DUSAGE_RENDERTARGET,
D3DFMT_R5G6B5,
D3DPOOL_DEFAULT,
&pRenderTexture,
NULL);
pRenderTexture->GetSurfaceLevel(0,&pRenderSurface);
// store old render target - back buffer
D3DD->GetRenderTarget(0,&pBackBuffer);
// set new render target - texture
D3DD->SetRenderTarget(0,pRenderSurface);
//clear texture to opaque black
D3DD->Clear(0,
NULL,
D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER,
D3DCOLOR_XRGB(0,0,0),
32.0f,
0);
// set blending
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ZERO);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE);
D3DD->SetRenderState(D3DRS_SRCBLENDALPHA, D3DBLEND_ZERO);
D3DD->SetRenderState(D3DRS_DESTBLENDALPHA, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_SEPARATEALPHABLENDENABLE, TRUE);
//// now I render hole sprites the usual way
// restore back buffe as render target
D3DD->SetRenderTarget(0,pBackBuffer);
// restore blending
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
D3DD->SetRenderState(D3DRS_SRCBLENDALPHA, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLENDALPHA, D3DBLEND_INVSRCALPHA);
D3DD->SetRenderState(D3DRS_SEPARATEALPHABLENDENABLE, FALSE);
ulong color = ulong(-1);
Vertex2D v[4];
v[0] = Vertex2D(0, 0, 0);
v[1] = Vertex2D(1023, 0, 0);
v[3] = Vertex2D(1023, 1023, 0);
v[2] = Vertex2D(0, 1023, 0);
D3DD->SetTexture(0, pRenderTexture);
D3DD->SetFVF(D3DFVF_XYZRHW | D3DFVF_DIFFUSE | D3DFVF_TEX1);
D3DD->DrawPrimitiveUP(D3DPT_TRIANGLESTRIP, 2, v, sizeof(Vertex2D));
D3DD->SetTexture(0, NULL);
// release used resources
pRenderTexture->Release();
pRenderSurface->Release();
pBackBuffer->Release();
Unfortunatelly, the app crashes when restoring the old render target. Any advice?
Firstly, you should create the mask in a separate texture first. Then you can add the holes as needed. Finally, draw the mask on the screen:
Initialize the texture
Clear it to opaque black
Using the following blend states:
D3DRS_SRCBLEND -> D3DBLEND_ZERO (hole's color does not matter)
D3DRS_DESTBLEND -> D3DBLEND_ONE (preserve the black color)
D3DRS_SRCBLENDALPHA -> D3DBLEND_ZERO
D3DRS_DESTBLENDALPHA -> D3DBLEND_SRCALPHA
D3DRS_SEPARATEALPHABLENDENABLE -> TRUE
Draw each hole sprite
Restore default blending (src_alpha / inv_src_alpha)
Render the texture as a sprite to the back buffer
The above blend state assumes that the holes are opaque where there should be a hole. Then, the color is calculated by:
blended color = 0 * hole sprite color + 1 * background color
which should always be black.
And the alpha channel is calculated by:
blended alpha = 0 * hole sprite alpha + (1 - hole sprite alpha) * background alpha
So where the hole sprite is opaque, the blended alpha becomes 0. Where it is transparent, the blended alpha is the previous value. Values in between are blended.
Making a 2D OpenGL game. When rendering a frame I need to first draw some computed quads geometry and then draw some textured sprites. When the body of my render method only draws the sprites, everything works fine. However, when I try to draw my geometric quads prior to the sprites the texture of the sprite changes to be the color of the last GL.Color3 used previously. How do I tell OpenGL (well, OpenTK) "Ok, we are done drawing geometry and its time to move on to sprites?"
Here is what the render code looks like:
// Let's do some geometry
GL.Begin(BeginMode.Quads);
GL.Color3(_dashboardBGColor); // commenting this out makes my sprites look right
int shakeBuffer = 100;
GL.Vertex2(0 - shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, 0 - shakeBuffer);
GL.Vertex2(0 - shakeBuffer, 0 - shakeBuffer);
GL.End();
// lets do some sprites
GL.Begin(BeginMode.Quads);
GL.BindTexture(TextureTarget.Texture2D, _rockTextureId);
float baseX = 200;
float baseY = 200;
GL.TexCoord2(0, 0); GL.Vertex2(baseX, baseY);
GL.TexCoord2(1, 0); GL.Vertex2(baseX + _rockTextureWidth, baseY);
GL.TexCoord2(1, 1); GL.Vertex2(baseX + _rockTextureWidth, baseY - _rockTextureHeight);
GL.TexCoord2(0, 1); GL.Vertex2(baseX, baseY - _rockTextureHeight);
GL.End();
GL.Flush();
SwapBuffers();
The default texture environment mode is GL_MODULATE, which does that, it multiplies the texture color with the vertex color.
A easy solution is to set the vertex color before drawing a textured primitive to 1,1,1,1 with:
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
Another solution is to change the texture environment mode to GL_REPLACE, which makes the texture color replace the vertex color and doesn't have the issue:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);