How to fix alpha of stb_image library - c++

I'm using stb_image.c for loading bytes of data to opengl.
However, I'm struggling with one issue that happens to me, when I try to load png files with alpha channel
Now the white area should be the transparent one, though it cleared only the little part of it.
Any ideas what is causing this behavior?
The photoshop histogram

You are doing the Tom Dalling OpenGL tutorial? ^^
I don't know if you are still searching for a solution to this but here I go anyways:
I actually had the same problem as I was working through that lesson but it worked for me by using glBlendFunc with sfactor GL_SRC_ALPHA and dfactor GL_ONE_MINUS_SRC_ALPHA. These are the parameters OpenGL seems to recommend in the documentation at least: glBlendFunc
Transparency is best implemented using blend function(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) with primitives sorted from farthest to nearest. Note that this transparency calculation does not require the presence of alpha bitplanes in the frame buffer.

Related

How to draw an OpenGL alphamap with filled foreground and background?

I'm drawing text from a texture atlas containing an alphamap GL_ALPHA, GL_UNSIGNED_BYTE --- it's produced by stb_truetype, if that's any help. Right now I'm drawing with GL_BLEND enabled and the texture with all the default settings. I can set the foreground colour with glColor3f, the background is transparent, and everything works fine.
The problem is, on some platforms drawing transparent text is too slow. And I don't actually need it; the background is always a solid colour. I am, in fact, filling it with glRectf just before drawing the texture.
Experiments have shown that disabling GL_BLEND fixes all my performance issues, but of course that gives me solid blocks of colour instead of text. Looking at the documentation for glTexEnvi, it seems to me that it ought to be possible to draw the texture without blending, getting it to just apply the alpha value to fixed source and destination colours. But I can't figure out how, the documentation is rather opaque, and just trying stuff isn't helping.
(Maybe I need to have GL_BLEND enabled but tell it somehow to use my background colour rather than reading from the framebuffer?)
Can anyone enlighten me?
I am using old-fashioned fixed-pipeline pre-shader OpenGL, by the way, so I can't simply do all this in the shader fragment.

OpenGL : Blending & feedback effect

I'm struggling on a simple project, as an example/sandbox, I'm rendering a small oscillating rectangle on my output. I'm not using glclearcolor() but instead, on every frame I draw a black rectangle before anything else, blending with glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
My goal is to see as I play with the alpha of this black rectangle feedback of previous frames, slowly fading, some kind of trail, and it's more or less working.
My main problem though, the trail never really disappears, and the longer I try to get the trail, the worse it gets. Also, I have to quite crank the alpha before seeing any trail, I don't really understand why.
The default OpenGL framebuffer only uses 8 bits for each color component. You can increase this by using a custom framebuffer backed by floats, or 16 or 32-bit components.
I'm not sure whether not using glClearColor is the proper way to implement motion trail. It's possible that the last bit of alpha blending runs into precision/rounding problem, where 0.9 * 0x01 might give you back 0x01 (for each rgba octet). (although I would be surprised you can see the difference but who knows). If that's not the case I would switch to a proper glClearColor and then create a trail of boxes similar to how you do the leading box, with deterministic decay/resource freeing.

Model with transparency

I have a model with transparent quads for a beard. I can not tell what triangles belong to the beard because their color comes from the texture passed to the fragment shader. I have tried to presort the triangles back to front during export of the model, but this does not help. So I implemented MSAA and Alpha to Coverage, but this did not help either. My last attempt was to draw the model, with a depth mask off and skipping any transparent data, so the color buffer would have non-clear color values to blend with. Then I would draw the model a second time with depth testing on and drawing the alpha pieces.
Nothing I have tried so far has worked. What other techniques can I try to get the beard of the model to properly draw? I am looking for a way to handle this that doesn't use a bunch of extensions. I'd prefer techniques that can be handled with plain old OpenGL 4.
Here is an image of what I am dealing with.
This is what I got after I applied the selected answer.
What you're trying to do there is a still largely unsolved problem: Order independent transparency. MSAA is something entirely different, as is alpha coverage.
So far the best working solution is to separate the model into an opaque and a hairy part. Draw the opaque parts of your scene first, then draw everything (semi-)translucent, ordered far to near in a second pass.
The way your image looks like it seems like the beard is rendered as the first thing, which is quite the opposite of what you actually want.
Simple way:
Enable depth write (depth mask), disable alpha-blending, draw model without the beard.
Disable depth write, enable alpha-blending, draw the beard. Make sure face culling is enabled.
Hard way:
Because order-independent transparency in renderers that use z-buffer is an unsolved problem (as datenwolf said), you could try depth-peeling. I believe the paper is available within OpenGL SDK. Most likely it'll be slower than "simple way", and there'll be a limit on number of maximum overlapping transparent polygons. Also check wikipedia article on order-independent transparency.

Render Textures at same position with one as partial mask

With OpenGL, is there any way to render two textures at the same position and blend them together with alpha blending so that one appears on top of the other? I am trying to make it so that my back-texture can be dynamic on the secondary texture will have a 'window' that will show the texture 'behind' it. I have done quite a bit of research and have tried several combinations of glDepthFunc, glBlendFunc, etc. and have not found any combination that works. I am guessing that this is possible, but just haven't found the trick.
its been awhile, but ill try to help some?
first you have to have GL_BLEND enabled
glEnable(GL_BLEND);
and then i usually have to follow that with a
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
But i usually use delphi so not sure how closely this will help you..

OpenGl Rendering Transparent .png with Random White Pixels

I am working on a game with a friend and we are using openGl, glut, devIL, and c++ to render everything. Simply, Most of the .pngs we are using are rendering properly, but there are random pixels that are showing up as white.
These pixels fall into 2 categories. The first are pixels on the edge of the image. These are resulting from the anti-aliasing going on from photoshop's stroke feature (which i am trying to fix). The second is the more mysterious one. When the enemy is standing still, the texture looks fine, but as soon as it jumps a random white line appears on the top of it.
The line on top is of varying solidity (this shot is not the most solid)
It seems like a blending issue, but I am not as familiar with the way openGl handles the transparency (our code for transparency was learned from the other questions on stack overflow though I couldn't find anything on this issue, however). I am hoping something will fix both issues, but am more worried about the second.
Our current setup code:
glEnable (GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_TEXTURE_2D);
glDisable(GL_DEPTH_TEST);
Transparent areas of a bitmap also have a color. If it is 100% transparent, you usually can't see it. Photoshop usually fills white in these areas.
If you are using minifying or magnifying flags that are not GL_NEAREST, then you will have interpolation. If you interpolate in between two pixels, where one is blue and opaque, and the other is white and transparent, then you will get something that is 50% transparent and light-blue. You may also get the same problem with mimaps, as interpolation is used. If you use mipmaps, one solution is to generate them yourself. That way, you can ignore the transparent areas when doing the interpolations. See some good explanations here: http://answers.unity3d.com/questions/10302/messy-alpha-problem-white-around-edges.html
Why are you using png files? You save some disk space, but need to include complex libraries like devil. You don't save any space in the delivery of an application, as most tools that creates delivery packages have very efficient compression. And you don't save any memory on the GPU, which may be the most critical.
This looks like an artifact in your source PNG. Are you sure there are no such light opaque pixels there?
White line appearing on top could be a UV interpolation error from neighbor texture in your texture atlas (or padding if you pad your NPOT textures to POT with white opaque pixels). Thats why usually you need to pad textures with at least one edge pixel in every direction. That won't help with mipmaps though, as Lars said - you might need to use custom mipmap generation or drop it altogether.