How to render a texture with alpha?
I have a texture, and need to render it with different alpha values at different locations. Any way to do so? (My texture is GL_RGBA)
If not possible to change alpha value on the fly, I have to create different textures for different alpha levels?
First, make sure that your texture has an alpha channel. You mention you are loading an RGBA format, but it's always good to check the original file in an image editing program. Then make sure your texture is ready for rendering in openGL. A common mistake is to forget to set up the texture's filtering mode through glTexParameter*. It starts on a setting requiring mipmaps, so I find that it's easiest to start with:
glTexParameteri(GL_TEXTURE_2D, GL_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_MAG_FILTER, GL_LINEAR);
Secondly, you will need to set up openGL to be ready for blending. This involves a glEnable call with GL_BLEND and a glBlendFunc call. Most of the time, you will want the function call to be glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA), as most other combinations of tokens will give you effects you are not after (see the glBlendFunc spec page for more info).
Finally, ensure you are sampling your texture at different points. If you are using immediate mode (you are using glVertex* to draw your scene), you will need to either use glTexGen* or manually specify texture points using glTexCoord* before calls to glVertex*. If using array data to draw your scene, make sure you have enabled the texture pointer using glEnableClientState(GL_TEXTURE_COORD_ARRAY) and glTexCoordPointer.
Your texture is GL_RGBA so it has a different alpha value for each texel.
If you want to change the alpha value used for render, I can think of the following methods:
Change the texture alpha values (not sure if you say that you don't want to do that).
Use glColor4f to change the alpha value of the vertices. It will multiply the texture values. You may need to use glEnable(GL_COLOR_MATERIAL) and/or glColorMaterial().
Use a vertex shader to change the vertex alpha values. It will multiply the texture values.
Use a fragment shader to change the sampled texture values on the fly.
Use two texture stages and multiply them. The second one will have the modified alpha values (see glActiveTexture() and friends).
Use a fragment shader and two (or more) texture stages. This is the coolest!
Related
I'm trying to render a model in OpenGL. I'm on Day 4 of C++ and OpenGL (Yes, I have learned this quickly) and I'm at a bit of a stop with textures.
I'm having a bit of trouble making my texture alpha work. In this image, I have this character from Spiral Knights. As you can see on the top of his head, there's those white portions.
I've got Blending enabled and my blend function set to glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
What I'm assuming here, and this is why I ask this question, is that the texture transparency is working, but the triangles behind the texture are still showing.
How do I make those triangles invisible but still show my texture?
Thanks.
There are two important things to be done when using blending:
You must sort primitives back to front and render in that order (order independent transparency in depth buffer based renderers is still an ongoing research topic).
When using textures to control the alpha channel you must either write a shader that somehow gets the texture's alpha values passed down to the resulting fragment color, or – if you're using the fixed function pipeline – you have to use GL_MODULATE texture env mode, or GL_DECAL with the primitive color alpha value set to 0, or use GL_REPLACE.
I'm learning about framebuffers right now and I just don't understand what the Color attachment does. I understand framebuffers.
What is the point of the second parameter in:
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureColorBuffer, 0);
Why doesn't anything draw to my frame buffer when I change it to COLOR_ATTACHMENT1?
How could I draw to the frame buffer by setting the texture to Color attachment 1?
Why would using multiple color attachments be useful?
Is it similar to the glActiveTexture(GL_TEXTUREi) concept of drawing multiple textures at once?
I'm just trying to understand more open gl. Thanks.
Yes, a framebuffer can have multiple color attachments, and the second parameter to glFramebufferTexture2D() controls which of them you're setting. The maximum number of supported color attachments can be queried with:
GLint maxAtt = 0;
glGetIntegerv(GL_MAX_COLOR_ATTACHMENTS, &maxAtt);
The minimum number supported by all implementations is 8.
To select which buffer(s) you want to render to, you use glDrawBuffer() or glDrawBuffers(). The only difference between these two calls is that the first one allows you to specify only one buffer, while the second one supports multiple buffers.
In the example you tried, you can render to GL_COLOR_ATTACHMENT1 by using either:
glDrawBuffer(GL_COLOR_ATTACHMENT1);
or:
GLenum bufs[1] = {GL_COLOR_ATTACHMENT1};
glDrawBuffers(1, bufs);
The default is GL_COLOR_ATTACHMENT0, which explains why you had success rendering to attachment 0 without ever using this call. The list of draw buffers is part of the FBO state.
To produce output for multiple color buffers, you define multiple outputs in the fragment shader.
One popular use for multiple color buffers (often referred to by the acronym MRT = Multiple Render Targets) is deferred shading, where the interpolated vertex attributes used for the shading calculation are stored in a number of buffers in the initial rendering pass. The actual shading calculation is then performed only for the visible pixels in a second pass, using the attribute values from the buffers produced in the first pass. See for example http://ogldev.atspace.co.uk/www/tutorial35/tutorial35.html for a more complete explanation of how this works.
I have two 2D textures. The first, an MSAA texture, uses a target of GL_TEXTURE_2D_MULTISAMPLE. The second, non MSAA texture, uses a target of GL_TEXTURE_2D.
According to OpenGL's spec on ARB_texture_multisample, only GL_NEAREST is a valid filtering option when the MSAA texture is being drawn to.
In this case, both of these textures are attached to GL_COLOR_ATTACHMENT0 via their individual Framebuffer objects. Their resolutions are also the same size (to my knowledge this is necessary when blitting an MSAA to non-MSAA).
So, given the current constraints, if I blit the MSAA holding FBO to the non-MSAA holding FBO, do I still need to use GL_NEAREST as the filtering option, or is GL_LINEAR valid, since both textures have already been rendered to?
The filtering options only come into play when you sample from the textures. They play no role while you render to the texture.
When sampling from multisample textures, GL_NEAREST is indeed the only supported filter option. You also need to use a special sampler type (sampler2DMS) in the GLSL code, with corresponding sampling instructions.
I actually can't find anything in the spec saying that setting the filter to GL_LINEAR for multisample textures is an error. But the filter is not used at all. From the OpenGL 4.5 spec (emphasis added):
When a multisample texture is accessed in a shader, the access takes one vector of integers describing which texel to fetch and an integer corresponding to the sample numbers described in section 14.3.1 determining which sample within the texel to fetch. No standard sampling instructions are allowed on the multisample texture targets, and no filtering is performed by the fetch.
For blitting between multisample and non-multisample textures with glBlitFramebuffer(), the filter argument can be either GL_LINEAR or GL_NEAREST, but it is ignored in this case. From the 4.5 spec:
If the read framebuffer is multisampled (its effective value of SAMPLE_BUFFERS is one) and the draw framebuffer is not (its value of SAMPLE_BUFFERS is zero), the samples corresponding to each pixel location in the source are converted to a single sample before being written to the destination. filter is ignored.
This makes sense because there is a restriction in this case that the source and destination rectangle need to be the same size:
An INVALID_OPERATION error is generated if either the read or draw framebuffer is multisampled, and the dimensions of the source and destination rectangles provided to BlitFramebufferare not identical.
Since the filter is only applied when the image is stretched, it does not matter in this case.
In order to implement "depth-peeling", I render my OpenGL scene in to a series of framebuffers each equipped with a rgba color texture and depth texture. This works fine if I don't care about anti-aliasing. If I do, then it seems the correct thing to do is enable GL_MULTISAMPLING and use a GL_TEXTURE_2D_MULTISAMPLE instead of GL_TEXTURE_2D. But I'm confused about which other calls need to be replaced.
In particular, how should I adapt my framebuffer construction to use glTexImage2DMultisample instead of glTexImage2D?
Do I need to change the calls to glFramebufferTexture2D beyond using GL_TEXTURE_2D_MULTISAMPLE instead of GL_TEXTURE_2D?
If I'm rendering both color and depth into textures, do I need to make a call to glRenderbufferStorageMultisample?
Finally, is there some glBlit* that I need to do in addition to setting up textures for the framebuffer to render into?
There are many related questions on this topic, but none of the solutions I found seem to point to a canonical tutorial or clear example putting all these together.
While I have only used multisampled FBO rendering with renderbuffers, not textures, the following is my understanding.
Do I need to change the calls to glFramebufferTexture2D beyond using GL_TEXTURE_2D_MULTISAMPLE instead of GL_TEXTURE_2D?
No, that's all you need. You create the texture with glTexImage2DMultisample(), and then attach it using GL_TEXTURE_2D_MULTISAMPLE as the 3rd argument to glFramebufferTexture2D(). The only constraint is that the level (5th argument) has to be 0.
If I'm rendering both color and depth into textures, do I need to make a call to glRenderbufferStorageMultisample?
Yes. If you attach a depth buffer to the same FBO, you need to use a multisampled renderbuffer, with the same number of samples as the color buffer. So you create your depth renderbuffer with glRenderbufferStorageMultisample(), passing in the same sample count you used for the color buffer.
Finally, is there some glBlit* that I need to do in addition to setting up textures for the framebuffer to render into?
Not for rendering into the framebuffer. Once you're done rendering, you have a couple of options:
You can downsample (resolve) the multisample texture to a regular texture, and then use the regular texture for your subsequent rendering. For resolving the multisample texture, you can use glBlitFramebuffer(), where the multisample texture is attached to the GL_READ_FRAMEBUFFER, and the regular texture to the GL_DRAW_FRAMEBUFFER.
You can use the multisample texture for your subsequent rendering. You will need to use the sampler2DMS type for the samplers in your shader code, with the corresponding sampling functions.
For option 1, I don't really see a good reason to use a multisample texture. You might just as well use a multisample renderbuffer, which is slightly easier to use, and should be at least as efficient. For this, you create a renderbuffer for the color attachment, and allocate it with glRenderbufferStorageMultisample(), very much like what you need for the depth buffer.
In a game I'm writing, I have a level, which is properly rendered to the on-screen render buffer provided to me by the OS. I can also render this to a framebuffer, then render this framebuffer onto the output render buffer.
To add a background, I want to render a different scene, an effect, or whatever to a second framebuffer, then have this "show through" wherever the framebuffer containing the level has no pixel set, i.e. the alpha value is 0. I think this is called alpha blending.
How would I go about doing this with OpenGL? I think glBlendFunc could be used to achieve this, but I am not sure how I can couple this with the framebuffer drawing routines to properly achieve the result I want.
glBlendFunc allows the application to blend (merge) the output of all your current draw operations (say, X) with the current "display" framebuffer (say, Y) that already exists.
ie,
New display output = X (blend) Y
You can control the blend function by gl as below snippet shows for example:
glEnable(GL_BLEND);
glBlendEquation(GL_FUNC_ADD);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Full usage shown here
https://github.com/prabindh/sgxperf/blob/master/sgxperf_test4.cpp
Note that the concepts of "showing through" and "blending" are a little different, you might just want to stick with "per pixel alpha blending" terminology.
FBOs are just a containers and are not storage. What you need to do is attach a texture target for each FBO and render your output to that texture, once you have done this. You can use your output textures on a fullscreen quad and do whatever you want with your blending.