Stencilling a render onto an unknown curved surface - opengl

Wanting to decal multiple irregular textures onto a curved surface (mesh with xyz vertices and uv specified at each). I am loading the mesh from a model file, and don't have any a priori knowledge of the surface... all we know is that it will have a "reasonable" uv mapping. Want to select a few uv regions and apply textures to them. Each region is specified by a bounding poly in uv coordinates. Don't know the equivalent xyz poly in this case, or I think the answer would be simple.
We have this working for flat surfaces and also simple cylindrical surfaces (which we approximate as a series of flat stripes, smoothed by choosing the normal as averages). In both cases we know a unique mapping from uv to xyz so we set up the stencil buffer to limit drawing to the desired uv region by drawing the equivalent xyz poly to the stencil buffer ahead of binding a texture and drawing the real surface.
We are also using rgba transparency within the textures when decaling those onto the surface. Typically each textured region is a small rotated rectangle so we draw the four vertices to the stencil buffer, then use the texture matrix to rotate that, and use the rgba transparency within the texture to ensure only the right part of the texture is applied. This all works nicely.
Would like to reuse our working code, but now apply these textures to an arbitrary curved surface/mesh. We are loading and drawing these models, and can already apply textures to whole faces [ie uv goes from (0,0) to (1,1) ]. Now we want to extend this and apply "placed" textures to regions of each surface.
Thought it might be possible draw the uv poly to the stencil buffer directly, not even knowing the equivalent xyz poly... then all the existing code would work. Perhaps could use some trick like a frame buffer object, and do the initial draw of the stencil poly to that, then using that as the stencil during the "real" draw of the curved surface mesh. Would that be a good approach? Or is there a better way?
Any advice or url links to relevant samples welcome...
PS Have looked at these threads... sort of relevant but not quite the same problem I think...
Binding a stencil render buffer to a frame buffer in opengl
Visualizing the Stencil Buffer to a texture
I am currently looking at some working FBO setup/usage code I have for off-screen shadow mapping, and trying to make it work for this seemingly simpler situation. The bit I'm unclear on is the setup gl calls needed ... I am rather confused about how to set this up. Here's an extract of the hardware shadowing FBO setup with bits chopped out and ?? added... any help on correct sequence here appreciated.
glBindTexture(GL_TEXTURE_2D, tex);
?? not
::glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT32, shadowsize, shadowsize, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
?? but a more normal binding approp to drawing RGBA textures
::glBindTexture(GL_TEXTURE_2D, 0);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, m_Framebuffer);
// Attach everything, tell fbo there will be a drawbuffer, unlike shadows tex draw
// ?? use GL_COLOR_ATTACHMENT0_EXT
glDrawBuffer(GL_NONE);
// no color buffer dest...
??wrong glReadBuffer(GL_NONE);
// no color buffer src
?? glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, tex, 0);
//??
Note: tex, m_Frambuffer are ints, correctly allocated textureid and framebuffer, think that bit is ok. My main points of confusion are
Seems that code does glBindTexture, glTexImage2D, glBindTexture release to 0: is it correct to release this early?
glDrawBuffer + glReadBuffer calls required?

Related

Render to a layer of a texture array in OpenGL

I use OpenGL 3.2 to render shadow maps. For this, I construct a framebuffer that renders to a depth texture.
To attach the texture to the framebuffer, I use:
glFramebufferTexture2D( GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, shdw_texture, 0 );
This works great. After rendering the light view, my GLSL shader can sample the depth texture to solve visibility of light.
The problem I am trying to solve now, is to have many more shadow maps, let's say 50 of them. In my main render pass I don't want to be sampling from 50 different textures. I could use an atlas, but I wondered: could I pass all these shadow maps as slices from a 2D texture array?
So, somehow create a GL_TEXTURE_2D_ARRAY with a DEPTH format, and bind one layer of the array to the framebuffer?
Can framebuffers be backed for DEPTH by a texture array layer, instead of just a depth texture?
In general, you need to distinguish whether you want to create a layered framebuffer (see Layered Images) or whether you want to attach a single layer of a multilayered texture to a framebuffer.
Use glFramebufferTexture3D to attach a layer of a 3D texture (TEXTURE_3D) or array texture to a framebuffer or use glFramebufferTextureLayer to attach a layer of a three-dimensional or array texture to the framebuffer. In either case the last argument specifies the layer of the texture.
Layered attachments can be attached with glFramebufferTexture. See Layered rendering.

Why can't I render my depth map on a quad?

Some intro:
I'm currently trying to see how I can convert a depth map into a point cloud. In order to do this, I render a scene as usually and produce a depth map. From the depth map I try to recreate the scene as a point cloud from the given camera angle.
In order to do this I created a FBO so I can render my scene's depth map on a texture. The depth map is rendered on the texture successfully. I know it is done because I'm able to generate the point cloud from the depth texture using glGetTexImage and converting the data acquired.
The problem:
For presentation purposes, I want the depth map to be visible on a separate window. So, I just created a simple shader to draw the depth map texture on a quad. However, instead of the depth texture being drawn on the quad, the texture being drawn is the last that was bound using GlBindTexture. For example :
glUseProgram(simpleTextureViewerProgram);
glBindVertexArray(quadVAO);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D,randomTexture);
glBindTexture(GL_TEXTURE_2D, depthTexture);
glUniform1i(quadTextureSampler, 0);
glDrawArrays(GL_TRIANGLES, 0, 6);
The code above renders the "randomTexure" on the quad instead of the "depthTexture". As I said earlier, "depthTexture" is the one I use in glGetTexImage, so it does contain the depth map.
I may be wrong but if I had to make a guess then the last GlBindTexture command fails and the problem is that "depthTexture" is not an RGB texture but a depth component texture. Is this the reason? How can I draw my depth map on the quad then?

LibGDX texture blending with OpenGL blending function

In libGdx, i'm trying to create a shaped texture: Take a fully-visible rectangle texture and mask it to obtain a shaped textured, as shown here:
Here I test it on rectangle, but i will want to use it on any shape. I have looked into this tutorial and came with an idea to first draw the texture, and then the mask with blanding function:
batch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_ALPHA);
GL20.GL_ZERO - because i really don't want to paint any pixels from the mask
GL20.GL_SRC_ALPHA - from original texture i want to paint only those pixels, where mask was visible (= white).
Crucial part of the test code:
batch0.enableBlending();
batch0.begin();
batch0.draw(original, 0, 0); //to see the original
batch0.draw(mask, width1, 0); //and the mask
batch0.draw(original, 0, height1); //base for the result
batch0.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_ALPHA);
batch0.draw(mask, 0, height1); //draw mask on result
batch0.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
batch0.end();
The center ot the texture get's selected well, but instead of transparent color around, i see black:
Why is the result blank and not transparent?
(Full code - Warning: very messy)
What you're trying to do looks like a pretty clever use of blending. But I believe the exact way you apply it is "broken by design". Let's walk through the steps:
You render your background with red and green squares.
You render an opaque texture on top of you background.
You erase parts of the texture you rendered in step 2 by applying a mask.
The problem is that for the parts you erase in step 3, the previous background is not coming back. It really can't, because you wiped it out in step 2. The background of the whole texture area was replaced in step 2, and once it's gone there's no way to bring it back.
Now the question is of course how you can fix this. There are two conventional approaches I can think of:
You can combine the texture and mask by rendering them into an off-sreen framebuffer object (FBO). You perform steps 1 and 2 as you do now, but render into an FBO with a texture attachment. The texture you rendered into is then a texture with alpha values that reflect your mask, and you can use this texture to render into your default framebuffer with standard blending.
You can use a stencil buffer. Masking out parts of rendering is a primary application of stencil buffers, and using stencil would definitely be a very good solution for your use case. I won't elaborate on the details of how exactly to apply stencil buffers to your case in this answer. You should be able to find plenty of examples both online and in books, including in other answers on this site, if you search for "OpenGL stencil". For example this recent question deals with doing something similar using a stencil buffer: OpenGL stencil (Clip Entity).
So those would be the standard solutions. But inspired by the idea in your attempt, I think it's actually possible to get this to work with just blending. The approach that I came up with uses a slightly different sequence and different blend functions. I haven't tried this out, but I think it should work:
You render the background as before.
Render the mask. To prevent it from wiping out the background, disable writing to the color components of the framebuffer, and only write to the alpha component. This leaves the mask in the alpha component of the framebuffer.
Render the texture, using the alpha component from the framebuffer (DST_ALPHA) for blending.
You will need a framebuffer with an alpha component for this to work. Make sure that you request alpha bits for your framebuffer when setting up your context/surface.
The code sequence would look like this:
// Draw background.
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_TRUE);
glDisable(GL_BLEND);
// Draw mask.
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glEnable(GL_BLEND);
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA);
// Draw texture.
A very late answer, but with the current version this is very easy. You simply draw the mask, set the blending mode to use the source color to the destination and draw the original. You'll only see the original image where the mask is.
//create batch with blending
SpriteBatch maskBatch = new SpriteBatch();
maskBatch.enableBlending();
maskBatch.begin();
//draw the mask
maskBatch.draw(mask);
//store original blending and set correct blending
int src = maskBatch.getBlendSrcFunc();
int dst = maskBatch.getBlendDstFunc();
maskBatch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
//draw original
maskBatch.draw(original);
//reset blending
maskBatch.setBlendFunction(src, dst);
//end batch
maskBatch.end();
If you want more info on the blending options, check How to do blending in LibGDX

opengl - how to put texture on 3d irregular object

I have to create animation where gatling gun will be shoot (it doesn't have to be complex, cause it's just a practice). I drew basic version of my gun which looks like this:
Don't bother colors - i made them like that to be able to see where are egdes of particular parts of gun. Now i would like to make it look better by using some texture - moro or something like metalic color - example 1 or example2. I know how to load texture and how to use it for 2d objects, but i have no idea if there is a possiblity to use this texture for my whole drawing or do i have to use texture for every part separately? This is my code which corresponds to load texture from bmp file and make it able to use:
void initTexture(string fileName)
{
loadBmp(fileName.c_str());
textureId = 13;
glBindTexture(GL_TEXTURE_2D, textureId); //Tell OpenGL which texture to edit
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
//Map the image to the texture
glTexImage2D(GL_TEXTURE_2D, //Always GL_TEXTURE_2D
0, //0 for now
GL_RGB, //Format OpenGL uses for image
tex.info.biWidth, tex.info.biHeight, //Width and height
0, //The border of the image
GL_RGB, //GL_RGB, because pixels are stored in RGB format
GL_UNSIGNED_BYTE, //GL_UNSIGNED_BYTE, because pixels are stored
//as unsigned numbers
tex.px); //The actual pixel data
}
loadBmp() is function to load bitmap file. I tried to search something in the Internet and stackoverflow, but all exmaples where about cubes or spheres, which doesn't help me. How can i put texture on this drawing?
Texture mapping requires to (manually) assign texture coordinates to each vertex. There are some approaches on automatic texture coordinate generation, or getting away without texture coordinates, by giving each face an individual pixel (Disney Animation pioneered the later method for their computer animated films).
Since you didn't specify which program you used to model I'll refer you to some tutorials on texture UV mapping for Blender
http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV/Unwrapping
Please don't tell me you did "code" your gun, because this is wrong, wrong, wrong!
Once you got the texture coordinates you pass them to OpenGL as just another vertex attribute.

Manual GL_REPEAT in GLSL

Currently I have a texture atlas that is 2048 x 2048 pixels set up with three 512 x 512 textures stored, and I am only applying one texture to the object. So I used the following code to position the texture coordinates (from zero to 1) to the correct position on the texture atlas for that texture:
color = texture2D(tex_0, vec2(0.0, 1024.0/2048.0) + mod(texture_coordinate*vec2(40.0), vec2(1.0))*vec2(512.0/2048.0));
The problem is that when I apply this, there is a black border around the texture. I presume that this is because OpenGL can't blend the two pixels at the place of that border.
So how do I get rid of the border?
Edit*
I have already tried to move the starting and ending boundaries in toward the center of the texture and that didn't work.
Edit*
I found the source of the problem, the automatic mipmap generation is blending the textures in the texture atlas together. This means I have to write my own mipmapping function. (As far as I can tell)
If anyone has any better ideas, please do post.
Instead of using a normal 2D texture as the texture atlas with a grid of textures, I used the GL_2D_TEXTURE_ARRAY functionality to create a 3D texture that mipmapped correctly and repeated correctly. That way the textures did not blend together at higher mipmap levels.