Opengl transparent cube faces - c++

I am drawing transparent cubes,which share vertices, in a grid(Windows 7 VC++ VS 2012). Problem is when i rotate the picture i see strange visual effect in planes where cubes touch each other. Is this effect because of that reason only and i need to remove the shared faces? Or there is some other trick to it?
Does it have anything to do with face/orientation of the cube surface? btw i have tried a lot of these already but cant make it work perfectly.

If you assure a consistent process when defining triangle vertex numbering this issue can be avoided. Typically you number vertices counter-clockwise 0, 1, 2 Doing this tells the OpenGL state machine which side is up. Depending on your needs there is a GL flag you can set to color in or not front and/or back of such triangles ... let us know how you get on

Related

Opengl : Can we still use GL_QUADS?

I am loading models from obj, colladae files where individual faces of the mesh are not triangulated if all the faces are perfect quads when exporting to save memory.
Now I know this won't work for most models but for some meshes say a cube where some amount of duplication can be avoided along each face I want to make it work.
I have 2 options using triangle strips or gl_quads.
The problem with triangle strips is that neighbouring faces are connected so it is impossible to have some gap between them,even for a simple cube the output looks correct on the outside but when I go inside the cube even with back face culling enabled I see some stray triangles connecting the front and back of the cube and basically it's a whole jumbled mess.
But with gl_quads everything works correctly but the docs say that quads is deprecated in 3.1 [even though I can still use it gl 4.0] so my question is
Can I continue using gl_quads for loading meshes from files without running into problems in the future or how can I replace it with triangle strips without the whole connectivity issue?.
One solution I found was to issue a draw call for every 4 vertices in a for loop but this is terrible for performance for an huge mesh.
Any suggestions?
Can we still use GL_QUADS?
Yes, if using a compatibility profile OpenGL context.
Can I continue using gl_quads for loading meshes from files without running into problems in the future...?
Platforms may choose not to implement the compatibility profile. Most desktop platforms do and have done so since forever. Would surprise me if existing implementations decided to drop it.
...or how can I replace it with triangle strips without the whole connectivity issue?
One way to render disconnected quads (or triangle strips in general) is to draw two vertices in the same point and then move to where the next strip should continue, see this question.
The simpler way is to generate a sequence of indices like [1 2 3, 1 3 4, 5 6 7, 5 7 8, ...] and render the mesh of quads using those indices.

OpenGL Random White Dots

I am trying to render a geometry in a white background. The problems is that random white dots appears inside the geometry. As I resize my window, the white points switch places... appearing and disappearing randomly inside the geometry (while I am resizing the window).
I have conducted extensive tests and have found that the dots only appears at the edges between two triangles. It seems like both triangles fail to render that pixels (as if that pixels isn't contained by any of the triangles), so the white background is rendered. I should note that only a few pixels at those borders are white (not all). And its not some kind of texture filtering issue since the problem happens even if I render the polygon with a solid color (that I set directly inside the shader).
Really, it seems some kind of hit test problem where the OpenGL implementation fails to detect some pixels on the boundaries of two adjacent triangles.
I am running this example in a 27'' iMac with a NVIDIA GeForce GTX 675MX. I'm going to test this same application on my MacBook with Intel Integrated Graphics Card.
Can someone shed some light on this topic?
Thanks #Damon. I solved the issue which wasn't that the vertices weren't exactly the same. The true problem is that (by design) some vertices needed to stay in the intersection between two triangles. This was causing problems with OpenGL. The solution was to move the vertex slightly down (inside the triangles) and adjust the texture coordinates accordingly.
Many thanks!

OpenGL to DirectX translation - alpha blending

I'm trying to translate an OpenGL renderer into DirectX9. It mostly seems to work, but the two don't seem to agree on the settings for alpha blending. In OpenGL, I'm using:
glDepthFunc(GL_LEQUAL);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
and never actually setting the GL_DEST_ALPHA, so it's whatever the default is. This works fine. Translating to DirectX, I get:
device->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_INVSRCALPHA);
which should do about the same thing, but totally doesn't. The closest I can get is:
device->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
device->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_DESTALPHA);
which is almost right, but if the geometry overlaps itself, the alpha in front overrides the alpha in back, and makes the more distant faces invisible. For the record, the other potentially related render states I've got going on are:
device->SetRenderState(D3DRS_LIGHTING, FALSE);
device->SetRenderState(D3DRS_ZENABLE, TRUE);
device->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
device->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);
At this point, I feel like I'm just changing states at random to see which combination gives the best results, but nothing is working as well as it did in OpenGL. Not sure what I'm missing here...
The alpha blending itself is performed correctly. Otherwise, every particle would look strange. The reason why some parts of some particles are not drawn, is that they are behind the transparent parts of some other particles.
To solve this problem you have two options:
Turn off ZWriteEnable for the particles. With that, every object drawn after the particle will be in front of it. This could lead to problems, if you have objects that should actually behind the particles and are drawn afterwards.
Enable alpha testing for the particles. Alpha testing is a technique to remove transparent pixels (given a certain threshold) from the target. This includes the ZBuffer.
Btw. when rendering transparent objects, it is almost always necessary to sort the objects to solve ZBuffer issues. The above solutions work for some special cases.

beginner transparency / opaque in openGL

I am going through the NeHe tutorials for OpenGL... I am at lesson 8 (drawing a cube with blending). http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=08
I wanted to experiment and change half of the faces to be opaque so that there is always a semi-transparent face opposite to an opaque one and be able to rotate the cube...
I changed the code a little bit, the entire source is there : http://pastebin.com/uzfSk2wB
I changed a few things :
enable blending by default and set the blending function to glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
changed the order of drawing the faces and colors for each face.
I set the depth testing
I draw all the opaque faces
I disable depth test
I draw all the transparent faces
Now, it's hard to tell exactly what is going wrong, but it definitely does not look right, I cannot recognize what face is opaque compared to the transparent ones, sometimes some faces do not seem to get drawn when they should...etc...
Seems like calculating what face is in front compared to back would not be trivial (although I am sure possible), I hope there is a way of doing that would not need to do that.
Looking either for what is wrong in my code or whether this is not the right way of doing it in the first place.
If you disable depth testing before drawing the transparent faces, then they will be drawn with no regard for their correct z-ordering. It probably looks like the transparent faces are being drawn atop all the other faces. Leave depth testing on.

OpenGL texturing via vertex alphas, how to avoid following diagonal lines?

http://img136.imageshack.us/img136/3508/texturefailz.png
This is my current program. I know it's terribly ugly, I found two random textures online ('lava' and 'paper') which don't even seem to tile. That's not the problem at the moment.
I'm trying to figure out the first steps of an RPG. This is a top-down screenshot of a 10x10 heightmap (currently set to all 0s, so it's just a plane), and I texture it by making one pass per texture per quad, and each vertex has alpha values for each texture so that they blend with OpenGL.
The problem is that, notice how the textures trend along diagonals, and even though I'm drawing with GL_QUAD, this is presumably because the quads are turned into sets of two triangles and then the alpha values at the corners have more weight along the hypotenuses... But I wasn't expecting that to matter at all. By drawing quads, I was hoping that even though they were split into triangles at some low level, the vertex alphas would cause the texture to radiate in a circular outward gradient from the vertices.
How can I fix this to make it look better? Do I need to scrap this and try a whole different approach? IS there a different approach for something like this? I'd love to hear alternatives as well.
Feel free to ask questions and I'll be here refreshing until I get a valid answer, so I'll comment as fast as I can.
Thanks!!
EDIT:
Here is the kind of thing I'd like to achieve. No I'm obviously not one of the billions of noobs out there "trying to make a MMORPG", I'm using it as an example because it's very much like what I want:
http://img300.imageshack.us/img300/5725/runescapehowdotheytile.png
How do you think this is done? Part of it must be vertex alphas like I'm doing because of the smooth gradients... But maybe they have a list of different triangle configurations within a tile, and each tile stores which configuration it uses? So for example, configuration 1 is a triangle in the topleft and one in the bottomright, 2 is the topright and bottomleft, 3 is a quad on the top and a quad on the bottom, etc? Can you think of any other way I'm missing, or if you've got it all figured out then please share how they do it!
The diagonal artefacts are caused by having all of your quads split into triangles along the same diagonal. You define points [0,1,2,3] for your quad. Each quad is split into triangles [0,1,2] and [1,2,3]. Try drawing with GL_TRIANGLES and alternating your choice of diagonal. There are probably more efficient ways of doing this using GL_TRIANGLE_STRIP or GL_QUAD_STRIP.
i think you are doing it right, but you should increase the resolution of your heightmap a lot to get finer tesselation!
for example look at this heightmap renderer:
mdterrain
it shows the same artifacts at low resolution but gets better if you increase the iterations
I've never done this myself, but I've read several guides (which I can't find right now) and it seems pretty straight-forward and can even be optimized by using shaders.
Create a master texture to control the mixing of 4 sub-textures. Use the r,g,b,a components of the master texture as a percentage mix of each subtextures ( lava, paper, etc, etc). You can easily paint a master texture using paint.net, photostop, gimp and just paint into each color channel. You can compute the resulting texture before hand using all 5 textures OR you can calculate the result on the fly with a fragment shader. I don't have a good example of either, but I think you can figure it out given how far you've come.
The end result will be "pixel" pefect blending (depends on the textures resolution and filtering) and will avoid the vertex blending issues.