LibGDX - Gradient rectangle using 2x2 texture (or similar way?) - opengl

CMIIW: I heard libgdx's ShapeRenderer is slow and it is better to use Batch.
I tried using Pixmap to produce 2x2 texture and rely on the linear blending:
public void rect(Batch batch, float x, float y, float w, float h, float rot, Color c00, Color c01, Color c10, Color c11){
Pixmap pm = new Pixmap(2, 2, Format.RGBA4444);
pm.drawPixel(0, 0, Color.rgba8888(c00));
pm.drawPixel(0, 1, Color.rgba8888(c01));
pm.drawPixel(1, 0, Color.rgba8888(c10));
pm.drawPixel(1, 1, Color.rgba8888(c11));
Texture tx = new Texture(pm);
tx.setFilter(Texture.TextureFilter.Linear, Texture.TextureFilter.Linear);
batch.end();
batch.begin();
batch.draw(new TextureRegion(tx), x, y, w/2f, h/2f, w, h, 1f, 1f, rot, true);
batch.end();
batch.begin();
tx.dispose();
pm.dispose();
}
And it produces this:
It is not the effect I would want.
If I could throw half pixel from each side of the texture then I think it would be good.
I thought in order to do that I have to change the TextureRegion to this:
new TextureRegion(tx, 0.5f, 0.5f, 1f, 1f)
but this produces:
What is happening there?
Or is there better way to efficiently draw gradient rectangle?
EDIT:
Ouch! Thanks TenFour04 - I tried with
new TextureRegion(tx, 0.25f, 0.25f, 0.75f, 0.75f) but got this instead:
weird, I got exactly what I want with
new TextureRegion(tx, 0.13f, 0.13f, 0.87f, 0.87f):
looks like some rounding problem? 0.126f would still give me that (seemingly) but 0.125f would give me something much closer to the very first image in the post.
#Pinkie Swirl: hmm right, I wanted a method to draw gradient rectangles because I don't want to make textures, but in the end I do.. actually I can avoid making those 2x2 textures on the fly.
#minos

Related

OpenGL - weird lines on polygon edges

In a minecraft-like game I'm making, I'm getting these weird lines on polygon edges:
I'm using a texture atlas, being clamped with GL_CLAMP_TO_EDGE.
I tried using setting GL_TEXTURE_MIN_FILTER to GL_LINEAR, GL_NEAREST and even using mipmaps, but it doesn't make a difference. I also tried insetting the texture coordinates by half a pixel, or using x16 anisotropic filtering, with no sucess.
Any help?
Edit - The top face of the cubes is being rendered something like this:
glBegin(GL_QUADS);
glTexCoord2f(0f, 0f);
glVertex3f(x, y + 1f, z);
glTexCoord2f(0f, 1 / 8f);
glVertex3f(x, y + 1f, z + 1f);
glTexCoord2f(1 / 8f, 1 / 8f);
glVertex3f(x + 1f, y + 1f, z + 1f);
glTexCoord2f(1 / 8f, 0f);
glVertex3f(x + 1f, y + 1f, z);
glEnd();
This looks like the classical texture pixel <-> screen pixel fenceposting problem to me. See my answer to it here: https://stackoverflow.com/a/5879551/524368
Another issue might be, that the corner coordinates of the cubes are not exactly the same. Floating point numbers have some intrinsic error and if you arrange those cubes in a grid by adding some floating point number to it, it may happen, that the vertex positions get slightly off and due to roundoff error you get to see depth fighting. Two things to solve this: 1st: If two cubes' faces touch, don't render them. 2nd: Use integer coordinates for laying out the cube grid and convert the vertices to floating point only when submitting to OpenGL, or don't convert at all.

Push/Pop matrix and individual object rotation around its own axis in OpenGL

This has been asked many, many times before and I've read loads of posts and forums on the internet about it, but I just can't get one object to rotate around it's own axis.
I have several objects drawn like this:
gl.glMatrixMode(GL2.GL_MODELVIEW);
gl.glLoadIdentity();
.....
gl.glPushMatrix();
gl.glRotatef(angle, 0.0f, 1.0f, 0.0f);
gl.glTranslatef(0.0f, 0.0f, 0.0f);
texture.bind(gl);
gl.glTexCoordPointer(2, GL2.GL_FLOAT, 0, textureRLt);
gl.glNormalPointer(GL2.GL_FLOAT, 0, normalRLt);
gl.glVertexPointer(3, GL2.GL_FLOAT, 0, vertexRLt);
gl.glDrawElements(GL2.GL_TRIANGLES, countI, GL2.GL_UNSIGNED_SHORT, indexRLt);
gl.glPopMatrix();
And this is drawn correctly, with all textures and normals applied..
I know that OpenGL executes commands in reverse, so that's why glRotatef is first. Also I know that all rotations are around the origin, so I need to translate the object to that origin (not that I think I have to, because "the pen" is already at the origin because I save the matrix before drawing every object and pop it afterwards). Is it something with glDrawElements? Something doesn't seem right.
Any help will be greatly appreciated. :)
Edit: I can see how the objects rotate around the main x axis, but I want them to rotate around their local x-axis.
"OpenGL executes commands reversely", means it multiplies the transformation matrix from right rather than left. What does this mean?
Imagine transformation A and B:
y = Ax
transforms x by A and yields y
This is equivalent to:
// sudo code:
glA()
glDraw(x)
Now, usually, in programming you think you get the transformations in order that you write them. So, you think that
glA()
glB()
glDraw(x)
would give you
y = BAx
but that is wrong. You actually get:
y = ABx
This means that, first B is applied to x and then A to the result.
Put in english, take a look at this example:
glScalef(...) // third, scale the whole thing
glTranslatef(...) // second, translate it somewhere
glRotatef(...) // first, rotate the object (or course,
// around its own axes, because the object is at origin)
glDrawElements(...) // draw everything at origin
So, what you need to do is to write:
// When everything is drawn, move them to destination:
gl.glTranslatef(destination[0], destination[1], destination[2]);
// For every object, rotate it the way it should, then restore transformation matrix
// object: RLt
gl.glPushMatrix();
gl.glRotatef(angle, 0.0f, 1.0f, 0.0f);
texture.bind(gl);
gl.glTexCoordPointer(2, GL2.GL_FLOAT, 0, textureRLt);
gl.glNormalPointer(GL2.GL_FLOAT, 0, normalRLt);
gl.glVertexPointer(3, GL2.GL_FLOAT, 0, vertexRLt);
gl.glDrawElements(GL2.GL_TRIANGLES, countI, GL2.GL_UNSIGNED_SHORT, indexRLt);
gl.glPopMatrix();
// object: RLt2
gl.glPushMatrix();
gl.glRotatef(angle2, 0.0f, 1.0f, 0.0f);
texture.bind(gl);
gl.glTexCoordPointer(2, GL2.GL_FLOAT, 0, textureRLt2);
gl.glNormalPointer(GL2.GL_FLOAT, 0, normalRLt2);
gl.glVertexPointer(3, GL2.GL_FLOAT, 0, vertexRLt2);
gl.glDrawElements(GL2.GL_TRIANGLES, countI2, GL2.GL_UNSIGNED_SHORT, indexRLt2);
gl.glPopMatrix();
I am not sure, whether you are updating 'angle' variable periodically. If you are not, then there won't be any rotation. The pseudo code is below. You can use the gluttimerfunc for periodic update of the variable.
for (every opengl loop)
angle+=5.0f
Satish

glColor not working, random color appearing

There's something wrong in my code somewhere but for any number of primitives that I draw, despite calling glClearColor and then picking a color with glColor3f, the colors that appear are completely random...
So in my Rendering class I cycle through all the objects and call their drawing methods, for primitives they would look like:
inline void PrimitiveDrawer::drawWireframePrism(Vector3 pos, float radius, Vector3 col){
glClearColor( 1.0f, 1.0f, 1.0f, 1.0f );
glColor3f(col.x, col.y, col.z);
glLineWidth(3);
glBegin (GL_LINE_LOOP);
...
glEnd()
But no matter what color i select I always get different ones... The interesting think is that all primitive lines I draw with this method assume the color of the models that they bound (they are meant to be bounding volumes for meshes)... Could it have to do with the model loaders I am using?
This is affecting every shape (outside the ones around the models), where every GL_LINE assumes the same colour (green for some reason), including the glutBitMapCharacter that I am trying to draw... That's the main think that bothers me as I'd like to pick the colour for the text drawing, currently I am doing:
void renderBitmapString(float x, float y, void *font,char *string)
{
char *c;
glRasterPos2f(x, y);
for (c=string; *c != '\0'; c++) {
glutBitmapCharacter(font, *c);
}
}
void drawText(char text[20], float x, float y){
glPushMatrix();
setOrthographicProjection();
glLoadIdentity();
glClearColor( 0, 0, 0, 0 );
glColor4f(0, 0, 1, 1);
renderBitmapString(x, y,(void *)font, text);
resetPerspectiveProjection();
glPopMatrix();
}
But the text comes up green instead of blue?
glClearColor has nothing to do with glColor. glClearColor sets the color used with a call of glClear(GL_COLOR_BUFFER_BIT) to clear the framebuffer with.
Colors from other objects drawn sounds to me, that you forget to disable texturing. Add a glDiable(GL_TEXTURE_2D); after you're done drawing textured stuff.

Previous calls to GL.Color3 are making my texture use the wrong colors

Making a 2D OpenGL game. When rendering a frame I need to first draw some computed quads geometry and then draw some textured sprites. When the body of my render method only draws the sprites, everything works fine. However, when I try to draw my geometric quads prior to the sprites the texture of the sprite changes to be the color of the last GL.Color3 used previously. How do I tell OpenGL (well, OpenTK) "Ok, we are done drawing geometry and its time to move on to sprites?"
Here is what the render code looks like:
// Let's do some geometry
GL.Begin(BeginMode.Quads);
GL.Color3(_dashboardBGColor); // commenting this out makes my sprites look right
int shakeBuffer = 100;
GL.Vertex2(0 - shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, 0 - shakeBuffer);
GL.Vertex2(0 - shakeBuffer, 0 - shakeBuffer);
GL.End();
// lets do some sprites
GL.Begin(BeginMode.Quads);
GL.BindTexture(TextureTarget.Texture2D, _rockTextureId);
float baseX = 200;
float baseY = 200;
GL.TexCoord2(0, 0); GL.Vertex2(baseX, baseY);
GL.TexCoord2(1, 0); GL.Vertex2(baseX + _rockTextureWidth, baseY);
GL.TexCoord2(1, 1); GL.Vertex2(baseX + _rockTextureWidth, baseY - _rockTextureHeight);
GL.TexCoord2(0, 1); GL.Vertex2(baseX, baseY - _rockTextureHeight);
GL.End();
GL.Flush();
SwapBuffers();
The default texture environment mode is GL_MODULATE, which does that, it multiplies the texture color with the vertex color.
A easy solution is to set the vertex color before drawing a textured primitive to 1,1,1,1 with:
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
Another solution is to change the texture environment mode to GL_REPLACE, which makes the texture color replace the vertex color and doesn't have the issue:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

Trying to zoom in on an arbitrary rect within a screen-aligned quad

I've got a screen-aligned quad, and I'd like to zoom into an arbitrary rectangle within that quad, but I'm not getting my math right.
I think I've got the translate worked out, just not the scaling. Basically, my code is the following:
//
// render once zoomed in
glPushMatrix();
glTranslatef(offX, offY, 0);
glScalef(?wtf?, ?wtf?, 1.0f);
RenderQuad();
glPopMatrix();
//
// render PIP display
glPushMatrix();
glTranslatef(0.7f, 0.7f, 0);
glScalef(0.175f, 0.175f, 1.0f);
RenderQuad();
glPopMatrix();
Anyone have any tips? The user selects a rect area, and then those values are passed to my rendering object as [x, y, w, h], where those values are percentages of the viewport's width and height.
Given that your values are passed as [x, y, w, h] I think you want to first translate in the negative direction to get the upper left hand corner at 0,0, then scale by 1/w and 1/h to fill it to the screen. Like this:
//
// render once zoomed in
glPushMatrix();
glTranslatef(-x, -y, 0);
glScalef(1.0/w, 1.0/h, 1.0f);
RenderQuad();
glPopMatrix();
Does this work?
When I've needed to do this, I've always just changed the parameters I passed to glOrtho, glFrustrum, gluPerspective, or whatever (whichever I was using).
From your comment it looks like you want to draw the same quad (RenderQuad()) as full image and in PIP mode.
Assuming you have widthPIP and heightPIP and startXY position of PIP window then use widthPIP/totalWidth, heightPIP/totalHeight to scale original quad and re-render at given startXY.
Is this what you are looking for?