I try to use the following code to draw a square-shaped pixel with opengl
glPointSize(5.0f);
glBegin(GL_POINTS);
glVertex3f(1.0f, 1.0f, 1.0f);
glEnd();
However, the final result is a circle-shaped pixel.
Please take a look the reference http://risknfun.com/compform/w1.html
See the "Problem 4. A Grid". On the right side, the display image has square-shaped pixel.
It's partly up to the OpenGL implementation (i.e., it can vary with your graphics driver), but with a bit of luck, you can turn this on or off with glEnable(GL_POINT_SMOOTH); or glDisable(GL_POINT_SMOOTH); With point smoothing turned on, you'll normally get round points, but with it turned off you'll get square points.
You can also try to tell OpenGL not to spend time making GL_POINTS nice and round by calling:
glHint(GL_POINT_SMOOTH_HINT, GL_FASTEST);
But keep in mind that's just an hint. The OpenGL driver has ultimately the last word.
Related
I want to draw a 4pointed star using GLUT and openGL in C++. Here is my code
glBegin(GL_TRIANGLE_FAN);
glVertex3f(0.0f,6.0f,0.0f);
glVertex3f(1.0f,4.0f,0.0f);
glVertex3f(3.0f,3.0f,0.0f);
glVertex3f(1.0f,2.0f,0.0f);
glVertex3f(0.0f,0.0f,0.0f);
glVertex3f(-1.0f,2.0f,0.0f);
glVertex3f(-3.0f,3.0f,0.0f);
glVertex3f(-1.0f,4.0f,0.0f);
glEnd();
The problem is the shape directly goes to 3,3 from 0,6
can anyone help me how to fix this,
screenshot
I want something like this
desired output
The 1st point of the the GL_TRIANGLE_FAN primitiv is always held fixed (See Triangle primitives). Just start the GL_TRIANGLE_FAN primitiv at one of the "inner" points:
glBegin(GL_TRIANGLE_FAN);
glVertex3f(1.0f,4.0f,0.0f);
glVertex3f(3.0f,3.0f,0.0f);
glVertex3f(1.0f,2.0f,0.0f);
glVertex3f(0.0f,0.0f,0.0f);
glVertex3f(-1.0f,2.0f,0.0f);
glVertex3f(-3.0f,3.0f,0.0f);
glVertex3f(-1.0f,4.0f,0.0f);
glVertex3f(0.0f,6.0f,0.0f);
glEnd();
You are creating a triangle fan but setting its central vertex (the initial one) at 0,6,0.
You probably want to change your geometry so that your central vertex is at the origin (for symmetry). It also works to move the first vertex down to the bottom as #Rabbid76 shows.
When drawing a quad, it vanishes when rotation brings in a position perpendicular to the screen. Ideally what I'd like to see is (b) but I get nothing
Is there something wrong with my code ? (warning old openGL code following)
void draw_rect(double vector[4][3], int rgb[3], double transp)
{
GLint is_depth, is_blend, blend_src, blend_dst;
glGetIntegerv(GL_DEPTH_WRITEMASK, &is_depth);
glGetIntegerv(GL_BLEND, &is_blend);
glGetIntegerv(GL_BLEND_SRC, &blend_src);
glGetIntegerv(GL_BLEND_DST, &blend_dst);
glEnable(GL_BLEND);
glDepthMask(0);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
// code to set the color ...
glBegin(GL_POLYGON);
glVertex3v(&vector[0][0]);
glVertex3v(&vector[1][0]);
glVertex3v(&vector[2][0]);
glVertex3v(&vector[3][0]);
glEnd();
if (!is_blend){ glDisable(GL_BLEND); }
glDepthMask(is_depth);
glBlendFunc(blend_src, blend_dst);
}
A quad (assuming it is defined by coplanar faces, as in this case) is by definition infinitely thin. It is correct behavior for it to be invisible when perpendicular to the camera.
The "correct" solution is to make a box rather than a single quad.
See Drawing cube 3D using Opengl for an example using a cube. You'll need to tweak the vertex positions to make the cube smaller along one dimension (probably Z), but it'll give you the effect that you're looking for.
Also, stop using the fixed function stuff (glVertex, etc.). It's been deprecated for years. Shaders aren't that difficult, and examples are easy to find via your favorite search engine.
try making it a line of some definite width when the quad is perpendicular to the screen
I just started loading some obj files and render it with opengl. When I render these meshes I get this result (see pictures).
I think its some kind of depth problem but i cant figure it out by myself.
Thats the parameters for rendering:
// Dark blue background
glClearColor(0.0f, 0.0f, 0.4f, 0.0f);
// Enable depth test
glEnable( GL_DEPTH_TEST );
// Cull triangles which normal is not towards the camera
glEnable(GL_CULL_FACE);
I used this Tutorial code as template. https://code.google.com/p/opengl-tutorial-org/source/browse/#hg%2Ftutorial08_basic_shading
The problem is simple, you are doing FRONT or BACK culling.
And the object file contains CCW(Counter-Clock-Wise) or CW (Clock-Wise) cordinates, so written from left to right or right to left.
Your openGL code is expecting it in the other way round, so it hides the surfaces which you are looking backward on.
To check this solves your problem, just take out the glEnable(GL_CULL_FACE);
As this exactly seems to be producing the problem.
Additionally you can use glCullFace(ENUM); where ENUM has to be GL_FRONT or GL_BACK.
If you don't in at least one of both cases can't see your mesh (means in both cases: GL_FRONT or GL_BACK your just seeing the partial mesh) , thats a problem with your code of interpreting the .obj. or the .obj uses not strict surface vectors. (A mix of CCW and CW)
I am actually unsure what you mean, however glEnable(GL_CULL_FACE); and then GL_CULL_FACE(GL_BACK); will cull out or remove the back face of the object. This greatly reduces the lag while rendering objects, and only makes a difference if you are inside or "behind" the object.
Also, have you tried glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); before your render code?
I have a problem when rendering cubes in OpenGL.I am drawing two cubes, one is a wire cube and is centered around the origin, while the other is offset from the origin and is solid. I have mapped some keys to rotate the objects by some degrees wrt to the origin, so the whole scene can rotate around the origin.
The problem is, when I render the scene, when the wire cube is supposed to be infront of the other solid cube, it does not display itself correctly.
In the image above, the colored cube is supposed to be behind the wire cube. i.e. the green wire cube should be on top.
Also the cube is not behaving properly.
After I rotate it a little bit around the x axis (current horizontal line).
The cube has missing faces and is not rendering correctly.
What am I doing wrong?
I have coded the following
Note that rotateX,rotateY,rotateZ are mapped to keys, and are my global rotation variables.
//The Initialize function, called once:
void Init(){
glEnable(GL_TEXTURE_2D);
glShadeModel(GL_SMOOTH); // Enable Smooth Shading
glClearColor(0.0f, 0.0f, 0.0f, 0.5f); // Black Background
glClearDepth(1.0f); // Depth Buffer Setup
glEnable(GL_DEPTH_TEST); // Depth Buffer Setup // Enables Depth Testing
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST); // Really Nice Perspective Calculations
glEnable(GL_LIGHTING);
}
void draw(){
//The main draw function
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity ();
gluPerspective(45, 640/480.0, .5, 100);
glMatrixMode(GL_MODELVIEW); //select the modelview matrix.
glLoadIdentity ();
gluLookAt(0,0,5,
0,0,0,
0,1,0);
glRotatef(rotateX,1,0,0);
glRotatef(rotateY,0,1,0);
glRotatef(rotateZ,0,0,1);
drawScene(); // this just draws the main axis lines,
glutWireCube(1);
glPopMatrix();
glPushMatrix();
glTranslatef(-2,1,0);
drawNiceCube();
glPopMatrix();
glutSwapBuffers();
}
The code for the drawNiceCube() is just using GL_QUADS, while the drawWireCube is built in in GLUT.
EDIT:
I have posted the full code at http://pastebin.com/p1kwPjEM, sorry if it is not well documented.
Did you also request a window with a depth buffer?
glutInitDisplayMode( ... | GLUT_DEPTH | ...);
Update:
Did you somewhere enable face culling?
glEnable(GL_CULL_FACE);
This is may be cause of clockwise
10.090 How does face culling work? Why doesn't it use the surface normal?
OpenGL face culling calculates the signed area of the filled primitive in window coordinate space. The signed area is positive when the window coordinates are in a counter-clockwise order and negative when clockwise. An app can use glFrontFace() to specify the ordering, counter-clockwise or clockwise, to be interpreted as a front-facing or back-facing primitive. An application can specify culling either front or back faces by calling glCullFace(). Finally, face culling must be enabled with a call to glEnable(GL_CULL_FACE); .
OpenGL uses your primitive's window space projection to determine face culling for two reasons. To create interesting lighting effects, it's often desirable to specify normals that aren't orthogonal to the surface being approximated. If these normals were used for face culling, it might cause some primitives to be culled erroneously. Also, a dot-product culling scheme could require a matrix inversion, which isn't always possible (i.e., in the case where the matrix is singular), whereas the signed area in DC space is always defined.
However, some OpenGL implementations support the GL_EXT_ cull_vertex extension. If this extension is present, an application may specify a homogeneous eye position in object space. Vertices are flagged as culled, based on the dot product of the current normal with a vector from the vertex to the eye. If all vertices of a primitive are culled, the primitive isn't rendered. In many circumstances, using this extension
from here
Also you can read here
datenwolf solved my problem. I quote him:
"#JonathanSimbahan: Parts of your code are redundant, but something is missing: You forgot to call Init(); after creating your GLUT window, hence depth testing and all the other state never get enabled. I for one suggest you don't use Init at all and move it's code into the drawing code, where it actually belongs."
I have a quad and I would like to use the gradient it produces as a texture for another polygon.
glPushMatrix();
glTranslatef(250,250,0);
glBegin(GL_POLYGON);
glColor3f(255,0,0);
glVertex2f(10,0);
glVertex2f(100,0);
glVertex2f(100,100);
glVertex2f(50,50);
glVertex2f(0,100);
glEnd(); //End quadrilateral coordinates
glPopMatrix();
glBegin(GL_QUADS); //Begin quadrilateral coordinates
glVertex2f(0,0);
glColor3f(0,255,0);
glVertex2f(150,0);
glVertex2f(150,150);
glColor3f(255,0,0);
glVertex2f(0,150);
glEnd(); //End quadrilateral coordinates
My goal is to make the 5 vertex polygon have the gradient of the quad (maybe a texture is not the best bet)
Thanks
Keep it simple!
It is very simple to create a gradient texture in code, e.g.:
// gradient white -> black
GLubyte gradient[2*3] = { 255,255,255, 0,0,0 };
// WARNING: check documentation, I am not quite sure about syntax and order:
glTexture1D( GL_TEXTURE_1D, 0,3, 2, 0, GL_RGB, GL_UNSIGNED_BYTE, gradient );
// setup texture parameters, draw your polygon etc.
The graphics hardware and/or the GL will create a sweet looking gradient from color one to color two for you (remember: that's one of the basic advantages of having hardware accelerated polygon drawing, you don't have to do interpolation work in software).
Your real problem is: which texture coordinates do you use on the 5 vertex polygon. But that was not your question... ;-)
To do that, you'd have to do a render-to-texture. While this is commonplace and supported by practically every board, it's typically used for quite elaborate effects (e.g. mirrors).
If it's really just a gradient, I'd try to create the gradient in am app like Paint.Net. If you really need to create them at run-time, use a pixel shader to implement render-to-texture. However, I'm afraid explaining pixel shaders in a few words is a bit tough - there are lots of tutorials on this on the net, however.
With the pixel shader, you gain a lot of control over the graphic card. This allows you to render your scene to a temporary buffer and then apply that buffer as a texture quite easily, plus a lot more functionality.