OpenGL: Drawing lines - opengl

If I want to draw lines, pure lines with GL_LINES primitive with only two positions.
Should I turn ON the glPolygonMode to GL_LINE too ? Or set the drawing primitive to GL_LINES is enough ?
EDIT: I use modern OpenGL technics

It is useless and superfluous to change the polygon mode, when rendering line primitives.
As the name suggests, polygon mode only affects polygons (triangle primitives), but it doesn't affect line and point primitives. glPolygonMode just controls the rasterization of polygons (triangles).

Related

How to enable both side lighting for GL_POINTS with vertex normals

How to enable both side lighting for GL_POINTS in opengl?
It seems glLightModeli(GL_LIGHT_MODEL_TWO_SIDE, GL_TRUE); is working for only facets, but not GL_POINTS.
What I am doing is supplying both vertices and the vertex normals with lighting enabled. With GL_LIGHT_MODEL_TWO_SIDE disabled, GL_POINTS are lit according to the normal direction. But I cannot enable both sided lighting for GL_POINTS.
Can this be done using OpenGL legacy functions? Or would I have to render both sides by negating all the normals?
Thanks in advance. Please do not comment to use modern OpenGL as that is not the answer to my question, but only a suggestion.
How to enable both side lighting for GL_POINTS in opengl?
You don't. Points and lines do not have sides. Only face primitives (triangles, quads, etc) have sides.
So if you want the lighting computation to reverse the normal if the normal is facing away or something, then yes, you will have to render your geometry twice.

Opengl ES draw wireframe over GL_TRIANGLES correctly

I need to draw a wireframe around a cube,I have everything made but I have some problem with the alpha testing, whatever I do the GL_LINES keep either overlapping the GL_TRIANGLES when they dont have to(they are behind them) or the GL_TRIANGLES keep overlapping the GL_LINES (when the lines should be visible).
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
Gdx.gl.glEnable(GL20.GL_DEPTH_TEST);
SquareMap.get().shader.getShader().begin();
SquareMap.get().shader.getShader().setUniformMatrix(u,camera.combined);
LineRenderer3D.get().render(SquareMap.get().shader,worldrenderer.getCamera());
TriangleRenderer3D.get().render(SquareMap.get().shader,worldrenderer.getCamera());
SquareMap.get().shader.getShader().end();
Also the wireframe is a little bigger than the cube.
The TriangleRenderer3D.get().render and LineRenderer3D().render just load the vertices and call gl_drawarrays
By enabling depth mask the cube GL_TRIANGLES overlap the lines
Do I need to enable something that I missing here?
It is worth mentioning that line primitives have different pixel coverage rules than triangles. A line must cross through a diamond-shaped pattern in the center of a pixel to be visible, where as a triangle needs to cover the top-left corner. This documentation is for Direct3D, but it does an infinitely better job describing these rules (which are the same in GL) than any OpenGL documentation I have come across.
As for fixing this problem, a small offset applied to all vertex positions in order to better align their centers is the most common approach. This is typically done by translating X and Y by 0.375 units.
Another Microsoft document explains this as well.
While some of the issues described in the first paragraph may be primitive coverage related, none are in the last paragraph.
The issue described in the final paragraphs can be addressed this way:
//
// Write wires wherever the line's depth is less than or equal to the triangles.
//
glDepthFunc (GL_LEQUAL);
TriangleRenderer3D.get().render(SquareMap.get().shader,worldrenderer.getCamera());
LineRenderer3D.get().render(SquareMap.get().shader,worldrenderer.getCamera());
By rendering the triangles first, and then only drawing the lines where they are either in front of or at the same depth as (default depth test discards this scenario) you should get the behavior you want. Leave depth writes enabled.

OpenGL : drawing line using degenerate triangle

In my engine, I want to avoid having line, and separate triangle types. I want to draw the lines using a triangle where 2 verts are identical. But in opengl, this triangle wont be displayed because it has zero area, and therefore can't cover a pixel.
Internally, at the driver level, an opengl line is drawn using a degenerate triangle, and a different rasterization rule is used where it draws at least one pixel per scanline.
D3d had some option where you could set the rasterization to always draw the first pixel per scan line--effectively accomplishing what I want in d3d.
But how can I do this with opengl? I don't see any command that would allow you to change the rasterization rules.
Well I did this exact thing by, of the three vertices necessary, using the first two as the start point of the line and then using glPolygonMode(GL_FRONT_AND_BACK, GL_LINE) at the start of the line rendering object and glPolygonMode(GL_FRONT_AND_BACK, GL_FILL) at the end.
Combine that with the appropriate enabling and disabling of face culling, and you've got yourself a perfectly good line renderer that still uses the triangle set up.

Drawing a circle: what to prefer for performance in OpenGL - Lines or a flat triangles?

I know two ways of drawing a circle: circle that contains of GL_LINES or contains of flat triangles (GL_TRIANGLES). In the second case we need to store more vertices and display it every frame. But triangles better to use in tearms of performance of the GPU.
So what's better to use when you need to draw many circles on the screen? Lines or flat triangles?
It could be that lines are being drawn with the help of triangles. You can check it by switching fill mode to wireframe.

Quad texture stretching on OpenGL

So when drawing a rectangle on OpenGL, if you give the corners of the rectangle texture coordinates of (0,0), (1,0), (1,1) and (0, 1), you'll get the standard rectangle.
However, if you turn it into something that's not rectangular, you'll get a weird stretching effect. Just like the following:
I saw from this page below that this can be fixed, but the solution given is only for trapezoidal values only. Also, I have to be doing this over many rectangles.
And so, the questions is, what is the proper way, and most efficient way to get the right "4D" texture coordinates for drawing stretched quads?
Implementations are allowed to decompose quads into two triangles and if you visualize this as two triangles you can immediately see why it interpolates texture coordinates the way it does. That texture mapping is correct ... for two independent triangles.
That diagonal seam coincides with the edge of two independently interpolated triangles.
Projective texturing can help as you already know, but ultimately the real problem here is simply interpolation across two triangles instead of a single quad. You will find that while modifying the Q coordinate may help with mapping a texture onto your quadrilateral, interpolating other attributes such as colors will still have serious issues.
If you have access to fragment shaders and instanced vertex arrays (probably rules out OpenGL ES), there is a full implementation of quadrilateral vertex attribute interpolation here. (You can modify the shader to work without "instanced arrays", but it will require either 4x as much data in your vertex array or a geometry shader).
Incidentally, texture coordinates in OpenGL are always "4D". It just happens that if you use something like glTexCoord2f (s, t) that r is assigned 0.0 and q is assigned 1.0. That behavior applies to all vertex attributes; vertex attributes are all 4D whether you explicitly define all 4 of the coordinates or not.