What is the best way to interpolate colors across a polygon mesh where all of the polygons have the same normal and considerable color differences? Is Using GLSL (with gouraud or phong shading) the right approach or should I take this elsewhere (on cpu side)? Or do I get this completely wrong?
ps: I'm using OpenGL 4.0+
I would like to interpolate colors on a mesh like this.
Based on your picture you need to do a flat shading per triangle(face).
In you vertex buffer you can add vertex color attribute per triangle.In fact it is going to be a color per vertex so if you are using an interleaved vertex buffer array it may look as follows :
[
vertex1.x , vertex1.y , vertex1.z ,vertex1color.r ,vertex1color.g ,vertex1color.b,
vertex2.x , vertex2.y , vertex2.z ,vertex2color.r ,vertex2color.g ,vertex2color.b,
vertex3.x , vertex3.y , vertex3.z ,vertex32color.r vertex3color.g ,vertex3color.b
..... the same for the rest of triangles ...
]
Then, in the vertex shader you set a varying for output color which is then used by the fragment shader to "paint" your fragments.Now, it's important you set no interpolation for that varying but flat.So that the colors aren't interpolated across the primitive.
Here is an example of how to do it.(If you are using GLSL 330 or later then substitute "varying" with in / out )
Related
I have a mesh whose vertex positions are generated dynamically by the vertex shader. I've been using https://www.khronos.org/opengl/wiki/Calculating_a_Surface_Normal to calculate the surface normal for each primitive in the geometry shader, which seems to work fine.
Unfortunately, I'm planning on switching to an environment where using a geometry shader is not possible. I'm looking for alternative ways to calculate surface normals. I've considered:
Using compute shaders in two passes. One to generate the vertex positions, another (using the generated vertex positions) to calculate the surface normals, and then passing that data into the shader pipeline.
Using ARB_shader_image_load_store (or related) to write the vertex positions to a texture (in the vertex shader), which can then be read from the fragment shader. The fragment shader should be able to safely access the vertex positions (since it will only ever access the vertices used to invoke the fragment), and can then calculate the surface normal per fragment.
I believe both of these methods should work, but I'm wondering if there is a less complicated way of doing this, especially considering that this seems like a fairly common task. I'm also wondering if there are any problems with either of the ideas I've proposed, as I've had little experience with both compute shaders and image_load_store.
See Diffuse light with OpenGL GLSL. If you just want the face normals, you can use the partial derivative dFdx, dFdy. Basic fragment shader that calculates the normal vector (N) in the same space as the position:
in vec3 position;
void main()
{
vec3 dx = dFdx(position);
vec3 dy = dFdy(position);
vec3 N = normalize(cross(dx, dy));
// [...]
}
I have a shader which calculates diffuse lighting values. It consists of a vertex and a fragment shader that calculate the lighting intensity on a per-vertex basis. However, as expected, if I have a large GL_TRIANGLE with a light position, say, just above the center of the triangle, the light that should illuminate it does not appear because the light values are smoothly interpolated across the surface of the triangle based on the vertex calculations.
So my question is this- how can a primitive be lit by a light source at a position other than at one of its vertices?
When you do lighting at vertices, you're doing Phong lighting, which is computing a color at the vertices, and then merely interpolating the colors computed at the vertices (i.e., Gouraud shading) across all the pixels in the primitive. If you were to compute the lighting color at each pixel, you'd get Phong shading. Given your scenario, if you move the lighting computations from the vertex shader to the fragment shader, and interpolate the normal across the primitive you should get much better results.
If you make the vertex normal a varying variable and normalize it in the fragment shader, and then do your lighting computations, you should get much better results.
I have a situation where I need to do light shading. I don't have a vertex shader so I can't interpolate normals into my fragment shader. Also I have no ability to pass in a normal map. Can I generate normals completely in the fragment shader based,for example on fragment coordinates? The geometry is always planar in my case.
And to extend on what I am trying to do:
I am using the NV_path_rendering extension which allows rendering pure vector graphics on GPU. The problem is that only the fragment stage is accessible via shader which basically means - I can't use a vertex shader with NV_Path objects.
Since your shapes are flat and NV_PATH require compat profile you can pass normal through on of built-in varyings gl_Color or gl_SecondaryColor
Extension description says that there is some kind of interpolation:
Interpolation of per-vertex data (section 3.6.1). Path primitives have neither conventional vertices nor per-vertex data. Instead fragments generate interpolated per-fragment colors, texture coordinate sets, and fog coordinates as a linear function of object-space or eye-space path coordinate's or using the current color, texture coordinate set, or fog coordinate state directly.
http://developer.download.nvidia.com/assets/gamedev/files/GL_NV_path_rendering.txt
Here's a method which "sets the normal as the face normal", without knowing anything about vertex normals (as I understand it).
https://stackoverflow.com/a/17532576/738675
I have a three.js demo working here:
http://meetar.github.io/three.js-normal-map-0/index6.html
My implementation is getting vertex position data from the vertex shader, but it sounds like you're able to get that through other means.
How to transfer data from vertex shader to fragment shader without changes?
I need to say to the vertex pixels that they have this color. This color I can obtain only in the vertex shader.
You have to use a varying, because each fragment is "influenced" by more than one vertex (unless you are rendering GL_POINTS), so you have to interpolate them across the line/polygon. Recent versions of GLSL allow to specify flat shading interpolation, which doesn't interpolate the value throughout the primitive, ignoring the values from the other vertices.
I suspect thought that what you want to do is to render only the pixels corresponding to the vertices in a different color, is that correct? In that case it's not so easy, you would probably want to render the filled polygons first, and then re-render as GL_POINTS. At that point, varying variables are not interpolated because each fragment is influenced by a single vertex.
Here's a good tutorial on GLSL: NeHe GLSL tutorial
If you want to share data between vertex and fragment shaders use one of the built in types, for example gl_Color
If you want to pass through the color computed by the vertex shader to through the fragment shader you would create a fragment shader with the following line: gl_FragColor = gl_Color
gl_Color will be automatically set for you from the colors written by the vertex shader. You write a color from the vertex shader by setting one of the built-in variables, like gl_FrontColor, or one of it's peers: gl_BackColor etc.
This question already has answers here:
What are Vertex and Pixel shaders?
(6 answers)
Closed 5 years ago.
I've read some tutorials regarding Cg, yet one thing is not quite clear to me.
What exactly is the difference between vertex and fragment shaders?
And for what situations is one better suited than the other?
A fragment shader is the same as pixel shader.
One main difference is that a vertex shader can manipulate the attributes of vertices. which are the corner points of your polygons.
The fragment shader on the other hand takes care of how the pixels between the vertices look. They are interpolated between the defined vertices following specific rules.
For example: if you want your polygon to be completely red, you would define all vertices red. If you want for specific effects like a gradient between the vertices, you have to do that in the fragment shader.
Put another way:
The vertex shader is part of the early steps in the graphic pipeline, somewhere between model coordinate transformation and polygon clipping I think. At that point, nothing is really done yet.
However, the fragment/pixel shader is part of the rasterization step, where the image is calculated and the pixels between the vertices are filled in or "coloured".
Just read about the graphics pipeline here and everything will reveal itself:
http://en.wikipedia.org/wiki/Graphics_pipeline
Vertex shader is done on every vertex, while fragment shader is done on every pixel. The fragment shader is applied after vertex shader. More about the shaders GPU pipeline link text
Nvidia Cg Tutorial:
Vertex transformation is the first processing stage in the graphics hardware pipeline. Vertex transformation performs a sequence of math operations on each vertex. These operations include transforming the vertex position into a screen position for use by the rasterizer, generating texture coordinates for texturing, and lighting the vertex to determine its color.
The results of rasterization are a set of pixel locations as well as a set of fragments. There is no relationship between the number of vertices a primitive has and the number of fragments that are generated when it is rasterized. For example, a triangle made up of just three vertices could take up the entire screen, and therefore generate millions of fragments!
Earlier, we told you to think of a fragment as a pixel if you did not know precisely what a fragment was. At this point, however, the distinction between a fragment and a pixel becomes important. The term pixel is short for "picture element." A pixel represents the contents of the frame buffer at a specific location, such as the color, depth, and any other values associated with that location. A fragment is the state required potentially to update a particular pixel.
The term "fragment" is used because rasterization breaks up each geometric primitive, such as a triangle, into pixel-sized fragments for each pixel that the primitive covers. A fragment has an associated pixel location, a depth value, and a set of interpolated parameters such as a color, a secondary (specular) color, and one or more texture coordinate sets. These various interpolated parameters are derived from the transformed vertices that make up the particular geometric primitive used to generate the fragments. You can think of a fragment as a "potential pixel." If a fragment passes the various rasterization tests (in the raster operations stage, which is described shortly), the fragment updates a pixel in the frame buffer.
Vertex Shaders and Fragment Shaders are both feature of 3-D implementation that does not uses fixed-pipeline rendering. In any 3-D rendering vertex shaders are applied before fragment/pixel shaders.
Vertex shader operates on each vertex. If you have a fixed polygon mesh and you want to deform it in a shader, you have to implement it in vertex shader. I.e. any physical change in vertex appearances can be done in vertex shaders.
Fragment shader takes the output from the vertex shader and associates colors, depth value of a pixel, etc. After these operations the fragment is send to Framebuffer for display on the screen.
Some operation, as for example lighting calculation, you can perform in vertex shader as well as fragment shader. But fragment shader provides better result than the vertex shader.
In rendering images via 3D hardware you typically have a mesh (point, polygons, lines) these are defined by vertices. To manipulate vertices individually typically for motions in a model or waves in an ocean you can use vertex shaders. These vertices can have static colour or colour assigned by textures, to manipulate vertex colours you use fragment shaders. At the end of the pipeline when the view goes to screen you can also use fragment shaders.