Shading a Sphere using Gouraud Shading - opengl

I am trying to shade a sphere. I calculated the normals to each vertex of the sphere, but I don't understand how the other pixels on the facets will be shaded. Any help on this ? I am using OpenGL 3+.

For Gouraud shading the lighting model is computed (as a color) for each vertex of the triangle and then linearly interpolated over the triangle pixels.
In OpenGL you simply compute the ligthting model for each vertex in a vertex shader as a color vector passed to a fragment shader as a varying and then the linear interpolation is done automatically "for free".
If you want Phong shading, you pass the vertex normal directly to the fragment shader which will be automatically linearly interpolated too and then you compute the lighting model in the fragment shader using this interpolated normal.

Related

Is what I'm generating the UV coordinates?

I don't usually use a flat surface in OpenGL, but recently I've been taking up on making After Effects plugins, and it has a template called Glator which passes a VBO which contains the UVs. However, I have learned by reading ShaderToy fragment shaders that by passing the resolution of the billboard to the fragment shader as a uniform, and doing this:
vec2 p = gl_FragCoord.st / resolution.xy
You can generate a value which is the UV coordinate of the fragment on the flat surface. Am I right?

Lighting Primitives From Positions That Aren't Processed By Shaders

I have a shader which calculates diffuse lighting values. It consists of a vertex and a fragment shader that calculate the lighting intensity on a per-vertex basis. However, as expected, if I have a large GL_TRIANGLE with a light position, say, just above the center of the triangle, the light that should illuminate it does not appear because the light values are smoothly interpolated across the surface of the triangle based on the vertex calculations.
So my question is this- how can a primitive be lit by a light source at a position other than at one of its vertices?
When you do lighting at vertices, you're doing Phong lighting, which is computing a color at the vertices, and then merely interpolating the colors computed at the vertices (i.e., Gouraud shading) across all the pixels in the primitive. If you were to compute the lighting color at each pixel, you'd get Phong shading. Given your scenario, if you move the lighting computations from the vertex shader to the fragment shader, and interpolate the normal across the primitive you should get much better results.
If you make the vertex normal a varying variable and normalize it in the fragment shader, and then do your lighting computations, you should get much better results.

Calculate normals for plane inside fragment shader

I have a situation where I need to do light shading. I don't have a vertex shader so I can't interpolate normals into my fragment shader. Also I have no ability to pass in a normal map. Can I generate normals completely in the fragment shader based,for example on fragment coordinates? The geometry is always planar in my case.
And to extend on what I am trying to do:
I am using the NV_path_rendering extension which allows rendering pure vector graphics on GPU. The problem is that only the fragment stage is accessible via shader which basically means - I can't use a vertex shader with NV_Path objects.
Since your shapes are flat and NV_PATH require compat profile you can pass normal through on of built-in varyings gl_Color or gl_SecondaryColor
Extension description says that there is some kind of interpolation:
Interpolation of per-vertex data (section 3.6.1). Path primitives have neither conventional vertices nor per-vertex data. Instead fragments generate interpolated per-fragment colors, texture coordinate sets, and fog coordinates as a linear function of object-space or eye-space path coordinate's or using the current color, texture coordinate set, or fog coordinate state directly.
http://developer.download.nvidia.com/assets/gamedev/files/GL_NV_path_rendering.txt
Here's a method which "sets the normal as the face normal", without knowing anything about vertex normals (as I understand it).
https://stackoverflow.com/a/17532576/738675
I have a three.js demo working here:
http://meetar.github.io/three.js-normal-map-0/index6.html
My implementation is getting vertex position data from the vertex shader, but it sounds like you're able to get that through other means.

Color interpolation across a polygon mesh

What is the best way to interpolate colors across a polygon mesh where all of the polygons have the same normal and considerable color differences? Is Using GLSL (with gouraud or phong shading) the right approach or should I take this elsewhere (on cpu side)? Or do I get this completely wrong?
ps: I'm using OpenGL 4.0+
I would like to interpolate colors on a mesh like this.
Based on your picture you need to do a flat shading per triangle(face).
In you vertex buffer you can add vertex color attribute per triangle.In fact it is going to be a color per vertex so if you are using an interleaved vertex buffer array it may look as follows :
[
vertex1.x , vertex1.y , vertex1.z ,vertex1color.r ,vertex1color.g ,vertex1color.b,
vertex2.x , vertex2.y , vertex2.z ,vertex2color.r ,vertex2color.g ,vertex2color.b,
vertex3.x , vertex3.y , vertex3.z ,vertex32color.r vertex3color.g ,vertex3color.b
..... the same for the rest of triangles ...
]
Then, in the vertex shader you set a varying for output color which is then used by the fragment shader to "paint" your fragments.Now, it's important you set no interpolation for that varying but flat.So that the colors aren't interpolated across the primitive.
Here is an example of how to do it.(If you are using GLSL 330 or later then substitute "varying" with in / out )

GLSL: vertex shader to fragment shader without varing

How to transfer data from vertex shader to fragment shader without changes?
I need to say to the vertex pixels that they have this color. This color I can obtain only in the vertex shader.
You have to use a varying, because each fragment is "influenced" by more than one vertex (unless you are rendering GL_POINTS), so you have to interpolate them across the line/polygon. Recent versions of GLSL allow to specify flat shading interpolation, which doesn't interpolate the value throughout the primitive, ignoring the values from the other vertices.
I suspect thought that what you want to do is to render only the pixels corresponding to the vertices in a different color, is that correct? In that case it's not so easy, you would probably want to render the filled polygons first, and then re-render as GL_POINTS. At that point, varying variables are not interpolated because each fragment is influenced by a single vertex.
Here's a good tutorial on GLSL: NeHe GLSL tutorial
If you want to share data between vertex and fragment shaders use one of the built in types, for example gl_Color
If you want to pass through the color computed by the vertex shader to through the fragment shader you would create a fragment shader with the following line: gl_FragColor = gl_Color
gl_Color will be automatically set for you from the colors written by the vertex shader. You write a color from the vertex shader by setting one of the built-in variables, like gl_FrontColor, or one of it's peers: gl_BackColor etc.