Drawing smooth circle - opengl

I'm using OpenTK (C#) but OpenGL suggestions are welcome too.
I have a point list generated by iteration having 1 degrees per point around the center point which means there are 361 point including the center point. Point list can be different with different approaches, that's ok. I can draw the circle with the below simple Vertex and Fragment shaders. How can change the fragment and/or vertex shaders to have a smooth circle.
Vertex shader:
#version 330
in vec3 vPosition;
in vec4 vColor;
out vec4 color;
out vec4 fPosition;
uniform mat4 modelview;
void main()
{
fPosition = modelview * vec4(vPosition, 1.0);
gl_Position = fPosition;
color = vColor;
}
Fragment shader:
#version 330
in vec4 color;
in vec4 fPosition;
out vec4 outputColor;
void main()
{
outputColor = color;
}
C# code:
GL.DrawArrays(PrimitiveType.TriangleFan, 0, points.Length);

Hello what do you actually see ? Post a screenshot. Anyway for smooth edges we have what's called anti alising.
Use this line for your glControl to enable it
glControl = new GLControl(new OpenTK.Graphics.GraphicsMode(32, 24, 0, 8));

Related

GLSL for sprite sheet

I have a shader that renders a simple texture on the screen. But, consider that I have a sprite sheet (multiple frames for a sprite) and I want to render just a small portion of that sheet (a frame), how would I do that using GLSL?
I do not want to manipulate the vertex buffer since that would become extremely costly.
My vertex shader:
#version 330 core
layout (location = 0) in vec4 vertex; // <vec2 position, vec2 texCoords>
out vec2 TexCoords;
uniform mat4 model;
uniform mat4 projection;
void main()
{
TexCoords = vertex.zw;
gl_Position = projection * model * vec4(vertex.xy, 0.0, 1.0);
}
My fragment shader:
#version 330 core
in vec2 TexCoords;
out vec4 FragColor;
uniform sampler2D texture;
uniform vec4 color;
void main()
{
FragColor = texture(texture, TexCoords) * color;
}
Here is how I calculate where along the X the current animation frame is in the texture:
m_animationData.x = (m_animationData.width * m_currentFrame);
So consider if I have a 128x32 sprite sheet, and the size of the sprite is 32x32, that makes it have 4 frames of animation. Each frame would then have an X equivalent of
frame 0: 0
frame 1: 32
frame 2: 64
frame 3: 96
All that is missing is how to tell the shader to render just that portion of the texture.

What should be the vertex and fragment shaders to draw a cube with texture in version 440 core

I am following "Interactive Computer Graphics" by Ed Angel, specifically the code for rotating cube with texture. The vertex shader for this (as in the book) is as follows:
#version 150
in vec4 vPosition;
in vec4 vColor;
in vec2 vTexCoord;
out vec4 color;
out vec2 texCoord;
uniform vec3 theta;
void main()
{
........
........
// code for rotation
........
color = vColor;
texCoord = vTexCoord;
gl_Position = rz * ry * rx * vPosition;
}
And the fragment shader code is:
#version 150
in vec4 color;
in vec2 texCoord;
out vec4 fColor;
uniform sampler2D texture;
void main()
{
fColor = color * texture2D( texture, texCoord );
}
[The link for the complete code is here. Just look at the code for example 8.
I am trying to implement this using #version 440 core. When I run this, I get only a black cube. The texture is not shown.
What change to the above code should I make to display the texture correctly?
This line in the cpp code:
glutInitContextVersion( 3, 2 );
is your problem. glsl version 440 requires an OpenGL 4.4 context. Change it to :
glutInitContextVersion( 4, 4 );
And modernise your shader code according to:
GLSL Specification
And it should work fine.

opengl glsl bug in which model goes invisible if i use glsl texture function with different parameters

I want to replicate a game. The goal of the game is to create a path between any 2 squares that have the same color.
Here is the game: www.mypuzzle.org/3d-logic-2
The cube has 6 faces. Each faces has 3x3 squares.
The cube has different square types: empty squares(reflect the environment), wall squares(you cant color them), start/finish squares(which have a black square in the middle but the rest of it is the colored).
I've close to finishing my project but i'm stuck with a bug. I used c++,sfml,opengl,glm.
The problem is in the shaders.
Vertex shader:
#version 330 core
layout (location = 0) in vec3 vPosition;
layout (location = 1) in vec3 vColor;
layout (location = 2) in vec2 vTexCoord;
layout (location = 3) in float vType;
layout (location = 4) in vec3 vNormal;
out vec3 Position;
out vec3 Color;
out vec2 TexCoord;
out float Type;
out vec3 Normal;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
gl_Position = projection * view * model * vec4(vPosition, 1.0f);
Position = vec3(model * vec4(vPosition, 1.0f));
Color=vColor;
TexCoord=vTexCoord;
Type=vType;
Normal = mat3(transpose(inverse(model))) * vNormal;
}
Fragment shader:
#version 330 core
in vec3 Color;
in vec3 Normal;
in vec3 Position;
in vec2 TexCoord;
in float Type;
out vec4 color;
uniform samplerCube skyboxTexture;
uniform sampler2D faceTexture;
uniform sampler2D centerTexture;
void main()
{
color=vec4(0.0,0.0,0.0,1.0);
if(Type==0.0)
{
vec3 I = normalize(Position);
vec3 R = reflect(I, normalize(Normal));
if(texture(faceTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=mix(texture(skyboxTexture, R),vec4(1.0,1.0,1.0,1.0),0.3);*/
}
else if(Type==1.0)
{
if(texture(centerTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=vec4(Color,1.0);
}
else if(Type==-1.0)
{
color=vec4(0.0,0.0,0.0,1.0);
}
else if(Type==2.0)
{
if(texture(faceTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=mix(vec4(Color,1.0),vec4(1.0,1.0,1.0,1.0),0.5);
}
}
/*
Type== 0 ---> blank square(reflects light)
Type== 1 ---> start/finish square
Type==-1 ---> wall square
Type== 2 ---> colored square that was once a black square
*/
In the fragment shader i draw the pixels of a square that has a certain type, so the shader only enters in 1 of the 4 if's for each square. The program works fine if i only use the glsl function texture with the same texture. If i use this function 2 times with different textures ,in 2 differents if's ,my model goes invisible. Why is that happening?
https://postimg.org/image/lximpl0bz/
https://postimg.org/image/5dzvqz2r7/
The red square is of type 1. I've modified code in the type==0 if and then my model went invisible.
Texture sampler in OpenGL should only be accessed in (at least) dynamically uniform control flow. This basically means, that all invocations of a shader execute the same code path. If this is not the case, then no automatic gradients are available and mipmapping or anisotropic filtering will fail.
In your program this problem happens exactly when you try to use multiple textures. One solution might be not to use anything that requires gradients. There are also a number of other options, for example, patching all textures together in a texture atlas and just selecting the appropriate uv-coordinates in the shader or drawing each quad separately and providing the type through a uniform variable.

Simple GLSL Shader (Light) causes flickering

I'm trying to implement some basic lighting and shading following the tutorial over here and here.
Everything is more or less working but I get some kind of strange flickering on object surfaces due to the shading.
I have two images attached to show you guys how this problem looks.
I think the problem is related to the fact that I'm passing vertex coordinates from vertex shader to fragment shader to compute some lighting variables as stated in the above linked tutorials.
Here is some source code (stripped out unrelated code).
Vertex Shader:
#version 150 core
in vec4 pos;
in vec4 in_col;
in vec2 in_uv;
in vec4 in_norm;
uniform mat4 model_view_projection;
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
void main(void) {
gl_Position = model_view_projection * pos;
out_col = in_col;
out_vert = pos;
out_norm = in_norm;
passed_uv = in_uv;
}
and Fragment Shader:
#version 150 core
uniform sampler2D tex;
uniform mat4 model_mat;
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
out vec4 col;
void main(void) {
mat3 norm_mat = mat3(transpose(inverse(model_mat)));
vec3 norm = normalize(norm_mat * vec3(in_norm));
vec3 light_pos = vec3(0.0, 6.0, 0.0);
vec4 light_col = vec4(1.0, 0.8, 0.8, 1.0);
vec3 col_pos = vec3(model_mat * vert_pos);
vec3 s_to_f = light_pos - col_pos;
float brightness = dot(norm, normalize(s_to_f));
brightness = clamp(brightness, 0, 1);
gl_FragColor = out_col;
gl_FragColor = vec4(brightness * light_col.rgb * gl_FragColor.rgb, 1.0);
}
As I said earlier I guess the problem has to do with the way the vertex position is passed to the fragment shader. If I change the position values to something static no more flickering occurs.
I changed all other values to statics, too. It's the same result - no flickering if I am not using the vertex position data passed from vertex shader.
So, if there is someone out there with some GL-wisdom .. ;)
Any help would be appreciated.
Side note: running all this stuff on an Intel HD 4000 if that may provide further information.
Thanks in advance!
Ivan
The names of the out variables in the vertex shader and the in variables in the fragment shader need to match. You have this in the vertex shader:
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
and this in the fragment shader:
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
These variables are associated by name, not by order. Except for passed_uv, the names do not match here. For example, you could use these declarations in the vertex shader:
out vec4 passed_col;
out vec2 passed_uv;
out vec4 passed_vert;
out vec4 passed_norm;
and these in the fragment shader:
in vec4 passed_col;
in vec2 passed_uv;
in vec4 passed_vert;
in vec4 passed_norm;
Based on the way I read the spec, your shader program should actually fail to link. At least in the GLSL 4.50 spec, in the table on page 43, it lists "Link-Time Error" for this situation. The rules seem somewhat ambiguous in earlier specs, though.

Why does my OpenGL Phong shader behave like a flat shader?

I've been learning OpenGL for the past couple of weeks and I've run into some trouble implementing a Phong shader. It appears to do no interpolation between vertexes despite my use of the smooth qualifier. Am I missing something here? To give credit where credit is due, the code for the vertex and fragment shaders cribs heavily from the OpenGL SuperBible Fifth Edition. I would highly recommend this book!
Vertex Shader:
#version 330
in vec4 vVertex;
in vec3 vNormal;
uniform mat4 mvpMatrix; // mvp = ModelViewProjection
uniform mat4 mvMatrix; // mv = ModelView
uniform mat3 normalMatrix;
uniform vec3 vLightPosition;
smooth out vec3 vVaryingNormal;
smooth out vec3 vVaryingLightDir;
void main(void) {
vVaryingNormal = normalMatrix * vNormal;
vec4 vPosition4 = mvMatrix * vVertex;
vec3 vPosition3 = vPosition4.xyz / vPosition4.w;
vVaryingLightDir = normalize(vLightPosition - vPosition3);
gl_Position = mvpMatrix * vVertex;
}
Fragment Shader:
#version 330
out vec4 vFragColor;
uniform vec4 ambientColor;
uniform vec4 diffuseColor;
uniform vec4 specularColor;
smooth in vec3 vVaryingNormal;
smooth in vec3 vVaryingLightDir;
void main(void) {
float diff = max(0.0, dot(normalize(vVaryingNormal), normalize(vVaryingLightDir)));
vFragColor = diff * diffuseColor;
vFragColor += ambientColor;
vec3 vReflection = normalize(reflect(-normalize(vVaryingLightDir),normalize(vVaryingNormal)));
float spec = max(0.0, dot(normalize(vVaryingNormal), vReflection));
if(diff != 0) {
float fSpec = pow(spec, 32.0);
vFragColor.rgb += vec3(fSpec, fSpec, fSpec);
}
}
This (public domain) image from Wikipedia shows exactly what sort of image I'm getting and what I'm aiming for -- I'm getting the "flat" image but I want the "Phong" image.
Any help would be greatly appreciated. Thank you!
edit: If it makes a difference, I'm using PyOpenGL 3.0.1 and Python 2.6.
edit2:
Solution
It turns out the problem was with my geometry; Kos was correct. For anyone else that's having this problem with Blender models, Kos pointed out that doing Edit->Faces->Set Smooth does the trick. I found that Wings 3D worked "out of the box."
As an addition to this answer, here is a simple geometry shader which will let you visualize your normals. Modify the accompanying vertex shader as needed based on your attribute locations and how you send your matrices.
But first, a picture of a giant bunny head from our friend the Stanford bunny as an example of the result !
Major warning: do note that I get away with transforming the normals with the modelview matrix instead of a proper normal matrix. This won't work correctly if your modelview contains non uniform scaling. Also, the length of your normals won't be correct but that matters little if you just want to check their direction.
Vertex shader:
#version 330
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 normal;
layout(location = 2) in mat4 mv;
out Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata;
uniform mat4 projection;
void main()
{
vdata.mvp = projection * mv;
vdata.position = position;
vdata.normal = normal;
}
Geometry shader:
#version 330
layout(triangles) in;
layout(line_strip, max_vertices = 6) out;
in Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata[3];
out Data
{
vec4 color;
} gdata;
void main()
{
const vec4 green = vec4(0.0f, 1.0f, 0.0f, 1.0f);
const vec4 blue = vec4(0.0f, 0.0f, 1.0f, 1.0f);
for (int i = 0; i < 3; i++)
{
gl_Position = vdata[i].mvp * vdata[i].position;
gdata.color = green;
EmitVertex();
gl_Position = vdata[i].mvp * (vdata[i].position + vdata[i].normal);
gdata.color = blue;
EmitVertex();
EndPrimitive();
}
}
Fragment shader:
#version 330
in Data
{
vec4 color;
} gdata;
out vec4 outputColor;
void main()
{
outputColor = gdata.color;
}
Hmm... You're interpolating the normal as a varying variable, so the fragment shader should receive the correct per-pixel normal.
The only explanation (I can think of) of the fact that you're having the result as on your left image is that every fragment on a given face ultimately receives the same normal. You can confirm it with a fragment shader like:
void main() {
vFragColor = normalize(vVaryingNormal);
}
If it's the case, the question remains: Why? The vertex shader looks OK.
So maybe there's something wrong in your geometry? What is the data which you send to the shader? Are you sure you have correctly calculated per-vertex normals, not just per-face normals?
The orange lines are normals of the diagonal face, the red lines are normals of the horizontal face.
If your data looks like the above image, then even with a correct shader you'll get flat shading. Make sure that you have correct per-vertex normals like on the lower image. (They are really simple to calculate for a sphere.)