2D white grid not displaying above background - c++

I'm trying to create a 2D hollow grid on top of a purple background; however, what's being displayed whenever I create the grid is a white window.
I created the 2D grid using GL_Lines, as I only wanted the borders to be colored white and not the inside of the grid, which is not what happens.
#include <stdio.h>
#include <stdlib.h>
#include <ctime>
#include <cmath>
#include <string.h>
#include<GL/glut.h>
int gridX = 1000;
int gridY = 600;
void drawGrid();
void drawUnit(int, int);
void drawGrid() {
for (int x = 0; x < gridX; x++) {
for (int y = 0; y < gridY; y++) {
drawUnit(x, y);
}
}
}
void drawUnit(int x, int y) {
glLineWidth(1.0);
glColor3f(1.0,1.0,1.0);
glBegin(GL_LINE);//(x,y)(x+1,y)(x+1,y+1)(x,y+1)
glVertex2f(x,y);
glVertex2f(x+1, y);
glVertex2f(x + 1, y);
glVertex2f(x+1, y+1);
glVertex2f(x + 1, y + 1);
glVertex2f(x, y+1);
glVertex2f(x, y + 1);
glVertex2f(x, y);
glEnd();
}
void Display() {
glClear(GL_COLOR_BUFFER_BIT);
drawGrid();
glFlush();
}
void main(int argc, char** argr) {
glutInit(&argc, argr);
glutInitWindowSize(gridX, gridY);
drawGrid();
glutCreateWindow("OpenGL - 2D Template");
glutDisplayFunc(Display);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glClearColor(120.0f / 255.0f, 92.0f / 255.0f, 166.0f / 255.0f, 0.0f);
gluOrtho2D(0.0, gridX, 0.0, gridY);
glutMainLoop();
}

GL_LINE is not an OpenGL primitive type. But GL_LINES is a line primitive type (see Line primitives):
glBegin(GL_LINE);
glBegin(GL_LINES);
GL_LINE is a polygon rasterization mode (see glPolygonMode).
One cell in your grid is only 1 pixel in size. This results in the entire screen being filled in white. Use a different size for the cells. For instance:
void drawGrid()
{
int size = 10;
for (int x = 0; x < gridX; x += 10)
{
for (int y = 0; y < gridY; y += 10)
{
drawUnit(x, y, size);
}
}
}
void drawUnit(int x, int y, int size)
{
glLineWidth(1.0);
glColor3f(1.0,1.0,1.0);
glBegin(GL_LINES);
glVertex2f(x,y);
glVertex2f(x+size, y);
glVertex2f(x + size, y);
glVertex2f(x+size, y+size);
glVertex2f(x + size, y + size);
glVertex2f(x, y+size);
glVertex2f(x, y + size);
glVertex2f(x, y);
glEnd();
}

Related

draw an arbitrary line with OpenGL(i.e. no limit on axis range)

I want to draw a 2D line with parameters defined by user. But the range of x and y axis is [-1,1].
How can I draw the line that can be completely displayed in the window? I used gluOrtho2D(-10.0, 10.0, -10.0, 10.0) but it doesn't seem a good choice because the range is dynamic according to the parameters.
For example, the line is y=ax^3+bx^2+cx+d. the range of x is [1, 100].
My code is:
#include "pch.h"
#include<windows.h>
#include <gl/glut.h>
#include <iostream>
using namespace std;
double a, b, c, d;
void init(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
gluOrtho2D(-10.0, 10.0, -10.0, 10.0);
}
double power(double x, int p) {
double y = 1.0;
for (int i = 0; i < p; ++i) {
y *= x;
}
return y;
}
void linesegment(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1, 0, 0);
glPointSize(1);
glBegin(GL_POINTS);
for (int i = 1; i <= 10; ++i) {
double y = a * power(i, 3) + b * power(i, 2) + c * i + d;
glVertex2f(i, y);
}
glEnd();
glFlush();
}
int main(int argc, char**argv)
{
if (argc < 4) {
cout << "should input 4 numbers" << endl;
return 0;
}
a = atof(argv[1]);
b = atof(argv[2]);
c = atof(argv[3]);
d = atof(argv[4]);
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowPosition(50, 100);
glutInitWindowSize(400, 300);
glutCreateWindow("AnExample");
init();
glutDisplayFunc(linesegment);
glutMainLoop();
return 0;
}
Setting projection matrix is not a one-time-only operation. You can change it anytime you like. As a matter of fact, the way you do init is strongly discouraged. Just set the projection parameters in your drawing function. Also use standard library functions and don't roll your own. No need to implement power yourself. Just use the pow standard library function. Last but not least, use double buffering; for it gives better performance, and has better compatibility.
#include "pch.h"
#include <windows.h>
#include <gl/glut.h>
#include <iostream>
#include <cmath>
using namespace std;
double a, b, c, d;
double x_min, x_max, y_min, y_max; // <<<<---- fill these per your needs
void linesegment(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(x_min, x_max, y_min, y_max, -1, 1);
glColor3f(1, 0, 0);
glPointSize(1);
glBegin(GL_POINTS);
for (int i = 1; i <= 10; ++i) {
double y = a * pow(i, 3) + b * pow(i, 2) + c * i + d;
glVertex2f(i, y);
}
glEnd();
glFlush();
}

OpenGL - Draw a Triangle wherever I click?

I have an OpenGL project with a screen size of 1000x800, and I want to be able to draw a triangle when I click the left mouse button.
I already have a mouse function set up that works fine:
struct point
{
int x;
int y;
};
std::vector <point> dots;
point OneDot;
void processMouse(int button, int state, int x, int y)
{
if ((button == GLUT_LEFT_BUTTON) && (state == GLUT_DOWN))
{
int yy;
yy = glutGet(GLUT_WINDOW_HEIGHT);
y = yy - y; /* In Glut, Y coordinate increases from top to bottom */
OneDot.x = x;
OneDot.y = y;
dots.push_back(OneDot);
}
}
void display(){
for (int i = 0; i < dots.size();i++){
glPointSize(10)
glBegin(GL_POINTS);
glVertex2i(dots[i].x, dots[i].y);
}
}
So in my display() function, how can I add some code that prints a triangle at the mouse location when I click?
UPDATE:
Here's my current drawCircle function:
void drawCircle(float cx, float cy, float r, float num_segments) {
// Sets variables for X, Y, Radius and Segments
glColor3f(1.0, 0.0, 0.0); // Red
glBegin(GL_POLYGON);
// To set 0 as origin point
for (int i = 0; i < num_segments; i++) {
float theta = 2.0f * 3.14 * i / num_segments;
float x = r * cosf(theta);
float y = r * sinf(theta);
glVertex2f(x + cx, y + cy);
}
glEnd();
}
Is there a way I could draw this function somehow at the mouse location when I left click?
In your code you're almost there. Two things you have to add: In your mouse event handler you have to set the flag that tells your window manager (in your case GLUT) to refresh the display. That function would be glutPostRedisplay.
Then in your display function you have to push 3 vertices instead of 1 and change the primitive type to GL_TRIANGLES. Like this
void processMouse(int button, int state, int x, int y)
{
if ((button == GLUT_LEFT_BUTTON) && (state == GLUT_DOWN))
{
int yy;
yy = glutGet(GLUT_WINDOW_HEIGHT);
y = yy - y; /* In Glut, Y coordinate increases from top to bottom */
OneDot.x = x;
OneDot.y = y;
dots.push_back(OneDot);
glutPostRedisplay(); ///<<<<<<<<<<
}
}
and
void display(){
glBegin(GL_TRIANGLES);
for (int i = 0; i < dots.size();i++){
glPointSize(10)
glVertex2i(dots[i].x-3, dots[i].y-5);
glVertex2i(dots[i].x+3, dots[i].y-5);
glVertex2i(dots[i].x, dots[i].y+5);
}
glEnd();
}
or, if you want to define the corners of the triangle(s) by the clicks just
void display(){
glBegin(GL_TRIANGLES);
for (int i = 0; i < dots.size();i++){
glPointSize(10);
glVertex2i(dots[i].x, dots[i].y);
}
glEnd();
}
You don't really have to check if 3 <= dots.size() because OpenGL will simply draw nothing then.
I haven't got experience in glbegin notation, but seems you miss glend
It should be something like this:
void display(){
glPointSize(10)
glBegin(GL_POINTS);
for (int i = 0; i < dots.size();i++){
glVertex2i(dots[i].x, dots[i].y);
}
glEnd()
}

How to draw a Bezier curve with C++ in OpenGL using floating point values

I am trying to draw a Bezier curve in OpenGL using floating point values. I have tried using many different code examples. My current code below, runs ,but does not show the curve on screen. The usual way to draw Bezier curves are with integer values, which means using the GLUORTHO2D() function for drawing the curve. But I want to draw a curve using floating point values. Such as x range(-1,1) and y range(-1,1).
like if x=(500) then consider it (-1 to 1) and if y=(800) then consider it (-1,1).
I have already tried using integer values and it worked for me. my code using integer values is below:
#include <GL/glut.h>
#include <math.h>
#include <stdio.h>
#define CTRL_COUNT 100
int ctrlPointsCount;
int ctrlPointsX[CTRL_COUNT], ctrlPointsY[CTRL_COUNT];
int X1[3]={20,25,20}, Y1[3]={5,24,38}; //first point(x1[0],y1[0]) second(x1[1],y1[1]) third(x1[2],y1[2])
void myInit()
{
glClearColor(0.0,0.0,0.0,0.0);
glColor3f(1.0,0.0,0.0);
glPointSize(8.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0.0,128.0,0.0,96.0);
}
//p(t)=(1-t)^3*p0+3t(1-t)^2*p1+3t^2(1-t)p2+t^3p3
float getNextBezierPointX(float t)
{
float x=0.0;
for(int i=0; i<ctrlPointsCount; i++)
{
int c;
if(i==0 || i==ctrlPointsCount-1)
c = 1;
else
{
c = ctrlPointsCount-1;
}
x += c * pow(t, i) * pow(1-t, ctrlPointsCount-1-i) * ctrlPointsX[i];
}
return x;
}
float getNextBezierPointY(float t)
{
float y=0.0;
for(int i=0; i<ctrlPointsCount; i++)
{
int c;
if(i==0 || i==ctrlPointsCount-1)
c = 1;
else
{
c = ctrlPointsCount-1;
}
y += c * pow(t, i) * pow(1-t, ctrlPointsCount-1-i) * ctrlPointsY[i];
}
return y;
}
void drawline()
{
// draw control points using red color
for(int i=0; i < 3; i++)
{
glBegin(GL_POINTS);
glVertex2i(ctrlPointsX[i], ctrlPointsY[i]);
glEnd();
glFlush();
}
// draw bezier curve using control poitns by calculating next points using cubic bezier curve formula
float oldX=ctrlPointsX[0], oldY=ctrlPointsY[0];
for(double t = 0.0;t <= 1.0; t += 0.01) {
float x = getNextBezierPointX(t);
float y = getNextBezierPointY(t);
//glColor3f(1.0,t,1.0);
glColor3f(1.0,1.0,1.0);
glBegin(GL_LINES);
glVertex2f(oldX, oldY);
glVertex2f(x, y);
glEnd();
glFlush();
oldX = x;
oldY = y;
}
}
void myDisplay()
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0,0.0,0.0);
ctrlPointsCount=3;
for(int i=0;i<3;i++)
{
ctrlPointsX[i] = X1[i];
ctrlPointsY[i] = Y1[i];
}
drawline();
glFlush();
}
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE|GLUT_RGB);
glutInitWindowSize(640,480);
glutInitWindowPosition(100,150);
glutCreateWindow("Bezier Curve");
glutDisplayFunc(myDisplay);
myInit();
glutMainLoop();
return 0;
}
But when i tried using floating point values , it does not work for me. It does not show the curved line on screen. My code using floating point values is below:
#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <GL/glut.h>
using namespace std;
#define CTRL_COUNT 100
int ctrlPointsCount;
int ctrlPointsX[CTRL_COUNT], ctrlPointsY[CTRL_COUNT];
double X1[3] = { 0.26015037593985, 0.43609022556391, 0.6 }, Y1[3] = { 0.946875, 0.884375, 0.946875 };
//Initializes 3D rendering
void initRendering() {
glEnable(GL_DEPTH_TEST);
}
float getNextBezierPointX(float t)
{
float x = 0.0;
for (int i = 0; i<ctrlPointsCount; i++)
{
int c;
if (i == 0 || i == ctrlPointsCount - 1)
c = 1;
else
{
c = ctrlPointsCount - 1;
}
x += c * pow(t, i) * pow(1 - t, ctrlPointsCount - 1 - i) * ctrlPointsX[i];
}
return x;
}
float getNextBezierPointY(float t)
{
float y = 0.0;
for (int i = 0; i<ctrlPointsCount; i++)
{
int c;
if (i == 0 || i == ctrlPointsCount - 1)
c = 1;
else
{
c = ctrlPointsCount - 1;
}
y += c * pow(t, i) * pow(1 - t, ctrlPointsCount - 1 - i) * ctrlPointsY[i];
}
return y;
}
void drawline()
{
// draw control points using red color
for (int i = 0; i < 3; i++)
{
glBegin(GL_POINTS);
glVertex2i(ctrlPointsX[i], ctrlPointsY[i]);
glEnd();
glFlush();
}
// draw bezier curve using control poitns by calculating next points using cubic bezier curve formula
float oldX = ctrlPointsX[0], oldY = ctrlPointsY[0];
for (double t = 0.0; t <= 1.0; t += 0.01)
{
float x = getNextBezierPointX(t);
float y = getNextBezierPointY(t);
//glColor3f(1.0,t,1.0);
glColor3f(1.0, 1.0, 1.0);
glBegin(GL_LINES);
glVertex2f(oldX, oldY);
glVertex2f(x, y);
glEnd();
glFlush();
oldX = x;
oldY = y;
}
}
//Called when the window is resized
void handleResize(int w, int h) {
glViewport(0, 0, w, h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0, (double)w / (double)h, 1.0, 200.0);
}
float _angle = 0.0;
float _cameraAngle = 0.0;
float _ang_tri = 0.0;
//Draws the 3D scene
void drawScene() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity(); //Reset the drawing perspective
ctrlPointsCount = 3;
for (int i = 0; i<3; i++)
{
ctrlPointsX[i] = X1[i];
ctrlPointsY[i] = Y1[i];
}
drawline();
glutSwapBuffers();
}
void update(int value) {
_angle += 2.0f;
if (_angle > 360) {
_angle -= 360;
}
_ang_tri += 2.0f;
if (_ang_tri > 360) {
_ang_tri -= 360;
}
glutPostRedisplay(); //Tell GLUT that the display has changed
//Tell GLUT to call update again in 25 milliseconds
glutTimerFunc(25, update, 0);
}
int main(int argc, char** argv) {
//Initialize GLUT
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(1331, 641);
glutInitWindowPosition(0, 0);
//Create the window
glutCreateWindow("Our cg project");
initRendering();
//Set handler functions
glutDisplayFunc(drawScene);
glutReshapeFunc(handleResize);
glutTimerFunc(25, update, 0); //Add a timer
glClearColor(0.0, 0.7, 1.5,0.0);
glutMainLoop();
return 0;
}
The problem is this here:
int ctrlPointsX[CTRL_COUNT], ctrlPointsY[CTRL_COUNT];
double X1[3] = { 0.26015037593985, 0.43609022556391, 0.6 }, Y1[3] = {0.946875, 0.884375, 0.946875 };
for (int i = 0; i<3; i++)
{
ctrlPointsX[i] = X1[i];
ctrlPointsY[i] = Y1[i];
}
ctrlPointsX and ctrlPointsYcan only hold integer values. So when you do ctrlPointsX[i] = X1[i] and ctrlPointsY[i] = Y1[i] you are converting the floats to integers, which will round them down. So all your controlPoints will be 0.
You have to declare the controlPoints arrays as type double too:
double ctrlPointsX[CTRL_COUNT], ctrlPointsY[CTRL_COUNT];
double X1[3] = { 0.26015037593985, 0.43609022556391, 0.6 }, Y1[3] = {0.946875, 0.884375, 0.946875 };
This should fix your problem.

How does C++ Opengl circle function work

I am reading a book of opengl and there is a function to draw a circle but I don't know how to put this function in my code and run it and also I don't know what parameter I put in this.
I am new to opengl and I am trying to figure it out.
Code
#include <stdlib.h>
#include <GL/glut.h>
#include <cmath>
void keyboard(unsigned char key, int x, int y);
void display(void);
void drawCircle(float cx, float cy, float r, int num_segments);
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutKeyboardFunc(&keyboard);
glutDisplayFunc(&display);
glutMainLoop();
return EXIT_SUCCESS;
}
void keyboard(unsigned char key, int x, int y)
{
switch (key)
{
case '\x1B':
exit(EXIT_SUCCESS);
break;
}
}
void display()
{
glClear(GL_COLOR_BUFFER_BIT);
//drawCircle(, , , );
glFlush();
}
void drawCircle(float cx, float cy, float r, int num_segments)
{
glBegin(GL_LINE_LOOP);
for (int i = 0; i < num_segments; i++)
{
float theta = i * (2.0f * PI / num_segments); // get the current angle
float x = r * cos(theta); // calculate the x component
float y = r * sin(theta); // calculate the y component
glVertex2f(x + cx, y + cy); // output vertex
}
glEnd();
}
You're missing window creation and a setting the color of the circle you are
drawing:
#include <stdlib.h>
#include <gl/glut.h>
#include <math.h>
#define M_PI 3.14159265359
void keyboard(unsigned char key, int x, int y);
void display(void);
void drawCircle(float cx, float cy, float r, int num_segments);
int main(int argc, char** argv)
{
int width = 1280;
int height = 720;
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE);
glutInitWindowSize(width, height);
glutCreateWindow("circle");
glutKeyboardFunc(&keyboard);
glutDisplayFunc(&display);
glutMainLoop();
return EXIT_SUCCESS;
}
void keyboard(unsigned char key, int x, int y)
{
switch (key)
{
case '\x1B':
exit(EXIT_SUCCESS);
break;
}
}
void display()
{
glColor3f(1.0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
gluOrtho2D(0.0, 1280, 0.0, 720);
glColor3f(1.0, 1.0, 1.0);
drawCircle(640, 360, 100, 200);
glFlush();
}
void drawCircle(float cx, float cy, float r, int num_segments)
{
glBegin(GL_LINE_LOOP);
for (int i = 0; i < num_segments; i++)
{
float theta = i * (2.0f * M_PI / num_segments); // get the current angle
float x = r * cos(theta); // calculate the x component
float y = r * sin(theta); // calculate the y component
glVertex2f(x + cx, y + cy); // output vertex
}
glEnd();
}

Drawing N-Pointed Star in OpenGL - C++

I've written a code that inputs n and draws N-pointed star,
just like this one:
when n=5 and filed
the problem that is whenever n=7 or 8 or 16 or 25...
I get a problem in the star drawing it becomes like this :
when n=7 and filled
Here's my code:
#include <iostream>
#include <ctime>
#include <vector>
#include <glut.h>
using namespace std;
float starCenterX, starCenterY, starRadius;
int numPoints;
bool bDrawFill = false;
void DrawStar (float cx, float cy, float radius, int numPoints);
void DrawStarFilled (float cx, float cy, float radius, int numPoints);
float width, height; // global variables to store window width and height
// render text
void renderBitmapString (float x, float y, float z, void* font, const char* string)
{
const char *c;
glRasterPos3f (x, y,z);
for (c = string; *c != '\0'; c++)
glutBitmapCharacter (font, *c);
}
void init ()
{
glClearColor (1.0, 1.0, 1.0, 0.0); // set display-window color to white
}
void reshape (int width, int height)
{
::width = width;
::height = height;
glViewport (0, 0, width, height);
glMatrixMode (GL_PROJECTION); // set projection parameters
glLoadIdentity ();
gluOrtho2D (0.0, width, 0.0, height);
glMatrixMode (GL_MODELVIEW); // set projection parameters
glLoadIdentity ();
}
void display ()
{
glClear (GL_COLOR_BUFFER_BIT); // clear display window
glColor3f (0, 0, 1);
renderBitmapString (10, height - 20, 0, GLUT_BITMAP_TIMES_ROMAN_24, "Name : Saif Badran");
renderBitmapString (10, height - 50, 0, GLUT_BITMAP_TIMES_ROMAN_24, "ID : 0142852");
renderBitmapString (10, height - 80, 0, GLUT_BITMAP_TIMES_ROMAN_24, "Section : 2");
DrawStar(starCenterX,starCenterY,starRadius,numPoints);
if(bDrawFill)
DrawStarFilled(starCenterX,starCenterY,starRadius,numPoints);
glFlush (); // process all openGl routines as quickly as possible
}
void processNormalKeys (unsigned char key, int x, int y)
{
if(key=='w' || key=='W')
starCenterY+=4;
else if(key=='z' || key=='Z')
starCenterY-=4;
else if(key=='a' || key=='A')
starCenterX-=4;
else if(key=='d' || key=='D')
starCenterX+=4;
else if(key=='f' || key=='F')
bDrawFill = (bDrawFill==1?0:1);
}
void mouseClick (int button, int state, int x, int y)
{
if (button == GLUT_LEFT_BUTTON && state == GLUT_DOWN)
{
starCenterX = x;
starCenterY = height - y;
}
}
void activeMouseMotion (int x, int y)
{
starRadius = abs(starCenterX-x);
}
void main (int argc, char** argv)
{
cout<<"Enter number of points : ";
cin>>numPoints;
numPoints = (numPoints < 2) ? 2 : numPoints;
glutInit (&argc, argv); // initialize GLUT
glutInitDisplayMode (GLUT_SINGLE | GLUT_RGB); // set display mode
glutInitWindowPosition (20, 20); // set top left display window position
glutInitWindowSize (600, 600); // set display window width and height
glutCreateWindow ("Homework#2 : Star Drawing"); // create display window
init (); // execute initialization function
glutKeyboardFunc (processNormalKeys);
glutMouseFunc (mouseClick);
glutMotionFunc (activeMouseMotion);
glutReshapeFunc (reshape);
glutDisplayFunc (display); // send graphics to display window
glutIdleFunc (display);
glutMainLoop (); // dispaly everthing and wait
}
void DrawStar (float cx, float cy, float radius, int numPoints)
{
const float DegToRad = 3.14159 / 180;
glColor3f(1.0,0.0,0.0);
glBegin (GL_POINTS);
int count = 1;
for (int i = 0; i <= 360; i+=360/(numPoints*2)) {
float DegInRad = i * DegToRad;
if(count%2!=0)
glVertex2d (cx + cos (DegInRad) * radius, cy + sin (DegInRad) * radius);
else
glVertex2d ((cx + cos (DegInRad) * radius/2), (cy + sin (DegInRad) * radius/2));
count++;
}
glEnd();
glBegin (GL_LINE_LOOP);
count = 1;
for (int i = 0; i <= 360; i+=360/(numPoints*2)) {
float DegInRad = i * DegToRad;
if(count%2!=0)
glVertex2d (cx + cos (DegInRad) * radius, cy + sin (DegInRad) * radius);
else
glVertex2d ((cx + cos (DegInRad) * radius/2), (cy + sin (DegInRad) * radius/2));
count++;
}
glEnd();
}
void DrawStarFilled (float cx, float cy, float radius, int numPoints)
{
const float DegToRad = 3.14159 / 180;
glBegin (GL_TRIANGLE_FAN);
int count = 1;
glVertex2f(starCenterX, starCenterY);
for (int i = 0; i <= 360; i+=360/(numPoints*2)) {
float DegInRad = i * DegToRad;
if(count%2!=0)
glVertex2d (cx + cos (DegInRad) * radius, cy + sin (DegInRad) * radius);
else
glVertex2d ((cx + cos (DegInRad) * radius/2), (cy + sin (DegInRad) * radius/2));
count++;
}
glEnd();
}
The issue is in this line:
for (int i = 0; i <= 360; i+=360/(numPoints*2)) {
For numPoints = 5, for each step i will be incremented with 360/(2*5) = 36.
For numPoints = 7, for each step i will be incremented with 360/(2*7) = 25 (integer division, truncating 25.714... to 25). So, at each step there is a 0.714.. degrees loss. Cummulated, this is: 360 - 14 * 25 = 10 degrees. This can be seen on the output picture.
To solve this we can use a floating point variable for the step counter, and to increment it with a floating point value obtained from a floating point division, using for example 360.0 as the numerator. (Actually 360.0 is stored as a double, to store it as a single precision float it should be 360.0f).
for (float i = 0; i <= 360; i+=360.0/(numPoints*2)) {
But doing so, we may have trouble at the i <= 360 comparison, there are quantization errors resulting from floating point operations (i could be slightly smaller or bigger than the "mathematical" value). So it would be better to keep the integer counter for the loop, and do the floating point operations afterwards. This code part:
for (int i = 0; i <= 360; i+=360/(numPoints*2)) {
float DegInRad = i * DegToRad;
would then be changed to:
for (int i = 0; i <= numPoints*2; i++) {
float DegInRad = i * 360.0/(numPoints*2) * DegToRad;