I'm using Visual Studio 2013 and OpenGL to create a simulation.
And I use the keboard inputs to make certain changes in the variables and the update is seen on the window that was created.
However there is a small delay between after I press the key and the changed output on the window.
I tried using the Visual Studio Diagnostic tools and saw that there were 2 key functions that was CPU intensive.
One was a user function that I created and another was 'display/main/__tmainCRTStartup/mainCRTStartup
I'm assuming this a GLUT function. So is this normal or am I doing something wrong?
Any help would be appreciated.
void keyboard (unsigned char key, int x, int y)
{
switch (key)
{
case 'r': case 'R':
if (filling==0)
{
glPolygonMode (GL_FRONT_AND_BACK, GL_FILL);
filling=1;
}
else
{
glPolygonMode (GL_FRONT_AND_BACK, GL_POINT);
filling=0;
}
break;
case 27:
exit(0);
break;
}
}
.
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef(-90,0.0,1.0,0.0);
glRotatef(-90,1.0,0.0,0.0);
rotation_x = rotation_x + (rotation_x_increment - rotation_x)/50;
rotation_y = rotation_y + (rotation_y_increment - rotation_y)/50;
rotation_z = rotation_z + rotation_z_increment;
if (rotation_x > 359) rotation_x = 0;
if (rotation_y > 359) rotation_y = 0;
if (rotation_z > 359) rotation_z = 0;
if(rotation_x_increment > 359) rotation_x_increment = 0;
if(rotation_y_increment > 359) rotation_y_increment = 0;
if(rotation_z_increment > 359) rotation_z_increment = 0;
glRotatef(rotation_x,1.0,0.0,0.0);
glRotatef(rotation_y,0.0,1.0,0.0);
glRotatef(rotation_z,0.0,0.0,1.0);
glTranslatef(x_translate,0.0,0.0);
glTranslatef(0.0,y_translate,0.0);
glTranslatef(0,0,z_translate);
glEnd();
glutSwapBuffers();
glFlush(); // This force the execution of OpenGL commands
}
.
int main(int argc, char **argv)
{
IntroDisplay();
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(screen_width,screen_height);
glutInitWindowPosition(0,0);
glutCreateWindow("Ultrasonic Testing");
glutDisplayFunc(display);
glutIdleFunc(display);
glutReshapeFunc (resize);
glutKeyboardFunc (keyboard);
glutSpecialFunc (keyboard_s);
glutMouseFunc(mouse);
glutMotionFunc(mouseMove);
init();
glutMainLoop();
return 0;
}
I've copied the code to a new project, i've changed the display function to draw a triangle and it works very fast.
(I know it should be a comment, not an answer, but I don't have enough reputation to add comments).
PD: You can change
glTranslatef(x_translate,0.0,0.0);
glTranslatef(0.0,y_translate,0.0);
glTranslatef(0,0,z_translate);
to
glTranslatef(x_translate,y_translate,z_translate);
Related
I've tried some OpenGL C++ training.
But I have a logic problem, how can I update my OpenGL Windows window.
It should draw text one, then delay 1-2sec, then draw text 2, but now it draws same time. Can anyone help or give a hint.
void text () {
wait(1);
Sleep(1000);
std::string text_one;
text_one = "Text 1";
glColor3f(1,01, 0);
drawText(text_one.data(), text_one.size(), 050, 150);
glutPostRedisplay();
wait (1)
std::string text_two;
text_two = "Text 2";
glColor3f(1,0, 0);
drawText(text_two.data(), text_two.size(), 250, 150);
}
and here the main
int main(int argc, char **argv) {
// init GLUT and create Window
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100,100);
glutInitWindowSize(640,640);
glutCreateWindow("Test 001");
// register callbacks
glutDisplayFunc(renderScene);
glutIdleFunc(text);
// enter GLUT event processing cycle
glutMainLoop();
return 1;
}
You should render in renderScene callback. It will be called automatically in you screen refresh rate. If you want some delay you need to implement it inside this callback (functions called from this callback).
So basically you need to re-render everything every 1/60 second.
If you want to implement easy delay you can do something like this:
void renderScene() {
time += deltaTime;
RenderText1();
if (time > delayTime)
RenderText2();
glutSwapBuffers();
}
For example, in my "test.txt" file
2 3
2 4
2 5
When I show in my OpenGL/C++ ( I'm using graph to represent it )
so when I go to my "test.txt" file and change some value and want to see
the new graph with new data without stop and recompile back.
isn't any way to refresh it?
NOTE: glutPostRedisplay() not working.
Thanks
void drawScene(void)
{
readFile(inFile);
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(0.0,0.0,0.0);
histogram();
linestrip();
piechart2012();
piechart2013();
glFlush();
//readFile(inFile);
}
int main( int argc, char **argv )
{
glutInit (&argc, argv); /* Initialise OpenGL */
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGBA);
glutInitWindowSize(800,600);
glutInitWindowPosition(100,100);
printInteraction();
glutCreateWindow ( "square.cpp" ); /* Create the window */
glutDisplayFunc (drawScene); /* Register the "display" function */
glutReshapeFunc(resize);
glutKeyboardFunc(keyInput);
setup();
glutMainLoop(); /* Enter the OpenGL main loop */
return 0;
}
i was writing a code in C/C++ and i face an error .
#include "glut.h"
#include <random>
// Classes and structs //
struct GLPoint {
GLfloat x, y;
};
// Method(s) Declration //
void drawDot(GLfloat, GLfloat);
void serpinski_render(void);
void myInti(void);
// Method(s) Implementation //
void drawDot(GLfloat x, GLfloat y){
glBegin(GL_POINTS);
glVertex2i(x, y);
glEnd();
}
void serpinski_render(void)
{
glClear(GL_COLOR_BUFFER_BIT); // Clear the screen from anything is displayed on it
GLPoint T[3] = { { 10, 10 }, { 600, 10 }, { 300, 600 } }; // the three points of parent triangle
int index = rand() % 3; // this mean i will choose a random number between 0 , 3
GLPoint point = T[index];
drawDot(point.x, point.y);
for (unsigned int i = 0; i < 5500; i++) // a loop that going to run 5500 ( a very big number )
{
index = rand() % 3;
point.x = (point.x + T[index].x) / 2;
point.y = (point.y + T[index].y) / 2;
drawDot(point.x, point.y);
}
glFlush();
}
void myInti(void)
{
glClearColor(1, 1, 1, 0); // a white background
glColor3f(0, 0, 0); // black points
glPointSize(3); // 3 pixel point size
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, 640, 0, 480);
}
// Main Method //
void main(int argc ,char ** argv )
{
glutInit(&argc, argv); // intilize toolkit
glutInitWindowPosition(100, 150);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(640, 480); // windows size is 640 x 480
glutDisplayFunc(serpinski_render);
myInti();
glutMainLoop();
}
i dont know if it will work fine but this code should produce me Sierpinski triangle .
and i face every time i use C++ library in this case the random lib this problem in the stdlib.h making me confused never face something like it before
Error 1 error C2381: 'exit' : redefinition; __declspec(noreturn) differs c:\program files (x86)\microsoft visual studio 12.0\vc\include\stdlib.h 376
There is an incompatibility between glut.h and Visual Studio .NET, which is the usage of both "glut.h" and in your case.
You can solve it by just declaring:
#include <random>
#include "glut.h"
instead of:
#include "glut.h"
#include <random>
Please read this description for further information and for an another solution. ("Header (.h) files" section)
Also your code will possibly fail because of not creating window. You can use glutCreateWindow to create a window. You can also solve this issue by arranging your main like below:
void main(int argc ,char ** argv )
{
glutInit(&argc, argv); // intilize toolkit
glutInitWindowPosition(100, 150);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(640, 480); // windows size is 640 x 480
glutCreateWindow("A title");
myInti();
glutDisplayFunc(serpinski_render);
glutMainLoop();
}
Please also read this information for glutCreateWindow function.
You probably have this code in glut.h:
# ifndef GLUT_BUILDING_LIB
extern _CRTIMP void __cdecl exit(int);
# endif
The glut.h header is quite old. This was probably a workaround for an old VC deficiency. Visual C now seems to have a declaration that conflicts with this one. The easy solution is to just delete these lines from the header, since there is a valid definition in stdlib.h.
By the way, all the glVertex, glBegin, glEnd, matrix stack, and many other OpenGL calls are deprecated in favor of shaders.
Perhaps there is also a newer/better glut available. I'd check that out.
I am trying to implement zoom in or zoom out operations using mouse scroll button
by glutMouseWheelFunc in opengl . I have implemted the code as below :
#include<GL/freeglut.h>
void mouseWheel(int button, int dir, int x, int y)
{
printf("in mouse wheel \n");
if (dir > 0)
{
// Zoom in
ztrans = ztrans - 1.0;
printf("scroll in = %0.3f\n ",ztrans);
}
else
{
// Zoom out
ztrans = ztrans + 1.0;
printf("scroll out = %0.3f\n ",ztrans);
}
glutPostRedisplay();
}
int main(int argc, char **argv)
{
// general initializations
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100, 100);
glutInitWindowSize(800, 400);
glutCreateWindow("Rotation");
// register callbacks
glutReshapeFunc(changeSize);
glutDisplayFunc(renderScene);
glutIdleFunc(renderScene);
glutIgnoreKeyRepeat(1);
glutMouseFunc(mouseButton);
glutMotionFunc(mouseMove);
glutMouseWheelFunc(mouseWheel); // Register mouse wheel function
glEnable(GL_DEPTH_TEST);
glutMainLoop();
return 0;
}
On executing, it is not calling the registered callback function(mouseWheel) . My system has freeglut3 installed.
try using
a static int inside void mouseWheelmethod, and then use it in renderScene
like this
static int k;
static int ztrans
void mouseWheel(int button, int dir, int x, int y)
{
k = dir; // int dir is +1 of -1 based on the direction of the wheel motion
ztrans = ztrans + k;
}
This worked for me,
Try this and feedback, GoodLuck .
How do you control the speed of an animation? My objects animate faster on another's machine.
void idle(void){
if (!wantPause){
circleSpin = circleSpin + 2.0; //spin circles
if(circleSpin > 360.0)
{
circleSpin = circleSpin - 360.0;
}
diamondSpin = diamondSpin - 4.0; //spin diamonds
if(diamondSpin > 360.0)
{
diamondSpin = diamondSpin + 360.0;
}
ellipseScale = ellipseScale + 0.1; //scale ellipse
if(ellipseScale > 30)
{
ellipseScale = 15;
}
glutPostRedisplay();
}
}
void drawScene()
{
...
glColor3f(1,0,0);
glPushMatrix();
glRotatef(circleSpin,0,0,1);
drawOuterCircles();
glPopMatrix();
}
int main (int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowSize(400,400);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutCreateWindow("Lesson 6");
init();
glutDisplayFunc(drawScene);
glutKeyboardFunc(keyboard);
glutReshapeFunc(handleResize);
glutIdleFunc(idle);
glutMainLoop();
return 0;
}
This is the poor man's solution:
FPS = 60.0;
while (game_loop) {
int t = getticks();
if ((t - t_prev) > 1000/FPS)
process_animation_tick();
t_prev = t;
}
this is the better solution:
GAME_SPEED = ...
while (game_loop) {
int t = getticks();
process_animation((t - t_prev)*GAME_SPEED/1000.0);
t_prev = t;
}
In the first one, getframe moves your object by a fixed amount, but that is prone to errors if framerate drops.
In the latter, you move the objects based on the time passed. E.g. if 20ms pass, you rotate an object 12 degrees, and if 10ms pass, you rotate it 6 degrees. In general, the animation if a function of time passed.
Implementation of getticks() is up to you. For a start you could use glutGet(GLUT_ELAPSED_TIME).
In your case it would look something like:
int old_t;
void idle(void) {
int t = glutGet(GLUT_ELAPSED_TIME);
int passed = t - old_t;
old_t = t;
animate( passed );
glutPostRedisplay();
}
void animate( int ms )
{
if (!wantPause){
circleSpin = circleSpin + ms*0.01; //spin circles
if(circleSpin > 360.0)
{
circleSpin = circleSpin - 360.0;
}
diamondSpin = diamondSpin - ms*0.02; //spin diamonds
if(diamondSpin > 360.0)
{
diamondSpin = diamondSpin - 360.0;
}
ellipseScale = ellipseScale + ms*0.001; //scale ellipse
if(ellipseScale > 30)
{
ellipseScale = 15;
}
}
}