GLUT does not get time with millis accuracy - c++

I've got timer issue in GLUT.
glutGet(GLUT_ELAPSED_TIME) only get time with sec accuracy (1000, 2000, 3000...)
and
glutTimerFunc(...) works only when millis parameter is set greater than 1000.
I don't know exactly how GLUT measure time
but I think there's something wrong with my system time setting.
How can I get time with millis accuracy in OpenGL?

As already mentioned in the comments above, you could use more reliable C++ date and time utilities like the std::chrono library. Here is a simple example:
#include <iostream>
#include <chrono>
int main()
{
const auto start = std::chrono::high_resolution_clock::now();
// do something...
const auto end = std::chrono::high_resolution_clock::now();
std::cout << "Took " << std::chrono::duration_cast<std::chrono::milliseconds>(end - start).count() << " ms\n";
return 0;
}

Related

How to get the difference between two time periods using the local system time [duplicate]

What's the best way to calculate a time difference in C++? I'm timing the execution speed of a program, so I'm interested in milliseconds. Better yet, seconds.milliseconds..
The accepted answer works, but needs to include ctime or time.h as noted in the comments.
See std::clock() function.
const clock_t begin_time = clock();
// do something
std::cout << float( clock () - begin_time ) / CLOCKS_PER_SEC;
If you want calculate execution time for self ( not for user ), it is better to do this in clock ticks ( not seconds ).
EDIT:
responsible header files - <ctime> or <time.h>
I added this answer to clarify that the accepted answer shows CPU time which may not be the time you want. Because according to the reference, there are CPU time and wall clock time. Wall clock time is the time which shows the actual elapsed time regardless of any other conditions like CPU shared by other processes. For example, I used multiple processors to do a certain task and the CPU time was high 18s where it actually took 2s in actual wall clock time.
To get the actual time you do,
#include <chrono>
auto t_start = std::chrono::high_resolution_clock::now();
// the work...
auto t_end = std::chrono::high_resolution_clock::now();
double elapsed_time_ms = std::chrono::duration<double, std::milli>(t_end-t_start).count();
if you are using c++11, here is a simple wrapper (see this gist):
#include <iostream>
#include <chrono>
class Timer
{
public:
Timer() : beg_(clock_::now()) {}
void reset() { beg_ = clock_::now(); }
double elapsed() const {
return std::chrono::duration_cast<second_>
(clock_::now() - beg_).count(); }
private:
typedef std::chrono::high_resolution_clock clock_;
typedef std::chrono::duration<double, std::ratio<1> > second_;
std::chrono::time_point<clock_> beg_;
};
Or for c++03 on *nix:
#include <iostream>
#include <ctime>
class Timer
{
public:
Timer() { clock_gettime(CLOCK_REALTIME, &beg_); }
double elapsed() {
clock_gettime(CLOCK_REALTIME, &end_);
return end_.tv_sec - beg_.tv_sec +
(end_.tv_nsec - beg_.tv_nsec) / 1000000000.;
}
void reset() { clock_gettime(CLOCK_REALTIME, &beg_); }
private:
timespec beg_, end_;
};
Example of usage:
int main()
{
Timer tmr;
double t = tmr.elapsed();
std::cout << t << std::endl;
tmr.reset();
t = tmr.elapsed();
std::cout << t << std::endl;
return 0;
}
I would seriously consider the use of Boost, particularly boost::posix_time::ptime and boost::posix_time::time_duration (at http://www.boost.org/doc/libs/1_38_0/doc/html/date_time/posix_time.html).
It's cross-platform, easy to use, and in my experience provides the highest level of time resolution an operating system provides. Possibly also very important; it provides some very nice IO operators.
To use it to calculate the difference in program execution (to microseconds; probably overkill), it would look something like this [browser written, not tested]:
ptime time_start(microsec_clock::local_time());
//... execution goes here ...
ptime time_end(microsec_clock::local_time());
time_duration duration(time_end - time_start);
cout << duration << '\n';
boost 1.46.0 and up includes the Chrono library:
thread_clock class provides access to the real thread wall-clock, i.e.
the real CPU-time clock of the calling thread. The thread relative
current time can be obtained by calling thread_clock::now()
#include <boost/chrono/thread_clock.hpp>
{
...
using namespace boost::chrono;
thread_clock::time_point start = thread_clock::now();
...
thread_clock::time_point stop = thread_clock::now();
std::cout << "duration: " << duration_cast<milliseconds>(stop - start).count() << " ms\n";
In Windows: use GetTickCount
//GetTickCount defintition
#include <windows.h>
int main()
{
DWORD dw1 = GetTickCount();
//Do something
DWORD dw2 = GetTickCount();
cout<<"Time difference is "<<(dw2-dw1)<<" milliSeconds"<<endl;
}
You can also use the clock_gettime. This method can be used to measure:
System wide real-time clock
System wide monotonic clock
Per Process CPU time
Per process Thread CPU time
Code is as follows:
#include < time.h >
#include <iostream>
int main(){
timespec ts_beg, ts_end;
clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &ts_beg);
clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &ts_end);
std::cout << (ts_end.tv_sec - ts_beg.tv_sec) + (ts_end.tv_nsec - ts_beg.tv_nsec) / 1e9 << " sec";
}
`
just in case you are on Unix, you can use time to get the execution time:
$ g++ myprog.cpp -o myprog
$ time ./myprog
For me, the most easy way is:
#include <boost/timer.hpp>
boost::timer t;
double duration;
t.restart();
/* DO SOMETHING HERE... */
duration = t.elapsed();
t.restart();
/* DO OTHER STUFF HERE... */
duration = t.elapsed();
using this piece of code you don't have to do the classic end - start.
Enjoy your favorite approach.
Just a side note: if you're running on Windows, and you really really need precision, you can use QueryPerformanceCounter. It gives you time in (potentially) nanoseconds.
Get the system time in milliseconds at the beginning, and again at the end, and subtract.
To get the number of milliseconds since 1970 in POSIX you would write:
struct timeval tv;
gettimeofday(&tv, NULL);
return ((((unsigned long long)tv.tv_sec) * 1000) +
(((unsigned long long)tv.tv_usec) / 1000));
To get the number of milliseconds since 1601 on Windows you would write:
SYSTEMTIME systime;
FILETIME filetime;
GetSystemTime(&systime);
if (!SystemTimeToFileTime(&systime, &filetime))
return 0;
unsigned long long ns_since_1601;
ULARGE_INTEGER* ptr = (ULARGE_INTEGER*)&ns_since_1601;
// copy the result into the ULARGE_INTEGER; this is actually
// copying the result into the ns_since_1601 unsigned long long.
ptr->u.LowPart = filetime.dwLowDateTime;
ptr->u.HighPart = filetime.dwHighDateTime;
// Compute the number of milliseconds since 1601; we have to
// divide by 10,000, since the current value is the number of 100ns
// intervals since 1601, not ms.
return (ns_since_1601 / 10000);
If you cared to normalize the Windows answer so that it also returned the number of milliseconds since 1970, then you would have to adjust your answer by 11644473600000 milliseconds. But that isn't necessary if all you care about is the elapsed time.
If you are using:
tstart = clock();
// ...do something...
tend = clock();
Then you will need the following to get time in seconds:
time = (tend - tstart) / (double) CLOCKS_PER_SEC;
This seems to work fine for intel Mac 10.7:
#include <time.h>
time_t start = time(NULL);
//Do your work
time_t end = time(NULL);
std::cout<<"Execution Time: "<< (double)(end-start)<<" Seconds"<<std::endl;

How to format steady_clock time into HH:MM:SS.Milliseconds using <chrono>from c++ stdl?

I'm currently making a small game for my C++ programming class and the professor requires us to have a timer for the game. We haven't talked at all about how to use timers and any std timer libraries so we're on our own. I found the std library and have tried to implement a simple timer for the game and have managed to do so but I can't seem to figure out how to format the time from it to a more user friendly version like HH:MM:SS.Millisecons. All I have is the raw time since I started the steady_clock till I ended it and I can display that in seconds, milliseconds, minutes, whatever, but that doesn't look as good as I'd like to. I found some solutioms but they are way too hard for me to even deconstruct and try to apply. Any simple way to do what I want?
Part of my code where I implement the timer:
// Initialize game timer using <chrono>
chrono::steady_clock::time_point start = chrono::steady_clock::now();
// While game is running (player alive and enemy robot lefts) update map
while (!GameEnd){
show_maze(maze_map);
cout << endl;
playerMove(x, y, GameEnd, died, maze_map);
}
// Terminate game timer and calculate time elapsed
chrono::steady_clock::time_point end = chrono::steady_clock::now();
chrono::steady_clock::duration time_elapsed = end - start;
// Show last map state before either player died or no more robots left
show_maze(maze_map);
// Boo / congratulate player for his performance on the game
cout << "Game Over! You " << (died ? "died by hitting a fence/robot :(" : "won because all the robots died. Congratulations!") << endl;
cout << "Your game lasted for " << chrono::duration_cast<chrono::milliseconds>(time_elapsed).count() << " milliseconds.\n\n";
You can get the number of hours out of time_elapsed with:
auto h = chrono::duration_cast<chrono::hours>(time_elapsed);
Then you can subtract the number of hours so that time_elapsed only holds a duration less than an hour:
time_elapsed -= h;
Then you can get the number of minutes out of time_elapsed:
auto m = chrono::duration_cast<chrono::minutes>(time_elapsed);
and subtract the minutes out...
time_elapsed -= m;
Now time_elapsed holds a duration less than a minute. You can continue with this pattern down to whatever granularity you desire.
C++20 has a type called std::chrono::hh_mm_ss that does precisely this operation as a convenience:
chrono::hh_mm_ss hms{time_elapsed};
hh_mm_ss has hours(), minutes(), seconds() and subseconds() getters which return the respective chrono units. To the best of my knowledge, no vendor is shipping this yet, but you can get a preview of this part of C++20, which works with C++11/14/17 here.
#include "date/date.h"
#include <chrono>
#include <iostream>
int
main()
{
using namespace std;
chrono::steady_clock::time_point start = chrono::steady_clock::now();
// ...
chrono::steady_clock::time_point end = chrono::steady_clock::now();
chrono::steady_clock::duration time_elapsed = end - start;
cout << date::hh_mm_ss{chrono::duration_cast<chrono::milliseconds>(time_elapsed)} << '\n';
}
Output:
00:00:00.000

Chrono C++ to give realtime elapsed time

Here is a clear question:
can you provide a simple example of chrono being called from main (or its own class) and used in another class. Or a link to an example.
and below is me fumbling around trying to explain my problem in more detail:
I have been working on this all day and keep ending up in the same place.
I am writing a program that is supposed to output the elapsed time after certain processes finish.
The problem I am having is that these processes are happening in different classes and I cannot get the clock to work properly.
I keep reverting back to having the clock in the main but am really struggling to make everything mesh together. So maybe this is a simple question about working with classes. But there is something about I am not understanding and I don't know what it is.
Below are the 4 lines of this timer that I keep reverting back to and placing in my main function. It prints the clock how I want in the format x.xxxxxx
auto clock_start = chrono::system_clock::now();
auto clock_now = chrono::system_clock::now();
float currentTime = float(chrono::duration_cast <chrono::microseconds> (clock_now - clock_start).count());
cout << "Elapsed Time: " << currentTime /1000000 << " S \n";
Eventually,
I have a queue of structs that im popping in a loop that I then manipulate. They need a time-stamp when printed at the end of each loop iteration.
I just can't for the life of me get the timer to give the time elapsed (or even work) while in the loop.
is this possible? I have read many threads on chrono and something is just not clicking for me when I try using the timer in multiple classes/functions across my program.
EDIT***
SO here is my current class in my meta.h:
These are private members inside class Meta
typedef std::chrono::system_clock timer;
timer::time_point currentTime;
timer::time_point startTime;
timer::time_point clock_wait;
timer::time_point clock_check;
timer::time_point elapsed_time; // this is my issue
And then I start the time in meta.cpp
void Meta::startTimer()
{
startTime = timer::now();
}
And here is the loop with some pieces missing so we can focus on the timer:
void Meta::displaySim()
{
//auto clock_start = chrono::system_clock::now(); THIS IS WHAT I WAS DOING
queue<sData>newFile;
while (!MetaQ.empty())
{
temp = MetaQ.front();
bool wait = true;
float waitTime = float(temp.ncycle)/1000;
while (wait)
{
clock_wait = timer::now();
clock_check = timer::now();
elapsed_time = timer::duration_cast<chrono::milliseconds>(clock_check - clock_wait);
if (elapsed_time.count() > waitTime)
wait = false;
}
cout << "****" << waitTime << "*****"<< endl;
end_time = timer::now();
//Below is the line that is giving me trouble now. I get an error when casting. I don't know how to make duration_cast part of the timer declared in meta.h
float EndTime = float(timer::duration_cast <chrono::milliseconds>(end_time - startTime).count());
cout << fixed << EndTime / 1000000 << " - (" << temp.desc << ')' << temp.cycle << " - " << temp.ncycle << " ms\n";
newFile.push(temp);
MetaQ.pop();
}
MetaQ = newFile;
}
timer::time_point elapsed_time; // this is my issue
Just from the name elapsed_time, this doesn't sound like a time_point. It sounds like a duration. Do this to fix that problem:
timer::duration elapsed_time;
This looks suspicious:
float waitTime = float(temp.ncycle)/1000;
Typically a time duration should have type std::chrono::duration<some representation, some period>. And you don't want to apply conversion factors like 1/1000 manually. Let <chrono> handle all conversions.
elapsed_time = timer::duration_cast<chrono::milliseconds>(clock_check - clock_wait);
duration_cast is not a static member function of system_clock. duration_cast is a namespace scope function. Use it like this:
elapsed_time = chrono::duration_cast<chrono::milliseconds>(clock_check - clock_wait);
If waitTime had a duration type, the .count() would be unnecessary here:
if (elapsed_time.count() > waitTime)
// Below is the line that is giving me trouble now. I get an error when casting.
// I don't know how to make duration_cast part of the timer declared in meta.h
float EndTime = float(timer::duration_cast <chrono::milliseconds>(end_time - startTime).count());
Best practice is to stay within the <chrono> type system instead of escaping to scalars such as float. You can get integral milliseconds with this:
auto EndTime = chrono::duration_cast<chrono::milliseconds>(end_time - startTime);
If you really want EndTime to be float-based milliseconds, that is easy too:
using fmilliseconds = chrono::duration<float, std::milli>;
fmilliseconds EndTime = end_time - startTime;
For more details, here is a video tutorial for the <chrono> library: https://www.youtube.com/watch?v=P32hvk8b13M
If this answer doesn't address your question, distill your problem down into a complete minimal program that others can copy/paste into their compiler and try out. For example I could not give you concrete advice on waitTime because I have no idea what temp.ncycle is.
Finally, and this is optional, if you would like an easier way to stream out durations for debugging purposes, consider using my free, open source, header-only date/time library. It can be used like this:
#include "date/date.h"
#include <iostream>
#include <thread>
using timer = std::chrono::system_clock;
timer::time_point clock_wait;
timer::time_point clock_check;
timer::duration elapsed_time;
int
main()
{
using namespace std::chrono_literals;
clock_wait = timer::now();
std::this_thread::sleep_for(25ms); // simulate work
clock_check = timer::now();
elapsed_time = clock_check - clock_wait;
using date::operator<<; // Needed to find the correct operator<<
std::cout << elapsed_time << '\n'; // then just stream it
}
which just output for me:
25729µs
The compile-time units of the duration are automatically appended to the run-time value to make it easier to see what you have. This prevents you from accidentally appending the wrong units to your output.

Cleanest and simplest way to get elapsed time since execution in C++

What is the simplest and cleanest way to get the time since execution of the program (with milliseconds precision) in C++ ?
I am making a wave interference simulator to produce Lissajous curves in C++. It requires the time since execution of the program (with at least milliseconds precision) to function. I can't seem to find any clean and simple way to do it after a bit of research.
All <chrono> functions seem very confusing to me. Similar questions on here on Stack Overflow seem to be either unrelated, confusing (to me) or inapplicable for my situation. I tried using functions from <time.h>, only to discover that they have precision upto seconds only.
I am running Windows 7 x64. The program need not be platform independent as it's for personal use.
Any help is greatly appreciated.
Thank you!
The new <chrono> functions take a little getting used to but they make things fairly easy when you get to know how they work.
Your problem could be solved like this for example:
#include <chrono>
#include <thread>
#include <iostream>
// for readability
using hr_clock = std::chrono::high_resolution_clock;
using hr_time_point = hr_clock::time_point;
using hr_duration = hr_clock::duration;
using milliseconds = std::chrono::milliseconds;
int main()
{
// note the program start time
hr_time_point prog_start = hr_clock::now();
// do stuff
std::this_thread::sleep_for(milliseconds(1000));
// find the duration
hr_duration d = hr_clock::now() - prog_start;
// cast the duration to milliseconds
milliseconds ms = std::chrono::duration_cast<milliseconds>(d);
// print out the number of milliseconds
std::cout << "time passed: " << ms.count() << " milliseconds.\n";
}
For convenience you could create a function to return the time since that function was last called:
milliseconds since_last_call()
{
// retain time between calls (static)
static hr_time_point previous = hr_clock::now();
// get current time
hr_time_point current = hr_clock::now();
// get the time difference between now and previous call to the function
milliseconds ms = std::chrono::duration_cast<milliseconds>(current - previous);
// store current time for next call
previous = current;
// return elapsed time in milliseconds
return ms;
}
int main()
{
since_last_call(); // initialize functions internal static time_point
// do stuff
std::this_thread::sleep_for(milliseconds(1000));
milliseconds ms = since_last_call();
// print out the number of milliseconds
std::cout << "time passed: " << ms.count() << " milliseconds.\n";
}

Precision time sleep using chrono

I want my application to sleep for precisely 2000 microseconds:
#include <iostream>
#include <chrono>
#include <thread>
std::cout << "Hello waiter" << std::endl;
std::chrono::microseconds dura( 2000 );
auto start = std::chrono::system_clock::now();
std::this_thread::sleep_for( dura );
auto end = std::chrono::system_clock::now();
auto elapsed = std::chrono::duration_cast<std::chrono::microseconds>(end - start);
std::cout << "Waited for " << elapsed.count() << " microseconds" << std::endl;
This results in
Waited for 2620 microseconds
Where does this discrepancy come from? Is there a better (more precise) method available?
Thanks!
Quoted from cppreference (see sleep_for):
This function may block for longer than sleep_duration due to scheduling or resource contention delays.
I think that is the most likely explanation. The details will depend on your environment, especially your OS.
In general, I see no portable way to avoid it (non-portable options include increasing thread priorities or reducing the nice level).
Another, however less likely, reason for time differences are external clock adjustments (e.g., caused by a ntp daemon). Using a steady_clock is a portable insurance against clock adjustments.
Evidently, sleep_for is not precise at all. The working solution for this issue is to enter a while loop until the desired duration is reached. This make the application "sleep" for precisely 2000 microseconds.
bool sleep = true;
while(sleep)
{
auto now = std::chrono::system_clock::now();
auto elapsed = std::chrono::duration_cast<std::chrono::microseconds>(now - start);
if ( elapsed.count() > 2000 )
sleep = false;
}