High accuracy timer in c++ - c++

I am working on high accuracy timers in C++ on Windows. My application requires the timer accuracy in ms. Up to 5 ms error can be omitted
in the app. I have shared the code below and I have got very good repeatability in this timer. 2 times out of 20 were error but the errors are
quite big around 75 ms. What can be caused of these errors? Because whenever I run the program the error level is the same which is around
75 ms. I usually choose delay 10 ms. Is it possible to obtain ms accuracy on Windows? Thank you in advance.
#include <string>
#include <chrono>
#include <iostream>
using namespace std;
//Get the time stamp
time_t getTimeStamp()
{
std::chrono::time_point<std::chrono::system_clock, std::chrono::milliseconds> tp =
std::chrono::time_point_cast<std::chrono::milliseconds>(std::chrono::system_clock::now());
auto tmp = std::chrono::duration_cast<std::chrono::milliseconds>(tp.time_since_epoch());
time_t timestamp = tmp.count();
return timestamp;
}
//Get the time year-month-day hour-minute-second millisecond
std::string gettm(__int64 timestamp)
{
__int64 milli = timestamp + (__int64)8 * 60 * 60 * 1000;
auto mTime = std::chrono::milliseconds(milli);
auto tp = std::chrono::time_point<std::chrono::system_clock, std::chrono::milliseconds>(mTime);
auto tt = std::chrono::system_clock::to_time_t(tp);
std::tm now;
::gmtime_s(&now, &tt);
char res[64] = { 0 };
sprintf_s(res, _countof(res), "%03d",static_cast<int>(milli % 100));
return std::string(res);
}
int main(void)
{
bool state = false;
int delay;
cout << "Delay (ms): " << endl;
cin >> delay;
Again:
string now = gettm(getTimeStamp());
int now_int = stoi(now);
int sss = now_int + delay;
if (sss >= 100)
sss = sss % 100;
while(1)
{
string change = gettm(getTimeStamp());
int now_change = stoi(change);
if (now_change >= 100)
now_change = now_change % 100;
state = true;
cout << "High O4 and now (ms): "<< sss << " and change (ms): "<<now_change <<endl;
if (now_change == sss)
{
state = false;
cout << "Low O4" << endl;
goto Again;
}
}
return 0;
}

Related

C++ CPU usage percentage spikes

I wrote the code below to get the current cpu usage on a window machine. However I notice some unusual spikes. Nine out of ten times the percentage I'm getting is between 2-5 and suddenly it spikes to over 95% which ofc doesn't agree with what the task-manger is showing... Could someone please point out if there is anything I'm doing wrong?
#include <iostream>
#include <sstream>
#include <chrono>
#include <thread>
#include <vector>
#include <Windows.h>
float roundPercentage(const double& d) {
std::ostringstream tmp;
tmp << d;
std::string str = tmp.str();
return (float)(std::round(std::stof(str.substr(0, 5)) * 100) / 100);
}
int main() {
while (true) {
FILETIME idle_time, kernel_time, user_time;
std::vector<FILETIME> last_values;
GetSystemTimes(&idle_time, &kernel_time, &user_time);
last_values.push_back(kernel_time);
last_values.push_back(user_time);
std::this_thread::sleep_for(std::chrono::milliseconds(500));
GetSystemTimes(&idle_time, &kernel_time, &user_time);
unsigned long long current_value = (kernel_time.dwLowDateTime + kernel_time.dwHighDateTime) + (user_time.dwLowDateTime + user_time.dwHighDateTime);
unsigned long long last_value = (last_values[0].dwLowDateTime + last_values[0].dwHighDateTime) + (last_values[1].dwLowDateTime + last_values[1].dwHighDateTime);
double d = (double)(last_value - current_value) / (double)(last_value + current_value) * 100.0;
std::cout << "cpu usage: " << roundPercentage(d) << "%" << std::endl;
}
return 0;
}
Based on comments I had a better look into FILETIME and ULARGE_INTEGER and the below code seems to work...
#include <iostream>
#include <chrono>
#include <thread>
#include <Windows.h>
uint64_t FromFileTime(const FILETIME& ft) {
ULARGE_INTEGER uli = { 0 };
uli.LowPart = ft.dwLowDateTime;
uli.HighPart = ft.dwHighDateTime;
return uli.QuadPart;
}
int main() {
while (true) {
FILETIME idle_time, kernel_time, user_time;
GetSystemTimes(&idle_time, &kernel_time, &user_time);
uint64_t k1 = FromFileTime(kernel_time);
uint64_t u1 = FromFileTime(user_time);
uint64_t i1 = FromFileTime(idle_time);
std::this_thread::sleep_for(std::chrono::milliseconds(500));
GetSystemTimes(&idle_time, &kernel_time, &user_time);
uint64_t k2 = FromFileTime(kernel_time);
uint64_t u2 = FromFileTime(user_time);
uint64_t i2 = FromFileTime(idle_time);
uint64_t ker = k2 - k1;
uint64_t usr = u2 - u1;
uint64_t idl = i2 - i1;
uint64_t cpu = (ker + usr - idl) * 100 / (ker + usr);
std::cout << "cpu usage: " << static_cast<int>(cpu) << "%" << std::endl;
}
return 0;
}

How to make a thread stop excution (eg: std::this_thread::sleep_for) for an accturate interval

I am currently making a small discord bot that can play music to improve my skill. That's why i don't use any discord lib.
I want the music as smooth as possible, but when i played some piece of music, the music produced is very choppy.
here is my code:
concurrency::task<void> play(std::string id) {
auto shared_token = std::make_shared<concurrency::cancellation_token*>(&p_token);
auto shared_running = std::make_shared<bool*>(&running);
return concurrency::create_task([this, id, shared_token] {
audio* source = new audio(id); // create a s16le binary stream using FFMPEG
speak(); // sending speak packet
printf("creating opus encoder\n");
const unsigned short FRAME_MILLIS = 20;
const unsigned short FRAME_SIZE = 960;
const unsigned short SAMPLE_RATE = 48000;
const unsigned short CHANNELS = 2;
const unsigned int BITRATE = 64000;
#define MAX_PACKET_SIZE FRAME_SIZE * 5
int error;
OpusEncoder* encoder = opus_encoder_create(SAMPLE_RATE, CHANNELS, OPUS_APPLICATION_AUDIO, &error);
if (error < 0) {
throw "failed to create opus encoder: " + std::string(opus_strerror(error));
}
error = opus_encoder_ctl(encoder, OPUS_SET_BITRATE(BITRATE));
if (error < 0) {
throw "failed to set bitrate for opus encoder: " + std::string(opus_strerror(error));
}
if (sodium_init() == -1) {
throw "libsodium initialisation failed";
}
int num_opus_bytes;
unsigned char* pcm_data = new unsigned char[FRAME_SIZE * CHANNELS * 2];
opus_int16* in_data;
std::vector<unsigned char> opus_data(MAX_PACKET_SIZE);
class timer_event {
bool is_set = false;
public:
bool get_is_set() { return is_set; };
void set() { is_set = true; };
void unset() { is_set = false; };
};
timer_event* run_timer = new timer_event();
run_timer->set();
//this is the send loop
concurrency::create_task([run_timer, this, shared_token] {
while (run_timer->get_is_set()) {
speak();
int i = 0;
while (i < 15) {
utils::sleep(1000);
if (run_timer->get_is_set() == false) {
std::cout << "Stop sending speak packet due to turn off\n";
concurrency::cancel_current_task();
return;
}
if ((*shared_token)->is_canceled()) {
std::cout << "Stop sending speak packet due to cancel\n";
concurrency::cancel_current_task();
return;
}
}
}});
std::deque<std::string>* buffer = new std::deque<std::string>();
auto timer = concurrency::create_task([run_timer, this, buffer, FRAME_MILLIS, shared_token] {
while (run_timer->get_is_set() || buffer->size() > 0) {
utils::sleep(5 * FRAME_MILLIS); //std::this_thread::sleep_for
int loop = 0;
int sent = 0;
auto start = boost::chrono::high_resolution_clock::now();
while (buffer->size() > 0) {
if (udpclient.send(buffer->front()) != 0) { //send frame
//udpclient.send ~ winsock sendto
std::cout << "Stop sendding voice data due to udp error\n";
return;
}
buffer->pop_front();
if ((*shared_token)->is_canceled()) {
std::cout << "Stop sending voice data due to cancel\n";
concurrency::cancel_current_task();
}
sent++; //count sent frame
//calculate next time point we should (in theory) send next frame and store in *delay*
long long next_time = (long long)(sent+1) * (long long)(FRAME_MILLIS) * 1000 ;
auto now = boost::chrono::high_resolution_clock::now();
long long mcs_elapsed = (boost::chrono::duration_cast<boost::chrono::microseconds>(now - start)).count(); // elapsed time from start loop
long long delay = std::max((long long)0, (next_time - mcs_elapsed));
//wait for next time point
boost::asio::deadline_timer timer(context_io);
timer.expires_from_now(boost::posix_time::microseconds(delay));
timer.wait();
}
}
});
unsigned short _sequence = 0;
unsigned int _timestamp = 0;
while (1) {
if (buffer->size() >= 50) {
utils::sleep(FRAME_MILLIS);
}
if (source->read((char*)pcm_data, FRAME_SIZE * CHANNELS * 2) != true)
break;
if ((*shared_token)->is_canceled()) {
std::cout << "Stop encoding due to cancel\n";
break;
}
in_data = (opus_int16*)pcm_data;
num_opus_bytes = opus_encode(encoder, in_data, FRAME_SIZE, opus_data.data(), MAX_PACKET_SIZE);
if (num_opus_bytes <= 0) {
throw "failed to encode frame: " + std::string(opus_strerror(num_opus_bytes));
}
opus_data.resize(num_opus_bytes);
std::vector<unsigned char> packet(12 + opus_data.size() + crypto_secretbox_MACBYTES);
packet[0] = 0x80; //Type
packet[1] = 0x78; //Version
packet[2] = _sequence >> 8; //Sequence
packet[3] = (unsigned char)_sequence;
packet[4] = _timestamp >> 24; //Timestamp
packet[5] = _timestamp >> 16;
packet[6] = _timestamp >> 8;
packet[7] = _timestamp;
packet[8] = (unsigned char)(ssrc >> 24); //SSRC
packet[9] = (unsigned char)(ssrc >> 16);
packet[10] = (unsigned char)(ssrc >> 8);
packet[11] = (unsigned char)ssrc;
_sequence++;
_timestamp += SAMPLE_RATE / 1000 * FRAME_MILLIS; //48000Hz / 1000 * 20(ms)
unsigned char nonce[crypto_secretbox_NONCEBYTES];
memset(nonce, 0, crypto_secretbox_NONCEBYTES);
for (int i = 0; i < 12; i++) {
nonce[i] = packet[i];
}
crypto_secretbox_easy(packet.data() + 12, opus_data.data(), opus_data.size(), nonce, key.data());
packet.resize(12 + opus_data.size() + crypto_secretbox_MACBYTES);
std::string msg;
msg.resize(packet.size(), '\0');
for (unsigned int i = 0; i < packet.size(); i++) {
msg[i] = packet[i];
}
buffer->push_back(msg);
}
run_timer->unset();
timer.wait();
unspeak();
delete run_timer;
delete buffer;
opus_encoder_destroy(encoder);
delete[] pcm_data;
});
}
There are 3 possible causes:
I send packet late so server-end buffer run out, so the sound produced has some silence between each each 2 packets. Maybe the timer is not accurate so the sound is out of sync.
The encode process is wrong which causes lost data somehow.
Bad network (i have tested an open source bot written on java, it worked so i can assume that my network is good enough)
So i post this question, hope someone has experienced this situation show me what wrong and what should i do to correct it.
I figured out the problem myself. I want to post solution here for someone who need.
The problem is the timer is unstable so it's usually sleep more than it should, so it makes the music broken.
I changed it to an accurate sleep function which i found somewhere on the internet(i don't remember the source, sorry for that, if you know it please credit it bellow).
Function source code:
#include <math.h>
#include <chrono>
#include <window.h>
static void timerSleep(double seconds) {
using namespace std::chrono;
static HANDLE timer = CreateWaitableTimer(NULL, FALSE, NULL);
static double estimate = 5e-3;
static double mean = 5e-3;
static double m2 = 0;
static int64_t count = 1;
while (seconds - estimate > 1e-7) {
double toWait = seconds - estimate;
LARGE_INTEGER due;
due.QuadPart = -int64_t(toWait * 1e7);
auto start = high_resolution_clock::now();
SetWaitableTimerEx(timer, &due, 0, NULL, NULL, NULL, 0);
WaitForSingleObject(timer, INFINITE);
auto end = high_resolution_clock::now();
double observed = (end - start).count() / 1e9;
seconds -= observed;
++count;
double error = observed - toWait;
double delta = error - mean;
mean += delta / count;
m2 += delta * (error - mean);
double stddev = sqrt(m2 / (count - 1));
estimate = mean + stddev;
}
// spin lock
auto start = high_resolution_clock::now();
while ((high_resolution_clock::now() - start).count() / 1e9 < seconds);
}
Thank you for your support!

How to make a timer that counts down from 30 by 1 every second?

I want to make a timer that displays 30, 29 etc going down every second and then when there is an input it stops. I know you can do this:
for (int i = 60; i > 0; i--)
{
cout << i << endl;
Sleep(1000);
}
This will output 60, 59 etc. But this doesn't allow for any input while the program is running. How do I make it so you can input things while the countdown is running?
Context
This is not a homework assignment. I am making a text adventure game and there is a section where an enemy rushes at you and you have 30 seconds to decide what you are going to do. I don't know how to make the timer able to allow the user to input things while it is running.
Your game is about 1 frame per second, so user input is a problem. Normally games have higher frame rate like this:
#include <Windows.h>
#include <iostream>
int main() {
// Initialization
ULARGE_INTEGER initialTime;
ULARGE_INTEGER currentTime;
FILETIME ft;
GetSystemTimeAsFileTime(&ft);
initialTime.LowPart = ft.dwLowDateTime;
initialTime.HighPart = ft.dwHighDateTime;
LONGLONG countdownStartTime = 300000000; // 100 Nano seconds
LONGLONG displayedNumber = 31; // Prevent 31 to be displayed
// Game loop
while (true) {
GetSystemTimeAsFileTime(&ft); // 100 nano seconds
currentTime.LowPart = ft.dwLowDateTime;
currentTime.HighPart = ft.dwHighDateTime;
//// Read Input ////
bool stop = false;
SHORT key = GetKeyState('S');
if (key & 0x8000)
stop = true;
//// Game Logic ////
LONGLONG elapsedTime = currentTime.QuadPart - initialTime.QuadPart;
LONGLONG currentNumber_100ns = countdownStartTime - elapsedTime;
if (currentNumber_100ns <= 0) {
std::cout << "Boom!" << std::endl;
break;
}
if (stop) {
std::wcout << "Stopped" << std::endl;
break;
}
//// Render ////
LONGLONG currentNumber_s = currentNumber_100ns / 10000000 + 1;
if (currentNumber_s != displayedNumber) {
std::cout << currentNumber_s << std::endl;
displayedNumber = currentNumber_s;
}
}
system("pause");
}
If you're running this on Linux, you can use the classic select() call. When used in a while-loop, you can wait for input on one or more file descriptors, while also providing a timeout after which the select() call must return. Wrap it all in a loop and you'll have both your countdown and your handling of standard input.
https://linux.die.net/man/2/select

How computer driven ignition timing on gas engines work?

I have been dabbling with writing a C++ program that would control spark timing on a gas engine and have been running in to some trouble. My code is very simple. It starts by creating a second thread that works to emulate the output signal of a Hall Effect sensor that is triggered once per engine revolution. My main code processes the fake sensor output, recalculates engine rpm, and then determines the time necessary to wait for the crankshaft to rotate to the correct angle to send spark to the engine. The problem I'm running into is that I am using a sleep function in milliseconds and at higher RPM's I am losing a significant amount of data.
My question is how are real automotive ECU's programed to be able to control spark at high RPM's accurately?
My code is as follows:
#include <iostream>
#include <Windows.h>
#include <process.h>
#include <fstream>
#include "GetTimeMs64.cpp"
using namespace std;
void HEEmulator(void * );
int HE_Sensor1;
int *sensor;
HANDLE handles[1];
bool run;
bool *areRun;
int main( void )
{
int sentRpm = 4000;
areRun = &run;
sensor = &HE_Sensor1;
*sensor = 1;
run = TRUE;
int rpm, advance, dwell, oHE_Sensor1, spark;
oHE_Sensor1 = 1;
advance = 20;
uint64 rtime1, rtime2, intTime, curTime, sparkon, sparkoff;
handles[0] = (HANDLE)_beginthread(HEEmulator, 0, &sentRpm);
ofstream myfile;
myfile.open("output.out");
intTime = GetTimeMs64();
rtime1 = intTime;
rpm = 0;
spark = 0;
dwell = 10000;
sparkoff = 0;
while(run == TRUE)
{
rtime2 = GetTimeMs64();
curTime = rtime2-intTime;
myfile << "Current Time = " << curTime << " ";
myfile << "HE_Sensor1 = " << HE_Sensor1 << " ";
myfile << "RPM = " << rpm << " ";
myfile << "Spark = " << spark << " ";
if(oHE_Sensor1 != HE_Sensor1)
{
if(HE_Sensor1 > 0)
{
rpm = (1/(double)(rtime2-rtime1))*60000;
dwell = (1-((double)advance/360))*(rtime2-rtime1);
rtime1 = rtime2;
}
oHE_Sensor1 = HE_Sensor1;
}
if(rtime2 >= (rtime1 + dwell))
{
spark = 1;
sparkoff = rtime2 + 2;
}
if(rtime2 >= sparkoff)
{
spark = 0;
}
myfile << "\n";
Sleep(1);
}
myfile.close();
return 0;
}
void HEEmulator(void *arg)
{
int *rpmAd = (int*)arg;
int rpm = *rpmAd;
int milliseconds = (1/(double)rpm)*60000;
for(int i = 0; i < 10; i++)
{
*sensor = 1;
Sleep(milliseconds * 0.2);
*sensor = 0;
Sleep(milliseconds * 0.8);
}
*areRun = FALSE;
}
A desktop PC is not a real-time processing system.
When you use Sleep to pause a thread, you don't have any guarantees that it will wake up exactly after the specified amount of time has elapsed. The thread will be marked as ready to resume execution, but it may still have to wait for the OS to actually schedule it. From the documentation of the Sleep function:
Note that a ready thread is not guaranteed to run immediately. Consequently, the thread may not run until some time after the sleep interval elapses.
Also, the resolution of the system clock ticks is limited.
To more accurately simulate an ECU and the attached sensors, you should not use threads. Your simulation should not even depend on the passage of real time. Instead, use a single loop that updates the state of your simulation (both ECU and sensors) with each tick. This also means that your simulation should include the clock of the ECU.

How to insert a timer in popen function C++?

I have this function, the input is the var "cmd" for example a dmesg command.
int i = 0;
char * bufferf;
bufferf = ( char * ) calloc( sizeof ( char ) , 200000 );
char buffer[1000][1280];
memset(buffer,0,1000 * 1280);
memset(bufferf,0,strlen(bufferf));
FILE* pipe = popen(cmd, "r");
if (!pipe){
send(client_fd,"EXCEPTION",9,0);
}
while(!feof(pipe)) {
if(fgets(buffer[i], 128, pipe) != NULL)
{
strcat(bufferf, buffer [i] );
}
i++;
}
pclose(pipe);
std::cout << bufferf ;
send(client_fd,bufferf,strlen(bufferf),0); }
Well. My goal is to calculate the amount of time between the start and the end of the while statement, by adding for each time a var that count the time passed.
For example dmesg is ~700 lines of output. The while runs for 700 times I have to add 700 times the amount of time to calculate the total sum.
How can I do that?
I've tried with difftime but it doesn't work very well.
Any other solutions?
Thank you.
You could make an extremely basic class that uses clock() to measure the time:
#include <ctime>
class Timer
{
private:
clock_t _start, _duration;
public:
Timer() : _start(0), _duration(0) { }
void start() { _start = clock(); }
void stop() { _duration = clock() - _start; }
float getTime() { return (float)_duration / CLOCKS_PER_SEC; }
};
Obviously multiply by 1000 if you want to display the time in milliseconds.
And then just:
Timer t;
t.start();
// do something
t.stop();
cout << "Duration: " << t.getTime() << endl;
Also, take note of what sarnold said, buffer is huge.