I currently have code that can display mins and seconds.
But the problem I'm facing is that when the mins gets to 60, it counts up to 61 and beyond.
How can I add the hours?
-(void)update:(ccTime)dt{
totalTime += dt;
currentTime = (int)totalTime;
if (myTime < currentTime)
{
myTime = currentTime;
[countUpTimer setString:[NSString stringWithFormat:#"%02d:%02d", myTime/60, myTime%60]];
}
}
[countUpTimer setString:[NSString stringWithFormat:#"%02d:%02d:%02d", myTime/3600, (myTime/60)%60, myTime%60]];
Related
I am trying to add days to a formatted date in C++, but without any success.
The date is passed as a SYSTEMTIME type, and days to add in long type.
In the following code example i am adding the days in a date converted to long, and this is wrong, i am using this just as an example.
long FormatDate(SYSTEMTIME* cStartTime, long daysToAdd)
{
UCHAR szToday[16];
sprintf((char*)szToday, "%04d%02d%02d", cStartTime->wYear, cStartTime->wMonth, (cStartTime->wDay));
long finalDate = atol((char*)szToday) + daysToAdd // e.g. if szToday is "20210601" and daysToAdd is 10, then finalDate is 20210611
return finalDate;
}
Thanks.
After some search and debugging i am using the following code, and it's working.
Note that hour, minute, second and millisecond from CustomDate must be set, otherwise it won't work.
In this scenario i'm adding seconds, so it could be more generic. So when i need to convert to days i do this: daysToAdd * 24 * 60 * 60.
SYSTEMTIME AddSeconds(SYSTEMTIME s, INT64 seconds) {
FILETIME f;
SystemTimeToFileTime(&s, &f);
(INT64&)f += seconds * 10000000L;
FileTimeToSystemTime(&f, &s);
return s;
}
void Func()
{
INT64 daysToAdd = 15;
SYSTEMTIME customDate;
customDate.wYear = 2021;
customDate.wMonth = 1;
customDate.wDay = 1;
customDate.wHour = 0;
customDate.wMinute = 0;
customDate.wSecond = 0;
customDate.wMilliseconds = 0;
INT64 secondsToAdd = daysToAdd * 24 * 60 * 60;
SYSTEMTIME finalDate = AddSeconds(customDate, secondsToAdd);
}
So I'm trying to create an energy meter device which will read power every minute and then send it every 5 minutes through a LoRa server, using an MKR 1300 arduino. The problem is that as of now the hardware is removing a few milliseconds on the delay and so the time in the server ends up being p.e:
10:50:30
10:50:30
10:50:30
... 2 hours later
10:50:29
10:50:29
...
10:49:59
The code looks like this:
#include <MKRWAN.h>
#include "EmonLib.h"
LoRaModem modem;
String appEui = "1234567891011121";
String appKey = "ffffffffffffffffffffffffffffffff";
EnergyMonitor emon1;
EnergyMonitor emon2;
EnergyMonitor emon3;
double totalWatt;
int time_running;
int sending;
int totalKW;
int DELAY = 60000; // millis
void setup() {
Serial.begin(115200);
if (!modem.begin(EU868)) {
Serial.println("Failed to start module");
while (1) {}
};
Serial.print("Your module version is: ");
Serial.println(modem.version());
Serial.print("Your device EUI is: ");
Serial.println(modem.deviceEUI());
Serial.println("Connecting");
int connected = modem.joinOTAA(appEui, appKey);
if (!connected) {
Serial.println("Something went wrong; are you indoor? Move near a window and retry");
while (1) {}
}
Serial.println("Connected");
modem.minPollInterval(60);
analogReadResolution(9);
emon1.current(1, 53);
emon2.current(2, 53);
emon3.current(3, 53);
time_running = 0;
randomSeed(analogRead(A4));
}
void loop() {
unsigned long StartTime = millis();
totalWatt = 0;
unsigned long delay_send = 0;
int sending = 0;
double Irms1 = emon1.calcIrms(600);
if (Irms1 < 0.3) Irms1 = 0;
double Watt1 = Irms1 * 230;
double Irms2 = emon2.calcIrms(600);
if (Irms2 < 0.3) Irms2 = 0;
double Watt2 = Irms2 * 230;
double Irms3 = emon3.calcIrms(600);
if (Irms3 < 0.3) Irms3 = 0;
double Watt3 = Irms3 * 230;
totalWatt = Watt1 + Watt2 + Watt3;
totalKW = totalKW + totalWatt/1000;
if (time_running == 5) { //15 para 15 mins
double IrmsTotal = Irms1 +Irms2 + Irms3;
String msg = "{\"id\":\"avac_aud1\",\"kW\":"+String(totalKW)+", \"current\":"+String(IrmsTotal)+"}";
int err;
modem.beginPacket();
modem.print(msg);
err = modem.endPacket(true);
if (err > 0) {
//message sent correctly
time_running = 0;
totalKW = 0;
} else {
Serial.println("ERR");
time_running = 0;
}
}
time_running = time_running + 1;
if ((millis() - StartTime) > DELAY){
delay(10);
return;
} else{
delay(DELAY-(millis() - StartTime));
return;
}
}
I tried adding a variable ARD_DELAY (not shown above) to the code that in that last delay would subtract 7 to 8 milliseconds to try and fix this, but apparently, it only made it worse (now it removes 1 second every 1 hours instead of 2 hours) so today I'll try to add those 7 to 8 millis and see if it works, but I would really like to know why the heck this is happening because from what I can see from my code the delay should always account for the processed time including the data sending time.
Question is, how precise is your clock at all...
Still, I personally would rather go with the following approach:
#define DELAY (5UL * 60UL * 1000UL) // or whatever is appropriate...
static unsigned long timestamp = millis();
if(millis() - timestamp > DELAY)
{
// adding a fix constant will prevent accumulating deviations over time
timestamp += DELAY;
// run the every-5-min task...
}
Edit: combined 1-min and 5-min task:
Variant 1:
#define DELAY_SHORT (1UL * 60UL * 1000UL)
#define DELAY_LONG (5UL * 60UL * 1000UL)
static unsigned long timestampS = millis();
static unsigned long timestampL = timestampS;
if(millis() - timestampS > DELAY_SHORT)
{
timestamp += DELAY_SHORT;
// run the every-1-min task...
}
if(millis() - timestampL > DELAY_LONG)
{
timestamp += DELAY_LONG;
// run the every-5-min task...
}
Variant 2:
#define DELAY_1M (1UL * 60UL * 1000UL)
static unsigned long timestamp = millis();
if(millis() - timestamp > DELAY)
{
// adding a fix constant will prevent accumulating deviations over time
timestamp += DELAY;
// run the every-1-min task...
static unsigned int counter = 0;
if(++counter == 5)
{
counter = 0;
// run the every-5-min task...
}
}
Instead of trying to measure a start time and adding delay depending on that, you could keep track of the timing for your next cycle.
unsigned long next_cycle = DELAY;
...
void loop() {
...
delay( next_cycle - millis() );
next_cycle += DELAY;
}
If you also want to adjust for any time the program spends on initialization or similar, you can next_cycle = millis() + DELAY; before you enter your loop.
I need to take the current local time, including milliseconds, and pass it to some embedded device. This device has now idea about calendar time, but has its own timer with 1 ms accuracy. So, when this device receives the current timestamp, it opens log file and writes this timestamp to the beginning. From now, it writes different messages to the log, each one with number of milliseconds elapsed from this initial time. Finally, embedded device log file is uploaded to the host and should be parsed, with all relative time intervals converted back to full calendar time. The first part in the host program looks like this:
struct timestamp
{
int year; // 0-based
int month; // [1-12]
int day; // [1-31]
int hour; // [0-23]
int minute; // [0-59]
int sec; // [0-59]
int ms; // [0-999]
};
timestamp time_point_to_timestamp(std::chrono::time_point<std::chrono::system_clock> tp)
{
auto seconds = std::chrono::time_point_cast<std::chrono::seconds>(tp);
auto fraction = tp - seconds;
auto milliseconds = std::chrono::duration_cast<std::chrono::milliseconds>(fraction);
time_t tt = std::chrono::system_clock::to_time_t(tp);
tm* ptm = localtime(&tt);
timestamp t;
t.year = ptm->tm_year + 1900;
t.month = ptm->tm_mon + 1;
t.day = ptm->tm_mday;
t.hour = ptm->tm_hour;
t.minute = ptm->tm_min;
t.sec = ptm->tm_sec;
t.ms = static_cast<int>(milliseconds.count());
return t;
}
void start()
{
timestamp ts = time_point_to_timestamp(std::chrono::system_clock::now());
// send ts to embedded device
// ...
}
Now, when I get the log from device back to the host, it looks like this:
2018 6 24 8 25 52 598 // start time ts
500 message 1 // ms elapsed from ts
2350 message 2 // ms elapsed from ts
...
I need to parse this file and convert every message, printing its full date and time. For example, 500 will be converted to:
2018 6 24 8 25 53 098
So, I need some way to convert timestamp to any C++ type, that allows to add time intervals to it (time_point, duration?), and print it in human readable form. My compiler supports C++14.
I'd do this:
int64_t to_epoch_ms(time_point<system_clock> tp)
{
return duration_cast<milliseconds>(tp.time_since_epoch()).count();
}
Then pass the milliseconds since epoch to the device, where it can be logged as e.g. 1529819166927. Adding milliseconds is trivial and fast, whether you do it directly using the int64_t or by converting back to a time_point:
time_point<system_clock> from_epoch_ms(int64_t ms)
{
return {milliseconds(ms)};
}
auto tp1 = from_epoch_ms(ms + 123);
auto tp1 = from_epoch_ms(ms) + milliseconds(456);
I'm making an application that has a countdown timer. The problem is. When I press startTimer() will countdown. But when back to the previous page. Then press enter. It's not showing up, I Want to switch to another page or switching to another app. Time will continue to work and continue to run until the timing is the same as the clock application of the iPhone or Android.
startTimer() {
this.isStart = true;
// var timer = 60; //Second
var hours;
var minutes;
var seconds;
this.t = setInterval(() => {
if (!this.isPause) {
var hours;
var minutes;
var seconds;
hours = Math.floor(this.time / 3600);
minutes = Math.floor(hours / 60);
seconds = Math.floor(this.time % 60);
hours = hours < 10 ? "0" + hours : hours;
minutes = minutes < 10 ? "0" + minutes : minutes;
seconds = seconds < 10 ? "0" + seconds : seconds;
this.output = hours + ":" + minutes + ":" + seconds;
this.percent = this.time / this.time * 100;
this.increment = 180 / 100;
const progress = 'rotate(' + this.increment * this.percent + 'deg)';
this.transform = progress;
this.fixTransform = progress;
this.time--;
if (this.time === 0) {
clearInterval(this.t);
}
}
}, 1000);
console.log('start');
}
code works fine when .tv_usec is replaced with .tv_sec
need more accuracy seconds to at lest to decimal points
wording if this my be an issue with the pis clock
code eventually to be used to calculate bpm but currently used to calculate time between clicks
gboolean tapTemp(GtkButton *button, gpointer user_data)
{
//errorMsg = bmp;
if(tapdown)
{
tapdown = false;
clock_gettime(CLOCK_REALTIME, &beetTime);
time_difference = beetTime.tv_nsec;// - start_time;
bpm = time_difference - start_time; //time_difference;
errorMsg = bpm;
}
else
{
tapdown = true;
clock_gettime(CLOCK_REALTIME, &beetTime);
start_time = beetTime.tv_nsec;
errorMsg2 = start_time;
}
}
tv_nsec will wrap back to zero every second - to make a continually incrementing time combine with tv_sec, e.g. thistime = beetTime.tv_secs+0.001*(beetTime.tv_nsec/1000000) to get the nearest millisecond.