algorithm for calculating a running clicks/second value (ala a speedometer) - c++

I'm trying to figure out how to produce a running calculation of clicks per second (e.g. an app with a window I click on and it gives me a speedometer-like value of the 'speed' of my clicks in clicks per second). For some reason the algorithm is eluding me.
It's easy to figure out if I just want to figure out clicks per second if at each second I report how many clicks happened in the last second. But where it gets tricky is if there was one click in second 1, then 0 clicks in seconds 2-9 and 1 click in second 10. Presumably that would be .2 clicks per second--although really only if it was kept up and averaged out to that over time. If that click in second 10 was followed by 0 clicks for 40 seconds, then it should be 0 clicks/second, not .04 clicks/second.
So clearly I need some kind of window within which I'm willing to presume the clicks are part of a pattern, or at least associated with the last ones. But it's just not making sense to me.
I'm using openframeworks for this, so have an update() function that is called more than once/second (say 30x/sec), and have a mousePressed() function that allows me to increment a variable to track the clicks. i can use difftime() and time() to track whether I just crossed into a new second, and then use fmod() to figure out if I just crossed some larger interval.
Any suggestions are appreciated.

I think you want to calculate the running average of the clicks per second. You would use a circular buffer of counters of a length of say 30 for a 30 second window. The average clicks per second is the sum of the counters divided by 30.
An index points to the current counter, the index is incremented modulo 30 every second, and the counter at the new position is set to zero.
example:
const unsigned BUFFER_SIZE = 30;
unsigned counters[BUFFER_SIZE];
unsigned current = 0;
time_t last;
void init() {
time(&last);
}
void update() {
time_t now;
time(&now);
while (now - last >= 1) {
++last;
current = (current+1)%BUFFER_SIZE;
counters[current] = 0;
}
}
void mousePressed() {
++counters[current];
}
float average() {
float sum = 0;
for (int i = 0; i < BUFFER_SIZE; ++i) {
sum += counters[i];
}
return sum/BUFFER_SIZE;
}

This is pseudo code, but I think it will do what you are asking:
onUpdate() {
if (currentTime() - lastClickTime > idleTimeout) {
// reset the clickometer to zero
} else {
// calculate the speed
}
}
onMouseClick() {
lastClickTime = currentTime();
// and whatever else needs to happen
}
Basically you are just tracking the time of the last click, and making sure it happened within the idleTimeout, which you obviously have to define for some span of time.

Related

Creating a custom millisecond-based chrono clock that increments counters, and resets every year?

I'm starting out with a basic task in C++, coming from Pascal and Lua. I picked this as it seems simple enough, and useful for the project I'm working on.
I need to create a millisecond timer that starts at 0 for everything; Year 0, month 0, week 0, day 0. It will also run at a faster timescale than real life; every second that passes is 20 seconds for this timer. Which should be easy? Just multiply each ms by 20 and save that.
As the timer progresses, increment various counters that track seconds/minutes/hours/days/weeks/months/seasons/years passed. And after a year has passed, reset the timer back to 0ms and continue incrementing.
I will also be stopping, saving, loading, resuming. That bit I'll be able to handle, as I have access to a MySQL database to take care of serializing data. I'm just perplexed by std:chrono, and couldn't understand how to make my own arbitrary starting time and reference date.
Don't have any code yet, but here's the layout for the C++ file:
namespace FyTyGameTime
{
//Number of realtime milliseconds for each timescale second.
uint32_t TimeScaleMsMult = 20;
//Current iteration's total number of milliseconds
uint32_t TimeScaleCounter = 0;
//Total accumulated time
uint32_t TimeScaleMinutes = 0;
uint32_t TimeScaleHours = 0;
uint32_t TimeScaleDays = 0;
uint32_t TimeScaleWeeks = 0;
uint32_t TimeScaleMonths = 0;
uint32_t TimeScaleSeasons = 0;
uint32_t TimeScaleYears = 0; //Number of years passed since timer started
//Current iteration's counters.
//Current start date is 01/01/0000 7am
uint32_t TimeScaleCurrentSecond = 0; //0-86399, Resets every day
uint32_t TimeScaleCurrentMinute = 420; //0-1339, Resets every day
uint32_t TimeScaleCurrentHour = 7; //0-23, Resets every day
uint32_t TimeScaleCurrentDay = 0; //0-6, Resets every week
uint32_t TimeScaleCurrentWeek = 0; //0-3, Resets every month
uint32_t TimeScaleCurrentMonth = 0; //0-11, Resets every year
uint32_t TimeScaleCurrentSeason = 0; //0-3, Resets every year
uint32_t TimeScaleCurrentYear = 0; //Starting year, incremeneted every year. !Does not reset!
uint32_t RealTimeToTimescale(uint32_t iMilliseconds)
{
return uint32_t(iMilliseconds * TimeScaleMsMult);
}
bool SaveTimescale()
{
//
}
void LoadTimescale()
{
//
}
bool ResetTimescale()
{
//
}
}
Any help and clarification on std::chrono is appreciated, thank you. C++ Is quite a bit more terse than the languages I usually use and feel comfortable with.
edit: From what I've been able to read, std::chrono::steady_clock is the way to go. But it doesn't have any member functions to pause, resume, and to modify its values?
The time points of this clock cannot decrease as physical time moves forward and the time between ticks of this clock is constant. This clock is not related to wall clock time (for example, it can be time since last reboot), and is most suitable for measuring intervals.
The only member function it has is now()...
edit2: No, that is still based on some external timer. While I need to increment every ms. It seems like the three clocks provided for chrono are for duration checks, rather than actually used to do something every time interval?

(Arduino) Is there a way to add a stopwatch to count how long the user has spent time using the program?

Our project is a mp3 player, we're all beginners at coding. The project has a sort of a 'main menu' from which the user is able to choose from options such as: show track info, shuffle playlist, etc. Now we're trying to add a feature that tells the user how long they've been logged in. Our stopwatch code:
unsigned long timeListened = 0;
static int sknt1=0;
static int sknt10=0;
static int min1=0;
static int min10=0;
static int hour1=0;
static int hour10=0;
static int hour100=0;
void loop() {
timeListened = millis();
if (timeListened >= 1000) {
sknt1 = sknt1 + 1;
timeListened = 0;
}
if (sknt1 == 10) {
sknt10 = sknt10 + 1 ;
sknt1 = 0;
}
if (sknt10 == 6) {
min1 = min1 + 1;
sknt10 = 0;
}
if (min1 == 10) {
min10 = min10 + 1;
min1=0;
}
if (min10 == 6) {
hour1=houri1+1;
min10=0;
}
if (hour1 == 10){
hour10 = hour10 + 1;
hour10=0;
}
Serial.print (hour100);
Serial.print (hour10); Serial.print(hour1); Serial.print(":");
Serial.print(min10); Serial.print(min1); Serial.print(":");
Serial.print(sknt10); Serial.println(sknt1);
delay(1000);
}
The issue is, we can't find the right spot to place this code in. Only results we've had is either the stopwatch adds 1 second every time the user chooses the option from the menu, or the stopwatch time is just 000:00:00.
Is there a way to make arduino run the stopwatch in the background while the main program (playing music) is running ?
I have written a some code that prints the elapsed time since the Arduino was turned on for another project. It is not efficient, but it works.
You could call this every second in your loop() in order to update it.
#define second (1000L)
#define minute (60L*1000L)
#define hour (60L*60L*1000L)
unsigned long testTime = millis();
unsigned long hh = (testTime ) / hour;
unsigned long mm = ((testTime ) % hour) / minute ;
unsigned long ss = (((testTime ) % hour) % minute) / second;
Serial.print(hh);Serial.print(':');
Serial.print(mm);Serial.print(':');
Serial.print(ss);Serial.print('\n');
You are already using millis in your code which gives you the time in milliseconds ellapsed since the program started.
So whenever you want to know how long the program has been running just call millis().
This will overflow after approximately 50 days but that's probably irrelevant in your use case.
The problem with your code is that you add 1 second every time millis returns a value > 1000 which is always the case if your Arduino is running for more than 1 second.
This also has the effect that you only add 1 second every time you check that. Not every second.
There is also no need to delay your program to get an update every second.
Simply conver the return value of millis() to a time string. It gives you the milliseconds since program start. So calculate how many hours, minutes and seconds that is. I guess I don't have to tell you that 60000ms make a minute...
If you want to know the time passed since a certain event. For example a user login, you store the time of that event and calculate the difference to the current time.

Issue with STM32F769 PWM output when updated according to condition

I am using STM32F769 Disc board with Mbed online compiler. My task is to change the PWM frequency and duty ratio according to input.
I've created a simple algorithm according to my need, the program is working well but whenever the controller updates the PWM frequency, there is strange behavior(overlapped maybe, difficult to explain verbally for me), the frequency is changed instantly and the output is incorrect at that moment. For other controllers (like arduino) this never happens, the controller updates value after the time period of PWM is over. But not in this case.
What can be wrong?
I thought to add a small delay before value is updated but that will not work, as every time a different delay would be needed. I have attached the code and screenshots.
#include "mbed.h"
AnalogIn analog_value(A0);
PwmOut pulse(D11);
int main() {
double meas_v = 0;
double out_freq, out_duty, s_time;
while (1) {
meas_v = analog_value.read() * 3300;
if (meas_v < 1) {
out_freq = 50000;
out_duty = 40;
} else if (meas_v >= 1000) {
out_freq = 100000;
out_duty = 80;
} else {
out_freq = 50000 + (meas_v * 50);
out_duty = 40 + (meas_v * 0.04);
}
pulse.period(1.0 / out_freq);
pulse = out_duty / 100;
s_time = 0.0001;
wait(s_time);
}
}
The output should be updated after the current period is completed, not instantly.
Error I am getting
The underlying HAL code probably resets the current count value of the timer when you set the new period. You'll have to read the current timer cnt value and wait for it to reach 0. You can set a new period when the timer cnt value reaches 0.
You need to update the value in the update interrupt or use the timer DMA burst mode

Determining if 5 seconds have passed

I'm trying to determine if five seconds have passed in a console application since the last time I checked. I think my logic is slightly off and I don't know how to resolve it.
My lastCheck variable is firstly 0 when the program begins. It's responsible for holding the "old time".
LastCheck is updated by CheckSeconds(), which gives it a new "old time"
If the LastCheck was equal to 1232323, and the now variable is currently equal to 1227323 then I would know 5000 milliseconds have passed. (in reality, the numbers are much greater than this)
Else, I don't want anything to happen, I want to wait until these five seconds have actually passed.
BACKEND
inline std::vector<int> CheckSeconds(int previous, int timeinseconds)
{
//check if a certain amount of seconds have passed.
int now = GetTickCount();
int timepassed = 0;
std::vector<int> trueandnewtime;
//if the current time minus the old time is greater than 5000, then that means more than 5000 milliseoncds passed.
//therefore the timepassed is true.
if (now - previous > 5000)
timepassed = 1;
trueandnewtime.push_back(timepassed);
trueandnewtime.push_back(now);
return trueandnewtime;
}
FRONTEND
storage = CheckSeconds(LastCheck, 5);
LastCheck = storage.at(1);
if (storage.at(0) == 1)
{
....blahblahblah.....
}
Anyone know what I'm doing wrong? I must have a logic error somewhere or I'm being dumb.
Also worth noting, this code is in a while loop, getting constantly run at Sleep(60); It's a console application at the momemnt.
Appreciate any assistance.
Fixed it by putting the Lastcheck set into the loop.

Calculating moving average in C++

I am trying to calculate the moving average of a signal. The signal value ( a double ) is updated at random times.
I am looking for an efficient way to calculate it's time weighted average over a time window, in real time. I could do it my self, but it is more challenging than I thought.
Most of the resources I've found over the internet are calculating moving average of periodical signal, but mine updates at random time.
Does anyone know good resources for that ?
Thanks
The trick is the following: You get updates at random times via void update(int time, float value). However you also need to also track when an update falls off the time window, so you set an "alarm" which called at time + N which removes the previous update from being ever considered again in the computation.
If this happens in real-time you can request the operating system to make a call to a method void drop_off_oldest_update(int time) to be called at time + N
If this is a simulation, you cannot get help from the operating system and you need to do it manually. In a simulation you would call methods with the time supplied as an argument (which does not correlate with real time). However, a reasonable assumption is that the calls are guaranteed to be such that the time arguments are increasing. In this case you need to maintain a sorted list of alarm time values, and for each update and read call you check if the time argument is greater than the head of the alarm list. While it is greater you do the alarm related processing (drop off the oldest update), remove the head and check again until all alarms prior to the given time are processed. Then do the update call.
I have so far assumed it is obvious what you would do for the actual computation, but I will elaborate just in case. I assume you have a method float read (int time) that you use to read the values. The goal is to make this call as efficient as possible. So you do not compute the moving average every time the read method is called. Instead you precompute the value as of the last update or the last alarm, and "tweak" this value by a couple of floating point operations to account for the passage of time since the last update. (i. e. a constant number of operations except for perhaps processing a list of piled up alarms).
Hopefully this is clear -- this should be a quite simple algorithm and quite efficient.
Further optimization: one of the remaining problems is if a large number of updates happen within the time window, then there is a long time for which there are neither reads nor updates, and then a read or update comes along. In this case, the above algorithm will be inefficient in incrementally updating the value for each of the updates that is falling off. This is not necessary because we only care about the last update beyond the time window so if there is a way to efficiently drop off all older updates, it would help.
To do this, we can modify the algorithm to do a binary search of updates to find the most recent update before the time window. If there are relatively few updates that needs to be "dropped" then one can incrementally update the value for each dropped update. But if there are many updates that need to be dropped then one can recompute the value from scratch after dropping off the old updates.
Appendix on Incremental Computation: I should clarify what I mean by incremental computation above in the sentence "tweak" this value by a couple of floating point operations to account for the passage of time since the last update. Initial non-incremental computation:
start with
sum = 0;
updates_in_window = /* set of all updates within window */;
prior_update' = /* most recent update prior to window with timestamp tweaked to window beginning */;
relevant_updates = /* union of prior_update' and updates_in_window */,
then iterate over relevant_updates in order of increasing time:
for each update EXCEPT last {
sum += update.value * time_to_next_update;
},
and finally
moving_average = (sum + last_update * time_since_last_update) / window_length;.
Now if exactly one update falls off the window but no new updates arrive, adjust sum as:
sum -= prior_update'.value * time_to_next_update + first_update_in_last_window.value * time_from_first_update_to_new_window_beginning;
(note it is prior_update' which has its timestamp modified to start of last window beginning). And if exactly one update enters the window but no new updates fall off, adjust sum as:
sum += previously_most_recent_update.value * corresponding_time_to_next_update.
As should be obvious, this is a rough sketch but hopefully it shows how you can maintain the average such that it is O(1) operations per update on an amortized basis. But note further optimization in previous paragraph. Also note stability issues alluded to in an older answer, which means that floating point errors may accumulate over a large number of such incremental operations such that there is a divergence from the result of the full computation that is significant to the application.
If an approximation is OK and there's a minimum time between samples, you could try super-sampling. Have an array that represents evenly spaced time intervals that are shorter than the minimum, and at each time period store the latest sample that was received. The shorter the interval, the closer the average will be to the true value. The period should be no greater than half the minimum or there is a chance of missing a sample.
#include <map>
#include <iostream>
// Sample - the type of a single sample
// Date - the type of a time notation
// DateDiff - the type of difference of two Dates
template <class Sample, class Date, class DateDiff = Date>
class TWMA {
private:
typedef std::map<Date, Sample> qType;
const DateDiff windowSize; // The time width of the sampling window
qType samples; // A set of sample/date pairs
Sample average; // The answer
public:
// windowSize - The time width of the sampling window
TWMA(const DateDiff& windowSize) : windowSize(windowSize), average(0) {}
// Call this each time you receive a sample
void
Update(const Sample& sample, const Date& now) {
// First throw away all old data
Date then(now - windowSize);
samples.erase(samples.begin(), samples.upper_bound(then));
// Next add new data
samples[now] = sample;
// Compute average: note: this could move to Average(), depending upon
// precise user requirements.
Sample sum = Sample();
for(typename qType::iterator it = samples.begin();
it != samples.end();
++it) {
DateDiff duration(it->first - then);
sum += duration * it->second;
then = it->first;
}
average = sum / windowSize;
}
// Call this when you need the answer.
const Sample& Average() { return average; }
};
int main () {
TWMA<double, int> samples(10);
samples.Update(1, 1);
std::cout << samples.Average() << "\n"; // 1
samples.Update(1, 2);
std::cout << samples.Average() << "\n"; // 1
samples.Update(1, 3);
std::cout << samples.Average() << "\n"; // 1
samples.Update(10, 20);
std::cout << samples.Average() << "\n"; // 10
samples.Update(0, 25);
std::cout << samples.Average() << "\n"; // 5
samples.Update(0, 30);
std::cout << samples.Average() << "\n"; // 0
}
Note: Apparently this is not the way to approach this. Leaving it here for reference on what is wrong with this approach. Check the comments.
UPDATED - based on Oli's comment... not sure about the instability that he is talking about though.
Use a sorted map of "arrival times" against values. Upon arrival of a value add the arrival time to the sorted map along with it's value and update the moving average.
warning this is pseudo-code:
SortedMapType< int, double > timeValueMap;
void onArrival(double value)
{
timeValueMap.insert( (int)time(NULL), value);
}
//for example this runs every 10 seconds and the moving window is 120 seconds long
void recalcRunningAverage()
{
// you know that the oldest thing in the list is
// going to be 129.9999 seconds old
int expireTime = (int)time(NULL) - 120;
int removeFromTotal = 0;
MapIterType i;
for( i = timeValueMap.begin();
(i->first < expireTime || i != end) ; ++i )
{
}
// NOW REMOVE PAIRS TO LEFT OF i
// Below needs to apply your time-weighting to the remaining values
runningTotal = calculateRunningTotal(timeValueMap);
average = runningTotal/timeValueMap.size();
}
There... Not fully fleshed out but you get the idea.
Things to note:
As I said the above is pseudo code. You'll need to choose an appropriate map.
Don't remove the pairs as you iterate through as you will invalidate the iterator and will have to start again.
See Oli's comment below also.