Here is my SQL query I am running:
string theQuery = "UPDATE readings SET chng = 1, time = FROM_UNIXTIME(";
theQuery += boost::lexical_cast<string>(ss.time);
theQuery += ") WHERE id = 1;";
ss.time is a uint32_t that records the number of seconds since 1 Jan 1970. When I attempt to put the value "3586767203" into the brackets of FROM_UNIXTIME, which is the time value on my device, it updates my time field to NULL. If I enter a smaller number it updates the time field fine.
Why is it updating to NULL if I am entering a valid time??
You've exceeded the limits of UNIXTIME. 3586767203 is 'Sun, 29 Aug 2083 12:13:23 GMT' and UNIXTIME can't be bigger than a date that resolves into '2038-01-18 22:14:07' ( FROM_UNIXTIME(2147483647) ) because the time is stored as a signed 32 bit integer of seconds after the epoch (Jan 1, 1970) and 2^31 seconds after the epoch is 'Tue, 19 Jan 2038 03:14:08 GMT'.
See http://en.wikipedia.org/wiki/Year_2038_problem for an explanation of this problem.
Related
Is there a recommended Power BI DAX pattern for calculating monthly Days Sales Outstanding (a.k.a. DSO or Debtor Days) using the Countback method?
I have been searching for a while and although there are many asking about it, there is no working solution recommendation I can find. I think that is perhaps because nobody has set out the problem properly so I am going to try to explain as fully as possible.
DSO is a widely-used management accounting measure of the average number of days that it takes a business to collect payment for its credit sales. More background info on the metric here: https://www.investopedia.com/terms/d/dso.asp
There are various options for defining the calculation. I believe my requirement is known as the countback method. My data set is a fairly large star schema with a separate date dimension, but using the below simplified data set to generate a solution would totally point me in the right direction.
Input data set as follows:
Month No
Month
Days in Month
Debt Balance
Gross Income
1
Jan
31
1000
700
2
Feb
28
1100
500
3
Mar
31
900
400
4
Apr
30
950
600
5
May
31
1000
400
6
Jun
30
1100
550
7
Jul
31
900
700
8
Aug
31
950
500
9
Sep
30
1000
400
10
Oct
31
1100
600
11
Nov
30
900
400
12
Dec
31
950
550
The aim is to create a measure for debtor days equal to the number of days of average daily income per month we need to count back to match the debt balance.
Starting with Dec as an example in 3 steps:
Debt Balance= 950, income = 550. Dec has 31 days. So we take all
31 days of income and reduce the debt balance to 400 (i.e. 950 - 550) and go back to the previous month.
Remaining Dec Debt balance =
400. Nov Income = 700. We don't need all of the daily income from Nov to match the rest of the Dec debt balance. 400/700 x 30 days in
Nov = 17.14 days
We have finished counting back days. 31 + 17.14 = 48.14 debtor days
Nov has a higher balance so we need 1 more step:
Debt balance= 1500, income = 700. Nov has 30 days. So we take all 30 days of income and reduce the debt balance to 800 (i.e. 1500 - 700) and go back to the previous month.
Remaining Nov Debt balance = 800. Oct Income = 600. Oct has 31 days. So we take all 31 days of income from Oct and reduce the Nov debt balance to 200 (i.e. 1500 - 700 - 600)
Remaining Nov debt balance = 200. Sep Income = 400. We don't need all of the daily income from Sep to match the rest of the Nov debt balance. 200/400 x 30 days in Sep = 15 days
We have finished counting back days. 30 + 31 + 15 = 76 debtor days
Apr has a lower balance so can be resolved in one step:
Debt Balance = 400, income = 600. Apr has 30 days. We don't need all of Apr Income as income exceeds debt in this month. 400/600 * 30 = 20 debtor days
The required solution for Debtor days in the simplified data set is therefore shown in the right-most "Debtor Days" column as follows:
Month
Month
Days
Debt Balance
Gross Income
Debtor Days
1
Jan
31
1000
700
2
Feb
28
1100
500
54.57
3
Mar
31
900
400
59.00
4
Apr
30
400
600
20.00
5
May
31
600
400
41.00
6
Jun
30
800
550
49.38
7
Jul
31
900
700
41.91
8
Aug
31
950
500
50.93
9
Sep
30
1000
400
65.43
10
Oct
31
1100
600
67.20
11
Nov
30
1500
700
76.00
12
Dec
31
950
550
48.14
I hope the above explains the required calculation sufficiently. Of course it needs to be implemented as a measure rather than a calculated column as in the real world it needs to work with more complex scenarios with the user defining the filter context at runtime by filtering and slicing in Power BI.
If anyone can recommend a DAX calculation for Debtor Days, that would be great!
This works on a small example, probably this may not work on a large model.
There is no easy way to do that, DAX isnt a programing language and we canot use loop / recursive statements etc. We have many limitations;
We can only mimic this behavior by bulk/ force calculate (which is resource consuming task). The most interesting part is variable _zz where we calculate for each row 3 version of the main table limited to 1/2/3 rows (as you see we hardcode some value - i consider that we can find result in max 3 iteration). You can investigate this if you want by adding NewTable from this code:
filter(GENERATE(SELECTCOLUMNS(GENERATE(Sheet1, GENERATESERIES(1,3,1)),"MYK", [MonthYearKey], "MonthToCheck", [Value], "Debt", [Debt Balance]),
var _tmp = TOPN([MonthToCheck],FILTER(ALL(Sheet1), Sheet1[MonthYearKey] <= [MYK] ), Sheet1[MonthYearKey], DESC)
return row("IncomAgg", SUMX(_tmp, Sheet1[Gross Income]) )
), [IncomAgg] >= [Debt])
Next, I try to find in our Table Variable 2 information, how many months back we must go.
Full code (I use MonthYearKey for time navigating purpose):
Mes =
var __currRowDebt = SELECTEDVALUE(Sheet1[Debt Balance])
var _zz = TOPN(1,
filter(GENERATE(SELECTCOLUMNS(GENERATE(Sheet1, GENERATESERIES(1,3,1)),"MYK", [MonthYearKey], "MonthToCheck", [Value], "Debt", [Debt Balance]),
var _tmp = TOPN([MonthToCheck],FILTER(ALL(Sheet1), Sheet1[MonthYearKey] <= [MYK] ), Sheet1[MonthYearKey], DESC)
return row("IncomAgg", SUMX(_tmp, Sheet1[Gross Income]) )
), [IncomAgg] >= [Debt]), [MonthToCheck], ASC)
var __monthinscoop = sumx(_zz,[MonthToCheck]) - 2
var __backwardrunningIncom = sumx(_zz,[IncomAgg])
var _calc = CALCULATE( sum(Sheet1[Days]), filter(ALL(Sheet1), Sheet1[MonthYearKey] <= SELECTEDVALUE( Sheet1[MonthYearKey]) && Sheet1[MonthYearKey] >= SELECTEDVALUE( Sheet1[MonthYearKey]) - __monthinscoop ))
var __twik = SWITCH( TRUE()
, __monthinscoop < 0 , -1
, __monthinscoop = 0 , 1
, __monthinscoop = 1 , 3
,0)
var __GetRowValue = CALCULATE( SUM(Sheet1[Gross Income]), FILTER(ALL(Sheet1), Sheet1[MonthYearKey] = (SELECTEDVALUE( Sheet1[MonthYearKey]) + __monthinscoop - __twik)))
var __GetRowDays = CALCULATE( SUM(Sheet1[Days]), FILTER(ALL(Sheet1), Sheet1[MonthYearKey] = (SELECTEDVALUE( Sheet1[MonthYearKey]) + __monthinscoop - __twik)))
return
_calc+DIVIDE(__GetRowValue - (__backwardrunningIncom - __currRowDebt), __GetRowValue) * __GetRowDays
Let's say I develop a ticket application that runs on windows. A given ticket has a validity of 3 hours. Now if I want to print a ticket on 28th of March 2020 (GMT+1, Germany) at 11 PM (23:00:00) it should be valid until 2 AM the next morning. (I manipulate my system time for testing)
Problem is, on the 29th DST-change happens: at 2 AM time will be set to 3 AM.
Due to DST the ticket is only valid until 1 AM (so technically only 2 hours), even though the actual time-leap happens later that day.
Here is what I do:
for the current time I use struct tm myTime;
myTime.tm_mday; \* = 28 *\
myTime.tm_hour; \* = 23 *\
struct tm newTime;
newTime.tm_mday = myTime.tm_mday; \* also done for remaining fields *\
newTime.tm_hour = myTime.tm_hour + 3; \* = 26 *\
no problem so far. On any other day the 26 hours will be converted to the following day 2 AM.
But if I call time_t result = _mktime64( newTime ); in this specific case, the resulting timestamp (e.g. 1585440205) will have mday = 29 and hour = 1 (when converted)
Is there another option, that calculates the time hour-precise, so that my ticket-validity doesn't lose one hour? (I assume _mktime64 recognizes the DST-change and manipulates all times for the day, no matter if they are before or after the actual time change at 2 AM)
Portal epochconverter.com converts timestamp 1531423084013 to correct date of Thursday, July 12, 2018 3:18:04.013 PM GMT-04:00 DST. But in Python 2.7.12 I got below which is wrong
>>> timestamp=1531423084013
>>> time.ctime(timestamp).rsplit(' ', 1)[0]
'Wed Nov 12 00:06:53'
How to make it correct ?
1531423084013 is in milliseconds not is seconds.
As you can see from epochconverter.com the hour is : 3:18:04.013, so the seconds part is 4.013, this site handle time in seconds and in milliseconds (it seems when the input has 13 digits instead of 10 for time around nowadays).
But time.ctime() from python handle only time in seconds and this is why you get a wrong answer when you enter a time in milliseconds (in my system it throws an out of range).
So you must divide your time in milliseconds by 1000 :
time.ctime(1531423084)
'Thu Jul 12 21:18:04 2018'
(My time zone is UTC+0200)
I am attempting to round times to the nearest 15 minute interval in Stata, so for instance Dec 31, 2017 23:58 would become Jan 01, 2018 00:00. I have time stored (based on my understanding of the documentation) as the number of milliseconds since the start of 1960. So I thought this would do it:
gen round = round(datetime, 60000*15)
However, this doesn't quite work. For instance Nov 03, 2017 19:45:27 becomes Nov 03, 2017 19:46:01, when I think I should become 19:45:00. Does anyone know what I'm missing here?
Let's show a worked example illustrating my comment that you need to store datetime values as double rather than float.
. clear
. set obs 1
number of observations (_N) was 0, now 1
. gen double datetime = clock("Nov 03, 2017 19:45:27","MDYhms")
. gen round_f = round(datetime, 60000*15)
. gen double round_d = round(datetime, 60000*15)
. format datetime round_f round_d %tc
. list, clean noobs
datetime round_f round_d
03nov2017 19:45:27 03nov2017 19:46:01 03nov2017 19:45:00
I have a scheduled task that needs to run three times a day, on each weekday. The setup surrounding the task is Coldfusion, and it is in the Crontime format. It should run at 11:30, 15:45 and 18:30 server time.
For some reason the task is occasionally running on weekends, which it should not do.
Here are the three strings for each of the days:
0 30 11 ? * 1-5
0 45 15 ? * 1-5
0 30 18 ? * 1-5
Can anyone point out to me why the task is sometimes running on weekends? Is there a mistake in my string?
The Coldfusion crontime documentation can be found here:
According to This, 1 = Sunday.
Days-of-Week can be specified as values between 1 and 7 (1 = Sunday) or by using the strings SUN, MON, TUE, WED, THU, FRI and SAT.
Try replacing 1-5 with MON-FRI?
An example of a complete cron-expression is the string "0 0 12 ? * WED" - which means "every Wednesday at 12:00:00 pm".
Individual sub-expressions can contain ranges and/or lists. For example, the day of week field in the previous (which reads "WED") example could be replaced with "MON-FRI", "MON,WED,FRI", or even "MON-WED,SAT".