Informatica_date_sequencing - informatica

How to create an automated workflow which takes date as an input in descending order?
At a one it should take single date and sync all those records for that particular date.
Next time it should take another date.

Related

Does Power BI support Incremental refresh for expanded table?

I have made table as follows (https://apexinsights.net/blog/convert-date-range-to-list):
In this scenario suppose I configure Incremental refresh on the Start Date column, then will Power BI support this correctly. I am asking because - say the refresh is for last 2 days or last 2 months, then it will fetch the source rows, and apply the transform to the partition. But my concern is that I will have to put the date param filter on Start date prior to non folding steps so that the query folds (alternatively power query will auto apply date filter so that query can fold).
So when it pulls the data based on start date and apply the transforms then I'm not able to think clearly about what kind of partitions it will create. Whether it is for Start date or for expabded date. Is query folding supported in this scenario?
This is a quite complicated scenario, where I would probably just avoid adding incremental refresh.
You would have to use the RangeStart/RangeEnd parameters twice in this query. Once that gets folded to the data source to retrieve ranges that overlap with the [RangeStart,RangeEnd) interval and a second time after expanding the ranges to filter out individual rows that fall outside [RangeStart,RangeEnd).

PBI how to map a date field to another dynamic table

I have a main table with tasks and the date completed. I wanted to put the completion dates into quarter buckets. instead of building complicated if statements, I am thinking of having a separate dynamic table. Is it possible to map the date completed in table1 into the respective buckets when it falls within the start and end dates of the quarters? if yes, how

how to set current quarter in Superset?

I want to set current quarter dynamically, e.g [2021-01-01 ~ 2021-04-01)
Does superset support it? if so how to config it?
The Last vs Previous and date range control in general has been a source of confusion for my users.
Last Quarter just shows the last 3 months [because it's a quarter of a year?].
It would be great to have options like Week to date, Month/Period to date, Quarter to date, etc...
Another issue is that each company may define their quarters/periods on different starting dates, depending on their fiscal calendar.
As a stop-gap, I've done the following.
enriched the underlying dataset to have additional columns like period_start_date and fiscal_quarter_start_date.
created a fiscal_dates table that contains a list of every day over the years I need to query. The columns correlate with date columns in my other tables, like dob, fiscal_week_start_date, period_start_date, fiscal_quarter_start_date . I created this table in postgres using generate series
created a new virtual dataset that contains the column period_start_date, that shows the last 4 years of period start dates.
use a value native filter to select from the list of dates.
make the values sorted descending, and default value as "first item in list".
This allows the user to select all records that occur in the same quarter/period, with a default of the current quarter.
The tentative apache/superset#17416 pull request should remedy this problem, i.e., for the QTD you would simply specify the START as datetrunc(datetime("now"), quarter) and leave the END undefined.

Power BI Dax Find Earliest match and preform operation

I'm working in Power BI, and I need to do this in DAX to keep it from having to re-read the 200K PDF files(8hr refresh time).
I have a table that has duplicated ID and Step values with a different time stamp. I need to find the earliest ID and subtract all matching ID's from the earliest Time stamp. I can then use the newly found delta time value to filter the table.
I'm struggling because I need to compare on ID out of the table to all of the the other ID's looking for matches.
Example Data:
Final Data:
This post got me close, but in the IF statments they equal it to "Yes" and not to the ID in the row. How to check for duplicates with an added condition

How to make a SAS DIS job loop over rows of a parameter table

I have a SAS DIS job which extracts and processes some timestamped data. The nature of the job is such that the data must be processed a bit at a time, month by month. I can use a time filter to ensure any given run is within the required timeframe, but I then must manually change the parameters of that table and rerun the job, month by month, until all the data is processed.
Since the timeframes extend back quite far, I'd like to automate this process as much as possible. Ideally I'd have a table which has the following form:
time_parameter_1 time_parameter_2
2JAN2010 1FEB2010
2FEB2010 1MAR2010
... ...
which could be part of an iterative job which continues to execute my processing job with the values of this table as time parameters until the table is exhausted.
From what I understand, the loop transformation in SAS DIS is designed to loop over tables, rather than rows of a table. Is the solution to put each date in a separate table, or is there a direct way to achieve this?
Much gratitude.
EDIT
So, with the help of Sushil's post, I have determined a solution. Firstly, it seems that SAS DIS requires the date parameters to be passed as text and then converted to the desired date format (at least, this is the only way I could get things to work).
The procedure is as follows:
In the grid view of the job to be looped over, right click and select Properties. Navigate to the Parameters tab and select New Group. Name the parameter in the General tab (let's use control_start_date) and in the Prompt Type and Values tab select Prompt type "Text". Press OK and add any other parameters using the same method (let's say control_end_date is another parameter).
Create a controlling job which will loop over the parameterized job. Import or create a table of parameters (dates) to loop over. These should be character representations of dates.
Connect the table of parameters to a Loop transformation, connect the parameterized job to the right end of the Loop transformation, and connect the right end of the parameterized job to a Loop End transformation.
Right click the Loop transformation and select Properties. Select the Parameter Mapping tab and properly map the control table date columns to the parameters of the parameterized job (control_start_date and control_end_date). In the Target Table Columns tab ensure that the parameter columns are mapped to the target table. Select OK.
In the parameterized job, create a User Written Code transformation. Create the columns start_date and end_date (type DATE9.) and populate the output work table using the following code:
DATA CONTROL_DATES;
start_date = input(trim("&control_start_date"),DATE9.);
end_date = input(trim("&control_end_date"),DATE9.);
RUN;
Connect the dates in the work table WORK.CONTROL_DATES to the logic of the job (possibly with a join) so that they serve as filters in the desired capacity. Save the parameterized job.
Now running the controlling job should be able to loop over the job using the specified date filters.
A lot of this is described in the following PDF, but I'm not sure how long that link will survive and some of the issues I encountered were not addressed there.
Your understanding about the LOOP transformation is incorrect. You do not need separate table for loop transformation to make your parameterized job flow loop. The table that has the time parameters can be the input to the loop transformation and the parameterized job can loop based on control table(input table for loop transformation).
Here is an example usage of loop transformation which is different from the one mentioned in the SAS DI Studio Documentation and is relevant to your problem : PDF
Let me know if it helps!