I am currently looking to edit a time stamp in spk file using SAS enterprise miner. I would like to change the created date and time to current time in the SAS enterprise miner.
Related
I think I'm on a severe wishful thinking trip here, please confirm if I am, if not then what would be the process to accomplish this?
Every day a team compiles an Excel file and sends it via email attachment. I have a Power Automate Flow that saves the attachment into a SharePoint space.
I can create a Power Bi and manually connect load those files to create the report, but it seems the real "Power" would be to not have to manually connect the new file (which has the creation date in the filename - eg 'Daily DRR 3-16-22.xlsx') every day, ergo:
What steps (using the Power Platform) would I take to have my PowerBi report auto (dynamically) refresh using the last 5 files (days) in the SharePoint drive? Is that possible?
Thanks in advance!
In informatica 9.5.1 how to update time stamp ? I'm not getting correct time stamp. This issue is facing while workflow check in and check out.
I have an excel sheet stored in a OneDrive Business Folder, which is updated continuously (approximately every minute). I am trying to show a live count of the number of entries in the table, as below, on a powerBI report:
From here I have tried two options:
1.Created a PowerBI Desktop File which shows the total count on a single card. I have then published this to PowerBI Service as a report.
2.Imported the excel file in via "Get Data->Files->OneDrive-Business" on PBI Service:
I loaded in the data and then created a report as below:
However, when a change is made to the excel file on the onedrive, the report data does not update automatically when any change is made. Instead, the only way that it updates is via the "refresh now" option in datasets on PBI service:
Then once that is refreshed, I have to manually refresh the data in the report window also.
The connection between the report and the excel file is therefore available to allow the manual update, but for some reason does not automatically update when I make a change. Are there are solutions available to get this to update automatically.
I saw this cool idea, but I can't seem to get it to work:
https://bigintsolutions.com/2019/03/29/refresh-power-bi-report-every-min-and-show-on-a-tv/
I have also read many threads, with some people having the same issue I am having:
https://community.powerbi.com/t5/Power-Query/Automatic-Refresh-not-working-when-connecting-to-SharePoint/td-p/546308
I know that there should be an update every hour for PowerBI-OneDrive connections, but I was hoping there was a way for updating live.
Goal: PowerBI Service Report to update automatically, for live data feed to a TV screen, when a change is made, say every minute.
Any help would be greatly appreciated!
Where are you sourcing your data? Updating each minute to an Excel file seems like something that won’t scale in the long term. This sounds like a better scenario for a real-time dataset that is in hybrid mode. Then you could build a dashboard over the data and it would automatically update as the data updates. I’ve used Power Automate to push data into a real-time dataset as well.
Treb Gatte, Power BI MVP
Recently we migrated 2013 sites to office 365 using metalogix tool, in one of the site Task list is migrated with all content but the date which is displaying in due date column is one day ahead of the date in source site
Ex: due date is 02/03/2017 in source but in target it is 03/03/2017
are there any teams which faced similar kind of issue??
what might be the fix for this issue??
You have migrated your SP 2013 sites to O365. Your SP 2013 uses your local timezone and O365 uses its timezone. When you migrate SP2013 to O365, it changes the DateTime value(in your local timezone) to O365 timezone. Thats the reason for 1 day ahead.
We're experiencing a similar issue now migrating from a SP 2007 on prem to a SP 2013 on prem. In one list, the dates are "1/01/1900" but the migration fails with Metalogix reporting that "1899-12-31T15:00:00.0000000Z" is an invalid date for those exact fields.
Not sure why at the moment as both servers appear to be set in the same time zone with the same locale.
UPDATE: This is happening on our servers because the SP2007 is set to GMT time whereas the SP2013 is set to UTC time. A default time value in a date/time field entered when the user cannot enter a time component is 12am. So during the migration daylight savings is/isn't taken into account on one of the servers. We've pushed those particular records one day ahead -> when they get migrated they fall back to 01/01/1900 11:00pm -> all we care about is the date so they come across.
We have a some ETL processes that read CSV files that are output from SAS programs. I'm in the process of upgrading one of these ETLs and was wondering if I could use SSIS to read directly from the SAS dataset.
Has anybody done this successfully?
See here
"You can use SAS Local Data Provider (can be downloaded separately and comes as part of SAS for Windows installation). "
Recently I've moved data from SAS environment:
In connection manager I choose _Native OLE DB\SAS Local Data Provider 9.3_
Enter file name (`\..\dev` before the table name). Click OK
Drag _OLE DB Source_ into the Data Flow
Right click on _OLE DB Source_ and choose _Show advanced editor_
In the first tab (_Connection Manager_) choose _SAS Connection Manager_ from drop down list you created now
In the Component Properties tab in _OpenRowset_ write the name of the table, click OK
If you have `datetime` type transform it using Derived Transformation Editor
For loading SAS XPT or SAS7BDAT data files without having an instance of SAS to connect to via OBDC we used the following
A third party tool (STATTransfer) to read the XPT file
STATTransfer ODBC driver
Setup the connection in SSIS as an ODBC datasource and load into the database for processing.
There are SAS datasource SSIS extension available http://www.cozyroc.com offer a SAS Data connection, but they where outside our price range