Simple Power BI Dataflow will not run in a specific Workspace - powerbi

I created this simple view in my SQL Server database...
CREATE VIEW dbo.v_cur_date_time_user
AS
SELECT GETDATE() AS cur_date_time
, user AS cur_user;
I then created two Power BI Dataflows for this view in separate Workspaces.
The only difference between these two Dataflows is the Workspace in which they are housed.
Same Gateway
Same User
Same View
The Dataflow in Workspace A runs in less than 3 seconds.
The Dataflow in Workspace B never finishes.
There are no other Dataflows running in either Workspace.
The time of day for this test does not matter.
Both Workspaces are part of the same Premium Capacity.
This issue started yesterday 9/26/2022. We noticed that Dataflows in Workspace B that normally take seconds still had not finished after +12 hours.
I found this from Microsoft...
Known issue #165 - Long running, failed or stuck dataflow in Premium Gen2
I feel like this is something different, although some symptoms are the same.
Further testing shows that the test Dataflow works in all other workspaces except Workspace B.
How to determine what the Workspace level issue is here With Workspace B?
I appreciate your insights.

Related

How to migrate AAS/SSAS cube to Power BI premium?

We have few cubes located in on-prem SSAS and on AAS (Azure analysis services). The report connect to the cube via live connection.
We are planning to migrate the cubes into the Power BI Premium workspace.
I want to ask - how do I migrate the cube from analysis services to Power BI Premium? Do I publish the model from visual studio analysis services project into Power BI premium workspace? Or do I convert the visual studio analysis services project into .pbix based data model?
Hi Easiest way is to migrate using Tabular Editor
First in power bi make sure you have enabled XMLA endpoint read write enabled in the tenant. Refer below SS
Get the analysis services url and click on From DB and paste the AAS url
Be mindful of the compatibility level Recommending to put it into 1565 range
After this deploy into the premium workspace.
Get the wokrspace connection string from below mentioned place.
Paste it in below.
Deploy by picking following settings.
And Deploy.
Deploying the code like #amelia suggested is a great way to migrate and the answer was extremely well written. For AAS there is a new built-in migration process which backs up and restores the AAS model to Power BI. Then it enables redirection so that existing Excel reports (or other client tools) automatically are redirected to Power BI.

Google Cloud Platform Dataflow is not Loading or Down

I am facing a problem of dataflow not loading even after 30minutes of staying. How do I complete that Lab,
The Task is:- Make a chart on dataflow by running a query on BigQuery.
I found the second option.
We can also start dataflow in another tab and restarting it from the new phase will help it in loading correctly. Then using make a new chart using a big query option with a query that has completed the task.

OLE DB or ODBC error in Microsoft PowerBI while importing data from Azure Synapse

I am trying to import a view (having 700M rows)from Azure Synapse in Microsoft Power BI, while it works intially and imports around 70M rows successfully but the connection gets dropped after that and following error appears.
Can Someone help me here?
I presume this is a dedicated SQL pool in Synapse.
If that's the case, I would suggest checking the following, in this order:
Intermittent connection issues if the query is taking some time to complete. You can test that with several tools, including command prompt.
Check service availability in the Azure Synapse resource, under the left menu "Diagnose and solve problems."
Optimise the query
Check your Azure Synapse firewall settings
Another test you can do is to temporarily change the view to return less records, to isolate the problem.

Google CloudSQL instance non-responsive, how to get support?

When it comes to databases, we want to leave managing them to the pros, which is why we went for a managed solution in the form of a CloudSQL 2nd gen db instance. Today the instance stopped responding, I clicked restart, it has been restarting for hours and is not responding, I have tried clone the instance, also not responding.
I don't know what else to do, our db is crippled and the service that uses it is down. These things happen, fine.
The thing that shocked me is that I am unable to contact anybody to resolve this problem. I understand that I can pay for a support subscription, $150p/m and up. This confuses me though, the GCloud console UI is not responding, am I incorrect in assuming I should not have to pay for support for the core product to at least work?
This leads me to my main question, if I want to continue using Google Cloud products in production, do I NEED a support subscription?
Same happened to us yesterday. The cloud SQL instance did not respond for an hour and a half (from 18h to 19:30h GTM+1).
We couldn't do absolutely nothing, we tried to backup the instance to a bucket but the command was returning an error saying that another operation is in progress.
We are a small startup and we can't pay for a support plan, but when we hired the cloud SQL service we thought that this kind of situations doesn't happen.
Honestly, after this I believe that Cloud SQL is not a good option if you do not contract at the same time a gold or platinum support plan. It is frustrating that something fails and you can not do anything, or even report the error.
Try the gcloud command line tool in your active shell, instead of the console UI. Try exporting the data from your SQL instance to google cloud storage bucket by using this command:
gcloud sql instances \
export <sql-instnace-name> \
gs://<bucket-name>/backup.sql
The SQL instance's service account by default has read and write access to google cloud storage bucket.
Create a new SQL instance using this command:
gcloud sql instances \
create <new-sql-instance-name>
Now, add the data to the new SQL instance using this command:
gcloud sql instances \
import <new-sql-instance-name> \
gs://<bucket-name>/backup.sql
You can get free or premium support here. You do not need a subscription to get help; it all depends on your needs and the level of urgency you estimate for eventual future problems.
If you have a recent backup of your database, you may consider re-creating it in another instance, from there.
You may consider posting your issue in the Google Cloud SQL Product Issue Tracker. This way, it will enjoy much better visibility from developers and Google support, without attracting any extra costs.

How to run SAS using batch if I do not have it locally

Is there a way to run SAS using batch if I don't have the sas.exe in my machine?
My computer has the SAS EG but the code is ran on our companies servers
Thanks
If you are asking whether it is possible to run SAS batch on your local machine without having SAS on your local machine, the answer is no.
If you are using EG to connect to a SAS server, and you want to execute a batch job on the SAS server, that is possible (just not with EG). For example, if you have terminal access to the SAS server via putty or whatever, you can do a batch submit.
Enterprise Guide is quite capable of scheduling jobs, whether or not you have a local SAS installation.
Wendy McHenry covers this well in Four Ways to Schedule SAS Tasks. Way 1 is what you probably are familiar with ('batch'), but Ways 2 through 4 are all possible in server environments.
Way 2 is what I use, which is specifically covered in Chris Hemedinger's post Doing More with SAS Enterprise Guide Automation. In Enterprise Guide since I think EG 4.3, there has been an option in the File menu "Schedule ...", as well as a right-click option on a process flow "Schedule ...". These create VBScript files that can be scheduled using your normal Windows scheduler, and allow you to schedule a process flow or a project to run unattended, even if it needs to connect to a server.
You need to make sure you can connect to that server using the credentials you'll schedule the job to run under, of course, and that any network connections are created when you're not logged in interactively, but other than that it's quite simple to schedule the job. Then, once you've run it, it will save the project with the updated log and results tabs.
If your company uses the full suite of server products, I would definitely recommend seeing if you can get Way 3 to work (using SAS Management Console) - that is likely easier than doing it through EG. That's how SAS would expect you to schedule jobs in that kind of environment (and lets your SAS Administrator have better visibility on when the server will be more/less busy).