Can we publish Power BI files into Private cloud (other than azure)? - powerbi

We have in house project which we are developing in our cloud environment.
So we wanted to use PowerBi as the visualization tool. So can you please suggest if we can publish PowerBi files into our Cloud environment other than azure.

This functionality is coming to some degree in SQL Server Reporting Services vNext.

Related

How to migrate AAS/SSAS cube to Power BI premium?

We have few cubes located in on-prem SSAS and on AAS (Azure analysis services). The report connect to the cube via live connection.
We are planning to migrate the cubes into the Power BI Premium workspace.
I want to ask - how do I migrate the cube from analysis services to Power BI Premium? Do I publish the model from visual studio analysis services project into Power BI premium workspace? Or do I convert the visual studio analysis services project into .pbix based data model?
Hi Easiest way is to migrate using Tabular Editor
First in power bi make sure you have enabled XMLA endpoint read write enabled in the tenant. Refer below SS
Get the analysis services url and click on From DB and paste the AAS url
Be mindful of the compatibility level Recommending to put it into 1565 range
After this deploy into the premium workspace.
Get the wokrspace connection string from below mentioned place.
Paste it in below.
Deploy by picking following settings.
And Deploy.
Deploying the code like #amelia suggested is a great way to migrate and the answer was extremely well written. For AAS there is a new built-in migration process which backs up and restores the AAS model to Power BI. Then it enables redirection so that existing Excel reports (or other client tools) automatically are redirected to Power BI.

connect powerbi to external webapi without using the desktop tool

is it possible to connect powerbi web to an external webapi without using the powerbi desktop and create a report totally online, without installing the thesktop tool
Currently you can only build a Dataset in the Power BI Service from flat files. Everything else requires using Power BI Desktop to build your Dataset.
Once you have a Dataset you can build reports in either Power BI Desktop or on the web.

How to show snapshot API response data in data-studio?

I need to design and display a compute engine snapshot report for different projects in the cloud in data-studio. For this, I am trying to use the below Google Compute Engine snapshot-api for retrieving data.
https://compute.googleapis.com/compute/v1/projects/my-project/global/snapshots
The data may change everyday depending on the snapshots created from the disks. So the report should display all the updated data.
Can this rest-api be called directly from Google data-studio?
Alternatively, what is the best/simplest way to display the response in data-studio?
You can use a Community Connector in Data Studio to directly pull the data from the API.
Currently, their is no way to connect GCP Compute Engine (GCE) resource data or use the REST API in Data Studio. The only products that are available on connecting data from GCP are the following:
BigQuery
Cloud Spanner
Cloud SQL for MySQL
Google Cloud Storage
MySQL
PostgreSQL
A possible way to design and display a Compute Engine Snapshot Report for different projects in the Cloud in Data Studio is by creating a Google App Script (to call the snapshot REST API) with a Google Sheet, and then import the data into the sheet on Data Studio.
Additionally, if you have any questions in regards to Data Studio, I would suggest reviewing the following documents below:
Data Studio Help Center
Data Studio Help Community
EDIT: My apologies, it seems that their is a way to show snapshot API response data in Data Studio by using a Community Connector to directly pull the data from the API.

Similar product in AWS or GCP like Azure Data Factory?

I am totaly new to the cloud in any way. I started some weeks ago with the Azure cloud and we setting up a project using many different products of Azure.
At the moment we think about setting up the project on a way that we are not trapped by Microsoft and are able to switch to GCP or AWS. For most products we use I have found similar ones in the other Clouds but I wonder if there is somthing like Azure Data Factory in AWS or CGP? I could not find something in my first google research.
Best and thanks for your help
If you need a good comparison between different cloud (Azure, AWS, Google, Oracle, and Alibaba) use this site: http://comparecloud.in/
Example for your case with "Azure Data Factory":
You could use a mix of those products:
[Cloud Data Fusion]https://cloud.google.com/composer
Cloud DataPrep: This is a version of Trifacta. Good for data cleaning.
If you need to orchestrate workflows / etls, Cloud composer will do it for you. It is a managed Apache Airflow. Which means it will handle complex dependencies.
If you just need to trigger a job on a daily basis, Cloud Scheduler is your friend.
You can check the link here which is cloud services mapping

Can we implement data lineage on queries run via Google BigQuery?

Could anyone help me in providing some pointers on how do we implement Data lineage on a DW type solution built on Google BigQuery using Google Cloud storage as source and Google Cloud Composer as the workflow manager to implement a series of SQL's?
If you have your data in Cloud Storage, you would maybe like to use something like GoogleCloudStorageToBigQueryOperator to first load your data in Bigquery, then use BigQueryOperator to run your queries.
Then you could see how your different DAGs,tasks etc are running in the Airflow Web UI inside Composer.