Can I run a Python script in PowerBI on pre-existing data tables? - powerbi

I see in the tutorials how to import data using Python. However, is it possible to manipulate a table or create a new one using Python? For example, I import data using Sharepoint. I can't wrap SQL in Python because the databases are only accessible through intranet, which PowerBI is not part of. Therefor, I need to input the data using one of PowerBI's connectors, but I'd like to manipulate the tables using Pandas. Is this possible?

You can use Python scripts to manipulate existing data. This can be done in the "Transform" tab under the Edit queries section. There is an option to select "Run Python Script". Once you select this option a dialog box will open and you can write the Python script you want. If you run into any issues you can refer to the following video:
https://www.youtube.com/watch?v=pF_JZk_ghCM
Hope this helps.

Related

Is there a way to build TMSL scripts programatically on the same way that XMLA tools create it (like Tabular editor) from an existent tabular db?

I am wondering how is possible to create a TMSL script from an tabular database (power bi service or SSAS) using some programing language. These kind of scripts are available in several tools, like SSMS or Tabular Editor:
Example of the menu to create TMSL script in SSMS from an active DB
What creates (as an example) something like this:
Example: A TMSL script header for a role
So, the question is if, for example, a python library / .NET wrapper could do that automatically. I am able to send several DMVs using pyadomd to extract the information by parts (with several lookups in alternate tables to transforms ids) and then recompose everything into an script, but if where a easy way to create it could less error prone and time saver.
Thanks in advance
Alexis
IT's a .NET Library called Tabular Object Model, which can be used to read Tabular Model metadata and generate scripts using the JSON Scripter.

Append CSV Data to Apache Superset Dataset

Using CSV upload in Apache Superset works as expected. I can use it to add data from CSV to a databse, e.g. Postgres. Now I want to apped data from a different CSV to this table/dataset. But how?
The CSVs all have the same format. But there is a new one for every day. In the end I want to have a dashboard which updates every day, taking the new data into account.
Generally, I agree with Ana that if you want to repeatedly upload new CSV data then you're better off operationalizing this into some type of process, pipeline, etc that runs on a schedule.
But if you need to stick with the uploading CSV route through the Superset UI, then you can set the Table Exists field to Append instead of Replace.
You can find a helpful GIF in the Preset docs: https://docs.preset.io/docs/tips-tricks#append-csv-to-a-database
Probably you'll be better served by creating a simple process to load the CSV to a table in the database and then querying that table in Superset.
Superset is a tool to visualize data, it allows uploading CSV for quick and dirty "only once" kind of charts, but if this is going to be a recurrent and structured periodical load of data, it's better to use whatever integrating tool you want to load the data, there are zillions of ETL (Extract-Transform-Load) tools out there (or scripting programs to do it), ask if your company is already using one, or choose the one that is simpler for you.

Moving views to another Project Bigquery

Would like to be able to move (copy and replace) views from one project to another.
Suppose we have Dev GCP Project 1, along with an Integrated and Production GCP project. I would like to be able to move individual views or specific datasets only (Not tables) from Dev to Int, then Int to Prod.
I know i can use the following Google Cloud Shell command for moving tables.
bq cp ProjectNumberDev.DatasetDev.TableDev ProjectNumberInt.DatasetInt.TableInt
However, this command only works with tables and not views, is there a way to do this with views? Or is a Table Insert / Post API script the only way?
Per documentation:
Currently, there is no supported method for copying a view from one dataset to another. You must recreate the view in the target dataset.
You can copy the SQL query from the old view:
Issue the bq show command.
The --format flag can be used to control the output. If you are getting information about a view in a project other than your default project, add the project ID to the dataset in the following format: [PROJECT_ID]:[DATASET]. To write the view properties to a file, add > [PATH_TO_FILE] to the command.
bq show --format=prettyjson [PROJECT_ID]:[DATASET].[VIEW] > [PATH_TO_FILE]
Meantime, if you can script all your views during the development then you can use CREATE VIEW statement in all environments
See more for Data Definition Language
You can apply same approach for tables creation, etc.
Moving views from one project to another is possible to do in the Cloud Console at least now.
Currently, you can copy a view only by using the Cloud Console.
Here is the instruction from GCP documentation:

How do you import data from your excel spreadsheet to ORACLE APEX?

I am completely new to databases, was wondering if there were ways to direcly upload the data in an excel file, to ORACLE APEX, if not, what would the best way be to upload small datasets .CSV extension around 15MB.
Apex enables you to directly upload CSV files using the Data Load Wizard. You can find a lot of tutorials. Here is just one.
You can also upload Excel files using the following methods:
EXCEL2COLLECTIONS Plugin
Create a procedure that will translate excel data into strings. Tutorial here.
Using AS_READ_XLSX package. Tutorial here.
Don't be shy to use google because you will find many more options.

Want to import data from old database(mysql) to new database(mysql) in kettle

Hi i have a scenerio i have a database dump which i want to import in my new rails web applications database i have used activerecord etl gem but now the demand is that use kettle etl for importing the data. i have no idea of kettle can someone help me or link me the tutorials from where by following it i can do my job?
thanks in advance :)
Actually it's very easy. Just use Table Input to read from current MySQL Table, do some transformation, pipe it to Table Output. In the Table Output, point it to target MySQL connection, fill in the table name and click "Generate SQL" button to execute the DML generated.
For example you can see my article for a sample Excel to MySQL, but the operation can easily converted for MysQL-MySQL migration. (The site is in Indonesian - you can translate it using Google Translator)
http://pentaho.phi-integration.com/kettle/export-excel-ke-mysql