If one had used
CREATE EXTERNAL FILE FORMAT ...
to create a file format on Azure SQL Data Warehouse, how would one reverse engineer the DDL?
From the latest version of SQL Server Management Studio > Object Explorer > yourDatabase > External File Formats > right-click the external format you are interested in, then click `Script External File Format as...' > CREATE To > New Query Editor Window:
Related
When connecting from Power BI to Azure Data Explorer (ADX) I see all artifacts, yet when I select an external table I get the following error.
Is there a way to connect and visualize external tables in Power BI?
I see a similar problem when using Tableau (using the synapse analytics connector) where I see only the internal tables.
Are external tables queryable outside the ADX portal or API?
The AzureDataExplorer PBI Connector doesn't support external tables as part of the UI navigation at the moment.
However, you can connect to external tables by providing them explicitly:
= AzureDataExplorer.Contents(
"<cluster>",
"<database>",
"external_table('<ExternalTableName>')",
[MaxRows=null, MaxSize=null, NoTruncate=null, AdditionalSetStatements=null])
For example:
= AzureDataExplorer.Contents(
"help",
"Samples",
"external_table('TaxiRides') | take 10",
[MaxRows=null, MaxSize=null, NoTruncate=null, AdditionalSetStatements=null])
Is it possible to convert a .pbix file into a .bim file while preserving all the data connections and expressions / shared expressions?
I have a power bi file I’ve exported into a .pbit file then loaded into Tabular Editor and saved as a .bim file and then loading the .bim file into SSDT visual studio 2015. My compatibility level is 1400.
The problem is that when I am converting from .pbix into .pbit that I lose data connections and shared expressions. The data connections are saved as “mashup” connection strings inside the database which reference back to the instance of power bi desktop I had open
How can I have these data connections remain as Oracle or SQL server connections?
You can import a .pbix file into Azure Analysis Services. At that point, it becomes a regular Tabular model that you can download as an SSDT project (including the Model.bim file). However, you'll have to pay for the Azure Analysis Services instance during this operation.
Other than that, I guess you could ask the author of Tabular Editor to provide this functionality.
I need to copy some tables in one dashdb database over to separate dashdb database. Normally I would export the CSV file from one and load it into the other using the Web console, however one table in particular has a CLOB column and so we will need to export to an ixf + lob files and then import it. Unfortunately I can't see any easy way to do this as it looks like clpplus can only export to the server that the database is on (which I don't have access to) and I can't see any way to get it to export the lob files. Does anyone know how best to accomplish this?
If the CLOB values are in reality smaller than 32K you can try to transform them into a VARCHAR value as part of the SELECT statement that you provide to EXPORT.
If you really need to export LOB files you can write them to your users home dir inside the dashDB instance and then use the /home REST API to download the files e.g. with curl: https://developer.ibm.com/static/site-id/85/api/dashdb-analytics/
Another option is to export the table with the LOBs to a local machine and then import into another dashDB.
One way to export a dashDB table to a local client is to run the EXPORT command in a DB2 Command Line Processor (CLP) on your client machine. To do so, you need to install the IBM Data Server Runtime Client and then catalog your dashDB databases in the client, like this:
CATALOG TCPIP NODE mydash REMOTE dashdb-txn-small-yp-lon02-99.services.eu-gb.bluemix.net SERVER 50000;
CATALOG DATABASE bludb AS dash1 AT NODE mydash;
CONNECT TO dash1 USER <username> USING <password>;
Now, let's export the table called "mytable" so that the LOB column is written to a separate file:
export to mytable.del of del
lobfile mylobs
modified by lobsinfile
select * from mytable;
This export commands produces the files mytable.del and mylobs.001.lob. The file mytable.del contains pointers into the file mylobs.001.lob that specify the offset and length of each value.
If the LOB data is too large to fit into a single file, then additional files mylobs.002.lob, mylobs.003.lob, etc. will be created.
Note the exported data will be sent from dashDB to your local client in uncompressed form, which may take some time depending on the data volume.
If the .DEL and .LOB files reside on a client machine, such as your laptop or a local server, you can use the IMPORT command to ingest these files into a table with a LOB column. In the CLP you would first connect to the dashDB database that you want to load into.
Let's assume the original table has been exported to the files mytable.del and mylobs.001.lob, and that these files are now located on your client machine in the directory /mydata. Then this command will load the data and LOBs into the target table:
IMPORT FROM /mydata/mytable.del OF DEL
LOBS FROM /mydata
MODIFIED BY LOBSINFILE
INSERT INTO mytable2;
This IMPORT command can be run in a DB2 Command Line Processor on your client machine.
How can I export data from an Oracle database to a sql server database using Toad Data Point 3.8?
We have a remote Oracle database, and I need to export data from that DB to a sql server database. It's the same thing as the sql server Import/Export utility, but using Toad Data Point. I have read-only access to the remote Oracle DB, but I have owner access to the local sql server DB.
What I've tried:
I was looking at this Toad link but I don't have a Schema Browser; I only have and "Object Explorer". I right-click on the table, but I don't see the option "Copy Data to Another Schema".
I then tried right-clicking table and "Export Wizard", but I don't see an option to export to another database. All the options are to hard file (sql script, CSV, tab-delimited, etc). I would choose "Sql Script" but there's so much data that the script would just be enormous.
Finally, I tried "Data Export Wizard" and I don't see an option to export to a sql server database. So I'm stuck.
I ended up using the Import Wizard, from the Oracle DB to the SQL Server DB. Very simple.
We have a some ETL processes that read CSV files that are output from SAS programs. I'm in the process of upgrading one of these ETLs and was wondering if I could use SSIS to read directly from the SAS dataset.
Has anybody done this successfully?
See here
"You can use SAS Local Data Provider (can be downloaded separately and comes as part of SAS for Windows installation). "
Recently I've moved data from SAS environment:
In connection manager I choose _Native OLE DB\SAS Local Data Provider 9.3_
Enter file name (`\..\dev` before the table name). Click OK
Drag _OLE DB Source_ into the Data Flow
Right click on _OLE DB Source_ and choose _Show advanced editor_
In the first tab (_Connection Manager_) choose _SAS Connection Manager_ from drop down list you created now
In the Component Properties tab in _OpenRowset_ write the name of the table, click OK
If you have `datetime` type transform it using Derived Transformation Editor
For loading SAS XPT or SAS7BDAT data files without having an instance of SAS to connect to via OBDC we used the following
A third party tool (STATTransfer) to read the XPT file
STATTransfer ODBC driver
Setup the connection in SSIS as an ODBC datasource and load into the database for processing.
There are SAS datasource SSIS extension available http://www.cozyroc.com offer a SAS Data connection, but they where outside our price range