Synapse Analytics - Can't see the tables list in the dedicated sql pool - azure-sqldw

In Synapse Analytics portal my account has this permission for the scope Workspace:
Synapse Administrator
Synapse SQL Administrator
I can execute pipelines. but I can't see the list of the objects such as tables, views, ... in the DB
In SSMS I can connect to the sql pool and browse object.
What am I missing ?

You will experience this because Azure Synapse SQL pool named - sspocbi is paused. Make sure the SQL Pool named sspocbi is online to see the tables.
For your understanding I have two Azure Synapse SQL pools named - chepra and SQLPOOL1.
where SQL Pool named chepra is online and SQLPOOL1 is paused.
When check the tables in the databases for the the SQL Pool which is online you can view the tables and for SQL Pool which is paused doesn't show any tables:
Now I tried to resume the Azure Synapse SQL pool named SQLPOOL1
.
A FEW MINUTES LATER: It's shown the tables.
If the database is online, you can view the tables. Else you will see the cross mark which denotes you cannot view the tables.

Thanks it is a good answer, but in my case it was not the issue.
My Azure AD account is admin of the Synapse workspace but was not a user in the pool. I did not create this pool.
By default it seems that only the user who creates the pool will be added automatically.

Related

How can i access metadata db of GCP Composer Airflow server?

I have created one Composer in gcp project. I want to access the Metadatadb of Airflow which runs at background on Cloud SQL.
How can i access that?
Also i want to create one table inside that metadatadb which i will be using to store some data query by one of airflow dag. Is it ok to create any table inside that metadatadb or that metadatadb is only for airflow server use?
You can access Airflow internal DB via UI using Data Profiling -> Ad Hoc Query
There you can see all the tables with a SQL query like :
SHOW tables;
I wouldn't recommand creating a new table or manually inserting rows into existing tables thought.
You should also be able to access this DB in your DAGs operators and sensors by using airflow-db connexion.

ERROR - The instance of SQL Server you attempted to connect to does not support CTAIP

I have created a new Azure SQL Data Warehouse database on a new logical server from the backup of a Azure SQL Data Warehouse database on a different logical server (using the Azure portal). I created the LOGINs on the new MASTER database for the users that would connect to the new Azure SQL Data Warehouse database. The users were restored to the new Azure SQL Data Warehouse database as expected per:
SYS.DATABASE_PRINCIPALS
Now when I attempt to connect with those users, I receive an error:
Sqlcmd: Error: Microsoft ODBC Driver 11 for SQL Server : The instance of SQL Server you attempted to connect to does not support CTAIP..
We use sql-server authentication, running the following on both the original and new MASTER:
CREATE LOGIN
the_userID
with password = 'xxxxxxxxxxxxxxxxx'
;
GO
and the following pattern on the original ADW database:
CREATE USER [the_userID] FROM LOGIN [the_userID]
;
GO
Is there any solution other than dropping and reCREATEing the users in the new ADW database?
CTAIP error is a rather poorly worded error message indicating the login (in master) does not have a corresponding user in the DW.
In this case, you need to drop the existing user in the DW and re-create it for the login in master.
It doesn’t work automagically (yet) because we track the association using security identifiers (SID) not names and the new login in master has a new/unique SID. AAD logins and contained users (not currently supported in DW) don’t have this problem.

WSO2 API Manager - migration from H2 to MySQL : one script is missing

In wso2am/repository/conf/datasources/*.xml, I can see 5 datasources :
<name>WSO2_CARBON_DB</name>
<name>WSO2AM_DB</name>
<name>WSO2AM_STATS_DB</name>
<name>WSO2_MB_STORE_DB</name>
<name>WSO2_METRICS_DB</name>
But in wso2am/dbscripts, I can find only 4 scripts for 4 databases (no script for WSO2AM_STATS_DB).
Is WSO2AM_DB supposed to stay an H2 database in production ? or should it point to an existing database ?
You don't need to create tables in WSO2AM_STATS_DB manually. Just creating a database is enough. Tables are automatically created by analytics scripts.
Table creation of the statistics database is handled by the Analytics
scripts when you configure APIM Analytics, so you will create the
statistics database in this step but will not specify a source script.
Ref: https://docs.wso2.com/display/CLUSTER44x/Clustering+API+Manager+2.0.0

Configuring WSO2 STATS_DB

I have configured API Manager 2.0.0 & API Manager Analytics Pack to use MySQL databases.
For each server, there exists a WSO2AM_STATS_DB. I have given these differing names on my MySQL server. I have also pointed my datasources in master-datasources.xml(for APIM) & stats-datasources.xml(for Analytics) to the relevant databases.
I couldn't find any relevant schema(dbscripts) for these databases in their respective packs.
On running, the Analytics database is populated but the APIM database isn't and throws an exception. The Analytics database not only gets the schema but also the invocation details of my API.
I am unable to get the stats on my dashboard though.
Previously, I (unwittingly) configured the h2-repository stats database to be the same for both servers (due to the folder structure) and was able to get all the statistics on my dashboard in the publisher.
Other configurations I have tried :
On the MySQL Server, pointed it to the same database (the Analytics one with the schema) but with no results on my dashboard (after waiting for a while).
Both datasources (WSO2AM_STATS_DB) in 2 servers should be pointed to the same database. There are no database scripts for this. Tables are created automatically.
By default in both servers, Stats DB path comes like this. (note ../ part)
<url>jdbc:h2:../tmpStatDB/WSO2AM_STATS_DB;DB_CLOSE_ON_EXIT=FALSE;LOCK_TIMEOUT=60000;AUTO_SERVER=TRUE</url>
So if you extract both servers to the same directory as mentioned in this doc, both datasources will be pointing to the same database (inside tmpStatDB) like this.
/parent_dir
|__wso2am-2.0.0/
|__wso2am-analytics-2.0.0/
|__tmpStatDB/
So, what happens here is, wso2am-analytics writes stats data to shared database, then apim reads it and shows data on its databases.

Sitecore Analytics reports - Is this only for Analytics database, can I use master database to generate reports?

I was trying to create a report from master database in analytics reports. (Stimulsoft Report Designer)
As it explains in the reports cookbook, I have created a "mrt file" (Report UI) and a report definition item in Engagement analytics.
I have configured the datasource item as query item
(/sitecore/system/Settings/Analytics/Reports SQL Queries/Visit Pages).
It worked.
But then I tried with a query using the master database, in the SQL query item I specifically mentioned the database as 'testProjectMaster' to point to master database. It did not work!
Then I figured out that in "/sitecore/system/Settings/Analytics/Reports SQL Queries/Visit Pages" item and other query items, it does not specify the database, that means by default sitecore queries the analytics database.
Is this a limitaion in sitecore, cant we query the master databse for reports? Are there any good resources to follow on creating reports?
I suggest taking the SQL from the Visit Pages report and running it in SQL Server Management Studio. There, you will be able to quickly see what's preventing your query from running. If I had to venture a guess, I would suspect that your SQL user account does not have db_datareader access to the master database.
The default SQL queries provided by Sitecore assume that the DMS is configured as the default database in the connection string. This, however, does not prevent you from querying other databases or doing cross-database joins like so:
SELECT TOP 100 * FROM Pages
INNER JOIN Sitecore_Master.dbo.Items AS MasterItems ON Pages.ItemId = MasterItems.ID
A word of caution.. from my experience, this can really slow down your reports as it does not take advantage of indexing and creating indexed views doesn't work across multiple databases.