I am trying to connect AWS SCT to Teradata to migrate some tables to Redshift. However, while connecting to Teradata, I am getting the error which says -
"The specified account does not have sufficient privileges for working with the following object(s) :
Database 'DBC' : [SELECT]
Here is the snapshot of the error (Removed some connection details) :
What permissions should I request from the Teradata Admin to provide to the user so that I am able to access my required DB.
User connecting to Teradata should have SELECT access on DBC to pull object metadata to be converted into Redshift DDL
Following the docs, you'll need these permissions:
SELECT ON DBC
SELECT ON SYSUDTLIB
SELECT ON SYSLIB
SELECT ON <source_database>
CREATE PROCEDURE ON <source_database>
In the preceding example, replace the <source_database> placeholder with the name of the source database.
Related
I am running Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production in a Docker container.
I created a user with CREATE SESSION and CREATE TABLE system privileges. User also has QUOTA UNLIMITED.
CREATE USER airflow IDENTIFIED BY pass;
GRANT CREATE SESSION TO airflow;
GRANT CREATE TABLE TO airflow;
ALTER USER airflow QUOTA UNLIMITED ON USERS;
With that user I attempted to create a private temporary table with the following query:
CREATE PRIVATE TEMPORARY TABLE ora$ppt_temp1 (
name varchar2(7),
age int,
employed int
) ON COMMIT PRESERVE DEFINITION;
I am accessing the database on Python 3.9.13 using SQLAlchemy 1.3.24.
I get the following error:
sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00903: invalid table name
I also get ORA-00903 when running the query from DBeaver. I have checked the private_temp_table_prefix and it is set to the default value of ORA$PTT_. I have read through the Oracle 19c documentation and several stack overflow questions and cannot see what I am missing here.
I suspect that there is some privilege I need to add or modify to make this work.
As stated this was a typo in the table name.
We are Trying to Drop and recreate a table via Informatica Mapping using the Pre_Sql option. Informatica throws an Insufficient privilege error even though we have granted privileges to the Informatica user, is it possible to drop and create a table Via pre SQL or is there any other method to accomplish this issue.
I have used Simba ODBC driver to connect SQL server to Bigquery as linked server in SQL Server Management Studio.
Not able to insert into BigQuery, only able to select data from BigQuery. I have checked 'AllowInProcess' and 'NonTransactedUpdate' too.
select * from openquery([GoogleBigQuery], 'select * from first.table2' )
The above select query is working.
Query:
insert into OPENQUERY([GoogleBigQuery], 'select * from first.table2') values (1,'c')
Error generated:
"The OLE DB provider "MSDASQL" for linked server "GoogleBigQuery"
could not INSERT INTO table "[MSDASQL]" because of column "id". The
user did not have permission to write to the column."
Query:
INSERT INTO [GoogleBigQuery].[midyear-byway-252503].[first].[table2] select * from Learning_SQL.dbo.demo
Error generated:
OLE DB provider "MSDASQL" for linked server "GoogleBigQuery" returned message "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
The OLE DB provider "MSDASQL" for linked server "GoogleBigQuery" could not INSERT INTO table "[GoogleBigQuery].[midyear-byway-252503].[first].[table2]" because of column "id". The user did not have permission to write to the column.
Was wondering if anyone has tried inserting into a dataset in BigQuery using Linked server.
This error is due to this limitation. It seems that Microsoft's SQL Server "Linked Servers" option does not support making INSERT, UPDATE, or DELETE calls to the external database being linked to unless the connection supports transactions.
Since BigQuery does not support explicit transactions, MSSQL would not allow INSERT, UPDATE, or DELETE calls to BigQuery.
If you would like to insert data into BigQuery, consider exporting the data into a file, and load that file into BigQuery.
The import file can be in Avro, CSV, JSON (newline delimited only), ORC, or Parquet format.
For more information, refer to importing data into BigQuery,
I need to connect Power BI to SAS using an OLE DB connection (can't use ODBC nor the native connection). Here is the string from the build:
provider=sas.IOMProvider.9.45;data source="iom-name://SASApp - Logical Workspace Server";mode="ReadWrite|Share Deny None";sas cell cache size=10000;sas port=0;sas protocol=0;sas server type=1;sas metadata user id=dxru984;sas metadata password=XXXXX;sas metadata location=iom-bridge://lhwappb1.xxx-xx.xxx:8561
I also tried with this one:
Provider=sas.IOMProvider.9.45;Data Source=iom-name://SASApp - Logical Workspace Server;SAS Cell Cache Size=10000;SAS Port=0;SAS Protocol=0;SAS Server Type=1;SAS Metadata User ID=dxru984;SAS Metadata Password=xxxxxxx;SAS Metadata Location=iom-bridge://lhwappb1.xxx-xx.xxx:8561
The first string works perfectly with Excel but not in PowerBI with that error message:
OLE DB : Format of the initialization string does not conform to the
OLE DB specification
Any idea?
I managed to connect to SAS Federation Server data using the following connection string:
provider=sas.IOMProvider.9.45;data source=blablabla2.abc.pt;sas port=1234;sas protocol=2;sas metadata user id=******;sas metadata password=**********;sas metadata location=blablabla1.abc.pt:5678
Hope this helps,
Rita Dias
I have created a new Azure SQL Data Warehouse database on a new logical server from the backup of a Azure SQL Data Warehouse database on a different logical server (using the Azure portal). I created the LOGINs on the new MASTER database for the users that would connect to the new Azure SQL Data Warehouse database. The users were restored to the new Azure SQL Data Warehouse database as expected per:
SYS.DATABASE_PRINCIPALS
Now when I attempt to connect with those users, I receive an error:
Sqlcmd: Error: Microsoft ODBC Driver 11 for SQL Server : The instance of SQL Server you attempted to connect to does not support CTAIP..
We use sql-server authentication, running the following on both the original and new MASTER:
CREATE LOGIN
the_userID
with password = 'xxxxxxxxxxxxxxxxx'
;
GO
and the following pattern on the original ADW database:
CREATE USER [the_userID] FROM LOGIN [the_userID]
;
GO
Is there any solution other than dropping and reCREATEing the users in the new ADW database?
CTAIP error is a rather poorly worded error message indicating the login (in master) does not have a corresponding user in the DW.
In this case, you need to drop the existing user in the DW and re-create it for the login in master.
It doesn’t work automagically (yet) because we track the association using security identifiers (SID) not names and the new login in master has a new/unique SID. AAD logins and contained users (not currently supported in DW) don’t have this problem.