AWS Schema conversion tool - Connection error - amazon-web-services

Am trying to migrate my oracle database to a AWS Redshift cluster, and am following the steps as documented here in the AWS website.
When I try to create a project in AWS Schema Conversion Tool, configure for Oracle (running in my laptop), am unable to get successfull connection, and here is the error
Connection to 'jdbc:oracle:thin:#localhost:1521:orcl' wasn't
established. ERROR: code: 28009; message: ORA-28009: connection as SYS
should be as SYSDBA or SYSOPER
Please note that in Schema Conversion Tool project page, I don't see a section "Connect as SYSDBA" option? I tried with some other sample users such as SCOTT, and I get Insufficient privileges.
With the same configurations am able to connect from a sql client, SQLWorkbench.
Appreciate any help please. Thanks

To connect to Schema conversion tool you need a user having permissions as connect ,select_catalog_role and select any dictionary. Make sure these permissions are set correctly. Either create a new user with these permissions or modify rights of exiting user. SYS\SYSDBA will not be helpful.
As you are in localhost, with given permissions, you should be able to connect.

You can also connect using sys, in user name field provide the user name like this
sys as sysdba
As per the AWS documentation if Oracle is a source DB then user should be created like this.
CREATE USER oracle_sct_user IDENTIFIED BY password;
GRANT CONNECT TO oracle_sct_user;
GRANT SELECT_CATALOG_ROLE TO oracle_sct_user;
GRANT SELECT ANY DICTIONARY TO oracle_sct_user;
and AWS DMS requires oracle user to have some privileges, detail can be found here.
https://docs.aws.amazon.com/dms/latest/sbs/CHAP_Oracle2PostgreSQL.Steps.ConfigureOracle.html

Related

Connecting App Engine to Cloud SQL Access Denied

I'm trying to get my Flask App Engine project to read data stored in Cloud SQL MySQL 5.7 database. Something has gone wrong as all I've gotten are pymysql.err.OperationalError. I've been following the instructions here: https://cloud.google.com/sql/docs/mysql/connect-app-engine.
The Cloud SQL Admin API is enabled.
According to the linked document:
App Engine uses a service account to authorize your connections to Cloud SQL. This service account must have the correct IAM permissions to successfully connect. Unless otherwise configured, the default service account is in the format service-PROJECT_NUMBER#gae-api-prod.google.com.iam.gserviceaccount.com.
The IAM page listing the permissions for my project doesn't contain a member in the above format. The "App Engine default service account" is of the format: my-project-name#appspot.gserviceaccount.com. This service account has Cloud SQL Client and Editor roles.
While my queries are unsuccessful, after each attempt I note in the Logs Viewer:
7183 [Note] Access denied for user 'www-data'#'cloudsqlproxy~xxx.xxx.xx.xx' (using password: YES)
(IP address redacted). This is somewhat confusing as 'www-data' isn't a user I specified in my code.
The code used to connect:
app.config['SQLALCHEMY_DATABASE_URI'] = 'mysql+pymysql://' + ':' + db_user + '#/' + db_name + '?unix_socket=/cloudsql/' + connection_name
Where have I gone wrong and how might I fix it?
This error is a mySQL error when trying to connect to a database with wrong credentials.
Please verify that the values you are using are the right ones.
If you dont rember the database username and password you can change it on the console by following the next steps which are also expalined here
Go to CloudSQL console
Select your database
Go to users
next to the user select click on the three dots
And select Change password
Type the new password and click OK
Mostly likely, you're building and testing your app locally where you're supplying your credentials with a username that has access to Cloud SQL. Upon build, unless you otherwise specify, a default username will be assigned to the app engine instance.
To fix this:
Head over to IAM & admin
Search for the app engine account --> ending with gae-api-prod.google.com.iam.gserviceaccount.com
Edit
Assign permission Cloud SQL Client
Let me know if this solves it for you!

"'POWERBI_ROLE' specified in the connect string is not granted to this user....."

I'm following the tutorial from here: https://community.snowflake.com/s/article/Amplifying-Outcomes-with-Snowflake
In PowerBI Desktop, I'm trying to "Get Data" and receive the following error: Details: "ODBC: ERROR [28000] Role 'POWERBI_ROLE' specified in the connect string is not granted to this user. Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC.
ERROR [28000] Role 'POWERBI_ROLE' specified in the connect string is not granted to this user. Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC."
In snowflake I've added the role to the user by using the query:
ALTER USER POWERBI_USER_ACCOUNT SET DEFAULT_ROLE=POWERBI_ROLE;
I've done this multiple times in snowflake, and did not receive an
error.
I've tried editing the ODBC connection in the "ODBC Data Source
Administrator (64-bit)" WIndows OS pref pane, and used the role
PUBLIC as suggested, but still receive the same error.
Any suggestions?
Also, does snowflake provide technical support, or are users left to post in public forums for technical support? A bit confused.
You need to run the following to grant the role to the user
GRANT ROLE POWERBI_ROLE TO USER POWERBI_USER_ACCOUNT
Looks like the article is missing this step.
Snowflake does provide tech support, you need to work with your account rep to set it up.

BigQuery API Error: "The project ___ has not enabled BigQuery."

I'm trying to use bigrquery from an interactive R session for the first time. I've installed the R bigrquery package and successfully established a connection. I've authenticated following a redirect from the interactive RStudio session to the web browser for login, so oauth seems to be working correctly. However, when I try to run a query, such as DBI::dbReadTable(con, "A549_raw_merged_TADs")[1:6, ], I get the error Error: The project variant-annotation has not enabled BigQuery. [invalid].
I am sure I need to enable something else in the project on the Google side, but I'm not sure what I need to enable, or how to do that from the web console interface.
Is this documented anywhere, or can anyone here give me instructions?
Note: I have confirmed that the BigQuery API has Activation status of Enabled via the console at https://console.cloud.google.com/apis/api/bigquery-json.googleapis.com/overview, so although my question is the same as Error processing job: Project has not enabled BigQuery, the issue appears to be different.
Additional information: when I check the BigQuery API status via the console link above, there's a notification that "To use this API, you may need credentials. Click 'Create credentials' to get started."
So it may be that my assumption that oauth seems to be working is not correct.
You can enable the BigQuery API with this command gcloud services enable bigquery.googleapis.com, but, since you mention it is already enabled, you may be pointing to a different project.
Feel free to ping me on a new thread if you require assistance on the new issue.

PermissionDenied: 403 IAM permission 'dialogflow.intents.list'

I'm trying to get the list of the intents in my Dialogflow agent using Dialogflow's V2 APIs but have been getting the following error:
PermissionDenied: 403 IAM permission 'dialogflow.intents.list' on 'projects/xxxx/agent' denied.
I adopted the following steps:
I created a new agent(with V2 APIs enabled) and a new service account for it.
I downloaded the JSON key and set my GOOGLE_APPLICATION_CREDENTIALS variable to its path.
Following is my code:
import dialogflow_v2 as dialogflow
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/user/folder/service-account-key.json"
client=dialogflow.IntentsClient()
parent = client.project_agent_path('[PROJECT_ID]')
for element in client.list_intents(parent):
pass
I have made various agents and service accounts and even changed the role from Admin to Client but can't figure out any solution. I tried the following solution but didnt' work
Tried Solution: DialogFlow PermissionDenied: 403 IAM permission 'dialogflow.sessions.detectIntent'
There is no need for creating a new Agent. You can edit the existing agents IAM.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
In google cloud, go to IAM Admin > IAM under tab Members. Find the name of your agents and then click on edit.
Give admin permissions to the agent to give permissions to list intent.
The problem lies in the IAM section of GCP. Probably you are making a POST request with a role that does not have the necessary authorizations.
Look into your key.json file that contains the field "client_email"
Proceed to the IAM page and set the relevant role with that email to
a role that has posting capabilities. (e.g. Admin)
This solved my problem.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
(Optional) In the Cloud console, go to the menu icon > APIs & Services > Library. Select any APIs (if any) > Enable.
In Cloud Console > under the menu icon ☰ > APIs & Services > Credentials > Create Credentials > Service Account Key.
Under Create service account key, select New Service Account from the dropdown and enter a project name and for role choose Owner > Create.
JSON private key file will be downloaded to your local machine that you will need.
For Javascript:
In the index.js file you can do service account auth with JWT:
const serviceAccount = {}; // Starts with {"type": "service_account",...
// Set up Google Calendar Service account credentials
const serviceAccountAuth = new google.auth.JWT({
email: serviceAccount.client_email,
key: serviceAccount.private_key,
scopes: 'https://www.googleapis.com/auth/xxxxxxx'
});
For Python:
There's a Google Auth Python Library available via pip install google-auth and you can check out more here.
When you create the intentClient, use following:
key_file_path = "/home/user/folder/service-account-key.json";
client=dialogflow.IntentsClient({
keyFilename: key_file_path
})
Intents list
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Keep in mind that when you set an environment variable value in a session, it is reset every time the session is dropped.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide, in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Try also to create project in DialogFlow Console
https://dialogflow.cloud.google.com/
You need to create the following as environment variable
googleProjectID: "",
dialogFlowSessionID: "anything",
dialogFlowSessionLanguageCode: "en-US",
googleClientEmail: "",
googlePrivateKey:
I think you might have missed the Enable the API section in the documentation setup.
Here is that link:
https://cloud.google.com/dialogflow/cx/docs/quick/setup#api
After clicking the link, select the chatbot project you created and fill the necessary instructions given there.
The permissions that I have given for that project are Owner, and editor.
After this, try the code in this link:
https://cloud.google.com/dialogflow/es/docs/quick/api#detect_intent
You should get a response from your chatbot
Hope this helps!

End user Google Account not authorized

I have a django app running on appengine and would like to attach a cloud sql instance to it. I've set up everything as necessary, and would like to populate the database with a simple
manage.py syncdb
but the above call gives me a "google.storage.speckle.python.api.rdbms.DatabaseError: 0: End user Google Account not authorized" error
I've set up the cloudsql to allow access to my appenging.
Maybe you are using the wrong OAuth2 credentials. Try deleting the credentials .dat file and logging in again.
https://developers.google.com/appengine/docs/python/cloud-sql/django#authentication