I'm evaluating AWS Data Migration Services. I'm attempting to move data from an Azure SQL database to a SQL Server 2016 database sitting on AWS RDS. I've successfully created the source and was able to connect when I clicked the Run Test button. However, when I entered the Target database connection details information, I'm not able to connect when I click the Run Test button. The information and error message is below.
I am able to connect to this instance using SQL Server Management Studio, with the credentials I'm using in the screen shot.
For timeout concerns, security groups are usually the culprit. Can you verify if the security group of your Target RDS instance allows ingress from the security group that the DMS Replication Instance belongs to?
See the attached screenshot:
See this article for more information: https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Security.Network.html
Related
We are attempting to connect to an Amazon Redshift Instance from Azure Data Factory as a linked service.
Steps Taken:
Provisioned Self Hosted Integration Runtime (Azure)
Created user access to database within Redshift (AWS)
White list IP addresses of SHIR within security group (AWS)
Built linked service to Redshift using log in, server address and database name (Azure)
From testing we know that this user log in works with this database for other sources and in general the process has worked for other technologies.
A screenshot of the error message received can be seen here
Any suggestions would be greatly appreciated :)
To connect to Amazon Redshift from Azure, look at using the Amazon Redshift AWS SDK for .NET. You can use the .NET Service client to write logic that performs CRUD operations on a Redshift cluster.
You can create a service client in .NET with this code:
var dataClient = new AmazonRedshiftDataAPIServiceClient(RegionEndpoint.USWest2);
Ref docs here:
https://docs.aws.amazon.com/sdkfornet/v3/apidocs/items/RedshiftDataAPIService/TRedshiftDataAPIServiceClient.html
I have a simple DynamoDB on AWS with a table which I can view test data 'Items' from the DynamoDB console also via a standalone client called Dynobase.
Now I want to create a simple webpage hosted on Lightsail that contains a HTML table to display the data.
I would like to connect to the DynamoDB using PHP then issue a query, tabulating the response.
Can someone point me at an example of how to do this? - The AWS documentation is quite confusing.
This is the code Link to the code I am running on my Lightsail instance. I have added <?php at the top of the file and ?> at the bottom. I am testing the code via my web browser xx.xx.xx.xx/MoviesCreateTable.php
This is the error I am getting->
Unable to create table: Error executing "CreateTable" on "http://localhost:8000"; AWS HTTP error: cURL error 7: Failed to connect to localhost port 8000: Connection refused (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://localhost:8000
Many Thanks
Andy
It turned out that the tutorial here ([Link to the code]) was very useful.
The main changes I made to enable the examples to work on my lightsail instance were to remove the endpoint line from the credentials.
Then I created a new Lightsail user and attached a policy o that user that had lightsail access and DynamoDB access.
Next, using the aws cli on the lightsail box I configured it for the new lightsail user.
This worked for me.
I have some trouble concerning the RDS / Managed AD connection:
I've set up the AWS Managed Microsoft AD and added some users.
Then, I've set up an MS-SQL Database in RDS.
Now, while accessing it via SQL Server Management Studio works flawlessly I simply cannot add the AD users I've created.
I get the following error: The program cannot open the required dialog box because it cannot determine whether the computer named "Network Name Resource" is joined to a domain
Looking at the AD, I can see that the RDS instance is indeed missing.
How can that be? In the RDS console I can it clearly being attached to the Domain?
Have searched this issue for quite some time and hope someone can help me out here...
You must be signed into SSMS via domain account with privileges to add/modify users' logins for that search box to work.
Furthermore, it is non-obvious but you can confirm that your RDS instance is in the domain by using ADAC or ADUC and looking under: AWS Reserved > RDS
I created a report in Power BI Desktop, connected to an AWS RDS database, and published it to the Power BI Web App, with an intent to refresh the dataset from the web app.
I tried doing so with both MySQL as well as SQL Server (on RDS). However, Power BI web does not let me refresh the dataset and instead wants me to install an on-premise Gateway.
I am not sure why this is a requirement as my database is on the cloud and not on-premise, and on a public VPC.
Is it possible to refresh an AWS RDS dataset in the Power BI web app? How?
To refresh the Power Bi Visuals on the Web Application, it needs to have a successful connection with the Database. Now, in the case of connecting to an AWS RDS instance, since the AWS database server is installed on an AWS Virtual Machine, this acts as an on-premise source on that machine [6]. Therefore you would have to install the on-premises gateway on the AWS server with an access to the RDS possibly on the same VPC or with the help of best security practices.
The steps to create an on-premise data gateway an AWS:
Create an EC2 Windows instance, please see this AWS documentation [3] for more details. Refer to this public Microsoft link [4] for details around the instance(s) based on the operating system requirements. Following are the instance configurations that have worked for me and can be different based on your requirements:
a. AMI Name: Windows_Server-2019-English-Full-Base-2021.10.13
b. Instance Type: t2.2xlarge
Please check the required network ports you will need to allow in your instance security group inbound and outbound rules. Please see this AWS documentation [5] to know how to work with security groups.
Ensure that you add a “Key Pair” to the instance and have access to the PEM file. This will be needed to RDP into the machine.
Once the instance is created, RDP into it and install any required software.
For instance in my case to connect to an AWS Postgres Server, I had to install the Npgsql version 4.0.9. The latest version unfortunately didn’t work so be mindful of version compatibility.
Also though not essential, I wanted to install the Chrome browser on the Remote Server as a preferred choice of my browser. To do so I had to run the following command on the PowerShell CLI
$Path = $env:TEMP; $Installer = "chrome_installer.exe"; Invoke-WebRequest "http://dl.google.com/chrome/install/375.126/chrome_installer.exe" -OutFile $Path$Installer; Start-Process -FilePath $Path$Installer -Args "/silent /install" -Verb RunAs -Wait; Remove-Item $Path$Installer
With the Remote Desktop open, install the Power BI on-Premises Gateway [4].
Now search for the “On-Premises Data Gateway” on the Remote Server and register your account. Please use the account that is connected to your PowerBi Web App. This could be either the role based email to which you and the Power Bi has access to … or your own email address associated with the PowerBi Web App.
For details on creating a data source, please see the reference link [6] for more details.
Data Refresh Scheduling
Now back on the Power Bi Web app, when you click on your settings button > manage gateways, on the left hand side you should be able to see the gateway showing up. Add the credentials and test if the connection is working. If you get a green tick, you should be able to schedule the Data Refresh on the web.
You can configure PowerBi to refresh data. Please see the reference link [7] for more details.
Troubleshooting
Pay attention to the error that is given when you try to connect to the database on the web. It would usually have a useful hint of what might be missing.
Reference links:
[3] Launch an instance using the Launch Instance Wizard https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html
[4] Install PowerBi gateway - https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-install
[5] Work with security groups - https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/working-with-security-groups.html
[6] Connecting the Microsoft Power BI service to AWS data sources - https://docs.aws.amazon.com/en_us/whitepapers/latest/using-power-bi-with-aws-cloud/connecting-the-microsoft-power-bi-service-to-aws-data-sources.html
[7] Data refresh in Power BI - https://learn.microsoft.com/en-us/power-bi/connect-data/refresh-data
[8] https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-onprem
[9] credits / author myself: https://www.smart5.co.uk/en_gb/article/25/on-premises-data-gateway-installation-aws-power-bi
I have Cloud SQL and Instance in same project of Google Cloud.
I want to connect to Mysql (from Google Cloud SQL) from Instance using command line.
I am connecting using command:
gcloud sql connect cloud-sql-name --user=username
its giving:
ERROR: (gcloud.sql.connect) There was no instance found or you are not authorized to connect to it.
How do I make connection.
You probably missed the authorization of your instance's IP to be able to connect to the Cloud SQL database. This is done for e.g. through the Cloud Console in the Cloud SQL Instances page.
To see the step-by-step guide for connecting from Compute Engine instance to Cloud SQL (with the mysql client) check this docs page.
Are you trying to connect to your Cloud SQL instance from your Compute Engine VM instance?
If this is the case, I can suggest to first grant your service account (the Compute Engine default service account or a new one) the corresponding IAM role/permission for Cloud SQL [1].
Then, authenticate/activate [2] your service account within your VM instance command line (it is recommended to generate a JSON key File instead of a P12 key File for your service account):
~$ gcloud auth activate-service-account [ACCOUNT] --key-file=KEY_FILE
To generate a key File for your service account, you can go to your Products and Services menu > IAM & admin > Service accounts. Click on the three-dotted button at the right of the corresponding service account and select "Create key".
Your key File should be stored/uploaded to your VM instance, so that it can use it to activate the service account.
It is possible that you would also need to grant your VM instance access for Cloud SQL within its Cloud API access scopes panel. Go to your Products and Services menu > Compute Engine > VM instances. Select your VM instance and edit it.
Be aware that you will need to stop your VM instance before editing Cloud API access scopes. Go to "Access scopes" > "Set access for each API". Enable and Save.
You will also have to enable the Cloud SQL Admin API. Go to your Products and Services menu > API and services. Search for SQL Admin and Enable it (wait a few minutes)
Start the VM instance and try your Cloud SQL tests (re-activate your service account if necessary).
Your Cloud SQL instance for MySQL creates the "root" user. Just make sure to use this as the username, or any other you have created (if any).
Take into account that since you would be connecting from a Compute Engine VM instance, it is possible that yo will be asked for a MySQL Client.
[1] https://cloud.google.com/sql/docs/mysql/project-access-control
[2] https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account