AWS DMS Service and S3 - amazon-web-services

I have source system in SAP BW and want to migrate data into AWS S3 bucket using AWS DMS (Database migration service). All the source data are in flat file format (Either .CVS or .xls).
How to connect to SAP BW from AWS DMS service and extract data from source?

I don't know if it's the same thing, but you can check this manual on how to migrate SAP ASE database as source.
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.SAP.html

Related

Migrate data from Azure SQL Pool (formerly SQL DW) to AWS S3

I am trying to migrate data from Azure Dedicated SQL Pool (formerly SQL DW) into S3.
My initial approach was to do so using AWS DMS. However, DMS doesn't have an endpoint for Azure SQL Pool. I am also aware of other ways like sending data from SQL Pool into Azure Blob Storage and then migrating from Blob Storage to S3.
I am however looking for some kind of 'direct transfer' by which I can pull data straight from SQL Pool into S3. Or, some way I can push from SQL Pool into S3.
Is there any way this can be done?
If not, which would be the most efficient way to migrate this data?

How to ingest my log files (which are generated in my EC2 instance) into azure data cloud

I'm trying to ingest my log files which are generated in the Ec2 instance (API Logs) into the Azure cloud. Which service i need to select in azure?. Please help me out to find the solution for this.
If you are able to export these logs to an S3 bucket then you can use Azure Data Factory to pull logs from this bucket to Azure blob or data lake.

Can SnowFlake be used as a source endpoint in Data Migration Service of AWS?

I am trying to use AWS DMS Database Migration Service along with Snowflake as a source database. Is there any way I can achieve this ?
All I could see options for IBMDB2, MySQL, SQL Server, Amazon Aurora, Oracle, SAP Sybase etc. But not for Snowflake.
Can ODBC string for SnowFlake be put in as a source endpoint ? Or any workaround
Because DMS doesn't support Snowflake as destination yet so I think you could use S3 as target then use
Snowflake bulkload to load data from S3 https://docs.snowflake.com/en/user-guide/data-load-s3-create-stage.html
Snowpipe to do continuous loading.

How to connect and access On-prem SFTP server in AWS glue directly?

Is it possible to connect to on-prem SFTP server directly in AWS glue job ?
The SFTP server has restricted access in this case (IP whitelisting)
Thanks
AWS Glue doesn't support connectivity to SFTP servers natively. You might have to use third party drivers or jars for this kind of support. I have added a blog link that might suit your needs here.
Connect to SFTP Data in AWS Glue Jobs Using JDBC
Alternative approach:
Since you mentioned "The SFTP server has restricted access" If there is no security concern to copy the data over to S3, below would be the possible solution for file based kind of source systems.
Setup a scheduler or a process to transfer these files into S3
buckets. Then AWS Glue crawler can be used to create metadata and AWS
Glue jobs for ETL.

Azure Data factory to SQL server in AWS

I am new to Azure Data Factory (ADF) and would like to know whether it is technically possible to use ADF to copy data from a source in AWS (not Azure) environment and put it to a sink in another AWS environment. I am aware that we need Integration Runtime (IR) to connect to the source. Can we achive copying to AWS as well using IR?
According to this document
Data stores with * can be on-premises or on Azure IaaS, and require you to install Data Management Gateway on an on-premises/Azure IaaS machine.
But this does not say that we can/cannot transfer to AWS environment.
You are referencing ADF V1 doc. You could reference ADF V2 doc for ADF V2 support more data store.
Currently, ADF V2 support Amazon Marketplace Web Service as Source, but not sink. But you could take a look of generic ODBC if you have odbc driver for your aws sql server.