Can The AWS CLI Copy From S3 To EC2? - amazon-web-services

I'm familiar with running the AWS CLI command to copy from a folder to S3 or from one S3 bucket to another S3 bucket:
aws s3 cp ./someFile.txt s3://bucket/someFile.txt
aws s3 cp s3://bucketSource/someFile.txt s3://bucketDestination/someFile.txt
But is it possible to copy files from S3 to an EC2-Instance when you're not on the EC2-Instance? Something like:
aws s3 cp s3://bucket/folder/ ec2-user#1.2.3.4:8080/some/folder/
I'm trying to run this from Jenkins which is why I can't simply run the command on the EC2 like this:
aws s3 cp s3://bucket/folder/ ./my/destination/folder/on/the/ec2
Update:
I don't think this is possible so I'm going to look into using https://docs.aws.amazon.com/cli/latest/reference/ssm/send-command.html

No.
The AWS CLI calls the AWS API. The APIs for Amazon S3 do not have the ability to interact with the operating system on an Amazon EC2 instance.
Your idea of using AWS Systems Manager is a good idea. You can send the command to the instance itself, and the instance can then upload/download objects to Amazon S3.

Since you have SSH access, you could also just run
ssh ec2-user#1.2.3.4:8080 "aws s3 cp s3://bucket/folder/ ./my/destination/folder/on/the/ec2"
... to run the command on the EC2 instance directly.
It's not as efficient as using send-command (because ssh will necessarily pipe the output of that command to your local terminal) but, if you're not transferring millions of files, the tradeoff in simplicity may be acceptable for you.

Using AWS System Manager send command :
#Copying file from S3 bucket to EC2 instance :
$Instance_Id='i-0123456xxx'
aws ssm send-command --document-name "AWS-RunShellScript" --document-version "\$DEFAULT" --targets "Key=instanceids,Values='$Instance_Id'" --parameters '{"commands":["aws s3 cp s3://s3-bucket/output-path/file-name /dirName/ "]}' --timeout-seconds 600 --max-concurrency "50" --max-errors "0" --region REGION_NAME

Related

How to copy files from AWS S3 to local machine?

How to copy the files which are newly updated in S3 bucket using AWS CLI to local machine?
Can we compare the logs and do the copy?
You can use either the aws s3 cp command, or if you want to only synchronise new files you can use the aws s3 sync command.
The syntax is below
aws s3 cp s3://mybucket . --recursive
The documentations are available below:
aws s3 cp
aws s3 sync

aws send command not copying file from s3 bucket to windows server ec2 instance

Is there a way i can copy a file from my s3 bucket to an windows ec2 instance?
I have tried the following way using send command.. it returns success but file is not being copied.. need help
sh """
aws ssm send-command --instance-ids ${Instance_Id} --document-name "AWS-RunPowerShellScript" --parameters '{"commands":["Read-S3Object -BucketName s3://{bucket-name} file.pfx -File file.pfx"]}' --timeout-seconds 600 --max-concurrency "50" --max-errors "0" --region eu-west-1
"""
I believe the command you pasted is wrong, or you might have copy/pasted wrong:
Considering you are running awscli and sending PowerShell command to be run within the instance, below 2 documents are worth referring.
Send-command CLI: https://docs.aws.amazon.com/cli/latest/reference/ssm/send-command.html
Read-S3Object CmdLet: https://docs.aws.amazon.com/powershell/latest/reference/items/Read-S3Object.html
SSM returning success would still only mean that it was able to execute the underlying plugin (in this case runpowershellscript) - regardless of the fact it was successfully executed or not. In order to investigate why it did not copy the file, you may start with checking the output of the ssm command.
Having said that, below is a working syntax of file copy from s3 object using runPowerShellScript:
aws ssm send-command --instance-ids $instance --document-name "AWS-RunPowerShellScript" --parameters commands=["Read-S3Object -BucketName $bucket -key get-param.reg -File c:\programdata\get-param.reg"]
SSM also provides a way to download s3 object with its own plugin aws:downloadContent
https://docs.aws.amazon.com/systems-manager/latest/userguide/ssm-plugins.html#aws-downloadContent
This would require you to create a custom document (you should find example in the above doc) and just run that document to get the s3 object into windows/linux instance.
I hope this helps.
Here is how I would accomplish what you are attempting:
Instead of AWS-RunPowerShellScript SSM document, use the SSM document AWS-RunRemoteScript.
What this document allows you to do is run a script on the ec2 instance, and then inside of the script you can have it download the files you're looking for in the s3 bucket using the aws s3api cli.
It would look something like this:
aws ssm send-command --document-name "AWS-RunRemoteScript" --document-version "1" --instance-ids $instance --parameters "sourceType=S3, sourceInfo=path:\"[url to script that is stored in s3]", commandLine=".\[name of script]", workingDirectory=\"\", executionTimeout=3600" --timeout-seconds 600 --max-concurrency "50" --max-errors "0"
The powershell script that you upload to s3 will look something like this:
aws s3api get-object --bucket [bucket name here] --key [s3 path (not url)] [path to where you want it downloaded]
To make this work, you need to make sure that the ec2 instance has permissions to read from your s3 bucket. You can do this by attaching an s3 full access policy to your ec2 security role in IAM.

Not able to get data from Amazon S3 to EC2 for Training

I'm new to cloud infrastructure for Deep Learning and trying to use AWS for deep learning first time and I don't know how to access my data from EC2 launched instance.
My data is stored is S3 bucket but I'm not able to find a way how to get it together and start training.
In that EC2 instance. login via ssh.
install aws cli if its not there
configure credentials are add permission for ec2 instance to use s3 bucket.
otherwise add aws secret and access key
get files to your local system
aws s3 cp s3://mybucket/test.txt test2.txt
Get files from local to s3
aws s3 cp test.txt s3://mybucket/test2.txt
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html#examples

Unable to copy from S3 to Ec2 instance

I am trying to copy a file from S3 to an Ec2 instance, here is the strange behavior
Following command runs perfectly fine and show me the contents of s3, that I want to access
$aws s3 ls
2016-05-05 07:40:57 folder1
2016-05-07 15:04:42 my-folder
then I issue following command (also successful)
$ aws s3 ls s3://my-folder
2016-05-07 16:44:50 6007 myfile.txt
but when I try to copy this file, I recive an error as follows
$aws s3 cp s3://my-folder/myfile.txt ./
A region must be specified --region or specifying the region in a
configuration file or as an environment variable. Alternately, an
endpoint can be specified with --endpoint-url
I simply want to copy txt file from s3 to ec2 instance.
At least modify the above command to copy the contents. I am not sure about region as If I visit S3 from web it says
"S3 does not require region selection"
What is happening on the earth?
Most likely something is not working right, you should not be able to list the bucket if your regions is not setup as default in the aws configure.
Therefore from my experience with S3 if this works:
aws s3 ls s3://my-folder
then this should work as well:
aws s3 cp s3://my-folder/myfile.txt ./
However if it's asking you for region, then you need to provide it.
Try this to get the bucket region:
aws s3api get-bucket-location --bucket BUCKET
And then this to copy the file:
aws s3 cp --region <your_buckets_region> s3://my-folder/myfile.txt ./
If I visit S3 from web it says
"S3 does not require region selection"
S3 and bucket regions can be very confusing especially with that message. As it is the most misleading information ever IMO when it comes to s3 regions. Every bucket has got specific region (default is us-east-1) unless you have enabled cross-region replication.
You can choose a region to optimize latency, minimize costs, or
address regulatory requirements. Objects stored in a region never
leave that region unless you explicitly transfer them to another
region. For more information about regions, see Accessing a Bucket: in
the Amazon Simple Storage Service Developer Guide.
How about
aws s3 cp s3://my-folder/myfile.txt .
# or
aws s3 cp s3://my-folder/myfile.txt myfile.txt
I suspect the problem is something to do with the local path parser.
aws cli s3 fileformat parser
It is kinda strange because aws cli read the credential and region config.
The fix is specifying the region, below explains how to get the bucket region if you cant get it from the cli.
aws s3 cp s3://xxxxyyyyy/2008-Nissan-Sentra.pdf myfile.pdf --region us-west-2

How to transfer data from Amazon S3 to Amazon EC2

I am using an EC2 instance and I have enabled the log service of Elastic Load Balancer. The logs are stored in Amazon S3 and I want that data to be used as dataset for Elasticsearch which is present on my EC2 instance. Is there a way I can transfer the data to my EC2 instance or access the data directly from S3 only to be used for Elasticsearch ?
The AWS Command Line Interface (CLI) has commands that make it easy to copy to/from Amazon S3.
You can run them on your EC2 instance to download data from Amazon S3.
aws s3 cp s3://bucket/path/file .
aws s3 cp s3://bucket/path . --recursive
aws s3 sync s3://bucket/path .