AWS Sync S3 with EC2 Clarification Question - amazon-web-services

If I sync an s3 bucket to an EC2 instance, when a new file is added to s3 does it automatically add it to my EC2 instance, or is it only the other way around (EC2 file --> add to s3 bucket).
I need to have any new files added to my s3 bucket to be added to my EC2 instance, so s3 bucket --> EC2.

No, a sync is a one-time operation. You would need to run it again each time files change wether your are copying them to or from s3.
Depending on your OS on the EC2 instances, there are ways to 'mount' an s3 bucket as a folder on an ec2 instance.
Googling 'mount s3 bucket to ec2' will get you started if that is what you want to do.

Related

How to copy aws s3 files to lightsail bucket?

How do we cp s3 files to lightsail bucket files?
I want to copy or move all s3 files to lightsail bucket.
Because lightsail bucket is much cheaper than s3.
I leave solution:
aws configure
You need to configure for s3 download.
Just put your AWS_ACCESS_KEY_ID and SECRET for s3
aws s3 cp --recursive s3://<bucketname>/ ./s3
Download all s3 fles to s3 folder.
aws configure
You need to apply lightsail bucket access key id and secret like s3.
aws s3 cp --recursive ./s3 s3://<bucketname>/
Then all files are copied.
It's so easy as long as you divide steps.
But I was trying to copy from s3 to lightsail bucket directly.
It's very complicated Because IAM role can't be shared.
I think lightsail is isolated service by s3 and ec2.
So it's cheap and easy.

Question on using aws cli to deploy code to a EC2 instance

I am looking at using jenkins to deploy a war file to an EC2 instance. I have set up similar before. Creating an EC2 instance, a S3 Bucket and a Code Deploy application. The way that worked was that :
1)zip up load the war/jar into a S3 Bucket.
2) Use AWS steps createDeployment to deploy the zip file from the S3 Bucket to the EC2. This would also involve creating a appspec.yml and scripts to set up the environment.
But have been told there is another way. that does not need setting up a code deploy.
I have created an Ec2 instance, set up a docker container inside it, with all the environment settings.
And what I would like to do is load my zip file into the EC2. That I dont need a AWS codedeploy application.
is this correct, is there a AWS CLI command to simply load a zip file into the EC2 instance.
Thank you for any help.
You can copy from an s3 bucket
To copy files from a S3 bucket to EC2 instance,
Create an IAM role with S3 write access or admin access
Map the IAM role to an EC2 instance
Install AWS CLI in EC2 instance
Run the AWS s3 cp command to copy the files from S3 to EC2
To copy the files from S3 to EC2, Keep the source as the bucket URL and the destination to your local directory or filename
To copy the files from S3 to EC2
aws s3 cp s3://<S3BucketName> <Fully Qualified Local filename/Directory>
In the previous command, you can see the difference. Here the source is S3 Bucket URL and the destination is a local file name or directory name.

How to download from requester payer aws s3 bucket?

How can I go about downloading from this s3 bucket:
aws s3 cp --request-payer requester s3://adatascientist/parler/v1/post.ndjson.zip
I am not very technically minded, do I need a particular software or programme to do this? I have already created my AWS account I am just not sure on the next steps.
The website is here for any extra information: https://parler.adatascienti.st/research
Assuming that you have installed the AWS Command-Line Interface (CLI), then:
Use aws configure and provide your Access Key and Secret Key
Use aws s3 cp --request-payer requester s3://adatascientist/parler/v1/post.ndjson.zip . to copy the file to the current local directory
There is no Data Transfer charge if you do this from an Amazon EC2 instance in the same Region. If you download to your own computer, you will be charged for Data Transfer (9c/GB).

Run AWS CLI from local without storing credentials in local

How to run aws cli to download s3 bucket data without storing aws credential in local machine?
Please Note that s3 bucket is not a public bucket.
Not sure what your goal is, but you can use environment variables which you are only exporting for the current session/aws_cli run.
To prevent in bash (asuming you are using linux) that the export is written to history, you can use a space infront of the command.
You can start an EC2 instance and give that instance a role that allows it to read from your S3 bucket.
Once started, connect to the EC2 instance using ssh and initiate your S3 transfer using aws s3 cp...ˋ or ˋaws s3 sync...

Not able to get data from Amazon S3 to EC2 for Training

I'm new to cloud infrastructure for Deep Learning and trying to use AWS for deep learning first time and I don't know how to access my data from EC2 launched instance.
My data is stored is S3 bucket but I'm not able to find a way how to get it together and start training.
In that EC2 instance. login via ssh.
install aws cli if its not there
configure credentials are add permission for ec2 instance to use s3 bucket.
otherwise add aws secret and access key
get files to your local system
aws s3 cp s3://mybucket/test.txt test2.txt
Get files from local to s3
aws s3 cp test.txt s3://mybucket/test2.txt
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html#examples