Uploading file to AWS from local machine - amazon-web-services

How to use scp command to upload file to aws server
I have .pem file in /Downloads in local machine
I am trying to copy file to /images folder in AWS server
What command can i use ?
Thanks,

You can use plain scp:
scp -i ~/Downloads/file.pem local_image_file user#ec2_elastic_ip:/home/user/images/
You need to put an Elastic IP to the EC2 instance, open port 22 to your local machine IP in the EC2 instance security group, and use the right user (it can be ec2-user, admin or ubuntu (look at the AMI documentation)).

Diego's answer works.. However, if you're unaware of your elastic IP, then you can simply scp using following command (check the order of arguments)
scp -i path-to-your-identifier.pem file-to-be-copied ubuntu#public-IP:/required-path
just for reference, here ubuntu is your AWS user and public-IP is somewhat like 54.2xx.xxx.xxx e.g. 54.200.100.100 or such
(If order is messed up: filename before identifier, then you'll get a Permission denied (publickey).lost connection error)
Also, keep in mind the permissions of .pem file.. Should be 400 or 600. Not public to all.
Hope it helps!

there are number of ways to achieve what you want
use s3cmd http://s3tools.org/s3cmd
or use cyberduck http://cyberduck.ch/
or write a tool using amazon Java API

You can try kitten utility which is a wrapper around boto3. You can easily upload/download files and run commands on EC2 server or on multiple servers at once for that matter.
kitten put -i ~/.ssh/key.pem cat.jpg /tmp [SERVER NAME][SERVER IP]
Where server name is e.g ubuntu or ec2-user etc.
This will upload cat.jpg file to /tmp directory of server

This is the correct way uploading from local to remote.
scp -i "zeus_aws.pem" ~/Downloads/retail_market_db-07-01-2021.sql ubuntu#ec2-3-108-200-27.us-east-2.compute.amazonaws.com:/var/www/
Could be a better approach

Another alternative way to scp is rsync.
Some of the benefits of rsync
faster - uploads only the deltas
enable compression
you can exclude some files from the upload
resume
limit the transfer bandwidth
The rsync cmd
rsync -ravze "ssh -i /home/your-user/your-key.pem " --exclude '.env' --exclude '.git/' /var/www/your-folder-to-upload/* ubuntu#xx.xxx.xxx.xxx:/var/www/your-remote-folder
Now, in case you find this syntax a little bit verbose you can use aws-upload which does all the above but you just tabbing.

Related

Copying files from remote (non GCE) server to google cloud platform bucket over SSH

How to copy files from a remote server to google cloud storage bucket over SSH? This question outlines how to copy files from a google compute engine instance to google cloud storage which is not what I need.
The files are too large to download to my local machine before uploading them. Is it possible to do something like this:
ssh -p 2455 skiventist#myremoteserver | gsutil cp -r /path/to/remote_folder :gs//mybucket
Error:
CommandException: No URLs matched: /path/to/remote_folder
it doesn't matter if it's GCE or another server, they are both remote servers. you need to:
1- ssh into your remote server
2- make sure that you have gsutil installed and that you are authorized to manage the bucket.
3- copy the files to that remote server:
gsutil cp -r /path/to/remote_folder gs//mybucket
you can use also -m flag which will perform a parallel multi-threaded/multi-processing and make the transfer even faster.
check this doc for more info

How to AWS s3 sync from remote server to s3 bucket?

I am a newbie on aws and server in general. Hope someone can help me :)
What I want to do:
I have some image files on remote server and I would like to copy them to aws s3 bucket.
I can ssh this remote server once I ssh intermediate server.
so it looks like
me(local) <-----> server1 <----> server2 where image files on
What I did:
*I first ssh server1.
*I installed aws-cli on that server, and set configure.
*Then I entered aws s3 sync <username-I-use-for-server2>#<server2-IP>:/images/pc s3://<s3 bucket name>
Problem:
What I expected was to get password prompt from server2 then sync the files in images/pc to s3 bucket
but I got the error saying The user-provided path <username-I-use-for-server2>#<server2-IP>:/images/pc does not exist.
can anyone tell me What am I doing wrong?
I've checked that I can ssh server2 and that there's files inside /images/pc as well.
Any help would be appreciated!!
It is expected that this is a local path to the current server.
The available combinations are <LocalPath> <S3Uri> or <S3Uri> <LocalPath> or <S3Uri> <S3Uri>.
You will need to run this command against a local path on the server you're trying to copy the objects from.
Remember that all this command is actually doing is running aws s3api putObject under the hood along with some management to perform operations such as recursion on your local disk. It does not handle any SSH interactions.
If you wanted to share any assets between servers you would need to mount them between servers (for example using an NFS mount) before you are able to sync them up, otherwise run the command on the other server.

Cannot copy file from local machine to AWS ec2 instance

In my Amazon EC2 instance, my application is located at /usr/local/apps/[my_app]. My problem is that I cannot transfer my file from local machine to the remote server. Supposed on my local path directory, I have a file named "dog.txt". On sftp connection with my Amazon EC2 instance, I can put dog.txt successfully; however, when I jump into /usr folder, I cannot even put dog.txt anymore. It said that
Uploading dog.txt to /usr/dog/txt
remote open("/usr/dog.txt"): Permission denied
Please help
it has nothing to do with AWS or EC2. It is a simple UNIX concept. To explain this in a simple way, just sftp all your files first to /tmp on remote server.
in order to solve your problem execute this command
$ chmod 777 -R /usr/local/apps/[my_app]
After this command is executed, you can copy files directly from your local machine or from /tmp.

How to download my application file from AWS

I have my app stored on AWS. How can i download/take down all those files to my local system from AWS ?
I know how to upload a file to aws using command line scp but don't know how to download from aws.
You can use scp for downloading too.
Upload : scp your-local-file yourId#yourAWShost:/directory/
Download : scp yourID#yourAWShost:/directory/filename /your/local/host/directory/
btw if you're using ".pem" key with it, you need to add option("-i") too.
For example(Download from yourAWShost)...
scp -i yourkey.pem yourID#yourAWShost:/directory/filename /your/local/host/directory/

Transferring Files between two EC2 Instances in the same region

I have 2 EC2 instances running Ubuntu 14.04 and I need to figure out how to transfer files from one to another. I read the FAQs from Amazon and it says that I can do this without incurring any additional costs if I use the private IP but I am not sure how to transfer the files using that.
Right now I use the scp protocol to do this -
scp -i ~/Path-To-Key-File/AAA.gem /path/file ec2-user#<Elastic IP>:/path/file
I tried replacing the elastic IP with private IP but it doesn't work. Am I doing something wrong here?
Actually, I figured it out ... I just needed to replace the Elastic IP with the private IP and configure the security groups properly to allow instances to communicate!
Transferring from Machine A to Machine B
I am running this code on machine A
scp -i ~/Path-To-Key-File/AAA.pem /path/file ec2-user#<Private IP of Machine B>:/path/file
For security groups, I had to allow SSH protocol over the private IP (from Machine B)!!
Assuming both of your instances are EC2 linux instances.
suppose you want to transfer file from the second instance(ec2-2) to first instance(ec2-1), the command should be run in ec2-1 is:
scp -i /Path-To-Key-File-for-ec2-2/key.pem ec2-user#Elastic-IP-of-ec2-2:/path/filename your/local-path-on-ec2-1/filename
A corresponding discussion you can find here
Hope this help!!
This question is asked about authentication with the .pem file. But accessing without auth could be helpful in some cases. Here, you will authorize another machine instead.
Say, you like to ssh or scp from machine-1 to machine-2.
In machine-1.
Check if there is a public key file (id_rsa.pub) in USER_HOME/.ssh/. If not, generate it with ssh-keygen -t rsa command.
In machine-2
Uncomment PubkeyAuthentication yes in /etc/ssh/sshd_config.
Open file USER_HOME/.ssh/authorized_keys and append contents of id_rsa.pub file from the machine-1.
Now you can copy it with scp as following:
scp username_machine1#ip_machine1:/file/to/copy /destination/path
You are done. Enjoy!!!
For detailed information please check here.
scp -i /home/centos/b1.pem centos#ip:/etc/httpd/conf/httpd.conf httpd.conf.j2
Copy Data from local to ec2 and one ec2 to another(if you are the inside source ec2)
scp -ri <key file path> <copy data file location> <Public DNS (IPv4)>:~/
Example:-
scp -ri practical.pem serverdata1.tar
ubuntu#ec2-xx-xxx-xxx-xxx.ap-southeast-1.compute.amazonaws.com:~/