Cannot copy file from local machine to AWS ec2 instance - amazon-web-services

In my Amazon EC2 instance, my application is located at /usr/local/apps/[my_app]. My problem is that I cannot transfer my file from local machine to the remote server. Supposed on my local path directory, I have a file named "dog.txt". On sftp connection with my Amazon EC2 instance, I can put dog.txt successfully; however, when I jump into /usr folder, I cannot even put dog.txt anymore. It said that
Uploading dog.txt to /usr/dog/txt
remote open("/usr/dog.txt"): Permission denied
Please help

it has nothing to do with AWS or EC2. It is a simple UNIX concept. To explain this in a simple way, just sftp all your files first to /tmp on remote server.
in order to solve your problem execute this command
$ chmod 777 -R /usr/local/apps/[my_app]
After this command is executed, you can copy files directly from your local machine or from /tmp.

Related

How to copy file from EC2 to local machine?

How to copy a file or folder from EC2 instance? I want to download a file from my server but I don't have idea how to do it.
P.S. I know how to copy in EC2.
You can use scp to securely copy a file from your EC2 instance to your local machine. You will need three things:
Your ec2key.pem key -- You created this when you created the EC2 instance
Your EC2 username and IP -- You can find this in the EC2 Console ('Connect to Instance' button)
Path to your file
On your local machine, open up your command line, and type:
scp -i ec2key.pem username#xx.xxx.xx.xxx:/path/to/file .
Note that the period at the end signifies that the file is to be saved 'here'.

Copying files from remote (non GCE) server to google cloud platform bucket over SSH

How to copy files from a remote server to google cloud storage bucket over SSH? This question outlines how to copy files from a google compute engine instance to google cloud storage which is not what I need.
The files are too large to download to my local machine before uploading them. Is it possible to do something like this:
ssh -p 2455 skiventist#myremoteserver | gsutil cp -r /path/to/remote_folder :gs//mybucket
Error:
CommandException: No URLs matched: /path/to/remote_folder
it doesn't matter if it's GCE or another server, they are both remote servers. you need to:
1- ssh into your remote server
2- make sure that you have gsutil installed and that you are authorized to manage the bucket.
3- copy the files to that remote server:
gsutil cp -r /path/to/remote_folder gs//mybucket
you can use also -m flag which will perform a parallel multi-threaded/multi-processing and make the transfer even faster.
check this doc for more info

How to AWS s3 sync from remote server to s3 bucket?

I am a newbie on aws and server in general. Hope someone can help me :)
What I want to do:
I have some image files on remote server and I would like to copy them to aws s3 bucket.
I can ssh this remote server once I ssh intermediate server.
so it looks like
me(local) <-----> server1 <----> server2 where image files on
What I did:
*I first ssh server1.
*I installed aws-cli on that server, and set configure.
*Then I entered aws s3 sync <username-I-use-for-server2>#<server2-IP>:/images/pc s3://<s3 bucket name>
Problem:
What I expected was to get password prompt from server2 then sync the files in images/pc to s3 bucket
but I got the error saying The user-provided path <username-I-use-for-server2>#<server2-IP>:/images/pc does not exist.
can anyone tell me What am I doing wrong?
I've checked that I can ssh server2 and that there's files inside /images/pc as well.
Any help would be appreciated!!
It is expected that this is a local path to the current server.
The available combinations are <LocalPath> <S3Uri> or <S3Uri> <LocalPath> or <S3Uri> <S3Uri>.
You will need to run this command against a local path on the server you're trying to copy the objects from.
Remember that all this command is actually doing is running aws s3api putObject under the hood along with some management to perform operations such as recursion on your local disk. It does not handle any SSH interactions.
If you wanted to share any assets between servers you would need to mount them between servers (for example using an NFS mount) before you are able to sync them up, otherwise run the command on the other server.

How can i transfer a file or files from a bucket to the file system of a GCE instance?

The documentation says download it, but when i try downloading, it downloads back to my local workstation. Is there a way to just transfer it within the cloud?
To download a file from GCS bucket to an instance:
SSH to instance (using gcloud ssh command or clicking on SSH button on the instances page in console.cloud.google.com).
Issue command `gsutil cp gs://$BUCKET/$file .

Uploading file to AWS from local machine

How to use scp command to upload file to aws server
I have .pem file in /Downloads in local machine
I am trying to copy file to /images folder in AWS server
What command can i use ?
Thanks,
You can use plain scp:
scp -i ~/Downloads/file.pem local_image_file user#ec2_elastic_ip:/home/user/images/
You need to put an Elastic IP to the EC2 instance, open port 22 to your local machine IP in the EC2 instance security group, and use the right user (it can be ec2-user, admin or ubuntu (look at the AMI documentation)).
Diego's answer works.. However, if you're unaware of your elastic IP, then you can simply scp using following command (check the order of arguments)
scp -i path-to-your-identifier.pem file-to-be-copied ubuntu#public-IP:/required-path
just for reference, here ubuntu is your AWS user and public-IP is somewhat like 54.2xx.xxx.xxx e.g. 54.200.100.100 or such
(If order is messed up: filename before identifier, then you'll get a Permission denied (publickey).lost connection error)
Also, keep in mind the permissions of .pem file.. Should be 400 or 600. Not public to all.
Hope it helps!
there are number of ways to achieve what you want
use s3cmd http://s3tools.org/s3cmd
or use cyberduck http://cyberduck.ch/
or write a tool using amazon Java API
You can try kitten utility which is a wrapper around boto3. You can easily upload/download files and run commands on EC2 server or on multiple servers at once for that matter.
kitten put -i ~/.ssh/key.pem cat.jpg /tmp [SERVER NAME][SERVER IP]
Where server name is e.g ubuntu or ec2-user etc.
This will upload cat.jpg file to /tmp directory of server
This is the correct way uploading from local to remote.
scp -i "zeus_aws.pem" ~/Downloads/retail_market_db-07-01-2021.sql ubuntu#ec2-3-108-200-27.us-east-2.compute.amazonaws.com:/var/www/
Could be a better approach
Another alternative way to scp is rsync.
Some of the benefits of rsync
faster - uploads only the deltas
enable compression
you can exclude some files from the upload
resume
limit the transfer bandwidth
The rsync cmd
rsync -ravze "ssh -i /home/your-user/your-key.pem " --exclude '.env' --exclude '.git/' /var/www/your-folder-to-upload/* ubuntu#xx.xxx.xxx.xxx:/var/www/your-remote-folder
Now, in case you find this syntax a little bit verbose you can use aws-upload which does all the above but you just tabbing.