Upload file from my laptop to EC2 instance - amazon-web-services

I am trying to upload file from my laptop to ec2 instance.
I am trying with this:
$ scp -i ec2_instance_key.pem ~/WebstormProjects/RESTAPI/config.js ec2-user#ec2-xxx.eu-west-x.compute.amazonaws.com:~/data/
When I launch the command, terminal responses with:
scp: /home/ec2-user/data/: Is a directory
But I put it in ec2 terminal:
$ cd /home/ec2-user/data
And it responses with no such file or directory
And it copies file in my laptop again, in ec2-instance-key path.
What is the problem?
Thank you very much.

The probable case is :
There is no directory named data in home directory of ec2-user.
OR a file named data is mistakenly created in home directory, while doing scp.
Solution :
Check if there exist a file named data at home directory of ec2-user.
Move the file ~/data to some other name(if exist).
Create a directory named data in home directory of ec2-user.
Give proper access permissions to the newly created data directory. (chmod 644 ~/data).
Try uploading the file again by using following command :
scp -i ec2_instance_key.pem ~/WebstormProjects/RESTAPI/config.js ec2-user#ec2-xxx.eu-west-x.compute.amazonaws.com:~/data

Related

SCP won't find file even though it exists

I'm trying to SCP some files from a remote server. The command I'm using is:
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/analysis .
I receive an error:
scp: /home/ubuntu/analysis: No such file or directory.
If I scp another file in my home directory, it does work, e.g.
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/.viminfo .
If I create a new file, e.g. with touch new_file.txt, I also cannot download that file.
The permissions and owners for .viminfo and the directory analysis are standard.
Why isn't the SCP working? I have been able to download files from this server before, but something has changed.
Quite confusing - any advice would be appreciated!
Thanks!

cannot open path of the current working directory: Permission denied when trying to run rsync command twice in gcloud

I am trying to copy the data from the file stores to the google cloud bucket.
This is the command I am using:
gsutil rsync -r /fileserver/demo/dir1 gs://corp-bucket
corp-bucket: Name of my bucket
/fileserver/demo/dir1: Mount point directory (This directory contain the data of the file store)
This command works fine in the first time, It copies the data of the directory /fileserver/demo/dir1 to the cloud bucket but then I delete the data from the cloud bucket and again run the same command without any changes then I get this error:
cannot open path of the current working directory: Permission denied
NOTE: If I made even a small changes to the file of the /fileserver/demo/dir1 and run the above command then again it works fine but my question is why it is not working without any changes and is there any way to copy file without making any changes
Thanks.
You may be hitting the limitation #2 of rsync " The gsutil rsync command considers only the live object version in the source and destination buckets"; You can exclude /dir1... with the -x pattern and still let rsync makes the clean up work as the regular sync process.
Another way to copy those files will be to use cp with -r option to make it recursively instead of rsync.

Trying to copy folder into AWS EC2 instance: Getting no directory found?

So I am trying to SSH into ec2 instance and copy folder from my desktop into instance.
Command Typed: scp -i -r prac1.pem SocialTrends ubuntu#[ec2-54-1....amazonaws.com]:socialtrendsApp/app
Error: Warning: Identity file -r not accessible: No such file or directory
I am typing this command from ~/SocialTrends directory which is what I am trying to copy. This folder has code files and the prac1.pem file already
What am I doing wrong please help?!
You have placed the command line option -r in between -i and prac1.pem. Move it to be after prac1.pem

Downloading folder from GCS to local directory

I'm trying to download a folder from my Cloud Storage bucket to local directory using the command gsutil cp -r gs://bucket/my_folder . . But it is showing OSError : Access is denied. Any idea how to get around this problem?
I can reproduce this error if I do not have permissions to create LOCAL_DEST_DIR on my local machine.
$ gsutil cp -r gs://BUCKET_NAME/DIR_IN_BUCKET LOCAL_DEST_DIR
Copying gs://BUCKET_NAME/DIR_IN_BUCKET/FILE...
OSError: Permission denied.
Please check you have permissions to create a file/directory in your current working directory.
You can run touch test-file.text to verify if you're able to create files in the current directory.
If you're on linux/*nix/mac, usually you will have full permissions to create files and directories in your $HOME directory, so you can try running the gsutil command in that directory.

EC2 - chmod: cannot access ‘mypemfile.pem’: No such file or directory

I download pem file while launching t2.small instance. When I try to connect it via ssh it say no such a file in directory. But am sure that pem file is in directory.
$ ls
mypemfile.pem
$ chmod 400 mypemfile.pem
chmod: cannot access ‘mypemfile.pem’: No such file or directory
$ ssh -i "mypemfile.pem" root#x.x.x.xx
Warning: Identity file mypemfile.pem not accessible: No such file or directory.
Permission denied (publickey).
How to track this issues? and any solutions?
Note: I created instance from AMI image shared by another account.
For sure the problem is because there is no "mypemfile.pem" file.
Recheck the availability of the file, if the file is available try to rename it or make a copy of it and try with the newly created file.
Hope it helps..
I figured it out on Mac. So, this is what I had to do. When you created private key, my Mac saved is as whatever.pem.txt, so in order to connect to the AWS instance just add .txt extension to whatever AWS instructions tell you to do. For example:
chmod 400 yourfile.pem.txt ssh -I "yourfile.pem.txt" ubuntu#ecX-XX-XX-XXX-XXX.compute-1.amazonaws.com
This is for Mac users.
Best,