Amazon Web service. Upload a war - amazon-web-services

I want to deploy a x.war in my tomcat of AWS, but I don't manage upload the file.
I have written this in my console.
scp -i x.pem /home/user/folder/x.war ubuntu#52.16.43.22/folder
but ubuntu said to me, is not a directory. The directory /folder exist in my instance of AWS.

You forgot the : between the server and the local file path
scp -i x.pem /home/user/folder/x.war ubuntu#52.16.43.22:/folder

Related

Cannot copy file from local machine to AWS ec2 instance

In my Amazon EC2 instance, my application is located at /usr/local/apps/[my_app]. My problem is that I cannot transfer my file from local machine to the remote server. Supposed on my local path directory, I have a file named "dog.txt". On sftp connection with my Amazon EC2 instance, I can put dog.txt successfully; however, when I jump into /usr folder, I cannot even put dog.txt anymore. It said that
Uploading dog.txt to /usr/dog/txt
remote open("/usr/dog.txt"): Permission denied
Please help
it has nothing to do with AWS or EC2. It is a simple UNIX concept. To explain this in a simple way, just sftp all your files first to /tmp on remote server.
in order to solve your problem execute this command
$ chmod 777 -R /usr/local/apps/[my_app]
After this command is executed, you can copy files directly from your local machine or from /tmp.

How to download my application file from AWS

I have my app stored on AWS. How can i download/take down all those files to my local system from AWS ?
I know how to upload a file to aws using command line scp but don't know how to download from aws.
You can use scp for downloading too.
Upload : scp your-local-file yourId#yourAWShost:/directory/
Download : scp yourID#yourAWShost:/directory/filename /your/local/host/directory/
btw if you're using ".pem" key with it, you need to add option("-i") too.
For example(Download from yourAWShost)...
scp -i yourkey.pem yourID#yourAWShost:/directory/filename /your/local/host/directory/

How do I configure AMS admin permissions on EC2 AWS?

I created a fresh Adobe Media Server 5 (aka Flash Media Server) instance on Amazon EC2 AWS, I connected over SSH and changed the default password, everything looks ok. But the amsadmin account doesn't seem to have enough access to administrate the instance. I can read most of the files and folders (access denied for /lib; /lost+found and /root).
I can connect ok
ssh -i key.pem amsadmin#public-dns.amazonaws.com
I can upload to the media folder
scp -i key.pem test.f4v amsadmin#public-dns.amazonaws.com:/mnt/applications/vod/media
I can't upload content to webroot; for example if I wanted to add a crossdomain.xml file to the web root.
scp -i key.pem crossdomain.xml amsadmin#public-dns.amazonaws.com:/mnt/webroot/
I can't create a directory; I want to setup S3 to store content, step 5 says to:
Create a directory on the instance to mount the S3 bucket, for example, /mnt/s3fs.
But under the amsadmin account I get an access denied error, is this expected?
How can I get access to these restricted areas?
Update:
I tried executing the commands with sudo and I entered the correct password for amsadmin, got "Sorry, try again".
Do I need to setup a root account?
Figured it out, I needed to run the commands using sudo sh after connecting to the instance via SSH.

deploy openfire xmpp in aws elastic beanstack

I have installed openfire on my windows system. It is working fine. Now I want to deploy openfire xmpp server on AWS elastic beanstack . I have been searching on web with no solutions.
How to deploy openfire xmpp in aws elastic beanstack?
You can create a custom AMI and install openfire there, then create your Beanstalk with this AMI.
You can install it your existing beanstalk environment.
need to login to ur aws environment/instance using putty.
In order to login you need to have .pem file converted to.ppk
using wget download the openfire from its site (tar.gz file)
extract tarfile from downlaoded tar into /usr/local/
EX: extract it using tar -zxvf openfire.tar.gz
cd /openfire/bin/ (Go into bin folder)
sh openfire start. (Start the openfire)
This will start the server
Check its status using sh openfire status from bin folder.

Uploading file to AWS from local machine

How to use scp command to upload file to aws server
I have .pem file in /Downloads in local machine
I am trying to copy file to /images folder in AWS server
What command can i use ?
Thanks,
You can use plain scp:
scp -i ~/Downloads/file.pem local_image_file user#ec2_elastic_ip:/home/user/images/
You need to put an Elastic IP to the EC2 instance, open port 22 to your local machine IP in the EC2 instance security group, and use the right user (it can be ec2-user, admin or ubuntu (look at the AMI documentation)).
Diego's answer works.. However, if you're unaware of your elastic IP, then you can simply scp using following command (check the order of arguments)
scp -i path-to-your-identifier.pem file-to-be-copied ubuntu#public-IP:/required-path
just for reference, here ubuntu is your AWS user and public-IP is somewhat like 54.2xx.xxx.xxx e.g. 54.200.100.100 or such
(If order is messed up: filename before identifier, then you'll get a Permission denied (publickey).lost connection error)
Also, keep in mind the permissions of .pem file.. Should be 400 or 600. Not public to all.
Hope it helps!
there are number of ways to achieve what you want
use s3cmd http://s3tools.org/s3cmd
or use cyberduck http://cyberduck.ch/
or write a tool using amazon Java API
You can try kitten utility which is a wrapper around boto3. You can easily upload/download files and run commands on EC2 server or on multiple servers at once for that matter.
kitten put -i ~/.ssh/key.pem cat.jpg /tmp [SERVER NAME][SERVER IP]
Where server name is e.g ubuntu or ec2-user etc.
This will upload cat.jpg file to /tmp directory of server
This is the correct way uploading from local to remote.
scp -i "zeus_aws.pem" ~/Downloads/retail_market_db-07-01-2021.sql ubuntu#ec2-3-108-200-27.us-east-2.compute.amazonaws.com:/var/www/
Could be a better approach
Another alternative way to scp is rsync.
Some of the benefits of rsync
faster - uploads only the deltas
enable compression
you can exclude some files from the upload
resume
limit the transfer bandwidth
The rsync cmd
rsync -ravze "ssh -i /home/your-user/your-key.pem " --exclude '.env' --exclude '.git/' /var/www/your-folder-to-upload/* ubuntu#xx.xxx.xxx.xxx:/var/www/your-remote-folder
Now, in case you find this syntax a little bit verbose you can use aws-upload which does all the above but you just tabbing.