How to Upload files to AWS from Ubuntu 15.04 - amazon-web-services

I'm doing a project and have to access AWS in order to maintain files and other stuffs.
I've connected to my AWS instance and now want to upload some files from my own computer system. There I can use only terminal.
My question is that how can I upload those files or directories to my AWS instance from my computer or any other method like first I upload those files to any third party server like github and then download or clone it from there?
Please suggest me an easy way of uploading from my own computer or present me with an alternative.

Pushing your code to Github and pulling from Github when you ssh into an AWS instance is a good way to deploy.
https://gist.github.com/oodavid/1809044

Related

how can i transfer my project from github repository to aws server? which type of aws services(ec2 or s3) i can use to do so?

I have already created an instance in ec2 and connected to the ssh
how can I transfer my project from GitHub repository to AWS server? which type of AWS services(ec2 or s3) i can use to do so??
In the GitHub web UI, find the big green "Code" button
and click on Code --> Download ZIP.
Use scp to transfer it to your ec2 instance.
Or use the amazon web console to upload the .zip to an S3 bucket.
Or install CLI tools and use
$ aws s3 sync myproject.zip s3://mybucket/
I would say use the git clone command for your project and if your unable to create a key you can use git clone with the ssh option - which is the same but requires you to put password for every pull or push
Regarding the instance if your deploying a application you should use the EC2 instance where as S3 instance is more often used for storing objects such as image files or other type of files but to run actual code you should use EC2
The best way to do this is by using the git clone command.
Login to your ec-2 instance via ssh.
ssh -i "/path/to/pem/file.pem" remote-name#ec2-xx-xx-xxx-xxx.us-xxxx-x.compute.amazonaws.com
After logging in, run the following:
git clone <link to repo>
Please note that the link to repository is made available once you click on the Code button on your repo page.
The other way of doing this, i.e., downloading the repo and then uploading it to your ec-2 instance via scp is inefficient. The secure copy (scp) is only good if you want to copy a project that resides in your local machine. Otherewise, there seems no purpose of first downloading and then uploading the whole project.
Also, I would not recommend putting your code base on s3, as it is not made for this purpose. It is better if your project resides in ec-2. If you want that your environment persists when your stop and again start your instance, then use ebs instead.

Icecast server with AWS S3 files

I'm currently running an icecast server for streaming audio on an EC2 Instance.
Currently all my .mp3 files are stored on the EC2 instance and I want to move them to AWS S3 for storage. So far I've been able to find scripts that will update the playlist but will not make the server request external sources.
Is it possible to setup this architecture? Any help would be appreciated.
How about mounting the S3 bucket as a directory and just using that?
https://github.com/s3fs-fuse/s3fs-fuse / https://github.com/s3fs-fuse/s3fs-fuse/wiki/Fuse-Over-Amazon
https://github.com/russross/s3fslite
As you only read the files, this should be without major issues.

Is it possible to copy files from amazon aws s3 directly to remote server?

I receive some large data to process and I would like to copy the files to my remote GPU server for processing.
the data contains 8000 files x 9GB/per file which is quite large.
Is it possible to copy the files from aws directly to the remote server (used with ssh)
I have googled it online and did not find anyone come up with the question..
If anyone could kindly provide a guide/url example I would appreciate a lot.
Thanks.
I assume your files are residing in S3.
If that is the case then you can simply install AWS CLI on your remote machine and use aws s3 cp command
For more details click here

AWS EB and S3: DL and manipulate archive content from the cloud

I am developing a web-app which will be hosted in an AWS EB. So far, I am done with the entire back-end, and am currently working on the build for deployment.
One thing bothers me however: my back-end accesses data from an AWS S3 Bucket, downloads the files locally (!), works it's magic on it and uploads it to a different bucket.
Since the data being downloaded is being downloaded in a local folder on my computer which won't be available once deployed, how can or should I modify this to make it run in the AWS EB instance? I absolutely need to DL the files (archives) to extract and modify them. How can I achieve this in the cloud ? Or am I looking at it the wrong way?
Any help would be appreciated, please be gentle I'm kind of new to the entire EE world ...

Easier way to update files on Amazon EC2 instance?

Major newbie when it comes to Amazon EC2 servers, and web development in general.
At the moment I have a web app that is hosted on parse. Everything is done on the client side in the browser, and I want to change it to a client server model by writing a server in node.js.
I've looked into Amazon EC2, I've set up and instance and it looks good. My question is however:
Is there an easier way to update files on the instance? At the moment I'm pushing all the files from my computer to a github repo, then pulling them on to the instance- this seems very long winded. When using parse, all I needed to type was 'parse deploy' into the command line to update and deploy my application, is there something like this for Amazon EC2?
Thank you
I typically install or enable FTP on my ec2 instances and then just use the ftp client of my choice to update files.