Dynamic usage of AWS S3 - web-services

I trying to explore AWS S3 and I found out that we can store data and have a URL for a file which can be used on a website, but my intention is to store files on S3 and have users of my website post and retrieve files to/from S3 without my intervention. I am trying to have my server and JSP/Servlets pages on EC2 on which Tomcat (and MySQL server) will be running.
Is this possible and if yes, how can i achieve this.
Thanks,
SD

Yes, it's possible. A full answer to this question is tantamount to a consulting gig, but some resources that should get you started:
The S3 API
Elastic Beanstalk for your webtier
Amazon RDS for MySQL

Related

Accessing Amazon S3 via FTP?

I have did a number of searches and can't seem to understand if this is doable at all.
I have a data logger that has FTP-push function. The FTP-push function have the following settings:
FTP server
Port
Upload directory
User name
Password
In general, I understand that a Filezilla client (I have a Pro edition) is able to drop files into my AWS S3 bucket and I had done this successfully in my local PC.
Is it possible to remove the Filezilla client requirement and input my S3 information directly into my data logger? Something like the below diagram:
Data logger ----FTP----> S3 bucket
If not, what will be the most sensible method to have my data logger JSON files drop into AWS S3 via FTP?
Frankly, you'd be better off with:
Logging to local files
Using a schedule to copy the log files to Amazon S3 using the aws s3 sync command
The schedule could be triggered by cron (Linux) or a Scheduled Task (Windows).
Amazon did add support recently to AWS Transfer for FTP support. This will provide an integration with Amazon S3 via FTP without setting up any additional infrastructure, however you should review the pricing at the moment.
As an alternative you could create an intermediary server that can sync between itself and AWS S3 using the cli aws s3 sync.

What AWS instance should i use?

I have a single HTML landing page and I expect around 50,000 to 100,000 visitors per day
(no server side code)
Only HTML and a little bit JavaScript.
So what AWS instance type I should use so my webpage will not crash?? Right now I have the free tier : t2.micro with window server 2016 do I need to upgrade? or this is good enough?
thanks.
Using AWS S3 Only
For static page hosting you can use AWS S3. You need to create a S3 bucket and enable static website hosting. For more details refer Example Walkthroughs - Hosting Websites on Amazon S3.
Using AWS S3 & CloudFront
Since you are expecting more traffic, you can reduce the cost and improve the performance by using AWS CloudFront where it will cache the content at Edge locations of the content delivery network. You can also setup free AWS Certificate Manager issued SSL Certificates if you use CloudFront.
If there is no backend code, then you can do it using just S3 and CloudFront.

Can I use AWS S3 for hosting, and EC2 to process my form submits?

So I am thinking of migrating my website to Amazon S3 since it's super cheap and fast, however, I use PHP and AJAX to submit my contact forms. Would it be possible to host my site using AWS S3 and then send all HTTP POSTs to the EC2 instance?
Yes, this is very well possible. However, if you're running an EC2 instance anyways and your traffic is not enormous, you might as well serve your static files from your EC2 instance.
It is not possible to host php site on AWS S3 only static content like images, css or js can be put their.
For dynamic content you have to make use of aws instance.
https://forums.aws.amazon.com/message.jspa?messageID=453142
Correct Usage of Amazon Web Services S3 for Server - Side Scripting

AWS service for website

I have website which I wanted to host on AWS. My website stores the data in back end in RDBMS/MongoDB and uses PHP/Javascript/python and etc...
My website will be receiving data from users and I will be using it for analysis. I want to do any installation.
Which is the best for my requirement AWS s3 or AES E2?
S3 is just file storage, you can't run dynamic applications (PHP/Python/etc.) or databases on S3. You probably need to run your application on EC2, your database on either EC2 or RDS, and store your application's static files on S3.

File upload API on EC2 with ELB and S3

I am developing app server with NodeJS and AWS.
I am setting up the server environment with ELB and EC2s.
I am using ELB as load balancer and attached several app server EC2 instances to it.
And one EC2 instance is used for MongoDB.
My question is about request including file upload.
I think uploaded file should not be in app server (EC2 instance), so I will try to save uploaded files in S3 and allow app servers (EC2 instances) to access it.
The rough solution is that if app servers accept a file from client, move it to S3 and delete the file on the app server.
But then it will cause some performance loss and I don't feel it's a clean way.
Is this a best way? or there is another way to solve it.
I think it's best way to upload file to S3.
But file is uploaded with other data. (For example, profile upload - name: String, age: Number, profileImage: File)
I need to process other data on app server, so client should not upload to S3 directly.
Is there any better idea?
Please save me.
P.S: Please let me know if you cannot understand my expression because I am not native. If so, I will add some explanation for it with my best!
You can directly upload to S3 using temporary credentials that allow the end user to write to you bucket.
There is a good article with detailed code for doing exactly what you are trying to do with node.js here.
Answers that refer to external links are frowned upon on SO, so in a nutshell:
include the aws sdk in your application
provide it with appropriate credentials
use those credentials to generate a signed URL with a short lifespan
provide the end user with the signed URL they can then use to upload, preferably asynchronously with progress feedback