updating an S3 website with zero downtime - amazon-web-services

What would be the best approach to update a static website hosted on an S3 bucket such that there is no downtime. The updates will be done by marketing teams of a company with zero knowledge of cli commands or how to move around in the console. Are there ways to achieve this without having to learn to move around in the console?
Edit
The web site is a collection of static html pages and will be updated using an Html editor. Once edited the marketing team will upload each individual updated file to the S3 bucket. There are no more than 10 such files including html and images. This was currently being hosted on a shared server and we now want to move it to an S3 bucket capable of hosting simple web pages. The preference is to not provision console access for certain users as they are comfortable only using a WYSIWYG html editor and uploading using an FTP client. The editors don't know html and the site doesn't use javascript. I am thinking of writing a batch script to manage the uploads to keep all that cli complexity away so they only work on the HTML in the editor. Looking for the simplest approach to achieve this.

You can set up a CI/CD pipeline as mentioned here
Update static website in an automated way
The above pipeline generally has a code commit as trigger. It depends what your marketing teams are doing on the content and how they are updating it. If they are updating the content that is hosted on AWS, you can change the trigger to S3 updates. The solution depends on an individual use case, which may require some dev from your side to make it simpler for your marketing teams.

I'm unsure what you are asking here, because to "update" a static website, surely you must have some technical knowledge in the very basics of web development.
It's important here to define what exactly you mean by update, because again, updating a website and updating a bucket are two completely different things.
Also, S3 has eventual consistency with PUT's (updating) so there will be some minimal downtime.
The easiest way to update an S3 bucket is via the console, and not the CLI. The console is pretty user friendly, and shouldn't take long to get used to.

Related

Possible to alias a download link (to file in AWS S3 bucket): ".../Foo_v1.0.13" <-- ".../FooLatest"?

This seems like something that happens often enough that there might already be some provision for doing it that I'm not aware of...
Users of our app download our installer package via a link on our site that points to the file hosted in an S3 bucket on AWS. Once installed, our app uses the same (hard-coded) URL to download and install updates when they become available.
One downside to this is that it requires that the download URL be static, and thus it can't contain any version information.
If we were hosting the download on our own (configurable) web server, I'd have an idea of how to set up a redirect from https://.../Foo_Latest to https://.../Foo_v1.0.13, and we could just manage the alias in that one place?
But since we upload new releases to an S3 bucket on AWS, I'm wondering whether there's not some existing capability on AWS to alias the URL. It seems like this might be a common enough use case that there's some solution already in place for doing this?
Of course, we could just have the static URL point to a server we control, and then do the redirect there to the AWS URL. But that feels like it would somewhat defeat the purpose of using high-availability and high-bandwidth benefits of AWS...
Am I missing anything?

DJango use templates from seperrate AWS S3 Bucket

I have a Django server running in an elastic beanstalk environment. I would like to have it render HTML templates pulled from a separate AWS S3 Bucket.
I am using the Django-storages library, which lets me use static and media files from the bucket, but I can't figure out how to get it to render templates.
The reasoning for doing it like this is that once my site is running, I would like to be able to add these HTML templates without having to redeploy the entire site.
Thank you
To my best knowledge, Django-storages is responsible for managing static assets and media files, it doesn't mount the S3 bucket to the file system, what you might be looking for is something like S3Fuse which will mount the bucket on the File System, which will allow you to update the template and have them sync. This might not be the best solution because even if you got the sync to work, Django might not pick those changes and serve the templates from memory.
I believe what you're looking for is a Continuous Delivery pipeline, that way you won't be worried about hosting.
Good Question though.

Stuck in transition between BlueHost and AWS

I cant seem to figure out how to query database information from mysql through a php file uploaded to a server back to a xamarin application after something like a http post call.
I am relatively new to database development, however I had used bluehost for php storage and querying of my app information for about a year yet cant seem to do the same with AWS. I have a relatively decent scope of what S3, EC2, and RDS do for you, however none of them seem to do what I want. In essence I want to be able to say have someone click a button on my xamarin app that creates an event for a club. Then after they click said button, it will make an http post request to an aws site that contains a php file. In that php file it will change or grab the contents of a mysql database. The only place that seems to be able to upload files is S3 but you cant seem to directly utilize php files in the JSON format that I want. Am I completely off with why you would use AWS at all, am I close but not using the right tools, or is it something else? Please give a pretty extensive description on all that I would need to do as I have been struggling to find anything at all on this topic surprisingly.
So after some digging this video here walks through it a bit with phpstorm as your way to connect to the database and add webpages or api php files. Its pretty long but very helpful.

Is it possible to add a file to Google Cloud Storage from a url?

On my php server I have a list of urls that point to large files (not locally stored). These files can be hundreds of Mb so I'm looking for the best way to add them to GCS without first saving them to my server. Is this possible or will I have to save each one then upload it to GCS?
Edit
I forgot to mention the list of urls that I have is managed programatically and changes often so any solution needs to be able to be implemented without manual interaction.
If your urls are publicly reachable, you may be interested in the Transfer Service provided by Google Cloud Storage. You can provide a TSV file with a list of urls from where your files will be uploaded to the bucket of your choice. You can have a look at the service doc here for further details.

Setting up a CDN with Wagtail CMS

I am looking into possibly setting up a CDN to use with my Wagtail sites. I am thinking that this will be a more efficient way to manage media uploads during stage/production pushes, since right now the media folder has to be manually copied from server to server on deploy. If all of the images were being accessed from a CDN then this wouldn't be an issue.
This would be my first time using a CDN so I'm looking for advice. There is lots of info on using a CDN with WordPress, but not a lot of documentation on setting one up with Wagtail/Django. I have the following questions about it:
Does anyone have any suggestions on the best way to implement the CDN with Wagtail?
How does it handle the uploads that the user submits through the CMS? Most of the images will be uploaded as part of the static files, but how does it work when the user uploads a photo as part of a post?
Which CDN companies have you had the best/worst experiences with? The sites I am planning to use this for are professional/business, but not e-commerce.
Also, if there is a more efficient way to handle the transfer of media uploads from one environment to another than using a CDN, I'd love to hear your suggestions for that too. As of right now I've had to copy the media folder over after doing the deploy, and I will have to do this every time I make a change to the site.
Thanks in advance for your assistance.
The following resources can be helpful for your required setup in Wagtail (later on today I can provide you some more details):
Frontend cache invalidator for pages (so not only for static and media files)
Link: http://docs.wagtail.io/en/latest/reference/contrib/frontendcache.html#frontendcache-aws-cloudfront
Storing media files in Amazon Web Services S3 buckets
This should be a better solution instead of copying media files from server to server. In this case Amazon Web Services CloudFront (CDN) would be a perfect choice.
Link: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#amazon-s3
More info CloudFront: https://aws.amazon.com/cloudfront/
Static file cache invalidation with Django Whitenoise
Can be relevant to clear the cache for a new deployment (the static files will have a unique filename so CDN will have a new file cache from its origin after the deployment)
Link: http://whitenoise.evans.io/en/stable/django.html
CloudFront from AWS will have my personal choice for CDN. Besides the awesome resources/services AWS has to offer, CloudFront is simple to setup and has one of the best CDN's out there.
Finally a CDN for serving static- and media files has nothing to do with Wagtail specifically. There are some (see list above) nice apps available for Django itself, but you are free to choose another CDN solution (like Cloudflare).
So setting up a AWS S3 Bucket for each environment (tst/acc/stg/prd) and use it for uploading you media files (so the files aren't on the server anymore) and setup a CloudFront distribution for these buckets would be a proper solution for your problem.
Best regards,
Rob Moorman