Site not deployed on AWS S3 - amazon-web-services

I have followed all instructions of deploying React app to AWS S3. Uploaded all files and folders in bucket, but, when clicked on site endpoint in Static Website Hosting shows blank page.What may be the cause? Please see following images for reference
All files and folders uploaded
Static Website Hosting in properties also updated
Public access also enabled
Bucket Policy also updated
Blank Page
This is all required for React App to deploy on AWS S3. Any suggestions?

Fetching index.html works fine, but the various JavaScript files are all yielding 404.
The problem appears to be that your HTML is fetching resources as:
/WeatherApp/static/js/main.057efd26.chunk.js
but in the origin they are actually at:
/static/js/main.057efd26.chunk.js
Fix your index.html to refer to static resources rooted at /static/, not /WeatherApp/static/.

In package.json file, change homepage to: "homepage": "./"
npm run build
re-upload these build files.

Related

Django admin fails to use pre-signed URL for some static files in an AWS bucket

I am serving static files from an AWS bucket for my Django 4.0 powered site. My own apps function perfectly, but the Admin app tries to use unsigned URLs for some of the static files.
More specifically, fonts.css and widgets.css under /admin/css/ are requested with unsigned URLs (resulting in 403 Forbidden) whereas all other CSS files are requested with presigned URLs and served just fine. As to images, only icon-addlink.svg is requested with unsigned URL (403 Forbidden) and all other images under /admin/img/are requested and served as expected.
The image below (from the browsers Inspect feature) shows how Django tries to fetch some of the files with no querystring at all. What could be causing this? And how to get around it?
I have checked that the problematic CSS and image files are properly copied to the static files folder with collectstatic command and they do exist in the AWS bucket. This issue has persisted upgrading to Django 4.0 from 3.X.

S3 Static site downloads index.html after uploading files

I have a static site that I served to s3 called tidbitstatistics.com
I wrote a script using boto3 to replace the files with new ones and since then, my site doesn't open - instead it downloads the index.html file.
From what I can tell, I didn't change any settings. The site was working fine before I re-uploaded the files. Since then, I deleted all the files and re-uploaded them manually, but I am still running into the same error.
I thought this might have to do with the file types, but they were the correct text/html file types when re-uploading manually and I am adjusting my script to specify file types when calling put_object instead of upload_file with boto3.
Static site hosting is turned on for that bucket and public permissions to read are set. I'm just not sure how s3 all of a sudden won't serve my static site.
I followed the answer here, but I don't see a Content-Disposition property.
Any help would be appreciated - web development is not my strong suit!

Host multiple websites on single domain - folder wise on aws s3

I want to have one S3 URL with index.html file and folders on bucket as folder1 and folder2.
Each folder contains build files of respective project.I want to click on folder1 link and go to respective website using AWS s3.
Folder1 link-> Project1
Folder2 link -> Project 2 and so on.
I have tried with creating Bucket on s3 and uploading build to two folders respectively according to project.I have one index.html with two links.
I have index.html file at each project level also.
I am not able to access projects on clicking on respective folder.
Can anyone suggest me something for this?
Thanks in advance.
If the homepage is being served correctly by S3 but the others are not (for a number of reasons), then check the following:
Do the objects exist in the S3 bucket. Check the filenames.
Do the objects have public read access (preferably through the bucket policy).
Does the HTML in your footer for the href go to a valid path. Be careful of relative vs absolute path.
Also check out these troubleshooting pages:
How can I troubleshoot the 404 "NoSuchKey" error from Amazon S3?
How do I troubleshoot 403 Access Denied errors from Amazon S3?

fullpage.js plugin not working net::ERR_ABORTED 403 (Forbidden)

I am using fullpage.js plug in for a website I am building as it allows me to display each page as a slide which locks in on scroll. Here is the website if you are interested in taking a look: https://alvarotrigo.com/fullPage/
Locally it loads fine and the page works and looks exactly how I would like it too, however, once I upload my files to amazon AWS I get a whole list of console errors but the root seems to be that the fullpage.js files are forbidden. I have tried using a CDN to pull the files, however, this has not been successful. Any recommendations?
I did not set the right permissions in AWS for the folder containing the files, I moved the files into the root folder and it works.

Whole new project inside a jekyll project

I have some website projects that I want to show case that have their own index.html files.
I want to make a route in my Jekyll website so that if I go asdf.com/project1/ I would see simply my project1.
Is this possible to do?
My project is deployed on amazon AWS S3 buckets, hosted on Cloudfront. Should I upload to the bucket and set up some kind of routing?
How would I go about doing this?
Thanks.
If you want S3 to show a specific HTML page when navigating to asdf.com/project1/ the easiest way to achieve this is by creating a file with corresponding key project1/index.html in that bucket.
Source here (bottom of page): http://docs.aws.amazon.com/AmazonS3/latest/dev/IndexDocumentSupport.html
You have two options, put those pages inside Jekyll as "projectname/index.html" or an html file without extension.
If you don't want Jekyll to process them and just use the html do not add front matter to it and jekyll will just copy them to the output folder.
Then you may need to set the right content type when uploading the website to the S3 bucket.
You may find helpful to put that into a script: https://simpleit.rocks/having-pretty-urls-in-a-jekyll-website-hosted-in-amazon-s3/