Uploaded a zip file of static website contents to S3 but website shows Error 404 - amazon-web-services

I successfully deployed a package to S3 via Octopus but I get an error 404 instead of a website.
According to this article, the zip file should at least have the index.html.
The zip folder that was uploaded already contains the files created by the npm run build command.
Kindly advise. Thanks a lot!

Related

CloudFront Error: This XML file does not appear to have any style information associated with it - Im ussing Vue and Vite

i have a vue app in a S3 Bucket inside AWS, and with my github workflow i can do a npm run build to create dist folder and copy it into the S3 so with that i have the compiled folder ready to production, and i have a CloudFront configured with that S3 bucket and the index.html works well, see the image below:
That page works well and i have a router with the following directory: /home and in localhost the page works well:
But in the cloudfront url the page returns the following error:
¿How can i solve this?
This is my github action to copy the contents inside the S3 Bucket:

How to give the local zip path in AWS CouldFormation YAML CodeUri?

I have exported a lambda YAML from its export funtion using Download AWS SAM file.
Also I have Downloaded the code zip file from Download deployment package.
in the YAML file we need to give the CodeUri
in the Downloaded YAML it is . as shown in the below picture.
So when I upload it in the AWS CouldFormation it says:
'CodeUri' is not a valid S3 Uri of the form 's3://bucket/key' with
optional versionId query parameter.
I need to know is there a way to give the zip file in the CodeUri from the local file path rather then uploading it in the S3.
I have tried with the zip file name I downloaded as well and still I get the same error.
You have to first run package command. It may not work with zip itself, so you may try with unpacked source code.

Downloading s3 bucket to local directory but files not copying?

There are many, many examples of how to download a directory of files from an s3 bucket to a local directory.
aws s3 cp s3://<bucket>/<directory> /<path>/<to>/<local>/ --recursive
However, I run this command from my AWS CLI that I've connected to and see confirmation in the terminal like:
download: s3://mybucket/myfolder/data1.json to /my/local/dir/data1.json
download: s3://mybucket/myfolder/data2.json to /my/local/dir/data2.json
download: s3://mybucket/myfolder/data3.json to /my/local/dir/data3.json
...
But then I check /my/local/dir for the files, and my directory is empty. I've tried using the sync command instead, I've tried copying just a single file - nothing seems to work right now. In the past I did successfully run this command and downloaded the files as expected.
Why are my files not being copied now, despite seeing no errors?
For testing you can go to your /my/local/dir folder and execute following command:
aws s3 sync s3://mybucket/myfolder .

CodePipeline not saving all files in source artifacts

I've set up a new pipeline in AWS CodePipeline, and connected it to my GitHub account. I'm getting build errors in CodeBuild because a folder that is in my GitHub repository, static/css/, is missing (I'm using CodeBuild to do a gatsby build).
This is not a folder generated in the build process - this folder and its files exist in a clean repo. I've also checked that the branch is correct (master).
When I inspect the zip file in the SourceArtifacts folder in my S3 bucket, this folder is not there.
Any ideas why CodePipeline is not retrieving, or at least keeping, this subfolder and its contents?
Go to your Github repo and select the Green button "Clone or Download", then download the zip file. This is essentially what CodePipeline is doing to get your Github source. Now inspect the files in the zip file and confirm if 'static' directory is there. If it is not there you need to fix that and get the files into github.
It turned out that the missing folder was listed with an export-ignore attribute in the .gitattributes folder. The static/css folder got zipped up with everything else after removing this attribute.

Akeeba Kickstart with Joomla 2.5.13 throws 403 Error

Everytime I start to extract my Akeeba Backup, I get a 403 error and an Uncaught ReferenceError as such
POST http://maxnathaniel.com/joomla/kickstart.php 403 (Forbidden)
kickstart.php:1119 Uncaught ReferenceError: Response is not defined
Versions
Joomla - 2.5.13
Akeeba Kickstart - 4.0.0
File Permissions
Joomla subdirectory folder is on 755
Files within the joomla folder is 644
Included in my /public_html/joomla are en-GB.kickstart.ini, index.html (a sample index file), kickstart.php and my backup file in .jpa format (file size is 1GB).
Appreciate any help, thank you!
Instead of using Akeeba Kickstart, I used Akeeba eXtract Wizard. Unpacked the .jpa file on my local machine, and uploaded it via FTP to my server. It worked without any problems.