I want to list the object URLs of all files in my public bucket. I read all the documents about s3 and s3api, but I couldn't find what I was looking for.
I don't want to define the pre-sign URL. I want to list the existing URLs of all files in the folder.
I'm sorry I couldn't add code, but I have a very specific problem and I can't find my starting point.
I'd appreciate it if someone help.
Solution:
https://github.com/cagdasdemirer/AWS-S3-URL-Listing
I don't think there's a direct way to get the URLs of all the S3 files in bucket. This is how can do it:
List all the files in S3 using:
aws s3 ls s3://bucket-name/folder-name
Build a URL like this:
http://s3.amazonaws.com/bucket/key (for path-style URL), or
http://bucket.s3.amazonaws.com/key (for virtual-hosted style URL)
Store it in the folder.
This SO link may help you: https://stackoverflow.com/a/44401684/7541412
Let me know if it helped.
Thanks
Related
there is a specific use case for which I am trying to solve . Issue is by using boto3 I want to create a folder first and generate presigned post urls for that specific folder . but for now I found that I can't create a new folder I have to upload some file on that directory first and 2nd thing can't generate presigned post url for whole folder I have to do this for 1 by 1 file ? Is my conclusion right or I am going wrong somewhere ?
So I am able to view the object URL of the files in the S3 bucket, but I'd also like to view the URL of the folder uploaded in that bucket.
Is that even possible?
Let me know, please...
Did you try getting the url for a particular item in the folder, and then truncating the url at the desired folder?
Example:
https://bucket-name.s3.amazonaws.com/folder/object
https://bucket-name.s3.amazonaws.com/folder/
I'm not really sure if that's what you're after...
On the otherhand, if you're using the console, you can click on properties of the folder and view the s3 uri also.
I want to have one S3 URL with index.html file and folders on bucket as folder1 and folder2.
Each folder contains build files of respective project.I want to click on folder1 link and go to respective website using AWS s3.
Folder1 link-> Project1
Folder2 link -> Project 2 and so on.
I have tried with creating Bucket on s3 and uploading build to two folders respectively according to project.I have one index.html with two links.
I have index.html file at each project level also.
I am not able to access projects on clicking on respective folder.
Can anyone suggest me something for this?
Thanks in advance.
If the homepage is being served correctly by S3 but the others are not (for a number of reasons), then check the following:
Do the objects exist in the S3 bucket. Check the filenames.
Do the objects have public read access (preferably through the bucket policy).
Does the HTML in your footer for the href go to a valid path. Be careful of relative vs absolute path.
Also check out these troubleshooting pages:
How can I troubleshoot the 404 "NoSuchKey" error from Amazon S3?
How do I troubleshoot 403 Access Denied errors from Amazon S3?
I have some website projects that I want to show case that have their own index.html files.
I want to make a route in my Jekyll website so that if I go asdf.com/project1/ I would see simply my project1.
Is this possible to do?
My project is deployed on amazon AWS S3 buckets, hosted on Cloudfront. Should I upload to the bucket and set up some kind of routing?
How would I go about doing this?
Thanks.
If you want S3 to show a specific HTML page when navigating to asdf.com/project1/ the easiest way to achieve this is by creating a file with corresponding key project1/index.html in that bucket.
Source here (bottom of page): http://docs.aws.amazon.com/AmazonS3/latest/dev/IndexDocumentSupport.html
You have two options, put those pages inside Jekyll as "projectname/index.html" or an html file without extension.
If you don't want Jekyll to process them and just use the html do not add front matter to it and jekyll will just copy them to the output folder.
Then you may need to set the right content type when uploading the website to the S3 bucket.
You may find helpful to put that into a script: https://simpleit.rocks/having-pretty-urls-in-a-jekyll-website-hosted-in-amazon-s3/
I've created a static website (using Hugo) that has links that look like http://www.example.com/post/my-post/. I upload the site to AWS (in an S3 bucket with CloudFront). When I point my browser to that URL, I get an AccessDenied error. But if I tack on "index.html", all is well.
I suspect there's some really obvious bit of config that I'm missing. Apologies, if this is a stupid question, but I couldn't think up an effective google search to find an answer.