I have a url (in the form of https://s3.amazonaws.com/...) pointing to a file in S3 and I want to have it downloadable by a user so I set the permission from S3 dashboard in AWS console but I learned that it is being reset whenever the file is re-written (the filename remains the same).
Is there a way to automatically set the permission right after the file creation? I looked at boto library but couldn't figure it out. Thanks in advance!
This is a very common operation.
With the Boto library, you can set an ACL. Assuming you have a Key:
key.set_acl('public-read')
If you don't have a Key, you'll need to have a Bucket:
bucket.set_acl('public-read', 'path/to/key')
You can use non-canned ACLs also. The documentation links through to that.
In boto3, you can also set an ACL.
Bucket syntax:
s3client.put_bucket_acl(ACL='public-read', Bucket='bucketname')
Key syntax:
s3client.put_object_acl(ACL='public-read', Bucket='bucketname', Key='path/to/key')
Non-canned ACLs are a little easier in boto3.
Related
What permissions do I set in a policy to allow a user to see a single bucket in the root s3 page in the console (https://s3.console.aws.amazon.com/s3/buckets)
I keep trying different things but either they see all the bucketsor none of them. I gave them permissions to manage the bucket and if they put the bucket url into their browser they can access it fine and upload stuff. But if they go to the root s3 page it doesn't list any buckets.
It is not possible to control which buckets a user can see listed in the S3 Management Console.
If a user has permission to use the ListBuckets() command, then they will be able to see a listing of ALL buckets in that AWS Account.
However, there is a cheat...
You can give permissions to a user to 'use' a specific Amazon S3 bucket (eg GetObject, PutObject, ListObjects), while not giving them permission List the buckets. They will not be able to use the S3 Management Console to navigate to the bucket, but you can give them a URL that will take them directly to the bucket in the console, eg:
https://s3.console.aws.amazon.com/s3/buckets/BUCKET-NAME
This will let them see and use the bucket in the S3 Management Console, but they won't be able to see the names of any other buckets and they won't be able to navigate to their bucket via the 'root s3 page' that you mention. Instead, they will need to use that URL.
I'm an AWS newbie trying to use Textract API, their OCR service.
As far as I understood I need to upload files to a S3 bucket and then run textract on it.
I got the bucket on and the file inside it:
I got the permissions:
But when I run my code it bugs.
import boto3
import trp
# Document
s3BucketName = "textract-console-us-east-1-057eddde-3f44-45c5-9208-fec27f9f6420"
documentName = "ok0001_prioridade01_x45f3.pdf"
]\[\[""
# Amazon Textract client
textract = boto3.client('textract',region_name="us-east-1",aws_access_key_id="xxxxxx",
aws_secret_access_key="xxxxxxxxx")
# Call Amazon Textract
response = textract.analyze_document(
Document={
'S3Object': {
'Bucket': s3BucketName,
'Name': documentName
}
},
FeatureTypes=["TABLES"])
Here is the error I get:
botocore.errorfactory.InvalidS3ObjectException: An error occurred (InvalidS3ObjectException) when calling the AnalyzeDocument operation: Unable to get object metadata from S3. Check object key, region and/or access permissions.
What am I missing? How could I solve that?
You are missing S3 access policy, you should add AmazonS3ReadOnlyAccess policy if you want a quick solution according to your needs.
A good practice is to apply the least privilege access principle and keep granting access when needed. So I'd advice you to create a specific policy to access your S3 bucket textract-console-us-east-1-057eddde-3f44-45c5-9208-fec27f9f6420 only and only in us-east-1 region.
Amazon Textract currently supports PNG, JPEG, and PDF formats. Looks like you are using PDF.
Once you have a valid format, you can use the Python S3 API to read the data of the object in the S3 object. Once you read the object, you can pass the byte array to the analyze_document method. TO see a full example of how to use the AWS SDK for Python (Boto3) with Amazon Textract to
detect text, form, and table elements in document images.
https://github.com/awsdocs/aws-doc-sdk-examples/blob/master/python/example_code/textract/textract_wrapper.py
Try following that code example to see if your issue is resolved.
"Could you provide some clearance on the params to use"
I just ran the Java V2 example and it works perfecly. In this example, i am using a PNG file located in a specific Amazon S3 bucket.
Here are the parameters that you need:
Make sure when implementing this in Python, you set the same parameters.
I'm trying to access a bucket via cross account reference, the connection is established, but the put/list permissions are set on a specific directory (folder) i.e. bucketname/folder_name/*
s3 = boto3.client('s3')
s3.upload_file("filename.csv","bucketname","folder_name/file.csv"
,ExtraArgs={'ACL':'bucket-owner-full-control'})
Not sure how do I allow the same via code, it throws access denied on both list/put. Nothing wrong with permissions as such, have verified the access via awscli, it works.
let me know if i'm missing something here, thanks!
There was an issue with the assumed role, followed the documentation as mentioned here https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-api.html along with the code mentioned above.
I have been provided with the access and secret key for an Amazon S3 container. No more details were provided other than to drop some files into some specific folder.
I downloaded Amazon CLI and also the Amazon SDK. So far, seems to be no way for me to check the bucket name or list the folders where I'm supposed to drop my files. Every single command seems to require the knowledge of a bucket name.
Trying to list with aws s3 ls gives me the error:
An error occurred (AccessDenied) when calling the ListBuckets operation: Access Denied
Is there a way to list the content of my current location (I'm guessing the credentials I was given are linked directly to a bucket?). I'd like to see at least the folders where I'm supposed to drop my files, but the SDK client for the console app I'm building seems to always require a bucket name.
Was I provided incomplete info or limited rights?
Do you know the bucket name or not? If you don't and you don't have permission to ListAllMyBuckets and GetBucketLocation on * and ListBucket on the bucket in question, then you can't get the bucket name. That's how it is supposed to work. If you know the bucket, then you can run aws s3 s3://bucket-name/ to get objects in the bucket.
Note, that S3 buckets don't have the concept of "folder". It's User interface "sugar" to make it look like folders and files. Internally, it's just the key and the object
Looks like it was just not possible without enhanced rights or with the actual bucketname. I was able to procure both later on from the client and able to complete the task. Thanks for the comments.
I want to simply check the permissions that I have on a buckets/folders/files in AWS S3. Something like:
ls -l
Sounds like it should be pretty easy but I cannot find any information on the subject. I just want to know if I have read access to a content, or if I can load a file locally without trying to load the data, to have an "Error Code: 403 Forbidden" thrown at me.
Note: I am using databricks and want to check the permission from there.
Thanks!
You can check the permissions using the command,
aws s3api get-object-acl --bucket my-bucket --key index.html
You acl for each object can vary across your bucket.
More documentation at,
https://docs.aws.amazon.com/cli/latest/reference/s3api/get-object-acl.html
Hope it helps.
There are several different ways to grant access to objects in Amazon S3.
Permissions can be granted on a whole bucket, or a path within a bucket, via a Bucket Policy.
Permissions can also be granted to an IAM User or Role, giving that specific user/role permissions similar to a bucket policy.
Then there are permissions on the object itself, such as making it publicly readable.
So, there is no simple way to say "what are the permissions on this particular object" because it depends who you are. Also, the policies can restrict by IP address and time of day, so there isn't always one answer.
You could use the IAM Policy Simulator to test whether a certain call (eg PutObject or GetObject) would work for a given user.
Some commands in the AWS Command-Line Interface (CLI) come with a --dryrun option that will simply test whether the command would have worked, without actually executing the command.
Or, sometimes it is just easiest to try to access the object and see what happens!