boto3 getting error when trying to list buckets - amazon-web-services

I'm using
>>> s3 = session.client(service_name='s3',
... aws_access_key_id='access_key_id_goes_here',
... aws_secret_access_key='secret_key_goes_here',
... endpoint_url='endpoint_url_goes_here')
>>> s3.list_buckets()
to list out my existing buckets, but got the error botocore.exceptions.ClientError: An error occurred () when calling the ListBuckets operation: Not sure how to proceed from that

Are you using boto3?
Here is some sample code. There are two ways to use boto:
The 'client' method that maps to AWS API calls, or
The 'resource' method that is more Pythonic
boto3 will automatically retrieve your user credentials from a configuration file, so there is no need to put credentials in the code. You can create the configuration file with the AWS CLI aws configure command.
import boto3
# Using the 'client' method
s3_client = boto3.client('s3')
response = s3_client.list_buckets()
for bucket in response['Buckets']:
print(bucket['Name'])
# Or, using the 'resource' method
s3_resource = boto3.resource('s3')
for bucket in s3_resource.buckets.all():
print(bucket.name)
If you are using an S3-compatible service, you can add a endpoint_url parameter to the client() and resource() calls.

Related

failed to download files from AWS S3

Senario:
commit Athena query with boto3 and output to s3
download result in s3
Error: An error occurred (404) when calling the HeadObject operation: Not Found
It's weird that the file exists in S3 and I can copy it down with aws s3 cp command. But I just cannot download with boto3 and failed to execute head-object.
aws s3api head-object --bucket dsp-smaato-sink-prod --key /athena_query_results/c96bdc09-d545-4ee3-bc66-be3be928e3f2.csv
It does work. I've checked account policies and it has granted admin policy.
# snippets
def s3_donwload(url, target=None):
# s3 = boto3.resource('s3')
# client = s3.meta.client
client = boto3.client("s3", region_name=constant.AWS_REGION, endpoint_url='https://s3.ap-southeast-1.amazonaws.com')
s3_file = urlparse(url)
if target:
target = os.path.abspath(target)
else:
target = os.path.abspath(os.path.basename(s3_file.path))
logger.info(f"download {url} to {target}...")
client.download_file(s3_file.netloc, s3_file.path, target)
logger.info(f"download {url} to {target} done!")
Take a look at the value of s3_file.path -- does it start with a slash? If so, it needs to change because Amazon S3 keys do not start with a slash.
I suggest that you print the content of netloc, path and target to see what values it is actually passing.
It's a bit strange to use os.path with an S3 URL, so it might need some tweaking.

AWS boto3 how to get the metadata from the key?

I am trying to get the metadata value of the file uploaded in the s3 bucket
#i have to specifically use the boto3.resource('s3') for other api call in the project.
i have below data available under the metadata field
#metadata
Key=Content-Type
Value= application/json
below are the code
bucket= 'mybucket'
key='L1/input/file.json'
s3_resource = boto3.resource('s3')
object = s3_resource.Object(bucket,key)
metadata = object.metadata
but iam getting below error
[ERROR] ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
can anyone help me on this.
Be careful of your syntax. This line:
s3_client=boto3.resource('s3')
is returning a resource, not a client.
Therefore, this line is failing:
obj = s3_client.head_object(bucket,key)
because head_object() is not an operation that can be performed on a resource.
Instead, use:
s3_resource = boto3.resource('s3')
object = s3_resource.Object('bucket_name','key')
metadata = object.metadata
It will provide a Dictionary of the metadata.

Getting 'ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden' while doing cross account copy of file in using boto3

Copying a file from an s3 bucket in one AWS account to an s3 bucket of another account.
The required roles/policies for this task were created by IAM team which is out of my scope.
This lambda is going to run in destination account and it has to copy the object from source bucket.
while running lambda, getting below error:
ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
I was wondering if there is something to be fixed in my code or just a permission issue?
Here is my lambda.
import boto3
ID_list=['123456789101']
def lambda_handler(event, context):
for entry in ID_list:
sts_client = boto3.client('sts')
assumed_role_object=sts_client.assume_role(
RoleArn="arn:aws:iam::" + entry[0] + ":role/requiredrole",
RoleSessionName="SampleSession"
)
credentials=assumed_role_object['Credentials']
s3_resource=boto3.resource(
's3',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
##performing copy from one bucket to other
s3 = boto3.resource('s3')
source= { 'Bucket' : 'my-bucket' + entry[0], 'Key': 'test.csv'} ##source bucket,file details
dest_bucket = s3.Bucket('dest-bucket') #bucket in destination account
dest_bucket.copy(source, 'test1.csv')
i think its a permission issue, you can't either access source bucket or destination bucket.

How to add encryption to boto3.s3.transfer.TransferConfig for s3 file upload

I am trying to upload a file to s3 using boto3 file_upload method. This is pretty straight forward until server side encryption is needed. In the past I have used put_object to achieve this.
Like so:
import boto3
s3 = boto3.resource('s3')
s3.Bucket(bucket).put_object(Key=object_name,
Body=data,
ServerSideEncryption='aws:kms',
SSEKMSKeyId='alias/aws/s3')
I now want to upload files directly to s3 using the file_upload method. I can't find how to add server side encryption to the file_upload method. The file_upload method can take a TransferConfig but I do not see any arguments that set the encryption but I do see them in S3Transfer.
I am looking for something like this:
import boto3
s3 = boto3.resource('s3')
tc = boto3.s3.transfer.TransferConfig(ServerSideEncryption='aws:kms',
SEKMSKeyId='alias/aws/s3')
s3.upload_file(file_name,
bucket,
object_name,
Config=tc)
boto3 documentation
file_upload
TransferConfig
I was able to come up with two solutions with jarmod's help.
Using boto3.s3.transfer.S3Transfer
import boto3
client = boto3.client('s3', 'us-west-2')
transfer = boto3.s3.transfer.S3Transfer(client=client)
transfer.upload_file(file_name,
bucket,
key_name,
extra_args={'ServerSideEncryption':'aws:kms',
'SSEKMSKeyId':'alias/aws/s3'}
)
Using s3.meta.client
import boto3
s3 = boto3.resource('s3')
s3.meta.client.upload_file(file_name,
bucket, key_name,
ExtraArgs={'ServerSideEncryption':'aws:kms',
'SSEKMSKeyId':'alias/aws/s3'})
you donot pass SSEKMSKeyId in boto3 api if you want to use s3 kms, by default, it uses s3 kms key.
import boto3
s3 = boto3.client('s3')
content = '64.242.88.10 - - [07/Mar/2004:16:06:51 -0800] "GET /twiki/bin/rdiff/TWiki/NewUserTemplate?rev1=1.3&rev2=1.2 HTTP/1.1" 200 4523'
s3.put_object(Bucket=testbucket, Key='ex31/input.log', Body=content,ServerSideEncryption='aws:kms')

Errno 11004 getaddrinfo failed error in connecting to Amazon S3 bucket

I am trying to use the boto (ver 2.43.0) library in Python to connect to S3, but I keep getting socket.gaierror: [Errno 11004] when I try to do this:
from boto.s3.connection import S3Connection
access_key = 'accesskey_here'
secret_key = 'secretkey_here'
conn = S3Connection(access_key, secret_key)
mybucket = conn.get_bucket('s3://diap.prod.us-east-1.mybucket/')
print("success!")
I can connect to and access folders in mybucket using AWS CLI by using a command like this in Windows:
> aws s3 ls s3://diap.prod.us-east-1.mybucket/
<list of folders in mybucket will be here>
or using software like CloudBerry or S3Browser.
Is there something that I am doing wrong here to access S3 bucket and folders properly?
get_bucket() expects a bucket name.
get_bucket(bucket_name, validate=True, headers=None)
Try:
mybucket = conn.get_bucket('mybucket')
If it doesn't work, show the full stack trace.
{Update]: There is a bug in boto library for bucket names with dot. Update your boto config
[s3]
calling_format = boto.s3.connection.OrdinaryCallingFormat
Or
from boto.s3.connection import S3Connection, OrdinaryCallingFormat
conn = S3Connection(access_key, secret_key, calling_format=OrdinaryCallingFormat())