Upload files to s3 bucket using python gives Access Denied - amazon-web-services

I created a Python script that should upload a file from my local ec2 to the s3 bucket
import boto3
s3 = boto3.resource('s3')
data = open('backupFile.txt', 'rb')
s3.Bucket('mlsd').put_object(Key='backupFile.txt', Body=data)
I went to AWS account details and got the credentials.
I executed aws configure to set credentials on my EC2.
Hear is the output of the credentials using aws configure list:
I went to .aws/credentials and pasted access_key_id, secret_access_key, and token
I ensured that the token is not expired.
When I ran the script, I got the following output:
Not sure what the problem is.

Boto3 detects your credentials in possible locations, as described here, so it should find your access_key_id and secret_access_key
Make sure the user whose access_key_id you use has the access to S3 bucket.
I tried this code example and it works:
import logging
import boto3
from botocore.exceptions import ClientError
def upload_file(file_name, bucket, object_name=None):
"""Upload a file to an S3 bucket
:param file_name: File to upload
:param bucket: Bucket to upload to
:param object_name: S3 object name. If not specified then file_name is used
:return: True if file was uploaded, else False
"""
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
# Upload the file
s3_client = boto3.client('s3')
try:
response = s3_client.upload_file(file_name, bucket, object_name)
except ClientError as e:
logging.error(e)
return False
return True

Related

How to create Pre-signed URL for specific version of file in AWS S3

How to create a Pre-signed URL for the specific version of a file in AWS S3?
If the bucket is enabled for file versioning and file has more than one version and wants to create presigned url for speific version of file.
Just need to pass the version_id along with key to create pre-signed url for the specific version of the file.
Python Example:
def get_pre_signed_url(bucket, file_name):
try:
response = boto3.client('s3', aws_access_key_id=os.environ.get("aws_access_key_id"), aws_secret_access_key=os
.environ.get("aws_secret_access_key"), region_name=os.environ.get("region_name"))\
.generate_presigned_post(Bucket=bucket, Key=os.environ.get('folder_location') + file_name,
ExpiresIn=300)
except ClientError as e:
logging.error(e)
return None
return response
Filename is {fileName}?versionId={versionId}
Check this repo, for more information

How to upload a list of files from the web directly to s3 bucket

I have a file with urls in my s3 bucket. I would like to use a python lambda function to upload the url files to s3 bucket.
For example my uploaded file to s3 contains:
http://...
http://...
Each line corresponds to a file to be uploaded into s3.
Here is the code:
import json
import urllib.parse
import boto3
import requests
import os
from gzip import GzipFile
from io import TextIOWrapper
import requests
print('Loading functions')
s3 = boto3.client('s3')
def get_file_seqs(response):
try:
size = response['ResponseMetadata']['HTTPHeaders']['content-length']
print("[+] Size retrieved")
return size
except:
print("[-] Size can not be retrieved")
def lambda_handler(event, context):
# Defining bucket objects
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
#get file from s3
print('[+] Getting file from S3 bucket')
response = s3.get_object(Bucket=bucket, Key=key)
try:
#checking file size
print('[+] Checking file size')
file_size = get_file_seqs(response)
if file_size == 0:
print('File size is equal to 0')
return False
else:
#create new directories
print('[+] Creating new directories')
bucket_name = "triggersnextflow"
directories = ['backups/sample/', 'backups/control/']
#loop to create new dirs
for dirs in directories:
s3.put_object(Bucket = bucket_name, Key = dirs, Body = '')
#NOW I WOULD LIKE TO DOWNLOAD THE FILES FROM THE URLS INSIDE S3 OBJECT
#return true
return True
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e
Download an S3 object to a file:
import boto3
s3 = boto3.resource('s3')
s3.meta.client.download_file('mybucket', 'hello.txt', '/tmp/hello.txt')
You will find great resource of information here:
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.download_file

S3 boto3 corrupts file

I have the following function:
def upload_file(file_name, bucket, object_name=None):
"""Upload a file to an S3 bucket -> from aws docs
:param file_name: File to upload
:param bucket: Bucket to upload to
:param object_name: S3 object name. If not specified then file_name is used
:return: True if file was uploaded, else False
"""
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
# Upload the file
s3_client = boto3.client('s3')
try:
response = s3_client.upload_file(file_name, bucket, object_name)
except ClientError as e:
logging.error(e)
return False
return True
I am trying to upload a html file to an S3 bucket acting as a webserver. When I manually upload the html file to S3, it works as expected, and displays the page when I navigate to the S3 bucket's URL.
If I programmatically upload the file using the above function, the html file will no longer be hosted, and my browser will attempt to download a XZ file.
Am I missing a parameter or something?
Courtesy of #jarmod, I learned I was setting an incorrect content-type.
Here is the updated function to upload an HTML file as text/html.
def upload_file(file_name, bucket, object_name=None):
"""Upload a file to an S3 bucket -> from aws docs
:param file_name: File to upload
:param bucket: Bucket to upload to
:param object_name: S3 object name. If not specified then file_name is used
:return: True if file was uploaded, else False
"""
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
# Upload the file
s3_client = boto3.client('s3')
try:
response = s3_client.upload_file(file_name, bucket, object_name, ExtraArgs={'ContentType': "text/html"})
except ClientError as e:
logging.error(e)
return False
return True

How to save files to ec2 instance through flask like s3?

I have developed an web app, in which files needs to be uploaded from local PC to AWS EC2 instance via flask web call and run the machine learning model in the back-end. But, could not find any related resources to do that.
Can we upload in AWS S3 instead and link ec2 EBS and S3?
If any help is provided then it will be useful to do this!
Use boto to upload files to s3.
In flash create an endpoint that will take the local file and push it to S3.
import boto3
from botocore.exceptions import ClientError
def upload_file(file_name, bucket, object_name=None):
"""Upload a file to an S3 bucket
:param file_name: File to upload
:param bucket: Bucket to upload to
:param object_name: S3 object name. If not specified then file_name is used
:return: True if file was uploaded, else False
"""
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
# Upload the file
s3_client = boto3.client('s3')
try:
response = s3_client.upload_file(file_name, bucket, object_name)
except ClientError as e:
logging.error(e)
return False
return True ```
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html
Read This https://www.javatpoint.com/flask-file-uploading
Upload the file to tmp directory and then upload to S3.

How to configure authorization mechanism inline with boto3

I am using boto3 in aws lambda to fecth object in S3 located in Frankfurt Region.
v4 is necessary. otherwise following error will return
"errorMessage": "An error occurred (InvalidRequest) when calling
the GetObject operation: The authorization mechanism you have
provided is not supported. Please use AWS4-HMAC-SHA256."
Realized ways to configure signature_version http://boto3.readthedocs.org/en/latest/guide/configuration.html
But since I am using AWS lambda, I do not have access to underlying configuration profiles
The code of my AWS lambda function
from __future__ import print_function
import boto3
def lambda_handler (event, context):
input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
input_file_key = event["Records"][0]["s3"]["object"]["key"]
input_file_name = input_file_bucket+"/"+input_file_key
s3=boto3.resource("s3")
obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
response = obj.get()
return event #echo first key valuesdf
Is that possible to configure signature_version within this code ? use Session for example. Or is there any workaround on this?
Instead of using the default session, try using custom session and Config from boto3.session
import boto3
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3client = session.client('s3', config= boto3.session.Config(signature_version='s3v4'))
s3client.get_object(Bucket='<Bkt-Name>', Key='S3-Object-Key')
I tried the session approach, but I had issues. This method worked better for me, your mileage may vary:
s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
You will need to import Config from botocore.client in order to make this work. See below for a functional method to test a bucket (list objects). This assumes you are running it from an environment where your authentication is managed, such as Amazon EC2 or Lambda with a IAM Role:
import boto3
from botocore.client import Config
from botocore.exceptions import ClientError
def test_bucket(bucket):
print 'testing bucket: ' + bucket
try:
s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
b = s3.Bucket(bucket)
objects = b.objects.all()
for obj in objects:
print obj.key
print 'bucket test SUCCESS'
except ClientError as e:
print 'Client Error'
print e
print 'bucket test FAIL'
To test it, simply call the method with a bucket name. Your role will have to grant proper permissions.
Using a resource worked for me.
from botocore.client import Config
import boto3
s3 = boto3.resource("s3", config=Config(signature_version="s3v4"))
return s3.meta.client.generate_presigned_url(
"get_object", Params={"Bucket": AIRFLOW_BUCKET, "Key": key}, ExpiresIn=expTime
)