How to make AWS credentials accessible to boto in django environment? - django

I'm unable to create an S3 connection in my django environment via django-s3-storages middleware (I'm getting a 403 response from S3).
Boto doesn't seem to be able to pick up the environment settings, and I suspect this is the cause (the traceback isn't helping much). As a diagnosis in manage.py shell:
import boto
boto.connect_s3()
>>> boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] Check your credentials
from django.conf import settings
boto.connect_s3(settings.AWS_ACCESS_KEY_ID, settings.AWS_SECRET_ACCESS_KEY)
>>> S3Connection:s3.amazonaws.com
The docs (and other posts) indicate that these settings should work:
MEDIAFILES_LOCATION = 'media'
AWS_S3_CUSTOM_DOMAIN = 'my-bucket.s3-website-eu-west-1.amazonaws.com'
AWS_S3_HOST = 's3-website-eu-west-1.amazonaws.com'
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)
DEFAULT_FILE_STORAGE = 'django_s3_storage.storage.StaticS3Storage'
#S3 settings from https://github.com/etianen/django-s3-storage
AWS_ACCESS_KEY_ID = "xxx"
AWS_SECRET_ACCESS_KEY = "yyy"
AWS_S3_BUCKET_NAME = "my-bucket"
AWS_S3_CALLING_FORMAT = "boto.s3.connection.OrdinaryCallingFormat"
# Make user uploaded files public
AWS_S3_BUCKET_AUTH = False
AWS_S3_MAX_AGE_SECONDS = 60*60*24*365 # 1 year
AWS_S3_GZIP = True
Why isn't boto able to connect?

The django storage middleware uses its own S3Storage class. That class has knowledge of django's settings.py and will use settings.AWS_ACCESS_KEY_ID and settings.AWS_SECRET_ACCESS_KEY if they're configured.
boto3 by itself has no knowledge of the django settings file, so it doesn't use anything you configure in that file. That's why you have to specify the key and secret to boto3 when trying to establish an S3 connection.

Related

django storages AWS S3 SigVer4: SignatureDoesNotMatch

My configuration (very basic):
settings.py
AWS_S3_REGION_NAME = 'eu-west-3'
AWS_S3_FILE_OVERWRITE = False
# S3_USE_SIGV4 = True # if used, nothing changes
# AWS_S3_SIGNATURE_VERSION = "s3v4" # if used, nothing changes
AWS_ACCESS_KEY_ID = "xxx"
AWS_SECRET_ACCESS_KEY = "xxx"
AWS_STORAGE_BUCKET_NAME = 'xxx'
# AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com' # if used, no pre-signed urls
AWS_DEFAULT_ACL = 'private'
AWS_S3_OBJECT_PARAMETERS = {'CacheControl': 'max-age=86400'}
AWS_LOCATION = 'xxx'
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
INSTALLED_APPS = [
...,
'storages'
]
models.py
class ProcessStep(models.Model):
icon = models.FileField(upload_to="photos/process_icons/")
What I get:
Pre-signed url is generated (both in icon.url and automatically on admin page)
Pre-signed url response status code = 403 (Forbidden)
If opened, SignatureDoesNotMatch error. With text: The request signature we calculated does not match the signature you provided. Check your key and signing method.
Tried:
changing access keys (both root and IAM)
changing bucket region
creating separate storage object for icon field (same error SignatureDoesNotMatch)
changing django-storages package version (currently using the latest 1.11.1)
Opinion:
boto3 client generate_presigned_url returns url with invalid signature
Questions:
What should I do?
Why do I get the error?
Patience is a virtue!
One might wait for 1 day for everything to work

How to sync the upload progress bar with upload on s3 bucket using Django Rest Framework

I am working on a REST API (using Django Rest Framework). I am trying to upload a video by sending a post request to the endpoint I made.
Issue
The video does upload to the s3 bucket, but the upload progress shows 100% within a couple of seconds only however large file I upload.
Why is this happening and how can I solve this it?
PS: Previously I was uploading on local storage, and the upload progress was working fine.
I am using React.
First of all you make sure you've installed these library: boto3==1.14.53, botocore==1.17.53, s3transfer==0.3.3, django-storages==1.10
settings.py :
INSTALLED_APPS = [
'storages',
]
AWS_ACCESS_KEY_ID = 'your-key-id'
AWS_SECRET_ACCESS_KEY = 'your-secret-key'
AWS_STORAGE_BUCKET_NAME = 'your-bucket-name'
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': 'max-age=86400',
}
DEFAULT_FILE_STORAGE = 'your_project-name.storage_backends.MediaStorage'
MEDIA_URL = "https://%s/" % AWS_S3_CUSTOM_DOMAIN
#File upload setting
BASE_URL = 'http://example.com'
FILE_UPLOAD_PERMISSIONS = 0o640
DATA_UPLOAD_MAX_MEMORY_SIZE = 500024288000
then make a storage_backends python file inside your project folder where settings.py file is located.
storage_backends.py:
import os
from tempfile import SpooledTemporaryFile
from storages.backends.s3boto3 import S3Boto3Storage
class MediaStorage(S3Boto3Storage):
bucket_name = 'your-bucket-name'
file_overwrite = False
def _save(self, name, content):
"""
We create a clone of the content file as when this is passed to
boto3 it wrongly closes the file upon upload where as the storage
backend expects it to still be open
"""
# Seek our content back to the start
content.seek(0, os.SEEK_SET)
# Create a temporary file that will write to disk after a specified
# size. This file will be automatically deleted when closed by
# boto3 or after exiting the `with` statement if the boto3 is fixed
with SpooledTemporaryFile() as content_autoclose:
# Write our original content into our copy that will be closed by boto3
content_autoclose.write(content.read())
# Upload the object which will auto close the
# content_autoclose instance
return super(MediaStorage, self)._save(name, content_autoclose)

Why can't I get images from my S3 bucket when all public access blocked? 403 Forbidden But static files load fine

My Django website allows users to upload photos. When the s3 bucket is on public the media content loads fine on the site. However once i block all public access the content no longer loads and a 403 forbidden error is shown. In my code I have added the values needed to allow for authenticate requests. There are no bucket policy's or any CORS configurations. I have tried several different suggestions from blogs and tutorials but nothing seems to work.
I have a user which has programmatic access and set the secret variables in the heroku environment and still get 403 error.
I have tried hard setting the secret variables and still no success.
settings.py
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = os.environ.get('AWS_STORAGE_BUCKET_NAME')
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': 'max-age=86400',
}
MEDIA_LOCATION = 'MediaStorage'
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, MEDIA_LOCATION)
MEDIA_FILE_STORAGE = 'MediaStorage'
STATIC_ROOT = os.path.join(BASE_DIR, "static")
STATIC_LOCATION = 'static'
STATICFILES_LOCATION = 'StaticStorage'
STATIC_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, STATIC_LOCATION)
django_heroku.settings(locals())
storage_backends.py
from storages.backends.s3boto3 import S3Boto3Storage
from django.conf import settings
class StaticStorage(S3Boto3Storage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3Boto3Storage):
location = settings.MEDIA_FILE_STORAGE
file_overwrite = False
custom_domain = False
All static files load fine but media files do not load at all.
I expect the files to load when displaying them in the view web page. however i just get a 403 forbidden error.
In your settings.py your have set?
AWS_DEFAULT_ACL = 'public-read'
Look this example is an public project for my state in Brazil.
Project
So after many iterations it came down to a line of code missing in the storage_backends.py file missing the line
custom_domain = False
I have updated the original post to match the correct storage_backends.py file

ClientError: The AWS Access Key Id you provided does not exist in our records

I am using a Django website running on an ubuntu EC2 instance with S3 for hosting my static and media. When I try to upload an image file to a model I receive the following.
"An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The AWS Access Key Id you provided does not exist in our records."
I have an iam user with AmazonS3FullAccess and I copy and pasted the correct Access key into my settings.py file. I am able to run collectstatic and access the buckets through the terminal, and I even have my static served correctly on my website, but no user can upload images or files. I'v run and rerun aws configure, but still no luck.
Here is the relevant code from my utils.py and settings.py:
utils.py
from storages.backends.s3boto3 import S3Boto3Storage
class StaticStorage(S3Boto3Storage):
location = 'static'
class MediaStorage(S3Boto3Storage):
location = 'media'
settings.py
AWS_ACCESS_KEY_ID = 'access_key'
AWS_SECRET_ACCESS_KEY = 'secret_key'
S3_REGION_NAME ='aws_region'
DEFAULT_FILE_STORAGE = 'my-project.utils.MediaStorage'
AWS_STORAGE_BUCKET_NAME = 'my-project-bucket'
STATICFILES_STORAGE = 'my-project.utils.StaticStorage'
AWS_S3_CUSTOM_DOMAIN = '//%s.s3-aws_region.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
MEDIA_URL = AWS_S3_CUSTOM_DOMAIN + 'media/'
STATIC_URL = AWS_S3_CUSTOM_DOMAIN + 'static/'
MEDIA_ROOT = MEDIA_URL
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
my /.aws/config and credentials file also have the same copy and pasted Id and Keys.
Any and all help would be greatly appreciated

Media uploads django mezzanine giving HTTP Error 500

I'm trying to setup django and the cms package mezzanine to use amazon s3 storage using django-storages and django-s3-folder-storage. All works well in the sense that I can use collectstatic without issue and upload my files to amazon, and also images get served correctly as does css/js.
However if a user tries to upload an image to the media-library or as a featured-image for a blog post. I get simply HTTP Error in the browser and POST /admin/media-library/upload_file/ HTTP/1.1" 500 146580 in the console. This is with DEBUG=True set, so I'm surprised there is not more of a trace/feedback.
I'm not sure what to do to fix or even begin debugging why this is occurring, anyone help?
My relevant settings.py are:
AWS_STORAGE_BUCKET_NAME = 'my_bucket'
AWS_SECRET_ACCESS_KEY = 'my_key'
AWS_ACCESS_KEY_ID = 'my_id'
AWS_PRELOAD_METADATA = True
AWS_QUERYSTRING_AUTH = False
AWS_S3_SECURE_URLS = False
AWS_S3_ENCRYPTION = False
from boto.s3.connection import ProtocolIndependentOrdinaryCallingFormat
AWS_S3_CALLING_FORMAT = ProtocolIndependentOrdinaryCallingFormat()
DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
DEFAULT_S3_PATH = "media"
MEDIA_ROOT = ''
MEDIA_URL = ''
STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'
STATIC_S3_PATH = "static"
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = '//s3.amazonaws.com/%s/static/' % AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
I had the same problem. What the problem turned out to be for me was in my setup I needed to change the permissions on this directory (and all its sub-directories) to be owned by the user used by my nginx.conf:
/var/lib/nginx
To correct this I did:
sudo chown -R <myuser>:<myuser> /var/lib/nginx
Where myuser is the nginx user specified in nginx.conf.
Think images you are uploading are placed in a temporary folder in there before reaching Amazon S3 and this temporary folder doesn't have the correct permissions to allow this.
What got me on the right track to fix this was I changed debug to DEBUG = True and then when I got the 500 error in my error logs I saw:
[crit] 26489#0: *374 open() "/var/lib/nginx/tmp/client_body/0000000050" failed (13: Permission denied),
Also have a look at this:
nginx 500 error, permission denied for tmp folder