Django collectstatic recollects module static files when deployed to Heroku - django

My issue is that every time I deploy my code to Heroku, when collecting static, all the static files within modules are copied again, even if they haven't changed. This means my free amazon S3 bucket copy limit is being reached after just a few deploys because it is copying 400 files each deployment.
The issue must be with some Heroku setting because when running manage.py collectstatic in my IDE, it does not re-copy the files even when using the S3 bucket as default.
I have DISABLE_COLLECTSTATIC=1 in my heroku config vars, and set it in the heroku CLI to be doubly sure.
I have no idea why it is doing this. There's nothing useful in the deployment logs either. Any help would be appreciated.
This is the log:
Successfully compiled 1 referred SASS/SCSS files.
Debug mode is off.
386 static files copied, 106 unmodified.
The 106 unmodified files are the bootstrap SASS and a few others of my own. the 386 files are drom django admin and django-countries. Despite not changing, they are always copied. No settings swap seems to fix the issue.
Relevant parts of my settings.py:
STATICFILES_FINDERS = [
# Default finders
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
# finders for sass files
'sass_processor.finders.CssFinder',
]
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')
STATICFILES_DIRS = (os.path.join(BASE_DIR, 'static/'),)
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
# Settings for AWS bucket
AWS_S3_OBJECT_PARAMETERS = {
'Expires': 'Thu, 31 Dec 2099 20:00:00 GMT',
'CacheControl': 'max-age=94608000',
}
AWS_STORAGE_BUCKET_NAME = os.environ.get('AWS_STORAGE_BUCKET_NAME')
AWS_S3_REGION_NAME = 'eu-west-3'
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com'
AWS_DEFAULT_ACL = 'public-read'
AWS_S3_OBJECT_PARAMETERS = {'CacheControl': 'max-age=86400'}
STATICFILES_LOCATION = 'static'
STATICFILES_STORAGE = 'custom_storages.StaticStorage'
STATIC_URL = f'https://{AWS_S3_CUSTOM_DOMAIN}/{STATICFILES_LOCATION}/'

So I figured it out after some work.
Despite having used DISABLE_COLLECTSTATIC=1 in other django projects and things running smoothly, in this case it turned out to be the problem. When I thought it must be a heroku setting creating the issue I realised there could only be one setting that interferes with static collection.
Removing the setting from heroku has resolved the issue. Everything now copies correctly as it should.

Related

django's collectstatic collects into unexpected directory

I want to upload my staticfiles to amazon s3 storage, but I can't stop django from just putting them in a directory staticfiles in the project root. I have boto3 in my requirements.txt and have set
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
I have successfully set up s3 with media files, to the same bucket (under different directory).
Collectstatic also seems disregards the STATIC_ROOT variable in settings.py when I comment out the s3-settings. When running python3 manage.py collectstatic I expect it to gather static files into STATIC_ROOT as it says in the docs here https://docs.djangoproject.com/en/2.1/ref/settings/ (under static_root). But even if I give a different value to STATIC_ROOT, collectstatic always collects into a directory on the root called staticfiles.
STATIC_ROOT = os.path.join(BASE_DIR, 'this_is_not_used_by_collectstatic')
The rest of my s3 settings:
# Amazon S3
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_STORAGE_BUCKET_NAME = 'my_bucket_name'
AWS_S3_REGION_NAME = 'eu-north-1'
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': 'max-age=86400',
}
AWS_DEFAULT_ACL = None
AWS_LOCATION = 'static'
STATIC_URL = 'https://%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, AWS_LOCATION)
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
# media
DEFAULT_FILE_STORAGE = 'korvir.storage_backends.MediaStorage'
If you're using django-heroku and and you have the following in your code, as per the instructions:
django_heroku.settings(locals())
... then it will overwrite your STATIC_ROOT setting. If you want to preserve your own STATIC_ROOT setting, then make sure you set STATIC_ROOT after calling django_heroku.settings.

Heroku/Django/S3: Static files not serving from S3, even though media is working

I configured my Heroku/Django app to serve media files from S3 using this tutorial a while back. I only cared about media files on S3 at the time, so I didn't use a custom storage originally, and stored the files in the root of my S3 bucket. But it's working.
However, now I want to store my static files on S3 because I have a few static videos and they're larger than I want to serve from my dyno. But when I configure the static files according to the tutorial, they still serve from the dyno. Here's my settings:
...
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'storages',
...
]
...
# Django storages configuration
AWS_STORAGE_BUCKET_NAME = os.environ.get('S3_BUCKET')
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_AUTO_CREATE_BUCKET = False
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
AWS_S3_ENCRYPTION = True
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': 'max-age=86400',
}
AWS_LOCATION = 'static'
# Static file storage
STATICFILES_STORAGE = 'core.custom_storages.StaticStorage'
STATICFILES_LOCATION = 'static'
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
STATIC_URL = 'https://{}/{}/'.format(AWS_S3_CUSTOM_DOMAIN, STATICFILES_LOCATION)
STATICFILES_FINDERS = (
"django.contrib.staticfiles.finders.FileSystemFinder",
"django.contrib.staticfiles.finders.AppDirectoriesFinder",
)
# Media file storage
DEFAULT_FILE_STORAGE = 'core.custom_storages.MediaStorage'
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(os.path.dirname(BASE_DIR), 'media')
...
And here's custom_storages.py:
$ cat core/custom_storages.py
# custom_storages.py
from django.conf import settings
from storages.backends.s3boto3 import S3Boto3Storage
class StaticStorage(S3Boto3Storage):
location = 'static'
class MediaStorage(S3Boto3Storage):
location = ''
Again, to be clear, media storage is working and serves from the s3 bucket correctly. So it's really puzzling to me that the static files aren't.
EDIT: When I push to Heroku, I'm seeing this in the logs, which indicates to me that I'm not even copying to S3:
remote: -----> $ python manage.py collectstatic --noinput
remote: 2090 static files copied to '/tmp/build_a69320678067fe5b2ce29ed74018de75/core/staticfiles', 2188 post-processed.
I also experienced this problem.
When I ran my site on local host the images and static content loaded correctly from my S3 bucket, however it did not work from Heroku.
I had imported import django_heroku in my settings file - this did not fix the problem by itself.
The solution:
I needed to add staticfiles=False argument to django_heroku.settings(locals()) in my settings.py file, like below:
django_heroku.settings(locals(), staticfiles=False)
Then run python manage.py collectstatic from the heroku shell
You should see https://stackoverflow.com/a/57049235/11652661
if you are using "import django_heroku" in your settings.py

Django. Alternate between local & remote staticfiles

After collecting my staticfiles and storing them in an Amazon Bucket (AWS S3), when I run the project locally it still uses the staticfiles stored online, this is a problem cause when I want to make a change on a css file for ex, I have to run collectstatic or manually upload the file to Amazon. I tried adding a new setting variable "LOCAL_STATICFILES" like this:
settings.py
LOCAL_STATICFILES = False
if not LOCAL_STATICFILES:
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = os.environ['AWSAccessKeyId']
AWS_SECRET_ACCESS_KEY = os.environ['AWSSecretKey']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
STATIC_URL = S3_URL
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
if LOCAL_STATICFILES:
STATIC_URL = '/static/'
STATIC_ROOT = '/'
But when I turn LOCAL_STATICFILES to True and runserver, django can't find them.
The project's folders look like this:
project
app
app
static
css
js
img
templates
What am I doing wrong?
First of all: Ensure you have a way to distinguish whether you are, or not, in an environment supporting the Amazon bucket configuration. This means, usually this will be your production environment, where you already configured the amazon bucket settings.
So you will:
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
# this is the base project path
if 'AWSAccessKeyId' in os.environ:
# assume the presence of this key will determine whether
# we are, or not, in the bucket-supporting environment
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = os.environ['AWSAccessKeyId']
AWS_SECRET_ACCESS_KEY = os.environ['AWSSecretKey']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
STATIC_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
# static url will be the re
STATIC_ROOT = None
# STATIC_ROOT doesn't matter since you will not invoke
# `manage.py collectstatic` from this environment. You
# can safely let it to None, or anything like:
# os.path.join(BASE_DIR, 'static')
else:
STATIC_URL = '/static/'
# the static files url will point to your local,
# development, server
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
# here, STATIC_ROOT matters, since you will invoke
# `manage.py collectstatic` from your local environment.

Django collectstatic not pushing to AWS S3

I can't push my local files to amazon S3 using django 'collectstatic'.
My settings.py, all the settings that may be relevant:
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
INSTALLED_APPS += ('storages',)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = os.environ['AWSAccessKeyId']
AWS_SECRET_ACCESS_KEY = os.environ['AWSSecretKey']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
STATIC_URL = S3_URL
STATICFILES_DIRS = (
os.path.join(BASE_DIR, "static"),)
STATIC_ROOT = 'staticfiles'
If I run collectsatic with these settings, I get the error:
0 static files copied, 139 unmodified.
If I change the STATIC_ROOT to "/" (this is how I'd like it to be, given the folder distribution in the Amazon Bucket), I get the error:
OSError: [Errno 13] Permission denied: '/css'
Please help, I've run out of ideas. I've double-checked the credentials and I've also tried to run it from heroku.
heroku run python manage.py collectstatic
Same errors.
According to the docs, you'll need to add
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
to your settings file.

Configuring django-compressor with remote storage (django-storage - amazon s3)

My scenario
I'm using django-storage to have the files served via Amazon S3.
This means that when I do ./manage.py collectstatic, the files will be saved on my bucket at amazon and not on the local file system.
To compress the files I do: "./manage.py compress"
Which gives this error:
Error: An error occured during rendering: [Errno 2] No such file or directory: u'/home/user/project/static/less/bootstrap.less'
Since the file isn't on my local filesystem.
"Due to the way Django Compressor processes files, it requires the files to be processed (in the {% compress %} block) to be available in a local file system cache."
http://django_compressor.readthedocs.org/en/latest/remote-storages/
Question
How do I make django-compress work with django-storage (amazon s3)?
What I've tried to do so far
Make collectstatic save files both local and on S3. Since it's mentioned mentioned in the documentation at the django-compressor page, there should be some good way to do it. How?
Configuration
STATIC_URL = 'http://mybucket.s3-website-eu-west-1.amazonaws.com/'
STATIC_ROOT = os.path.join(PROJECT_DIR,"static/")
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'compressor.finders.CompressorFinder',
)
STATICFILES_STORAGE = DEFAULT_FILE_STORAGE = 'index.storage.CachedS3BotoStorage' #defined as it is in the documentation
AWS_ACCESS_KEY_ID = "xxx"
AWS_SECRET_ACCESS_KEY = "xxx"
AWS_STORAGE_BUCKET_NAME = "xxxx"
COMPRESS_URL = STATIC_URL
COMPRESS_OFFLINE = True
COMPRESS_PRECOMPILERS = (
('text/less', 'lessc {infile} {outfile}'),
)
I think the only setting you are missing is COMPRESS_ROOT.
I have django-compressor working very nicely with S3. Here's my configuration:
DEFAULT_FILE_STORAGE = 'g2k_utils.s3storage.S3BotoStorage'
STATICFILES_STORAGE = DEFAULT_FILE_STORAGE
COMPRESS_ROOT = '/home/user/website/static/' # Where my SCSS, JS files are stored
COMPRESS_STORAGE = DEFAULT_FILE_STORAGE
COMPRESS_OFFLINE = True