Force django_compressor to recompile css/less files - django

We let users upload their own custom css/less to our django app.
All our css/less files are compressed by django_compressor "on-the-fly".
When the app is initially deployed all css files are moved to the collect-static directory.
When users upload custom css styles they replace one of the less files in the collect-static directory.
The problem with that is that the changes only appear when the apache is being reloaded thus a new css file gets generated by django-compressor.
Is there a way to force django-compressor to regenerate it's compiled and cached files? I would not feel comfortable triggering a sudo services apache2 reload at django application level.

I can come up with two possible solutions, I don't like both of them very much.
You can call compress (doc) from within incron or cron:
python manage.py compress
Or you can set a very low COMPRESS_REBUILD_TIMEOUT. (doc)
BTW you have the user scripts as a seperate bundle, right?

I used a different approach. I am now using offline compression, which is faster and better for multi server deployments anyways.
I give the user an interface to change certain css and less values. I save those css/less values in a database table, so that the user can edit the stuff easily.
To make the new css/less values available to the frontend (compiled css files) I write the values the user entered in a less file to the disk and re-run the python manage.py compress command.
This way the compiled compressor files are generated and if the user entered invalid less code, which would lead to compile errors, the compressor stops and keeps the old css files.
Here's my save() method:
def save(self, *args, **kwargs):
#write the less file to the file system
#everytime the model is saved
try:
file_location = os.path.join(settings.STATIC_ROOT, 'styles', 'less', 'custom_styles.less')
with open(file_location, 'w') as f:
f.write(render_to_string('custom_styles/custom_stylesheet_tmpl.txt', {'platform_customizations': self}))
except IOError:
#TODO show error message to user in admin backend and via messaging system
raise IOError
#re-run offline compress command in prod mode
management.call_command('compress')
super(PlatformCustomizations, self).save(*args, **kwargs)

Related

Unit Test: How To Mock MEDIA_ROOT But Still Access A File That Is Stored In My Normal MEDIA Folder

Im testing one of my webpages POST functions. if request.method == 'POST' it generates a pdf and attaches it to the user. Every time I ran tests I was generating pdf files which were building up in my MEDIA folder and I had to manually delete them. I decided to look for ways to prevent this from happening and what seems to me to be the best way is to override my MEDIA_ROOT with a tempfile.
I am getting the following error message
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\Acer\\AppData\\Local\\Temp\\profile_pics\\default.jpg'
I am sure this is because I must create and log-in a user prior to running the test (I dont want to do this in setup as I must generate different types of users for different tests). Every time a user is created their profile picture is set to profile_pics\\default.jpg'.
My understanding is that the mocked MEDIA_ROOT is empty and doesnt contain the default profile pic, so the error is thrown.
My question is how can I circumvent this (ideally mocking the default.jpg image)? I found other solutions besides mocking the MEDIA_ROOT folder, but they seem quite hacky and id prefer to do it properly and hopefully learn from this.
tests.py
from django.test import override_settings
import tempfile
#override_settings(MEDIA_ROOT=tempfile.gettempdir())
def test_redirects_after_POST(self):
user = User.objects.create_superuser('username')
self.client.force_login(user)
response = self.client.post(
reverse('final_question'), data={
'answer': 'this has me totally stumped'}
)
self.assertRedirects(response, reverse('page_2'))
Thank you.
The problem:
PDF files are building up in your MEDIA_ROOT folder because there is nothing automatically deleting them.
Recommended solution:
I actually ran into this problem a few months ago myself, and the solution I found was the django-cleanup tool. It's very easy to install and integrate into your app (simply add it to your INSTALLED_APPS in settings.py).
Note: This is assuming that your PDF files are in a FileField of one of your models. If that's not the case, then this solution won't work for you.

How can I upload and download files with graphene-django?

I'm currently using graphene-django v2.0 and I've absolutely no clue of how can I upload and download files like images, does anyone have an example of a query where you can download an Image and a mutation where you can upload one?
UPLOADS
You don't need to invent your own frontend code to add a file upload to a mutation -- there are existing packages that do this already. For example, apollo-upload-client if you are using Apollo.
To receive an uploaded file on the backend, the files are going to be available in the dictionary request.FILES. So any mutation handling a file upload needs to examine info.context.FILES.items to get and save the file data. The specifics of this code are going to depend on the ultimate destination of the saved file.
(UPDATE) However, if possible I would recommend not using graphene-django to upload files because it adds a large amount of complexity on both the backend and the frontend. My team ultimately scrapped our working graphene-django file upload code and replaced it with a standard Django file upload.
DOWNLOADS
For downloads, I would recommend not using graphQL for the actual download. Instead create a Django function view that returns a HttpResponse or FileResponse and sets the Content-Disposition header. Something like
from django.http import HttpResponse
def download(request):
... do stuff to get file contents and file name and mime_type
response = HttpResponse(file_contents, content_type=mime_type)
response['Content-Disposition'] = 'attachment; filename="{}"'.format(file_name)
return response
Then add this download path to your urls.py and to a graphQL query response. So graphQL would be used to get the download path, but actually downloading the file would be a regular Django page.

Serve static HTML in Django

I’m pretty new to Django so forgive me if this is something I shouldn’t even consider, but I’d like my app to be able to link to a large number of static HTML pages (enough that creating URL paths to each one would be unreasonable) and there would constantly be more being uploaded (by me via FTP).
I’ve got this working on the the development server by adding the path to those HTML files to my STATICFILES_DIRS [] but that doesn’t seem to work when I push my code to production.
I tried setting up a STATIC_ROOT and running collectstatic but that didn’t help, and I’m also worried that even if I got that working, I would have to run collectstatic each time I uploaded new files.
So my question is, is this even reasonable? Should I just avoid hosting these static HTML files alongside my project and leave them where they are now, on a separate server under a separate domain?
The only reason I wanted to host them together initially is because along with the static HTML files, there is an SQL LITE database that my Django app displays data from (this is the whole purpose of my app). However, I could just implement another method of getting that SQL LITE file like using ftlib. Note that I don’t need that database to connect to Django at all - I just needs to read data from it.
You don't need to write urls for every page. You can "capture" the requested page name from the url and render the page according to its value.
# urls.py
url(r'^page/(?P<page_name>\w+)/$', my_view)
# views.py
import os
from django.http import HttpResponse, Http404
FTP_UPLOAD_DIR = '/path/to/directory/where/you/upload/files/'
def my_view(request, page_name):
# check if requested page exists
if os.path.exists(FTP_UPLOAD_DIR + page_name):
# if yes, then serve the page
with open(FTP_UPLOAD_DIR + page_name) as f:
response = HttpResponse(f.read())
return response
else:
raise Http404
Above, we are reading the file directly from the upload folder, so there's no need for you to run collectstatic.

Django file processing app

I am looking to write a Django app that is a utility.
I would like the user to be presented with a form when they can upload 4 or 5 files.
They would then be able to hit a process button and the 5 files would be processed and combine or zipped and downloaded as 1 file(result_file)
After this the 5 file would be deleted as well at the result_file
I would prefer to avoid having a database on this app as it never really stores ant information
Could all this be done in a view ? and how would you guys approach it ?
This is what I would try:
Create the script that does the file processing work.
You can place the script in a newly created folder inside the django app folder and just import it in your views.py.
You can create an empty file ,"init.py" in the aforementioned new folder, and make it as package to import it in views.py
In the same views.py, there could be another function that has the download feature .

How to upload file in django google app engine?

I am new to Google App Engine. I have deploy the pure django application in google app engine. It is working fine. But I wrote the django file upload functionality as shown below,
def forhandler(list, project_id, task_id, request):
for i in list:
filename = i
extension = filename.content_type
print i
newdoc = FileUpload(project_id=project_id, task_id=task_id, file=filename, filetype=extension)
newdoc.save()
When I run the ./manage.py runserver. The above functionality working correctly and uploading files correctly. When I use google app engine means dev_appserver.py my_project. It is perfect but when I upload the file using above functionality It gives an error as shown below,
Exception Value: [Errno 30] Read-only file system: u'/home/nyros/Desktop/projectstat/projectstat/media/documents/2013/05/24/1354676051_chasm_fishing_w1.jpeg'
How do I upload the file using django with google app Engine ? Please solve my problem.Thanks......
The problem here is that App Engine uses a read-only filesystem, and the default Django file upload mechanism wants to store files on disk. (Regardless of whether you're using App Engine or not, storing images in an ordinary database is a bad idea.)
Instead, you should use AppEngine's Blobstore API to save the image. This is special storage App Engine provides for storing large data uploaded by users.
The good news is there's a plugin that takes care of all of this for you: http://www.allbuttonspressed.com/projects/django-filetransfers
Just follow the instructions there and you should be in business.
(Note: I've only tried this with django-nonrel. I've never tried with with vanilla Django.)
The best way to upload files in google app engine with python is using the blobstorehandler.
class Upload(blobstore_handlers.BlobstoreUploadHandler):
for upload in self.get_uploads():
try:
img = Image()
img.primary_image = upload.key()
img.put()
except:
pass