*.tar.gz file download use apache2 webserver - django

I would like to implement a function that user can upload/download file.
But when user upload a *tar.gz file and download, seems the file was changed, file's md5sum was changed and the files size was chaged.
Productions env use apache webserver. The webframework is django,
It's seems django's develop webserver upload/download *tar.gz file is okay (the file will not changed).
Is relate to apache server? any suggestions?

Related

how to serve files from /tmp directory in django

I want to serve downloadable files located in the /tmp directory via django. However I don't want these files copied or moved to the static_dir. The whole point of having the files in /tmp is that a cronjob will delete them every night.
I realized I could just set /tmp as my static_dir in the django settings, but I already have a static_dir set that I am using in my project.
My question is what is the easiest way to allow an end user to download files located in the /tmp directory?
If I hit this url:
http://localhost:8000/api/v2/tmp/testfile.zip
I would expect to download /tmp/testfile.zip
Can this be done via just a single entry in urls.py? Or am I going to have to create a new view?

How to get the .env file into the remote host server

Quick question how to get the .env conf file into the server.
I have three settings file in my project base, development, production. I think testing and staging is an overkill
It is encouraged to put .env in .gitignore file. so it won't be included in version control. in Heroku, it is easy to set the environment variable in the settings tab. what about other hosting services without such a feature? since I push to my repository and pull from the server, How I am I suppose to get the API keys stored in the .env since it will ignore it (it's in gitignore) file? should I create in the server? or is there a way it is done? Newbie question I know.
You are correct. You should put your .env file in .gitignore and create .env file on server manually - either by connecting to server (SSH) and creating file or transfer file using one of secure transfer protocols (SCP, SFTP).

File uploads in Heroku deployment with Django

So I was finally able to set up local + prod test project I'm working on.
# wsgi.py
from dj_static import Cling, MediaCling
application = Cling(MediaCling(get_wsgi_application()))
application = DjangoWhiteNoise(application)
I set up static files using whitenoise (without any problems) and media (file uploads) using dj_static and Postgres for local + prod. Everything works fine at first... static files, file uploads.
But after the Heroku dynos restart I lose all the file uploads. My question is, --- Since I'm serving the media files from the Django app instead of something like S3, does the dyno restart wipe all that out too?
PS: I'm aware I can do this with AWS, etc, but I just want to know if thats the reason I'm losing all the uploads.
Since I'm serving the media files from the Django app instead of something like S3, does the dyno restart wipe all that out too?
Yes!. That's right. According to the Heroku docs:
Each dyno gets its own ephemeral filesystem, with a fresh copy of the most recently deployed code.
See, also this answer and this answer.
Conclusion: For media files (the uploaded ones), you must use some external service (like S3 or something). whitenoise is just for static files. See here why whitenoise is not suitable for serving user-uploaded (media) files.

django mkdir permission in apache

I have a django app containing a model with a file upload field. the upload field takes the targeted file and uploads a copy to either an existing directory in the media root or, if the directory hasn't been created, it creates the directory and drops the file inside of it.
The app works beautifully in dev, utilizing the built-in django server, but when I move it to a production server (my OSX machine running an apache2 instance with mod_wsgi) I get "[Error 13] Permission denied" thrown from the mkdir function in django's storage.py whenever I try to upload a file. I strongly suspect there is permission syntax that needs to be added to my apache httpd.conf. I don't know why else the django server has no trouble with the code but apache gags. Does anyone know?
Permissions issues are described in mod_wsgi documentation at:
http://code.google.com/p/modwsgi/wiki/ApplicationIssues#Access_Rights_Of_Apache_User
I guess sometimes an error message is exactly what it says it is. In this case "[Error 13] Permission denied" was being thrown because apache didn't have write access to the directories the django app was attempting to upload to. I simply navigated to the the directories I set as file upload directories, and gave write permissions on them systemwide. This probably wasn't the most secure solution, but it was the most practical as, it works and I don't know how to explicitly set write permissions for apache2 without just opening the directory systemwide.
Also, I didn't post the question at serverfault because I didn't know whether it was a django config issue or an apache issue.

Django and live server, on rewrite, the old sources files are still used

I'm using Django 1.3 on an Apache server and mod_wsgi(daemon mode), with Nginx for serving static file. The database is on a separate server. The wsgi daemon runs on 2 threads with a maximum requests of 100.
I get in trouble when I override old .py files ... Not .pyc ... I'm also overriding the .wsgi config file (http://code.google.com/p/modwsgi/wiki/ReloadingSourceCode). Sometimes, some server requests uses the old code, and therefore an error is generated (HTTP ERROR 500). Is there a server side cache that needs be emptied?
Can this be generated by the .pyc files? Do I need to restart the Apache server or the wsgi daemon?
If you remove the .pyc files and touch your wsgi files it should reload the wsgi daemon when it gets a chance, and you should be good.
On occasion I have had to restart apache in order to get my changes take affect.
Setup up ownership/permissions so that the user that code runs as under Apache cannot change code files nor create .pyc files. The user application runs as should only have ability to write to data or upload directories that it really needs to as is safer anyway.
The most reliable deployment method would be to install new version into a completely new directory hierarchy, with WSGI script file outside that tree. Then replace WSGI script file with new one referring to new directory. The WSGI script file in doing this should not be edited in place however, but a new file moved into place so filesystem does an atomic replace of the whole file and no risk of on the fly edit being picked up.