nginx/django file download permission - django

i have a django website running on ubuntu nginx where a user can upload image and another user can download the image.
Problem is when i upload image from frontend then another user can view the image but can't download the original image and when i upload the image from backend its downloadable.
i need to change file permission every time to download the image
nginx.conf:
user www-data;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;
events {
worker_connections 768;
# multi_accept on;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
client_max_body_size 100M;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
# Gzip Settings
##
gzip on;
##
# Virtual Host Configs
##
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
default settings:
server {
listen 80;
server_name 159.65.156.40;
location = /favicon.ico { access_log off; log_not_found off; }
location /static {
root /home/tboss/liveimage;
client_max_body_size 100M;
}
location /media/ {
root /home/tboss/liveimage;
}
location / {
include proxy_params;
proxy_pass http://unix:/home/tboss/liveimage/liveimage.sock;
}
}

Related

Debugging Django on a deployed remote server

I am running Django with Nginx and Gunicorn on a remote server.
There are certain types of interactions I can do on the remote machine (via my web browser) that will cause the webserver to respond with a "502 Bad Gateway nginx/1.10.3 (Ubuntu)" error after doing certain POST operations to the Django webserver. This error happens repeatably after exactly 30 seconds. Which makes me think it's some kind of timeout with Nginx.
When I run the Django server locally everything runs fine. But I don't think this is a problem with Nginx, I think it's a problem with Django on the remote system.
Can anybody provide any guidance about how to see what is going on with Django on the remote machine? Or how to debug this problem further.
user www-data;
worker_processes auto;
pid /run/nginx.pid;
events {
worker_connections 768;
# multi_accept on;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
# server_tokens off;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
#Time Out Settings
##
proxy_read_timeout 300;
proxy_connect_timeout 300;
proxy_send_timeout 300;
send_timeout 300;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
# gzip_vary on;
# gzip_proxied any;
# gzip_comp_level 6;
# gzip_buffers 16 8k;
# gzip_http_version 1.1;
# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss tex$
##
# Virtual Host Configs
##
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
#mail {
# # See sample authentication script at:
# # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
#
# # auth_http localhost/auth.php;
# # pop3_capabilities "TOP" "USER";
# # imap_capabilities "IMAP4rev1" "UIDPLUS";
#
# server {
# listen localhost:110;
# protocol pop3;
# proxy on;
# }
#}
contents of /etc/nginx/sites-enabled:
I have replaced my IP address with xxx.xxx.xxx.xxx and server name
with "myservername"
server {
server_name xxx.xxx.xxx.xxx backend.myservername.com www.backend.myservername.com;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /home/django/my_django_project;
}
location / {
include proxy_params;
proxy_pass http://unix:/home/django/my_django_project/django_subfolder/django_subfolder.sock;
}
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/backend.myservername.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/backend.myservername.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
}
server {
if ($host = backend.myservername.com) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
server_name xxx.xxx.xxx.xxx backend.myservername.com www.backend.myservername.com;
return 404; # managed by Certbot
}'''

AWS ec2 ubuntu+nginx+uwgi+flask deep learning api: 504 Gateway Time-out

I deployed a deep learning model api in the ec2 ubuntu server. And I using the following command to send my json files.:
curl http://ccc/api/ -d '{"image_link":["https://xx/8/52i_b403bb15-0637-4a17-be09-476168ff9a73"], "bb":"100"}' -H 'Content-Type: application/json'
Since the predicting model takes about 5 minutes to complete prediction. If I just predict some labels(1 labels) not whole labels(10 labels), the response result is OK. If want to predict the whole labels, and the error is out
<head><title>504 Gateway Time-out</title></head>
<body bgcolor="white">
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.14.0 (Ubuntu)</center>
</body>
</html>
and My Ngnix.conf:
user www-data;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;
events {
worker_connections 768;
# multi_accept on;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 900;
types_hash_max_size 2048;
# server_tokens off;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
and I also set the default file in sites-enable:
server {
listen 80; # listen gate
server_name xxx; # aws IP or domain name
charset utf-8;
client_max_body_size 75M;
fastcgi_read_timeout 1200;
location / {
include uwsgi_params; # import uwsgi
uwsgi_pass 127.0.0.1:8000; #
uwsgi_param UWSGI_PYTHON /usr/bin/python3; # Python environment)
uwsgi_param UWSGI_CHDIR /home/ubuntu/xxxx/src; # project dir
uwsgi_param UWSGI_SCRIPT app:app; # main app
}
}

Django : uploading too slow

I have a Django App running in Compute Engine(GCE) and my problem is that everytime I upload a not so large file like 1MB, the upload time is more than a minute. Is this normal? or I have mis-configuration somewhere in my code like maybe in Nginx.
I tested uploading 2 image files (1.2MB and 2.4MB) in my local and in the production site.
1.2MB
Local : 2 -3 seconds
Live : 50-60 seconds
2.4 MB
Local : 5-6 seconds
Live : 1.5 - 2 minutes
Python 2.7,Django 1.8 with nginx and uWSGI
Compute Engine instance : n1-standard-2 (2 vCPUs, 7.5 GB memory)
nginx config
server {
listen 80;
server_name example.com www.example.com;
return 301 https://www.example.com$request_uri;
}
server {
listen 443 ssl;
server_name example.com www.example.com;
ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_prefer_server_ciphers on;
ssl_ciphers 'ECDHE-RSA-AES256-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-EDH-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:RSA-RSA-AES256-SHA:AES128-GCM-AES128:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA';
location / {
include uwsgi_params;
uwsgi_pass unix:/sock/path/application.sock;
uwsgi_read_timeout 600;
}
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
alias /project/static_path/static/;
}
location /media {
alias /project/static_path/static/uploads;
}
location ~ /.well-known{
allow all;
}
}
uwsgi config
[uwsgi]
project_path = /project/path/
chdir = %(project_path)
home = /environment/path/
module = root.wsgi:application
master = true
processes = 5
socket = %(project_path)application.sock
chmod-socket = 666
vacuum = true
uid = current_user
gid = www-data
die-on-term = true
logto = /var/log/uwsgi/app/%n.log
EDIT
nginx.conf
user www-data;
worker_processes 4;
pid /run/nginx.pid;
events {
worker_connections 768;
# multi_accept on;
}
http {
client_max_body_size 20M;
client_header_timeout 3m;
client_body_timeout 3m;
send_timeout 3m;
client_header_buffer_size 1k;
large_client_header_buffers 4 4k;
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
# server_tokens off;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
Thanks!

nginx generate predictable session?

I deploy my django with nginx and uwsgi
And I have a security issue The remote web server generates predictable session IDs.
Ports
tcp/80
Sending several requests gives us the following session IDs :
SERVERID=locationserverfarm1|Vv4q4|Vv4q4
SERVERID=locationserverfarm2|Vv4q4|Vv4q4
SERVERID=locationserverfarm3|Vv4q4|Vv4q4
SERVERID=locationserverfarm2|Vv4q4|Vv4q4
SERVERID=locationserverfarm1|Vv4q4|Vv4q4
How can I set to let it generate random session id ???
Please help me. Thank you.
This is my nginx setting :
nginx.conf
user www-data;
worker_processes 1;
pid /run/nginx.pid;
events {
worker_connections 3000;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
# server_tokens off;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# SSL Settings
##
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
ssl_prefer_server_ciphers on;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
# gzip_vary on;
# gzip_proxied any;
# gzip_comp_level 6;
# gzip_buffers 16 8k;
# gzip_http_version 1.1;
# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
##
# Virtual Host Configs
##
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
myweb.conf
upstream django {
server 127.0.0.1:8001;
}
server {
listen 80;
server_name 127.0.0.1;
charset utf-8;
# max upload size
client_max_body_size 75M; # adjust to taste
#security
add_header X-Frame-Options "DENY";
add_header X-Content-Type-Options "nosniff";
add_header X-XSS-Protection "1; mode=block";
location /static {
alias /usr/share/nginx/ENV/mysite/mysite/staticfiles;
}
location / {
uwsgi_pass django;
include /etc/nginx/uwsgi_params; # the uwsgi_params file you installed
}
}

Configure nginx : my browser always displays IP address instead of domain_name

With my web browser: when I go to my_domain.com => I am redirected to XX.XX.XX.XX (the IP)
and the address displayed in my web browser remains XX.XX.XX.XX.
I want to point on XX.XX.XX.XX and I want the address displayed my_domain.com (same behaviour as in any website).
My configuration file: /etc/nginx/nginx.conf
user www-data;
worker_processes 4;
pid /var/run/nginx.pid;
events {
worker_connections 768;
}
http {
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
client_max_body_size 24M;
include /etc/nginx/mime.types;
default_type application/octet-stream;
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
gzip on;
gzip_disable "msie6";
passenger_root /usr/lib/ruby/vendor_ruby/phusion_passenger/locations.ini;
passenger_ruby /home/deploy/.rvm/gems/ruby-1.9.3-p547/wrappers/ruby;
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
And this is my other configuration file /etc/nginx/sites-enabled/default
server {
listen 80;
server_name my_domain.com;
add_header X-Frame-Options "SAMEORIGIN";
location / {
root /home/deploy/www/comint/current/public;
passenger_enabled on;
rails_env development;
}
location /doc/ {
alias /usr/share/doc/;
autoindex on;
allow 127.0.0.1;
allow ::1;
deny all;
}
}
when I try to perform a redirect, I have a loop error.
I tried many configurations, but it seems that I misunderstood something.
May someone explain me what the problem can be ?
Thank you
PS: I'm a rails developper, but that's my first web server configuration.