I have created a Laravel project in laravel 5.4 and i have made it live using AWS server . Now the issue I face is I have to provide the 777 permission to storage folder very frequently and due to this the site is not working properly. Can anyone help me with this as what can be the issue ? I have already given 777 permission to storage folder but somehow the permission changes and site stops as it cannot write log in log file. Thanks in advance
Ideally giving 777 permissions means who have open the access to ANYONE in the world who can access your storage with all Read/Write permissions.
You need to assign permission to your Web server to access the Directories and files which you can do in following way:
www-XXX can be your webserver user
sudo chown -R www-xxx:www-xxx /path/to/your/laravel/root/directory
Now in order to grant the storage level permissions to your webserver you need to execute the below commands
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
Related
While Submitting a form having media input its showing
[Errno 13] Permission denied: '/home/ubuntu/django/media/pictures'
I have searched in google but no one told giving permissions for media files , they ever all telling about static files only . Can any one please tell me which permission i have to give to it with chmod no.
You're having permissions issues. To fix it, you need to allow the Apache process access to the folder and it's content.
This can be done through the following steps
Change the permissions to read and write
chmod 664 /home/ubuntu/django/media/pictures -R
Give the group Apache runs under (www-data group) group ownership of the folder and its content
sudo chown -R :www-data ~/home/ubuntu/django/media/pictures
Restart the Apache service
sudo service apache2 restart
If you want to ensure Django behaves as it should, you can also add the following to your settings.py
FILE_UPLOAD_DIRECTORY_PERMISSIONS = 0o755
FILE_UPLOAD_PERMISSIONS = 0o644
I have a Django app running on a Linux server under NGINX. The "user" for the Django app is www-data. In this app, I try to connect to AWS IOT, and to do that I believe that the AWS boto3 package tries to find the AWS credentials here: ~/.aws/credentials. The problem is that for the user www-data there is no such path! When I login to the server (using my real username), and I try to run a script that connects to AWS, it connects just fine. Let's say my username is "joe". There is indeed a file /home/joe/.aws/credentials that contains the correct credentials. This is why the script works fine when I run as user "joe". But when the Django app is running, it doesn't work because there is no login user www-data, ie there is no file /home/www-data/.aws/credentials.
I understand that AWS boto3 let's us set an environment variable to specify a non-standard path to the credentials file. This env variable is AWS_SHARED_CREDENTIALS_FILE and there is also a AWS_CONFIG_FILE.
However, I don't know how to set an environment variable in Django for user www-data so that boto3 can now use that environment variable to specify the AWS credentials path.
Anyone know how to do this? Note that this is a production environment so I can't use any local server tricks/hacks.
If you are running your Django App from an EC2 instances the best practice is associate an IAM Role to the instance.
I had exactly the same issue but in a docker container and with apache instead of NGINX. For the container, we can do the following:
Edit /etc/apache2/envvars where environment settings for apache live:
echo "export AWS_SHARED_CREDENTIALS_FILE=/root/.aws/credentials" >> /etc/apache2/envvars
change ownership of aws credentials file parent directory:
chown -R www-data:www-data /root
restart
apache:
service apache2 restart
Note that here ownership of root's home directory was changed. This is because aws needs certain file permissions and ownership for the credentials file (which means only www-data will be able to use this login from now on.) Maybe it would be a better practice (especially if you're runnning on an actual machine and not a container) to copy the credentials file to a new location and follow the same steps:
mkdir -p /home/joe/workdir/.aws/
cp /home/joe/.aws/credentials /home/joe/foo/.aws/credentials
sudo chown -R www-data:www-data home/joe/foo/
sudo echo "export AWS_SHARED_CREDENTIALS_FILE=/home/joe/foo/.aws/credentials" >> /etc/apache2/envvars
sudo service apache2 restart
I don't know how well this fits to the NGINX config, but hope this helps a bit.
how to make auto 777 to the folder/files with get created automatically in amazon linux
every time my laravel application creates a new file in temp folder but not able to execute.
how to set automatically to give 777 permision to all files automatically
everytime in have to change the permissions to make it execute .
Run
chmod -R 777 /path/to/folder
with SSH on your server. Every created file in the folder should now inherit these permissions.
Be careful that giving all permissions to a folder is generally a bad idea, it can end up in a security issue. See this article.
I managed to configure my website on a Linux ec2 instance with Drupal. But I don't know where I need to modify the files of the server. I already have a fully functional website on my local host and would like to upload it in my ec2 instance.
Can I upload my site somewhere in Drupal? I also tried without Drupal, I installed Apache, and everything but I can't add files on /var/www/ folder because I don't have the necessary permission.
Can you please give me some suggestions or tutorials that might help me?
You can change file permissions using terminal command.
As super user use:
chmod -R 777 var/www/
The -R makes it recursive. For security reasons it's not a good practice to give everyone access to var/www folder. Really consider do you want your filesystem to be so accessible.
My suggestion is to make sub folder and temporary give it full access while you migrate your site from local server.
chmod -R 777 var/www/folder_for_your_drupal_site
After you done with your local site migration you should change back permissions on Drupal default settings (pay attention on settings.php file).
For further info related with migration of local Drupal site check my answer here.
Hope this helps.
My Django app is built on a VM Ubuntu instance via a Fabric script ran from my local dev machine as root with sudo. The Fabric script sets up a folder in:
/var/log/FOLDERNAME
and the app is set to log all log data into it.
However after each build even though the right permissions (group & folder) exist on the folder (ls -all confirms it) the log files have trouble getting generated unless I SSH to the box after each Fabric build and physically type in:
sudo chmod 777 /var/log/FOLDERNAME -Rf
... then everything works fine.
Can anyone please shed some light and/or point me in the right direction to solve this?
Cheers!
use put with mode to setup your logfile folder with permissions.
put('yourlogfile', 'yourlogfile', mode=0755)
A sidenote: Using chmod 777 is generally not a good idea. If your VM is running ubuntu your apache runs by default as www-data. chown www-data and r-w permissions for this user/group should be enough.