I'm using Amazon Linux on EC2 and I just added the user using sudo adduser john.
How can I give the user specific permissions to only ONE folder and all the sub folders in it? For example, they only have access to: /var/www/html and everything in it. How can I do that?
You will have to create a jailed user which will have access to that folder.
Please follow this steps in this tutorial.
http://www.cyberciti.biz/faq/debian-ubuntu-restricting-ssh-user-session-to-a-directory-chrooted-jail/
The chown command allows you to change the user and/or group ownership of a given file, directory, or symbolic link.
In Linux, all files are associated with an owner and a group and assigned with permission access rights for the file owner, the group members, and others.
You can use
sudo chown -R john folder_name
Related
I have created a Laravel project in laravel 5.4 and i have made it live using AWS server . Now the issue I face is I have to provide the 777 permission to storage folder very frequently and due to this the site is not working properly. Can anyone help me with this as what can be the issue ? I have already given 777 permission to storage folder but somehow the permission changes and site stops as it cannot write log in log file. Thanks in advance
Ideally giving 777 permissions means who have open the access to ANYONE in the world who can access your storage with all Read/Write permissions.
You need to assign permission to your Web server to access the Directories and files which you can do in following way:
www-XXX can be your webserver user
sudo chown -R www-xxx:www-xxx /path/to/your/laravel/root/directory
Now in order to grant the storage level permissions to your webserver you need to execute the below commands
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
I have a hosted server on Google Cloud Platform (GCP), and I am trying to overwrite some files.
I was able to make a connection through WinSCP, and I'm able to find the directory of the files I need to overwrite, however, all files are read-only.
How can I manage the permissions to give myself add/change permissions?
I agree this seems to be related to permissions on the files. I am not able to comment and wanted to add that if you want to avoid changing the ownership of directory and files, you can always set up a group as an owner.
Details can be found on this discussion
Summarizing:
# groupadd mygroup
# useradd -G mygroup user1
# chown -R :mygroup /path/folder
# chmod -R g+rw /path/folder
Create new group ‘mygroup’
Adds user user1 to mygroup
Recursively grants group ownership to content of /path/folder/ to mygroup
Recursively grants group read & write permission to contents of /path/folder
This will effectively allow you to manage users in mygroup with the appropriate permissions and access.
You need to be the owner of the file in order to be able to make changes. For example, if root is the owner of the file, you won't be able to change it (since GCP doesn't allow root access through FTP).
What you should do is make you (the user logged through WinSCP) owner of the file using command line and then make changes to the file. Be careful to make the old owner of the file owner again.
For example, using Centos and WinSCP you should do this:
Login to your server with WinSCP
Login to your server through putty or any other command line client
in putty: sudo chown YOUR_USER /complete/URL/file/in/your/server.XYZ
make whatever changes you need to make to your file
in putty: sudo chown OLD_USER /complete/URL/file/in/your/server.XYZ
YOUR_USER is the user you are logged in on WinSCP.
OLD_USER can be apache, root or whatever
If you want to upload a new file you must take ownership of the folder. To do that do not specify the file on the chown command, for instance:
sudo chown YOUR_USER /complete/URL/folder/
Once you finish, give back ownership to OLD_USER.
This can be a pain but is the only way I found to edit my files in my GCP server...
Hope this helps.
how to make auto 777 to the folder/files with get created automatically in amazon linux
every time my laravel application creates a new file in temp folder but not able to execute.
how to set automatically to give 777 permision to all files automatically
everytime in have to change the permissions to make it execute .
Run
chmod -R 777 /path/to/folder
with SSH on your server. Every created file in the folder should now inherit these permissions.
Be careful that giving all permissions to a folder is generally a bad idea, it can end up in a security issue. See this article.
I managed to configure my website on a Linux ec2 instance with Drupal. But I don't know where I need to modify the files of the server. I already have a fully functional website on my local host and would like to upload it in my ec2 instance.
Can I upload my site somewhere in Drupal? I also tried without Drupal, I installed Apache, and everything but I can't add files on /var/www/ folder because I don't have the necessary permission.
Can you please give me some suggestions or tutorials that might help me?
You can change file permissions using terminal command.
As super user use:
chmod -R 777 var/www/
The -R makes it recursive. For security reasons it's not a good practice to give everyone access to var/www folder. Really consider do you want your filesystem to be so accessible.
My suggestion is to make sub folder and temporary give it full access while you migrate your site from local server.
chmod -R 777 var/www/folder_for_your_drupal_site
After you done with your local site migration you should change back permissions on Drupal default settings (pay attention on settings.php file).
For further info related with migration of local Drupal site check my answer here.
Hope this helps.
I am trying to run Lynx under apache user via sudo, but it seems that lynx tries to access my home directory:
$ sudo -u apache lynx
/home/ssmirnov/: No such directory
I have such permissions on my home directory: drwx------
Can you advice me how to run Lynx under another user?
You might try using sudo's -H option. It sets $HOME to the home directory of the user you're trying to run as. Perhaps lynx is looking for a file there, i dunno. (It doesn't seem to have a problem on my machine...but eh.)
-i might work as well; it basically sets the environment up as if the user had logged in, including cd'ing to their home directory. Note, that means starting the shell specified for that user, running login scripts, and all that. If the user's not allowed to log in, this will likely fail.
If you want to run it from your home directory, for example to download something to that location, of course you'll have to grant access to apache somehow. This can be done on ext* filesystems on most modern Linux systems (without granting everyone access) by saying something like setfacl -m u:apache:rwx $HOME. In a pinch, you could temporarily put apache in your group and grant group rwx permissions on your homedir...but unless this is your home machine, i wouldn't do that.