EC2 FTP User Directory change - amazon-web-services

Our EC2 instance setup has an ftp user that had successful setup through the vsftpd program with an original home directory of \home\user\, I followed the instructions on this stack overflow, and had the user's shell set to /bin/false
What I'm looking to do is make the ftp user login only accessible to a particular directory, a folder in the html directory - \var\www\html\website.com\userfolder
What I've done:
Added user to a group ftpgroup
Authorized access and ownership of the new directory to the user:ftponly
Changed the user's home directory in /etc/passwd
added .ssh/authorized_keys with user's key in the new directory
changed ChrootDirectory in /etc/ssh/sshd_config to new directory
changed the permissions on the directory to chmod -R 775 user:ftpgroup
mounting \var\www\html\website.com\userfolder
Before these changes I was able to access the FTP, and now upon attempted access I receive the following errors from the Filezilla client:
Error: Disconnected: No supported authentication methods available
(server sent:publickey) ... Status: Connection attempt failed with
"ETIMEDOUT - Connection attempt timed out"
As it was working before, I'm thinking that it might have something to do with permissions, I'm just unsure of where else to change.
Thanks for any insight.

This worked for me.
After creating the user with vsftpd, the user now has access to the directory via FileZilla.
I then added a link from the home/{user} directory to the /var/www/html/{user} directory.
The user can upload files to the home directory and can view it from the html directory.
This is a simple hack. Let me know if this solves your problem.

Related

Permission Denied / Operation not permitted

I was having a issue while experimenting with my samsung android 6.0 Device. I have downloaded Termux( Linux terminal for android which has all linux terminal functionality ). And i have converted my phone into a ssh server using SSH Client app from playstore.
So the main problem is that when i try to generate a ssh key pair using command ssh-keygen it works but it stores the ssh key into data/data/com.termux/files/home/.shh but i want it in sdcard/key/ ( I have created a directory name key ). But when i try to generate a key pair in sdcard/key it show permission denied. When i try to copy or move from data/data/com.termux/files/home/.shh/ using cp data/data/com.termux/files/home/.ssh/key_pair sdcard/key it show me permission denied so i tried to create another folder in sdcard/ and gave it permissions for read-write-execute using chmod 777 dir_name it shows me operation not permitted.
So i done a bit research on this topic on google, i found that i have to root my device then i tried to root my device using KingoRoot, iRoot, adb but it showed that Your device is too strong to bee rooted in fact i gave those app all permission and there is no antivirus or in built security disabled on my device. Then i thought to directly place it in sdcard/key but then i got to know that i cannot move anything from data/data/com.termux/files/home/.shh.
And when i tried to move files via ssh server or upload files on my ssh server via ssh admin#192.168.1.5 -p 2222 using scp command it showed me file not found but the file is in the correct directory and correct path.
And please answer the following questions that i have regarding the issues above that 1> How to move files from localhost to ssh server 2> how to root device if it is too strong to be rooted(any method)
Thanks !

How to create folder on Elastic Beanstalk server to install LetsEncrypt SSL certificate with AcmePHP

I have a site running on an Elastic Beanstalk single instance server and want to add automated SSL certificate generation from LetsEncrypt using the AcmePHP library.
The library tries to store the certificates in ~/.acmephp, which the server responds to with an error
Failed to create "/home/webapp/.acmephp": mkdir(): Permission denied.
The acmephp library doesn't have an option to change the path built in, and rather than fork and recompile the script, I'd like to be able to store the files in the default directory.
Does anyone know how I can give the app permission to create this directory, outside of the web root, or how I can make the server create it automatically and have it be available to the app?
It looks like since it's being ran by the webapp user, when acmePHP is trying to store the certificate under that user's home directory it fails because that directory doesn't exist (afaik the webapp user only runs httpd and it definitely doesn't have a home directory).
A very dirty workaround could be manually creating that file and folder in the . ebextensions folder in your project.The file would be .ebextensions/create_home.config and it would contain something like this:
files:
"/tmp/create-home.sh" :
mode: "000755"
content: |
#!/usr/bin/env bash
mkdir -p /home/webapp
chown webapp:webapp -R /home/webapp
commands:
01_create:
command: "/tmp/create-home.sh"
That script is ran by the root user, and afterwards it changes ownership of the /home/webapp folder to the webapp user and group respectively. Hope it helps

I can't connect VM on GCP as root

I can't connect VM on GCP as root on the browser SSH.
Is there anyone who had the same problem?
the following message is displayed.
You can drastically improve your key transfer times by migration to OS login.
It might be caused to set a password...
By default, you will login as the GCP user. Now, to log in as root please run the following command once SSH browser works.
sudo -s
If you cannot login with browser SSH, then I suspect a permission issue with that particular user.
The above is the recommended way of doing things, however if logging in as root is absolutely needed, please follow the steps below:
As root, edit the sshd_config file in /etc/ssh/sshd_config:
nano /etc/ssh/sshd_config
Make sure PermitRootLogin is set to “yes” and save the /etc/ssh/sshd_config file.
Restart the SSH server:
service sshd restart
Change username to root by clicking on the wheel on the top right corner and selecting “Change Linux Username"

Google Cloud permissions

I have a hosted server on Google Cloud Platform (GCP), and I am trying to overwrite some files.
I was able to make a connection through WinSCP, and I'm able to find the directory of the files I need to overwrite, however, all files are read-only.
How can I manage the permissions to give myself add/change permissions?
I agree this seems to be related to permissions on the files. I am not able to comment and wanted to add that if you want to avoid changing the ownership of directory and files, you can always set up a group as an owner.
Details can be found on this discussion
Summarizing:
# groupadd mygroup
# useradd -G mygroup user1
# chown -R :mygroup /path/folder
# chmod -R g+rw /path/folder
Create new group ‘mygroup’
Adds user user1 to mygroup
Recursively grants group ownership to content of /path/folder/ to mygroup
Recursively grants group read & write permission to contents of /path/folder
This will effectively allow you to manage users in mygroup with the appropriate permissions and access.
You need to be the owner of the file in order to be able to make changes. For example, if root is the owner of the file, you won't be able to change it (since GCP doesn't allow root access through FTP).
What you should do is make you (the user logged through WinSCP) owner of the file using command line and then make changes to the file. Be careful to make the old owner of the file owner again.
For example, using Centos and WinSCP you should do this:
Login to your server with WinSCP
Login to your server through putty or any other command line client
in putty: sudo chown YOUR_USER /complete/URL/file/in/your/server.XYZ
make whatever changes you need to make to your file
in putty: sudo chown OLD_USER /complete/URL/file/in/your/server.XYZ
YOUR_USER is the user you are logged in on WinSCP.
OLD_USER can be apache, root or whatever
If you want to upload a new file you must take ownership of the folder. To do that do not specify the file on the chown command, for instance:
sudo chown YOUR_USER /complete/URL/folder/
Once you finish, give back ownership to OLD_USER.
This can be a pain but is the only way I found to edit my files in my GCP server...
Hope this helps.

Not able to upload files on AWS server root folder getting 'file permission error'

I'm trying to upload a google webmaster file to aws root folder with Filezilla. I'm getting the following error
and my root folder permission is set to 755. But still i'm getting file permission error.
You are not authorized to access this directory. Ask your administrator/Root user to run this command.
sudo chown -R USER_NAME:USER_NAME FOLDER_PATH
USER_NAME: Your username
FOLDER_PATH: Folder path
I have also faced the same issue earlier so I sorted it like this.