ssh with password into AWS EC2 instance - amazon-web-services

I'm trying to set up an AWS instance for about 90 people where they can sftp in and drop files, but it appears that AWS blocks passwords for ssh and only allows in ssh keyed logins by default.
Does there exist a way to undo this behavior inside AWS and allow for normal ssh and thus sftp to occur? If I don't need to manage ssh keys along with user names and passwords that would be great.

You can enable Password Authentication following helloV's advice, however I would advise against this.
Why would you not just use SSH keys?
Each of these 90 people should have their own private key files that they use daily - each person who needs access to this machine, you should ask for their public key.
Take each developer's public key, and add them, line by line, to a file on the EC2 host at
/home/<user>/.ssh/authorized_keys
From here, each user will be able to login using
ssh -i /path/to/private/key <user>#ec2.host.com
sftp -oIdentityFile=~/.ssh/keyfile <user>#ec2.host.com
This is much more secure in that you will not need to have a single password shared between 90+ developers, that password which will at some point be written on a sticky note and lost, or someone changes and locks out the other 89 developers.
This also has advantages in that, if a single one of those 90 developers leaves the company, all you will need to do is remove their public key from _authorized_keys_ and that person looses access to the machine, you will not need to change that shared password for all users.

First open few terminal sessions to the EC2 machine, in case modifying SSH config fails, you still have an option to restore it to its old values.
Edit /etc/ssh/sshd_config (after saving a backup file to sshd_config.bak) and change the following lines to:
# Change to no to disable tunnelled clear text passwords
PasswordAuthentication yes
In addition, if you want to disable key based authentication:
PubkeyAuthentication no
Save the file and restart the SSH daemon:
sudo service ssh restart
Create a new user and set a password or if a password has been already set for an existing user: open a new SSH session and you should be able to login using a password. If the solution didn't work, use any of the other open terminal sessions to restore the old sshd_config, restart the ssh service and continue debugging.

The critical part is to modify /etc/ssh/sshd_config on the server and restart the SSH server (usually "service sshd restart" - note it is "sshd", not "ssh" at least on Red Hat).
PasswordAuthentication yes
You may also want to run
sshd -t
before you run "service sshd restart", to make sure that you don't accidentally lock yourself out by a typo in the sshd_config file.
Using public keys for authentication probably makes more sense, depending on your scenario.
If you use public key authentication and have anything like 50 people, make sure you also manage the keys properly. See https://www.ssh.com/iam/ssh-key-management or NIST IR 7966 more information. Properly terminating keys for people who leave is also required by most compliance regulations if you happen to be in health care, finance, government, critical infrastructure, or anything else that is regulated. Shared passwords are expressly forbidden by most regulations and generally bad security practice.
You might also want to look at CryptoAuditor https://www.ssh.com/products/cryptoauditor/, which can enforce file transfer policies (direction etc), make sure you don't accidentally give people shell access if not needed, and it can record who did what (for both file transfers and shell access).

Related

Troubleshooting ssh login failure for AWS EC2 using powershell

Short problem description: I had created two new AWS EC2 server instances on Redhat Linux and Ubuntu operating system. I was trying to login to it through SSH via Putty but was getting errors(detailed errors further in question). This is just a short description.
Detailed problem description: I generated a new .pem file after creating a free tier Amazon EC2(Elastic Compute Cloud) server of Ubuntu 20 O.S version. Then I have used PuttyGen software in my Windows-10 O.S p.c. While trying to login through putty it was giving me an error as shown in the following error description, due to this I was not able to login to aws machines, I have further investigated the issue and all details of troubleshooting steps are mentioned in detailed manner in this self answered question(other suggestions/solutions are welcome).
Error's description:
"Server refused our key" On terminal AND "NO supported authentication method available(server sent:public ke;gssapi-with-mic)"
Initial troubleshooting:
My very initial thought was that it could be control M characters in .pem file OR in private key. But when checked key looked fine.
Then I thought it could be inbound rule issue, that we are NOT allowed to do ssh to aws server but NO I was wrong when I checked configuration of server I found it was fine and 22 port was there for ssh.
Then I thought if .pem file has proper permissions or not I checked and yes it was having read permissions on all users(whoever has had access to it).
Obviously nothing of above worked; enough talks so coming now to actual solution now :)
ACTUAL FIX with complete steps:
Since I am using Windows 10 O.S in my P.C so I have thought to go for any other medium to do ssh, hence I tried SSH with powershell as follows.
PS E:\test> ssh -i "aws_key_generator.pem" ubuntu#singh_server_test.compute.amazonaws.com
########################################################### #
WARNING: UNPROTECTED PRIVATE KEY FILE! #
###########################################################
Permissions for 'aws_key_generator.pem' are too open.
It is required that your private key files are NOT accessible by others.
This private key will be ignored.
Load key "aws_key_generator.pem": bad permission
Above has given me lot of confidence and shown path to me what to do next. I immediately checked permissions of file again (which was READ and verified as mentioned in my initial troubleshooting column above). Then I saw that other groups are present in permissions and then I removed inheritance by going to "Right click on .pem file"-->"Properties"-->"Security"-->"Advance" and disabled “Inheritance” from there.
Then I only kept "admin" group with READ ONLY permissions to it and saved the properties of file.
Again I tried to SSH from PUTTY(using private key file generated from PuttyGen mentioned in problem statement) but it failed.
But when I tried from powershell command(which used .pem file) it had been successful login now.
Then I su as root and went to /home/ubuntu/.ssh directory and checked authorized_keys file and couldn’t see PUBLIC key inside it.
So I added Public key generated by PuttyGen to "authorized_keys" and saved it.
Again I tried to login to server by Putty with passing private key to it and I was successfully able to login now BINGO :)
I have fixed issue with this work around fix and thought to share with all here; other solutions are welcomed too, cheers and Happy learning.
It appears that you are logging into an Ubuntu server.
For that, the username should be ubuntu, not ec2-user.

Working Private Key Pair Stopped Working

After several days of working on a Flask webapp when I tried using Putty and WinSCP to connect to my server I got the following error message:
Disconnected: No supported authentication methods available (server sent: public key).
I then launched a new EC2 instance and after getting a basic web app and running and logging out of Putty and trying to get back in the same thing happened.
Perfectly good key pair which works influentially suddenly stops working.
Any thoughts on what the problem might be?
Based on your description regarding permission changes I think #sqlbot has the right answer here. If you've recursively changed permissions or ownership within a user directory (chmod -R 777 /home/someuser or something similar) then you may have accidentally changed the permissions of /home/someuser/.ssh/authorized_keys afterwhich the public keys associated with this user will no longer be honored.
These are the correct permissions for .ssh and .ssh/authorized_keys:
chmod 0700 ~/.ssh
chmod 0600 ~/.ssh/authorized_keys
Unfortunately if you can't access the instance you won't be able to reset them at this point.

Fabric run not working with password

I want to copy a file from remote1 host to the remote2 host using fabric.
I am trying to pass the password for remote2 to the fabric run command, but it's still prompting for the remote2 password.
Anything wrong with my code?
run('echo "pass123" | scp my.tar root#1.2.3.4:')
run('"pass123" | scp my.tar root#1.2.3.4:')
Edit:
In our prod and stagging environments key auth is not supported.
The OpenSSH utilities, including scp, don't accept passwords on the command line or standard input. Whey they read a password, they explicitly open the process's TTY and read from that.
There are basically four approaches available to you:
Use key-based authentication instead of passwords.
Use a program like sshpass or expect to feed the password to scp through a PTY.
Download the OpenSSH source code and modify the software to work the way you want.
Find a way to transfer these files which doesn't involve using the OpenSSH clients software.

AWS - Disconnected : No supported authentication methods available (server sent :publickey)

SSH to my AWS server just broke for both Putty and Filezilla. I'm making some effort for this post to be a comprehensive troubleshooting list, so if you share links to other stack overflow pages, I'll edit them into the question.
Disconnected : No supported authentication methods available (server sent :publickey)
The error is familiar from when I set up the connection almost a year ago. If you're setting up AWS SSH for the first time, these address the most common problems:
Wrong username: Disconnected : No supported authentication methods available (server sent :publickey)
Incorrect .ppk file: Unable to connect to amazon server using putty
However, the only thing I could think that would impact a previously working system is:
Wrong IP: Restarting an AWS instance (or creating an image) is not guaranteed to keep the same IP address. This would obviously have to be updated in putty.
What other possibilities are there?
Solution to this one (per the accepted post below) is that for AWS EC2 all 3 of these need to have proper permissions (777 not ok for any of these). Here's one example that works:
/home/ec2-user/ - 700
/home/ec2-user/.ssh/ - 600
/home/ec2-user/.ssh/authorized_keys - 600
/var/log/secure will tell you which one is throwing an error, consult this video tutorial to get access if you're completely locked out:
http://d2930476l2fsmh.cloudfront.net/LostKeypairRecoveryOfLinuxInstance.mp4
I had the same problem, by accident mistake. I'll share it here, in case someone may have made the same mistake.
Basic steps, as others described.
Download putty and puttygen, or the putty package and install it.
Get the .pem file from your AWS EC2 instance.
Use puttygen to convert the .pem file so that you'll have a private key --- mistake happened here. I chose "Conversions" tab from PuttyGen, and load my .pem file. After loading pem file, here DO NOT hit "Generate", instead directly "Save private key". That's the key you need. If you click Generate, you'll have a totally different pair of keys.
In putty, use ec2-user#your.public.dns.that.you.get.from.aws.ec2.instance, and load the private key at SSH/Auth
Good luck!
Comprehensive answer is here: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/putty.html
Your problem can be related to incorrect login which varies depending on AMIs.
Use following logins on following AMIs:
ubuntu or root on ubuntu AMIs
ec2-user on Amazon Linux AMI
centos on Centos AMI
debian or root on Debian AMIs
ec2-user or fedora on Fedora
ec2-user or root on: RHEL AMI, SUSE AMI, other ones.
If you are using OS:
Windows - get PEM key from AWS website and generate PPK file using PuttyGen. Then use Putty to use the PPK (select it using left-column: Connection->SSH->Auth: Private key for authorization)
Linux - run: ssh -i your-ssh-key.pem login#IP-or-DNS
Good luck.
There is another cause that would impact a previously working system. I re-created my instances (using AWS OpsWorks) to use Amazon Linux instead of Ubuntu, and received this error after doing so. Switching to use "ec2-user" as the username instead of "ubuntu" resolved the issue for me.
I got this error when I upgraded from ubuntu 20 to ubuntu 22.
The solution was simply to upgrade my Putty to the latest version.
You will also receive
Disconnected : No supported authentication methods available (server sent :publickey)
when you have a correct Linux user but you haven't created the file .ssh/authorized_keys and saved the public key as indicated in Managing User Accounts on Your Linux Instance
PuTTY does not natively support the private key format (.pem) generated by Amazon EC2. PuTTY has a tool named PuTTYgen, which can convert keys to the required PuTTY format (.ppk). You must convert your private key into this format (.ppk) before attempting to connect to your instance using PuTTY.
The steps how to perform this are described here: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/putty.html
This solved the problem.
For me this error appeared immediatey after I changed the user's home directory by
sudo usermod -d var/www/html username
It can also happen because of lack of proper permission to authorized_key file in ~/.ssh. Make sure the permission of this file is 0600 and permission of ~/.ssh is 700.
Check the version of Ubuntu.
If it's 22 then you need to update the putty too.
Updating putty version solved my issue.
Don't Generate Private key
Your problem is , when opened puttygen you are load file-from-aws.pem you are click on Generate this is wrong , just click on save private key button
in most cases, got no authentication method error when using the wrong username for logging in. But I do find something else if you still struggle with connection issue and you have tried all the options above.
I created couple Linux VM and try to reproduce such connection issue, one thing I found is, when AWS asked you name your key pair, DO NOT user blank space (" ") and dot (".") in key pair name, even AWS actually allow you to do so.
ex. when I named the key pair as "AWS.FREE.LINUX", connection always be refused. When I named as "AWS_FREE_LINUX", everything works fine.
Hope this will help a little bit.
Login is depending upon the AMI which you have created. Use left hand side data as a username while doing login.
ubuntu- ubuntu AMIs
ec2-user- Amazon Linux AMI
centos- Centos AMI
debian or root- Debian AMIs6
ec2-user or fedora- Fedora
If you use PuTTY to connect to your instance and get either of the following errors, Error: Server refused our key or Error: No supported authentication methods available, verify that you are connecting with the appropriate user name for your AMI. Type the user name in User name in the PuTTY Configuration window.
The appropriate user names are as follows:
For Amazon Linux 2 or the Amazon Linux AMI, the user name is ec2-user.
For a CentOS AMI, the user name is centos or ec2-user.
For a Debian AMI, the user name is admin.
For a Fedora AMI, the user name is fedora or ec2-user.
For a RHEL AMI, the user name is ec2-user or root.
For a SUSE AMI, the user name is ec2-user or root.
For an Ubuntu AMI, the user name is ubuntu.
For an Oracle AMI, the user name is ec2-user.
For a Bitnami AMI, the user name is bitnami.
Otherwise, check with the AMI provider.
In my case the problem was that the ppk file was placed in %USERPROFILE%\Downloads instead of %USERPROFILE%.ssh folder.
After I moved the file, the problem disappeared.
This: "Disconnected: No supported authentication methods available (server sent: publickey)" happened to me after I turned on Microsoft One Drive backup and sync for my files including the directory where I saved my ssh key. In my case the solution is simple: just go to Putty => SSH => Auth and just (re)browse again to where my same key is located and saved, then it worked. It looks backup and sync software such as Microsoft One Drive (and may be the same with Google Drive), affect the way Putty sees and identifies directories if the key directory is specified and then later some time installing or turning on backup and sync including that directory.
In my case the problem was with hostname/public DNS.I associated Elastice IP with my instance and then my DNS got changed. I was trying to connect with old DNS. Changing it to new solved the problem. You can check the detail by going to your instance and then clicking view details.
This happened to me because After Load pem file to puttygen I pressed generate button and then pressed save the private key. It is not needed to press Generate button. Just Load and press Save Private Key
After an hour grinding through other suggestions and scanning every character too many times, realized I'd switched to a different user for another reason and created the .ssh and .ssh/authorized_user as the other user instead of as my new user. The owner-only 600 and 700 permissions then prevented necessary access when attempting to connect as the new user.
Simple fix below - hope sharing the story saves someone a few minutes.
$ chown mynewuser:mynewuser .ssh
$ chown mynewuser:mynewuser .ssh/authorized_keys
Based on multiple instances, if the key file and username are correct, this seems to occur when changing certain directory permissions associated with the root user.
A similar issue happened with me today. I also had searched alot about this.No one help. I just made two changes and its get working properly as well.
I had visited Amazon documentation where describe either Verify that there is a rule that allows traffic from your computer to port 22 (SSH) and if not present, create it and edit "Security Group" and add "SSH" to my IP. This will help.
In my case, In putty profile, I have to again authorize with .ppk file. I don't know why it ask again, without any changes made.
Hope it will help you.
I had the same problem, I used Public DNS instead of Public IP. It resolved now.
For me, I just had to tell FileZilla where the private keys were:
Select Edit > Settings from the main menu
In the Settings dialog box, go to Connection > SFTP
Click the "Add key file..." button
Navigate to and then select the desired PEM file(s)
I am using OpsWorks and wanted to register a new existing Linux instance from my Windows machine on AWS Cli.
Frist problem was, that I had to use my Putty generated .pkk file.
Second problem was that I needed to quote the absolute path to that .pkk file like that:
aws opsworks register --infrastructure-class ec2 --ssh-username
ec2-user --ssh-private-key "C:\key.ppk"
In order to get this to work with my AWS Linux instance, I had to change the "Type of key to generate" in PuTTYgen from "RSA" to "EdDSA" before clicking "Save private key".
I hope this helps someone else out.
I am using FileZilla to connect. But it gives the same error
ATAL ERROR: No supported authentication methods available (server sent: publickey)
Solution is simple I was using an older version of FileZilla and after updating it. It connected successfully.
You can try to update Putty and WinSCP to solve this.
Solution:
https://www.youtube.com/watch?v=FhPq1EAh8BM
While trying to connect to a SiteGround server via Putty I had the same problem. Their instructions are pretty thorough, and must work for some people, but didn't work for me.
They recommend running pageant.exe, which runs in the background. You register your key(s) with Pageant, and it's supposed to let Putty know about the keys when it tries to connect.
In a couple of places I found suggestions to specify the key directly in the Putty session definition: Putty Configuration > Connection > SSH > Auth > "Private key file for authentication", then browse to your key file in .ppk format.
Doing this without running Pageant resolved the problem for me.
During ssh session my connection broke, since then I cannot ssh my SRV, I had started a new instance, and I'm able to ssh the new instance (with the same key).
I mounted the old volume to the new machine, and check the .ssh/authorized_key and couldn't find any problem with permission or content.

Connect to Amazon EC2 file directory using Filezilla and SFTP

I have created an AWS EC2 Instance and I want to be able to upload files to the server directory using FileZilla in the simplest and most straightforward fashion possible.
I've created a video tutorial for this. Just check:
Connect to Amazon EC2 file directory using FileZilla and SFTP, Video Tutorial
Summary of above video tutorial:
Edit (Preferences) > Settings > Connection > SFTP, Click "Add key file”
Browse to the location of your .pem file and select it.
A message box will appear asking your permission to convert the file into ppk format. Click Yes, then give the file a name and store it somewhere.
If the new file is shown in the list of Keyfiles, then continue to the next step. If not, then click "Add keyfile..." and select the converted file.
File > Site Manager Add a new site with the following parameters:
Host: Your public DNS name of your EC2 instance, or the public IP address of the server.
Protocol: SFTP
Logon Type: Normal
User: From the docs: "For Amazon Linux, the default user name is ec2-user. For RHEL5, the user name is often root but might be ec2-user. For Ubuntu, the user name is ubuntu. For SUSE Linux, the user name is root. For Debian, the user name is admin. Otherwise, check with your AMI provider."
Press Connect Button - If saving of passwords has been disabled, you will be prompted that the logon type will be changed to 'Ask for password'. Say 'OK' and when connecting, at the password prompt push 'OK' without entering a password to proceed past the dialog.
Note: FileZilla automatically figures out which key to use. You do not need to specify the key after importing it as described above.
If you use Cyberduck follow this.
Check this post if you have any permission issues.
This is very simple if you used your pem file ( I am using macOS / windows users can follow the same steps.)
Just download your FileZilla (I'm using macOS - and downloaded the free version, that's good enough)
Open Site Manager in FileZilla (⌘S) -> New Site
Put your hostname in the Host field.
Example: eca-**-**-**-111.ap-southwest-9.compute.amazonaws.com
Select Protocol as SFTP - SSH File Transfer Protocol
Select Logon type as Key File
Put your user name in the User field : for me it's ubuntu (find your ssh user)
Note:
OS vs Username
Amazon - ec2-user
Centos - centos
Debian - admin or root
Fedora - ec2-user
RHEL - ec2-user or root
SUSE - ec2-user or root
Ubuntu - ubuntu or root
For the Key file field, browse your pem file: and click Connect
That's all :) have fun!
Note:
(Remember to allow SSH connection to your IP address from EC2) If not you will get connecting error message!
Note: Allowing your IP to connect your AWS instance via SFTP
EC2 -> SecurityGroups -> SSH -> Inbound rules -> Edit -> Add Rule (
SSH|TCP|22|My IP(it's get ip automatically | name for rule) -> Save
If anyone is following all the steps and having no success, make sure that you are using the correct user. I was attempting to use "ec2-user" but I needed to use "ubuntu."
Make sure you use port 22. Filezilla will default to port 21 for SFTP.
Just one minor note to the well explained accepted answer of
Yasitha Chinthaka:
Note: FileZilla automatically figures out which key to use. You do not
need to specify the key after importing it as described above.
In my case I already had other 5 ppks from other instances that I was using in the past (with the ppk of the new instance being at the bottom of that list). I added the new ppk of my new instance, and it wouldn't let me connect to it. The error message: too many tries / attempts.
After I deleted the unused ppks, I was finally able to login to the instance.
So no, Filezilla is not that smart ;-)
FileZilla did not work for me, I kept getting this error:
Disconnected: No supported authentication methods available (server sent: publickey)
What did work was the sftp command.
Connect with the EC2 Instance with
sftp -i "path/to/key.pem" ec2-user#ec2-54-212-34-84.us-west-2.compute.amazonaws.com
Downloading files / dirs
To download path/to/source/file.txt and path/to/source/dir:
lcd ~/Desktop
cd path/to/source
get file.txt
get -r dir
Uploading files / dirs
To upload localpath/to/source/file.txt and ~/localpath/to/source/dir to remotepath/to/dest:
lcd localpath/to/source
cd remotepath/to/dest
put file.txt
put -r dir
You can use any FTP client. I use winscp and it works just fine. In all these clients; you can specify the ssh secure key.
Old question but what I've found is that, all you need is to add the ppk file.
Settings -> Connections -> SFTP -> Add keyfile
User name and the host is same as what you would provide when using putty which is mentioned in http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-connect-to-instance-linux.html
Might help someone.
In my case, Filezilla sends the AWS ppk file to every other FTP server I try to securely connect to.
That's crazy. There's a workaround as written below but it's ugly.
It does not behave well as #Lucio M pointed out.
From this discussion: https://forum.filezilla-project.org/viewtopic.php?t=30605
n0lqu:
Agreed. However, given I can't control the operation of the server, is there any way to specify within FileZilla that a site should authenticate with a password rather than key, or vice-versa? Or tell it to try password first, then key only if password fails? It appears to me it's trying key first, and then not getting a chance to try password.
botg(Filezilla admin) replied:
There's no such option.
n0lqu:
Could such an option be added, or are there any good workarounds anyone can recommend? Right now, the only workaround I know is to delete the key from general preferences, add it back only when connecting to the specific site that requires it, then deleting it again when done so it doesn't mess up other sites.
botg:
Right now you could have two FileZilla instances with separate config dirs (e. g. one installed and one portable).
timboskratch:
I just had this same issue today and managed to resolve it by changing the "logon type" of the connection using a password in the site manager. Instead of "Normal" I could select either "Interactive" or "Ask for Password" (not really sure what the difference is) and then when I tried to connect to the site again it gave me a prompt to enter my password and then connected successfully.
It's not ideal as it means you have to remember and re-type you password every time you connect, but better than having to install 2 instances of FileZilla.
I totally agree that it would be very useful in the Site Manager to have full options of how you would like FileZilla to connect to each site which is set up (whether to use a password, key, etc.)
Hope this is helpful!
Tim
Also see:
https://forum.filezilla-project.org/viewtopic.php?t=34676
So, it seems:
For multiple FTP sites with keys / passwords, use multiple Filezilla installs,
OR,
use the same ppk key for all servers.
I wish there was a way to tell FileZilla which ppk is for which site in Site Manger
the most simple and straight forward is to create a FTP login.
Here is a little and easy to understand tutorial site on stackoverflow itself, how to set things up in 2min...
Setting up FTP on Amazon Cloud Server
First of all Filezilla is an FTP/SFTP client/server. We will need to use the client for this purpose.
1) Download the client from the URL: https://filezilla-project.org/
2) Go to AWS management console and then EC2. Select the instance that you want to access and then copy the DNS or IP address of the instance and then paste it in Filezilla host name.
Follow Image: Amazon Instance Access via FileZilla
3) Then, enter the username for the instance that you have created, For Amazon-ami it will be ec2-user and for other OS it would be different. Then, enter the password and port which will be 21 or 22.
4) Then, it will ask for the key, which is in pem format just select the .pem file and then it will confirm authentication. Click on Yes and then you are Done.
Note: In your EC2 Security group allow port number 21 and 22 whichever required for FTP access.
all you have to do is:
1. open site manager on filezilla
2. add new site
3. give host address and port if port is not default port
4. communnication type: SFTP
5. session type key file
6. put username
7. choose key file directory but beware on windows file explorer looks for ppk file as default choose all files on dropdown then choose your pem file and you are good to go.
since you add new site and configured next time when you want to connect just choose your saved site and connect. That is it.
If you are comfortable using command lines, and use git bash to ssh to remote server (ubuntu, etc) from your PC, you may use sftp as below, works great always, and seems very fast. The pem file can be downloaded from aws EC2 or Lightsail, or any server. In the below command replace the path/namd of the pem file. Also replace the IP address to that of remote server [say remote Unix or linux/ubuntu server.]
$ sftp -i /c/Users/pat/Downloads/LightsailDefaultKey-us-east-1-2.pem ubuntu#123.45.6789
Additional commands for actual upload/download using sftp
Go to remote folder
sftp> pwd
sftp> cd /home/ubuntu/mymedia
Go to local folder
sftp> lpwd
Local working directory: /
sftp> lcd /c/Users/pat/Desktop/Camtasia
To upload local files to server
sftp> put *
Or use get if you need to download to your local PC
sftp> get *
Note: this is similar to the ssh to connect to remote using pem file.
$ ssh -i /c/Users/pat/Downloads/LightsailDefaultKey-us-east-1-2.pem ubuntu#123.45.6789
Thanks!
For Ubuntu users, there is another way of making and setting up a key without PuttyGen. I assume that you can connect to the server via terminal and want to set up FileZilla for file transferring.
In terminal execute the command:
ssh-keygen
It will prompt you to enter the destination path for the private and public keys files, by default it's
/home/user/.ssh/id_rsa and /home/user/.ssh/id_rsa.pub
you can leave it as is. Then, open the public key and copy its content:
cat .ssh/id_rsa.pub
In the terminal, on the server-side, open the .ssh/authorized_keys and paste the copied line there:
nano .ssh/authorized_keys
You will probably see there some present lines like "ssh-rsa AAAAB3Nz...". It's okay. Simply put the new line and paste the line copied before.
https://www.cloudjojo.com/how-to-connect-ec2-machine-with-ftp/
First you have to install some ftp server on your ec2 machine like vsftpd.
Configure vsftpd config file to allow writes and open ports.
Create user for ftp client.
Connect with ftp client like filezilla.
Make sure you open port 21 on aws security group.