Cant connect to new Amazon AWS Centos instance - amazon-web-services

I cannot connect to my AWS instance. I've searched the other related topics, they've not given me the answer.
I have created a Centos AMI and gone through all the steps. At the end I have imported a key pair.
The instance is created and everything seems fine. Except that I cannot connect to the server.
VIA PUTTY
Putty says: Login As, followed by password request, which of course I don't have. I dont know how to connect using a PEM file there does not seem to be a way to connect using PEM file.
VIA BITVISE
Ive created the keypair in Bitvise. When I connect Ive tried username ec2-user and root and asked for the initial_method to be publickey+slot2+password (and without password).
It ALWAYS asks me for a password, And when I enter (or not) the passphrase I created with the key it fails EVERYTIME to connect.
Driving me crazy, WTH Amazon! Just create a user/ password for us to connect with!!! This key pair thing is just crazy.

you can-not login with pem file you have to convert pem file to ppk by using puttygen, then you give your hostname and this newly generated ppk file, it will allow you to login. you have to use ec2-user in user-name.

The default user for the CentOS image is centos and NOT ec2-user.

Contrary to other answers .pem works fine from macOS terminal! It is possible to sign in to CentOS 7.8 on aws using the ssh -i ~/.ssh/my_key.pem centos#<public_ip>.
Converting to ppk just produced an error. All it took was to change the user to centos as shown below:
ssh -i ~/.ssh/my_key.ppk centos#<public_ip>
Load key "/Users/ilam/.ssh/my_key.ppk": Permission denied
centos#3.87.28.200: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

centos#ec2--36--160.us-**-2.compute.amazonaws.com and your ppk for auth will get you signed in with putty

Related

Problem creating flock connection using datastream

Good morning everyone.
I've been trying to configure the connection to the GCP datastream for a few days now with great success. Basically I can connect to the database with the username and password provided using dbeaver. But when trying to create the connection in gcp to the datastream I get the following error:
(1045 "Access denied for user 'username'#'XX.XX.XX.XX' (using password: YES)")
Being the ip that I'm informing one of the ips that google informs to release in the google firewall. The username and password I already have access to in dbeaver.
The bank as firewall rule is public. This is. it is accessible from any host. Which generates even more doubts without pointing out a proper solution.
Since now. Thank you very much.
I hope to be able to connect to create a datastream and thus replace an ETL.

Exporting certificates from AWS to Azure Key Vault

I have been following this sample from AWS. I have followed the guide, downloaded the certificates, converted them into .pfx and tested them in a local code similar to the sample. The code works fine and the intended connection can be established. However, when I try to upload the certificate to Azure Key Vault, I get this error:
The specified X.509 certificate content is invalid. Error: one or more x.509 properties are invalid.
I have Googled and tested quite a few different openssl commands, and also tried to upload through PowerShell (just incase), but its always the same error. I am a bit clueless as to why this (as far as I know) official AWS example does not just work, and why I can not just export certificates from one big cloud company to another.
Does anyone have any guesses as to what is going wrong and what properties may be invalid, or how I can find out?
PS: My Azure code is getting all configurations from the Key Vault secrets (connection strings, etc). I figured it could similarly be used to store certificates that my code can retrieve and use, instead of storing the certificate file in the project. Do let me know if I have misunderstood what kind of certificates should be saved in the Key Vault.
Please check if following can be worked around:
Please note that Key Vault requires /accepts only a PEM or PFX file
along with a private key.
If you have a private key stored separately in a different format,
you need to combine the key with the certificate and key should not
be encrypted. Some certificate authorities (CAs) provide certificates
in other formats. Therefore, before you import the certificate, make
sure that it's in either PEM or PFX file format and it uses key
either (RSA) or elliptic-curve cryptography (ECC) encryption. see
creating a certificate with a ca not partnered with key-vault
Some factors to check:
Please check if the Pfx was expired or Pfx password is incorrect or
has an invalid format.
Try Re-importing the cert from a pfx file with the --password
parameter if it is password protected.
Also make sure you have access policies created create,get
,set,delete ,list for user that is trying to create and also for the
application.
And the name of the secret or certificate that you are uploading
must be unique and should not match with any other secret that was
created previously. Content-type must be application/x-pkcs12 for
pfx file.
Also check the similar case from SO reference
References:
Understand X.509 public key certificates | Microsoft Docs /About Azure Key Vault Certificates
"The parameter KeyVault Certificate has an invalid value"- Microsoft
Q&A

How can I log into digital Ocean droplets using username and password, I want to deploy a MERN app

I'm used to using Putty to generate SSH KEY and logging in using my privatekey and passphrase. Now am working on a project and the owner generated a droplet password and username themselves.
I'm very confused now, I don't know how to log in to droplets using the password given to me.
What I have tried:
I have tried running ssh root#<password> but it didn't work.
This is the error I got:
ssh: Could not resolve hostname <password>: Name or service not known
Your effort will be greatly appreciated. Thanks
Try ssh <username>#<server> (for example ssh root#203.0.113.0). You should then be prompted for the password. Enter the password (nothing will show while you type) then press enter.

Unable to login my AWS server

i was logging in AWS server (ubuntu#54.564.564.1) with my pem key. After, i created one user in ubuntu and opened "/etc/ssh/sshd_config". In this file I have added following text "allow user username". I did reload that file and logout. but i unable to login my server with newuser(username#54.564.56.1) and olduser (ubu..#54.564.564.1).
Try ssh'ing without the ubuntu#. Just do the ip address because usually the first part implies the username.
Ex:
if I ssh into my raspberry pi and do: pi#192.168.1.12 then it just asks me for my password not my username and I login as the user pi.
Hope this helped

AWS prompts for password even though I have a key pair

I am trying to log into ASW after setting it up for the first time. I have been around the forums but cannot find someone with the same problem. Basically after I SSH in with my keypair files I get hit with a password request. After I do this:
chmod 400 x.pem
ssh -i x.pem ubuntu#ec2-52-25-41-126.us-west-2.compute.amazonaws.com
The forums recommend editing my sshd config file and setting:
PasswordAuthentication yes
However how can I get to my sshd file I cannot get in in the first place?
Also... I logged into the AWS GUI but cannot find anywhere to modify this password.
So...would anyone know
1) how to modify SSHD_Config when I cannot log in via ssh
2) what the default password might be?
Thanks for our hep Community.
Keep in mind that the AWS system, and your actual instance are two seperate systems. They don't talk to each other (This is for a very good reason once you understand the system design).
If you want to change your instance authentication, you need to do so from within the instance. AWS does not have access to do this for you.
If you are having issues connecting to your instance via SSH, you need to post information about what error you are getting.
Its not recommended that you use password authentication, its generally less secure than using a private key.