How to create ansible master on GCP VM instance? - google-cloud-platform

I tried to create ansible master on instance-1 in GCP. And I have an application instance-2. When I try to connect from instance-1 to instance-2 I always get error message - Permission denied (publickey).
I added instance-2 ssh public key to authorized_keys and configured parameter PasswordAuthentication to yes in sshd_config. In ansible file I have this:
sba ansible_host=35.**.57.** ansible_user=admin
ansible_ssh_private_key_file=/home/admin/.ssh/app_gcp
Also I can't do sudo ssh-copy-id 35.111.57.111

The problem was in wrong ansible_ssh_private_key_file. It should point on the id_rsa of the current machine.

Related

I can't open mi VM by SSH

I have acquired on the marketplace a solution for Ubuntu Trusty, I've create my VM, I just changed the name and try to open by SSH I can't get a connection. This is the error:
"Could not connect to the VM on port 22. Learn about the possible causes of this problem."
One year ago I created a VM and I didn't have troubles, now I don't know why just I can't...
If there was a change in the setup "gcloud" denies an ssh connection, you must update the ssh keys in your metadata:
sudo gcloud compute config-ssh
If the internal firewall of the system block port 22, you can try the following steps to fix it:
Shut down the instance
Click 'Edit' for the instance, under 'Custom Metadata' section, add 'startup-script' key, with value:
#! /bin/bash
sudo ufw allow 22
Click 'Save'
Start the instance again, and SSH into it

Unable to access instance Permission denied (publickey) after installing vsftpd

I am unable to access my ec2 ssh , the last thing i am sure that i've did is created AMI in aws console and then installed vsftpd:-
sudo apt install vsftpd
That's it.
Now when i am trying to access ec2 ssh i get the following error:-
ssh -i sshkey.pem ubuntu#ec2-00-000-000-000.eu-west-3.compute.amazonaws.com
ubuntu#ec2-00-000-000-000.eu-west-3.compute.amazonaws.com: Permission denied (publickey).
How can i fix this?
Installing vsftpd can never have an impact on your SSH access.
the only things I think of:
you regenerated ssh keys for ubuntu user.
you rebooted your ec2 server and with elatic IP attached and now it has a different IP.
your ssh key has wrong permissions, it should be 600

SSH back to AWS Lightsail after UFW enabling

I've enabled the UFW service without allowing SSH access before then logging off.
I am now unable to ssh back to the instance.
The steps I have already taken:
Made a snapshot and create a new instance from it
You could create a snapshot and use it to create another new instance, and add sudo service ufw stop to the launch script.
The general point: try to execute a command (firewall disabling) while creating a new instance. I was able to do it with the usage of AWS CLI:
aws lightsail create-instances-from-snapshot --region eu-central-1 --instance-snapshot-name TEST_SSH_FIX-1591429003 --instance-names TEST_SSH_FIX_2 --availability-zone eu-central-1a --user-data 'sudo service ufw stop' --bundle-id nano_2_0
It`s work.

SSH issues on GCP VM migrated from AWS

I have migrated an EC2 instance (amazon linux) to Google cloud (ubuntu 18.04) using cloud endure.
But I am not able to ssh into google cloud VM. I dont have EC2 instance anymore. How can I access the Google cloud VM ? Error message:
ERROR: (gcloud.beta.compute.ssh) [/usr/bin/ssh] exited with return code [255]
using gcloud command you can config your SSH
gcloud compute config-ssh
for more details on config-ssh ref :
Link
If the gcloud compute config-ssh doesn't work check the firewall rules for your machine; find the VPC it's in and make sure port 22 is open - it may happen it's blocked.
If you're not sure if SSH can come through create a rule for it.
Very similar issue was also discussed in this topic on StackOverflow which might help you.
You can (to be absolutely sure SSH traffic is allowed to your VM) set up a startup script for it: edit the VM in question and find "Custom Metadata" section and click "Add Item", next type startup-script as a key and the command sudo ufw allow ssh in the "value" field.
Having the SSH traffic enabled in the GCP firewall and the VM itself you should be able to log in.

AWS ssh into instance giving Permission denied (publicly)

I am trying to ssh into a new EC2 instance. I have followed the instructions and when I attempt to ssh I get Permission denied (publickey).
Below is an image of everything I did in the console according to the instructions. Everything seems to go accordingly until I enter yes. Then it fails. I have followed the instructions twice and get the same result. I also do not have AWS CLI Tools as I believe they are optional.
ssh will use your local username to connect to the instance by default, if you not specified Host, User and IdentityFile in your ~/.ssh/config.
As you call ssh to your instance with the pem specified on the command line you also have to specify the remote user name (which is ec2-user for AWS linux instances and ubuntu for AWS Ubuntu instances).
Try to use this commandline:
ssh -i sub_api.pem ec2-user#ec2-54....