Unable to connect (SSH) to instance with elastic ip - amazon-web-services

I'm having issues connecting to the (only) EC2 instance in my VPC. The VPC was has public and private subnets, a nat-gateway, an internet-gateway and a bunch of security groups.
After trying nearly everything I'm at the stage of adding an Elastic IP that points to my EC2 instance without any luck of getting in. I've added SSH (port 22) 0.0.0.0/0 to ALL my security groups, just to try to connect, but nothing is working.
The command I'm trying to ssh with is this
ssh -i "path-to-my-key.pem" ec2-user#<public-dns>.eu-west-3.compute.amazonaws.com
and the result is ssh: connect to host <public-dns>.eu-west-3.compute.amazonaws.com port 22: Operation timed out
VPC DNS Hostnames is set to true
VPC DNS Resolution is set to true
Instance has Public DNS (IPv4), the one i'm trying to connect to

Related

Can't access the EC2 instance remotely

I'm trying to connect my friend's MySQL database remotely but I'm getting connection timeout error. I tried to ssh. But same result.
Then I check that instance. It has public IP. Also allowed 3306 and 22 ports on the security group. Allowed 100th rule for all sources in subnet NACL.
What I'm missing? Is there any other way to block those traffic? Can anyone help me? I'm a still beginner
When an SSH connection times-out, it is normally an indication that network traffic is not getting to the Amazon EC2 instance.
Things to check:
The instance is running Linux
The instance is launched in a public subnet, which is defined as having a Route Table entry to points to an Internet Gateway
The instance has a public IP address, which you are using for the connection
The Network Access Control Lists (NACLs) are set to their default "Allow All" values
A Security Group associated with the instance that permits inbound access on port 22 (SSH) either from your IP address, or from the Internet (0.0.0.0/0)
Your corporate network permits an outbound SSH connection (try alternate networks, eg home vs work vs tethered to your phone)
See also: Troubleshooting connecting to your instance - Amazon Elastic Compute Cloud
Based on your descriptions, I would suggest checking whether the instance was launched in a public subnet.
I found the reason. That instance was deployed in a private subnet and didn't have inbound access.
Solution:-
I deployed a bastion host in a public subnet and used SSH agent forwarding to access the instance through the bastion host.

Not able to ssh/http into EC2 instance

I am at my wits end with this, please help.
I am creating EC2 instances in my default public VPC, yet i am not able to ssh or http to my instance or webserver running into the machine. I checked the following
The SG has inbound SSH, HTTP and HTTPS allowed from 0.0.0.0/0 and assigned to my instance
the default VPC, has route tables with 0.0.0.0/0 pointed to IGW
the NACLs are configured to Allow all traffic. i also manually updated to allow only HHTP, HTTPS and SSH
the Key is use has been given the right permission by running chmod 400 filename
Despite all this not able to connect to the EC2 instance, AMI being Amazon Linux 2 AMI
When I try to ssh, i get a connection timeout error after a while, initially, i thought it was my office network but I am also getting the same from my home network with no firewalls in place
To allow an SSH connection, you will need:
An Amazon EC2 instance running Linux launched in a public subnet (defined as having a Route Table that directs 0.0.0.0/0 to an Internet Gateway)
A Security Group permitting Inbound access on port 22 (Outbound configuration is irrelevant)
Network ACLs left at their default settings of Allow All in both directions
A Public IP address associated with the instance
From your descriptions, I would say that the problem is probably with the Outbound NACLs. Return traffic from an SSH session goes back to the source port on the initiating server, which is not port 22. In general, only change the NACLs if you have a specific reason, such as creating a DMZ. I recommend you reset the NACL rules to Allow All traffic in both directions.

Aws Connection to EC2 timed out over SSH

I have tried to connect EC2 using SSH but ssh: connect to host XXXXXXXXX port 22: Connection timed out
Note: XXXXXXXX is user#IP
Also I have checked security groups. Inbound rules are allowed for ssh
SSH TCP 22 0.0.0.0/0 -
SSH TCP 22 ::/0 -
For first time, I was able to login using SSH. After that I installed LAMP stack on EC2 instance. I think I forgot to add ssh in ufw rules.
I can't able to connect using Browser Based SSH Connection in AWS and showing erros for Session Manager connection method.
How can I connect using SSH or other, so I can allow SSH in ufw rules.
This indicates that you cannot to the host.
From your question I can see you have validated security group access, however there are some other steps you should take to investigate this:
Is the IP address a public IP? If so ensure that the instances subnet has a route table with a Internet Gateway associated with it.
Is the IP address a private IP? Do you have a VPN or Direct Connect connection to the instance? If not you would either need to set this up or use a bastion host. Otherwise if you do ensure that the route tables reference back to your on premise network range.
If you're not using the default NACLs for your subnet check that both the port ranges from your security group as well as ephemeral port ranges.

Unable to SSH to my EC2 instance despite adding my IP in the security group route table

I have tried all that I could have done.
Deleted the previous EC2 instances
Used a new key pair
Used putty to connect with new pair
Used chrome extension secure shell app to connect to EC2 instance with new key pair
I added my IP address in my security group inbound table but not able to access the EC2 instances.
Attached are the images of my issues.
Cause of the problem:
The port number for SSH is 22.
However, the screenshot for the ssh error shows that the connection is being attempted on port 80.
Suggested fix:
The problem can be fixed by specifying the port number as '22' in the SSH client connection settings.
To access the EC2 instance via SSH, check:
The instance has been launched in a public subnet (defined as having a Route Table that routes traffic to an Internet Gateway)
The Security Group should be permitting inbound traffic on port 22 from your IP address (or a wider range, such as 0.0.0.0/0)
Don't change the NACLs from default
Make sure the instance is running Linux
For EC2 Instance Connect, make sure it is using Amazon Linux 2 or Ubuntu 16.04 or later
Make sure you are connecting to the public IP address of the instance (based on your pictures, you are doing this)
Simple hint: If the connection takes a long time to fail (or hangs), then there is no network connectivity to the instance. Check Security Groups and VPC configurations. If an error comes back immediately, then network connectivity is okay and the connection is simply being refused by the instance.

AWS EC2 instance unable to ssh remote server

I have 2 EC2 Ubuntu instances: Instance-A and Instance-B. Both's ufw shows inactive and they are in the same subnet of a VPC. Both's security group allows all inbound and outbound traffic from anywhere. And they have identical ssh_config.
From command line of Instance-B, I can ssh to any of my SSH servers, either they are in the same VPC or non-AWS server.
However, from commaind line of Instance-A, I can only ssh to Instance-A and Instance-B using their private IP. I cannot ssh either (even Instance-A itself) using their public IP. Neither can I log in to any non-AWS server. The error is 'connection timeout'.
How can I make Instance-A' ssh client work?
[added facts]
In Instance-A, I can ping google.com, A's public IP, B's public IP
ssh client used to work well in Instance-A. I don't know what has changed.