AWS EMR connect to Master Node: Connection Refused - amazon-web-services

I have setup a new EMR instance on AWS. I've copied the ssh connection command for the master node.
ssh -i -vvv ~/.ssh/xxxx.pem hadoop#ec2-xx-xxx-xxx-xx.us-east-2.compute.amazonaws.com
I have the pem file in the correct location. I have changed permissions on the file with
chmod 400 ~/.ssh/xxxx.pem
I still get the following error:
debug1: Connecting to ec2-18-219-186-80.us-east-2.compute.amazonaws.com port 22.
Any help much appreciated.

I figured it out. You have to add SSH permission to the security group on the machine you wish to SSH to. Go to EC2, then down to security groups, and on the master node, and add rule for SSH.
From Amazon:
Security groups are restrictive by default. They reject all traffic. You can add a rule to allow traffic on a particular port to your custom or additional security groups. If there is more than one rule for a specific port in two security groups that apply to the same instances, the most permissive rule is applied. For example, if you have a rule that allows SSH access via TCP port 22 from IP address 203.0.113.1 and another rule that allows access to TCP port 22 from any IP address (0.0.0.0/0), the rule allowing access by any IP address takes precedence.
https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-additional-sec-groups.html

Related

I can SSH to one instance via bastion, but not to a second instance

I supposed to add the ssh key and then ssh to the AWS instance using a jumphost in 2 operations
So, first adding the key
ssh-add ~/.ssh/<key-file>.pem
Then ssh to jumphost
ssh -A ec2-user#jumphost
And then from jumphost to instance
ssh ec2-user#<private IP>
This works for one instance but does not work for another instance - cannot ssh to this instance from a jumphost.
What instance setting can prevent me to do ssh?
The output of ssh -v ec2-user#
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 58: Applying options for *
debug1: Connecting to <private ip>[<private ip>] port 22
There are different causes of this issue.
Check the security group of the ec2 instance you are connecting from the jump host / bastion if port 22 is open. If it's not open, add the rule. You can verify this if you telnet the destination server on port 22 form the jump host / bastion. command: telnet destinationip 22
if the above doesn't work, check if the ec2 instance you are connecting is in a running state with 2/2 checks passed. This will make sure that both network and operating system are functional.
if the above doesn't work, check if your Network Access List (NACLs) are allowing the inbound and outbound traffic. The default setting allows the traffic.
Make sure the the firewall is setup correctly! That is mandatory in AWS console when you setup your new instance. My guess is a wrong default was selected.
The reason was pointed by the user LogicIO. Somehow, one instance was moved to the different VPC and therefore the jumphost (bastion) and the instance appeared on the different VPC's, so connection stopped working. I needed a different jumphost.

EC2 is not responding for ssh connection

Another bad day. I have all the configuration for my ec2 instance.
Till yesterday I was able to connect it via ssh on mac. but know why it's not getting connect now.
Configuration is as below:
Security Group:-
I'm using below steps as usual and I'm same directory where mypleaks-inst.pem kept.
My guess: Your security group that was applied was "launch-wizard-2" which by default sets exlusion rules. You need to associate that EC2 instance with one of the two security groups listed in your second screen shot to allow TCP connections on port 22 from inbound ip range. OR you could modify launch-wizard-2 to incorporate the relevant rules to allow for ssh connection.
if you're sure nothing was changed on AWS side then perhaps your SSH service is down temporarily or permanently (the server was overloaded? You can do it with ease with T2.small).
Check NACL and routing, otherwise.

Cannot reach amazon EC2 instance from local using SSH

Cannot ssh into Amazon EC2 instance, this seems to be very common problem but I have tried everything suggested in all available documents, anyone else have any idea what is missing from below?
Created new EC2 instance and downloaded the .pem file
Created a new inbound rule in EC2 instance Security group allow my local IP
Created a new inbound rule in EC2 instance Network ACL to allow my local IP
Created a new outbound rule in EC2 instance Network ACL to reach my local IP
Made sure the VPC route is attached to internet gateway
Made sure EC2 instance is attached to correct security group
After all this when I try to ssh from my local machine I'm getting connection timed out ,is there anything else I have to do
I have also disabled firewall and tested just incase
ssh -vvv -i key.pem ec2-user#********
OpenSSH_7.1p2, OpenSSL 1.0.1g 7 Apr 2014
debug1: Reading configuration data /etc/ssh_config
debug2: ssh_connect: needpriv 0
debug1: Connecting to ec2-instance [********] port 22.
debug1: connect to address ****** port 22: Connection timed out
ssh: connect to host ******* port 22: Connection timed out
A timeout is normally an indication that there is no network connectivity. This is almost always related to the Security Group. It the error returns immediately, it means that the instance received, but rejected, the connection. If it takes a long time to respond, it typically means that the request is not reaching the instance.
Some suggestions:
Confirm that the instance has been assigned a Public IP address
Confirm that the security group is permitting Inbound Port 22 (SSH) for your IP address. In fact, for testing purposes, temporarily allow it from 0.0.0.0/0 just to eliminate one potential cause of the problem.
Remove your NACL entries. The default NACL entries permit ALL inbound and outbound access. Your changes might have negatively impacted this.
Confirm that the it is a Linux instance (Windows does not support SSH by default)
Confirm the instance is in the Public Subnet (the one with your route to the Internet Gateway)
It is possible that your network is rejecting the connection (eg due to firewall rules). Try using a different computer and/or a different network (eg home vs office) to eliminate this as a cause.

Accessing ElasticSearch on EC2 instance from outside the cloud

I am trying to access my ElasticSearch on a running EC2 instance from outside the Cloud. I currently have SSH/HTTP/HTTPS open to the public for inbound traffic as well as all open for outbound traffic. I set up a public IP for my EC2 instance as well.
By default ElasticSearch is on port 9200. I'm not sure if I configured my elasticsearch.yml file correctly but it basically has the default configuration I only changed the cluster.name to something else.
When I type in my public IP with port 9200 into my local browser or locally do a telnet {public-ip} 9200, there is no response. When I SSH into my EC2 instance. I can perform a curl localhost:9200 and I get the correct response from elasticsearch
How can I connect to my ElasticSearch running on my EC2 instance from outside the cloud?
I added a Custom Rule for my security group for inbound traffic that includes port 9200 and is open to 0.0.0.0/0 and I still cannot access this EC2 instance
Potential issues to check are wrong binding and instance operating system firewall.
Check where elasticsearch is binding, as if it is binding to 127.0.0.1 you won't be able to reach it from the outside.
Check binding by running in one shell on the elasticsearch ec2:
sudo netstat -lptun | grep 9200
If it shows 127.0.0.1:9200 then there is a misconfiguration if otherwise shows
*:9200 or :9200 then it is correct.
If it shows 127.0.0.1 then you should modify elasticsearch parameter network.bind_host as described in:https://www.elastic.co/guide/en/elasticsearch/reference/1.4/modules-network.html
Additionally http/HTTPS and ssh are usually allowed by default operating system firewall, whereas elasticsearch 9200 is not. This is usually the case for rhel and centos. You can temporarily disable iptables and check if it works.
To disable iptables run:
sudo iptables -F
If after disabling iptables the connection works you should configure iptables to allow connection on 9200.
I hope this helps.
G.
It is mess around Security Groups
You can add or remove rules for a security group (also referred to as
authorizing or revoking inbound or outbound access).
You shuld use the SG while launching your instance whith bounded 9200
Establish an SSH tunnel from your desktop to EC2.. then simply use your browser.. follow steps as given in https://www.jeremydaly.com/access-aws-vpc-based-elasticsearch-cluster-locally/

Different ssh port than 22 on ec2

I recently changed port field with custom port sshd_config file on amazon ec2.
However, it doesn't respond to ssh -p1234 user#domain.com because of security groups.
I have my ssh port (22) open on security group but I need to make it 1234 but on aws console, when I changed ssh source to 0.0.0.0/1234 it says unable to find group.
So, how do I set security group right ?
You appear to have confused the port with the CIDR address.
The CIDR annotation describes the range of addresses which are permitted to establish connections on a port - if you want anyone to be able to access the port from any address, use 0.0.0.0/0.
You then specify the port separately, which in your case is 1234.
Please note that changing your ssh port is not considered as a best practice. Network scanners will find your 'hidden' ssh port in no time. It is actually worse than having ssh on 22 since it provides you with false sense of security.
Consider restricting the actual scope of this service to your home / office IP addresses (as instructed by AWS Trusted Advisor)
You can even utilize Dome9 to have this port normally closed and only opened on demand.
(disclaimer: I'm a proud Dome9'er)