Boto3 Instance Not Allowing SSH - amazon-web-services

I have created an instance using the Boto 22 interface, my connection arguments including key_id and access_key, and a new security group authorized with the following.
security_group.authorize_ingress(IpProtocol="tcp",CidrIp="0.0.0.0/0",FromPort=22,ToPort=22)
I create the instance with
instance = ec2.create_instances(ImageId='ami-5b41123e', KeyName='test_pair57', InstanceType="t2.micro", MinCount=1, MaxCount=1)
I set the program to wait in a while loop until it finds the instance state is running. However, I still can't SSH into the public IP address it then prints out. The connection always times out. I have tried specifying the port, but that does not change it.
Do I need to pass my new keypair's fingerprint somewhere or is there something else I'm missing?

My security settings were incorrect. Checking them in the AWS online interface allowed me to verify the settings and showed me the issue.
I needed to add my created security settings to the instance like so:
instance = ec2.create_instances(ImageId=image.id, KeyName='test_pair' + str(rand), InstanceType="t2.micro", MinCount=1, MaxCount=1, SecurityGroupIds=[security_group.id])

Related

Can't Connect to Public RDS Instance in Python

I created an RDS instance (Postgres, free tier, in default VPC) and selected public access. Now, I am trying to connect to it in Python like this:
import psycopg2 as ps
# define credentials
credentials = {'POSTGRES_ADDRESS' : '', # change to your endpoint
'POSTGRES_PORT' : '', # change to your port
'POSTGRES_USERNAME' : '', # change to your username
'POSTGRES_PASSWORD' : '', # change to your password
'POSTGRES_DBNAME' : ''} # change to your db name
# create connection and cursor
conn = ps.connect(host=credentials['POSTGRES_ADDRESS'],
database=credentials['POSTGRES_DBNAME'],
user=credentials['POSTGRES_USERNAME'],
password=credentials['POSTGRES_PASSWORD'],
port=credentials['POSTGRES_PORT'])
cur = conn.cursor()
Here is the security group inbound:
However, this times out. What am I doing wrong? I believe I put in all the credentials above, so maybe it is a problem with the AWS side of things? Please let me know if you need any more info.
Thanks!
You will need to allow inbound access on the Security Group:
Port 5432
Source: Your IP address (for good security)
It is best to create a new Security Group with these settings then modify the database to use this new security group. That way, it will not impact other systems, and future changes will not impact this security group.
See also: Change security group on AWS RDS Database Instance
Based on the comments.
The issue turned out to be due to missing default database in the RDS.
Recreating the RDS with the default database solved the problem.

Cannot remote (rdp) into EC2 started from aws lambda using boto3::run_instances

When I launch an EC2 instance from a particular AMI via the web console, it works just fine and I can RDP into it no problems.
But when I launch another (identical) instance via an aws lambda, I cannot RDP into the instance
Details
Here is the lambda used to launch the instance
import boto3
REGION = 'ap-southeast-2'
AMI = 'ami-08e9ad7d527e4e95c'
INSTANCE_TYPE = 't2.small'
def lambda_handler(event, context):
EC2 = boto3.client('ec2', region_name=REGION)
init_script = """<powershell>
powershell "C:\\Users\\Administrator\\Desktop\\ScriptToRunDaily.ps1"
aws ec2 terminate-instances --instance-ids 'curl http://169.254.169.254/latest/meta-data/instance-id'
</powershell>"""
instance = EC2.run_instances(
ImageId=AMI,
InstanceType=INSTANCE_TYPE,
MinCount=1,
MaxCount=1,
InstanceInitiatedShutdownBehavior='terminate',
UserData=init_script
)
I can see the instance start up in the AWS console. Everything looks normal until I go to remote in, where a prompt says 'Initiating remote session' takes ~15 seconds and returns
We couldn't connect to the remote PC. Make sure the PC is turned on and connected to the network, and that remote access is enabled.
Error code: 0x204
Note
When I click try to connect to the instance through the AWS console, it lets me download an RDP file, however, it doesn't display the option to 'Get Password' as it does if I start the exact same AMI through the console (as opposed to via a lambda)
I suspect I may need to associate the instance with a keypair at launch?
Also note
Before creating this particular AMI, I logged in and changed the password, so I really have no need to generate one using the .pem file.
It turns out I needed to add SecurityGroupIds
Note that it's an array of up to 5 values, rather than a single value, so it's specified like ['first', 'second', 'etc'] rather than just 'first'. Hence the square brackets around ['launch-wizard-29'] below
I also specified a key.
The following is what worked for me
import boto3
REGION = 'ap-southeast-2'
AMI = 'ami-08e9ad7d527e4e95c'
INSTANCE_TYPE = 't2.small'
def lambda_handler(event, context):
EC2 = boto3.client('ec2', region_name=REGION)
init_script = """<powershell>
powershell "C:\\Users\\Administrator\\Desktop\\ScriptToRunDaily.ps1"
aws ec2 terminate-instances --instance-ids 'curl http://169.254.169.254/latest/meta-data/instance-id'
</powershell>"""
instance = EC2.run_instances(
ImageId=AMI,
InstanceType=INSTANCE_TYPE,
MinCount=1,
MaxCount=1,
InstanceInitiatedShutdownBehavior='terminate',
UserData=init_script,
KeyName='aws', # Name of a key - I used a key (i.e. pem file) that I used for other instances
SecurityGroupIds=['launch-wizard-29'] # I copied this from another (running) instance
)

Unable to SSH in EC2 instance

I am unable to ssh AWS ec2 instance. It shows error Permission denied (public Key).I have tried to generate new pem key
# ssh-keygen -y -f /path/to/keypair.pem
and added it in Instance Settings View/Change User Data
#cloud-config
#ssh_deletekeys: false
#ssh_authorized_keys:
# - ssh-rsa ENTER YOUR PUBLIC KEY HERE ...
#cloud_final_modules: - [ssh, always]
But it didn't work and I also tried this tutorial
https://aws.amazon.com/premiumsupport/knowledge-center/recover-access-lost-key-pair/
Still it is not working.
What you are trying to do wont work. Neither will the answer #Rajeev provided. If it was a file permission error you would see a message indicating the .pem file had unsecure read permissions. What I suspect you'll need to do is kill off that instance and relaunch with a new key. Will be easier than troubleshooting it, and you can make a snapshot to launch a new instance from if you have done a lot of config on it already.
AWS best practices state instances should be treated as ephemeral. Follow those guidelines and you'll save yourself a lot of hassle in the long run.

How can I create an AMI that uses password authentication?

My goal is to spin up an EC2 instance, make an AMI from that instance, use the custom AMI to spin up a new instance that uses password authentication instead of keys.
I changed /etc/ssh/sshd_config to have PasswordAuthentication yes before creating the AMI, but I still get "permission denied (publickey)" when trying to SSH to the new instance created from the custom AMI.
I followed these steps and they work on the first instance, but then I still get the error when logging into the instance created from the customn AMI.
AWS support fixed my problem, here is their answer:
Make changes to the following lines in /etc/cloud/cloud.cfg file on
the original instance launched in the above steps as follows:
Default
settings:
ssh_pwauth: 0
lock_passwd: true
Settings after modification:
ssh_pwauth: 1
lock_passwd: false
service sshd restart
REFERENCES
https://cloudinit.readthedocs.io/en/latest/topics/examples.html#including-users-and-groups
https://cloudinit.readthedocs.io/en/latest/topics/modules.html

Conn Configuration for AWS Lambda Python RDS Postgres IAM Authentication

Recently it was create the possibility to access RDS instances with IAM users and roles. I am confused about how to configure a python connection, since I would not use the database authentication data with psycopg2.
Now I am using like this:
conn = psycopg2.connect("dbname='%s' user='%s' host='%s' password='%s'" % (db_name, db_user, db_host, db_pass))
I have not idea how to use IAM credentials to connect my lambda function with IAM auth.
Please help.
First, you need to create an IAM policy and a DB user as described here:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.html
Then you need to create IAM role for your Lambda function and attach the IAM policy created above to it. Your Lambda function will need to be executed with this role to be able to create a temporary DB password for the DB user.
Finally, you can generate a temporary password for your DB user (created above) within your Lambda using a code snippet like this:
from urllib.parse import quote_plus
import boto3
def get_password(rds_hostname, db_user, aws_region=None, url_encoded=True):
if (not aws_region):
aws_region = boto3.session.Session().region_name
if (not aws_region):
raise Exception("Error: no aws_region given and the default region is not set!")
rds_port = 5432
if (":" in rds_hostname):
split_hostname = rds_hostname.split(":")
rds_hostname = split_hostname[0]
rds_port = int(split_hostname[1])
rds_client = boto3.client("rds")
password = rds_client.generate_db_auth_token( Region=aws_region,
DBHostname=rds_hostname,
Port=rds_port,
DBUsername=db_user)
if url_encoded:
return quote_plus( password )
else:
return password
Do not assign the the password to a variable. Get a new password on every run, since the password has limited time validity and your Lambda container might not be recycled before it expires...
Finally, create the DB connection string for whatever python package you use (I would suggest some pure Python implementation, such as pg8000) from your RDS hostname, port, username and the temporary password obtained with the function above (<user>:<password>#<hostname>:<port>/<db_name>).
Connecting to the RDS might be a bit tricky. If you don't know how to set up VPC's properly I would suggest you run your Lambda outside of VPC and connect to the RDS over a public IP.
Additionally, you will probably need to enforce SSL connection and possibly include the RDS CA file in your Lambda deployment package. The exact way how to do this depends on what you use to connect (I could only describe how to do this with pymysql and sqlalchemy).
Each of these steps could be described in a tutorial of it's own, but knowing about them should be enough to get you started.
Good luck!