Getting wrong number of instances with boto3 describe_instances - amazon-web-services

I have a lambda function that describes instances that are running in AWS Account, but when I have scaled out instances by using Auto Scaling, the Lambda function returning the wrong number of instances.
To check for this:
I have used the same logic and created a CLI command and the CLI command gives me the correct number of instances.
CLI command:
aws ec2 describe-instances --filters Name=instance-state-name,Values=running --query 'Reservations[].Instances[].{Instance:InstanceId}' --output json --region -eu-west-1
After that created a python script which I have executed in the server but this is giving the same answer as of lambda function.
Putting Lambda Function Code:
client = boto3.client('ec2',region_name = 'eu-west-1')
response_dict = client.describe_instances(Filters=[
{
'Name': 'instance-state-name',
'Values':['running']
}
])
instances_list = response_dict['Reservations']
result_dict={}
for instance_dict in instances_list:
instanceDetailsresult_dict = {}
instance_role=None
instance = instance_dict['Instances'][0]
instanceId = instance['InstanceId']
print(instanceId)
Note:
This is just a code snippet, all libraries are included.

This is a list: instance_dict['Instances'][0] and you are only getting the first instance out of that list. I suggest iterating over that list.

Related

`aws secretsmanager list-secrets` command to return secrets and filter them by tag

How do I call the aws secretsmanager list-secrets command and filter secrets by their tags? I don't see examples of this here: https://docs.aws.amazon.com/cli/latest/reference/secretsmanager/list-secrets.html
Also, Amazon's docs seem to be wrong. It says --max-items on that page but really should be --max-results. Also there is no mention of how to filter on that wiki page as well.
[Original: December 2019]
You can use jq, for example:
aws secretsmanager list-secrets \
| jq '.SecretList[] | select((.Tags[]|select(.Key=="Name")|.Value) | test("^Production$|^Staging$"))'
You can also use the awscli's in-built query option, for example:
aws secretsmanager list-secrets \
--query "SecretList[?Tags[?Key=='Name' && Value=='Production']]"
You can use boolean tests with the awscli's in-built query option, for example:
aws secretsmanager list-secrets \
--query "SecretList[?Tags[?Key=='Name' && (Value=='Production' || Value='Staging')]]"
Here's an outline of a solution using Python and boto3:
from functools import partial
import boto3
def filter_tags(key, values, secret):
for tag in secret['Tags']:
if tag['Key'] == key and tag['Value'] in values:
return True
return False
sm = boto3.client('secretsmanager')
paginator = sm.get_paginator('list_secrets')
secrets_list_iterator = paginator.paginate()
filter_production = partial(filter_tags, 'Name', ['Production', 'Staging'])
for secrets in secrets_list_iterator:
for s in filter(filter_production, secrets['SecretList']):
print(s['Name'], s['Tags'])
[Updated: January 2021]
The aws secretsmanager list-secrets command now supports filtering via the --filters option. But .. I recommend that you do NOT use it unless you understand how it actually works (see below) and you would benefit from its particular implementation.
Here's an example of how to use it to filter on secrets with a name that begins with Production:
aws secretsmanager list-secrets \
--filters Key=name,Values=Production
Note that you cannot do an exact match with the --filters option, just a 'begins with' match, so be cautious when using it. If you have secrets with names of Production and Production-Old, both will be returned. That may not be what you want so in that case use the original client-side queries described above.
Here's an example of how to use it to filter on secrets with a name that begins with Production or Staging:
aws secretsmanager list-secrets \
--filters Key=name,Values=Production,Staging
Here's an example of how to use it to filter on secrets with a tag key that begins with stage or a tag value that begins with dev:
aws secretsmanager list-secrets \
--filters Key=tag-key,Values=stage Key=tag-value,Values=dev
Note: the --filters option implements logical OR, not logical AND.
Here's a boto3 example, filtering on tag keys that begin with Name or tag values that begin with Production or Staging:
import boto3
sm = boto3.client('secretsmanager')
res = sm.list_secrets(Filters=[
{ 'Key': 'tag-key', 'Values': ['Name'] },
{ 'Key': 'tag-value', 'Values': ['Production', 'Staging'] },
])
for secret in res['SecretList']:
print(secret['Name'], secret['Tags'])

create a Powershell code that works with AWS: to list EC2 Key Pairs that are not in use by instances

I am looking to create a Powershell code that works with AWS: to list EC2 Key Pairs that are not in use by instances.
aws ec2 --profile default describe-key-pairs --query
KeyPairs[].[KeyName] --output text |xargs -I {} aws ec2 --profile
default describe-instances --filters Name=key-name,Values={} --query
Reservations[].Instances[].[KeyName,InstanceId] --output text| uniq
this code dosent work for powershell
You need to work with AWS Powershell Module.
The trick here is work with 2 cmdlet function - Get-EC2instance and Get-EC2KeyPair. First of all, you need to get all the keys that in use right now.
Later, you need to get all key pairs and filter them by basic for each and if statement.
Take a look at the following code snippet :
Import-Module AWSPowerShell
$keysInUse = #()
$keysNotInUse = #()
#Set AWS Credential
Set-AWSCredential -AccessKey "AccessKey" -SecretKey "SecretKey"
#Get ec2 key name from each instance
$allInstancesKeys = (Get-EC2instance -Region "YourRegion").Instances.KeyName
#Get all key based on region and check if there's an instance who use this key
Get-EC2KeyPair -Region "YourRegion" | % {
if($_.KeyName -notin $allInstancesKeys)
{
$keysNotInUse += $_.KeyName
}
else
{
$keysInUse += $_.KeyName
}
}
Write-Output "Keys not in use: $($keysNotInUse -join ',')\n
Keys in use: $($keysInUse -join ',')"
The instances i own and key name :
Output :
How to create new AccessKey and SecretKey - Managing Access Keys for Your AWS Account.
AWSPowerShell Module installation.
More about Get-EC2KeyPair Cmdlet.
From the docs :
Describes the specified key pairs or all of your key pairs.
More about Get-EC2instance
Returns information about instances that you own.

AWS CLI - Get all CloudFormation stacks that have a name that starts with string

What query should I use to get all CloudFormation stacks that start with a specific string?
I have tried the following query, but it always returns an empty array:
aws cloudformation describe-stacks --no-paginate --query "Stacks[?StackName!='null']|[?starts_with(StackName,'HD-') == 'true']"
All the stacks in our account start with "HD-", so this should return the same as
aws cloudformation describe-stacks --no-paginate
But it returns
[]
This command works fine:
aws cloudformation describe-stacks --no-paginate --query \
'Stacks[?StackName!=`null`]|[?contains(StackName, `Release`) == `true`].StackName'
It looks like you need to use ` rather than ' inside the query..

AWS CLI list-objects by specific date

I'm trying to make a groovy script that list the objects on the AWS S3 that have been uploaded in the past three days. I installed the AWS CLI on the agent that the script runs on. The command I found that lists the objects by date is the following:
def cmd = "aws s3api list-objects --bucket (name of bucket) --query \"Contents[?LastModified>= '2018-10-16'].{Key: Key, LastModified: LastModified }\""
When I run this command on the agent directly from a putty session, it runs fine and lists the objects correctly. But when I try to execute the same command from the groovy script, I get the following error:
Bad value for --query "Contents[?LastModified: Bad jmespath expression: Unclosed " delimiter:
"Contents[?LastModified
^
I tried to replace the first and last quotation marks with single quotes but did not work. I tried to do the same thing with the quotation marks before contents and after LastModified but did not work as well. I tried passing Contents[?LastModified>= '2018-10-16'].{Key: Key, LastModified: LastModified } to a string variable and pass its value in the command after --query but that didn't work as well.
Please try:
Then try:
def date = new Date().format('yyyy-MM-dd')
def cmd = ['aws', 's3api', 'list-objects', '--bucket', 'Bucket-Name', '--query', "Contents[?LastModified>='${date}'].{Key: Key , LastModified: LastModified}"]
Remember to always pass the command as a list, not string.

Getting a list of instances in an EC2 auto scale group?

Is there a utility or script available to retrieve a list of all instances from AWS EC2 auto scale group?
I need a dynamically generated list of production instance to hook into our deploy process. Is there an existing tool or is this something I am going to have to script?
Here is a bash command that will give you the list of IP addresses of your instances in an AutoScaling group.
for ID in $(aws autoscaling describe-auto-scaling-instances --region us-east-1 --query AutoScalingInstances[].InstanceId --output text);
do
aws ec2 describe-instances --instance-ids $ID --region us-east-1 --query Reservations[].Instances[].PublicIpAddress --output text
done
(you might want to adjust the region and to filter per AutoScaling group if you have several of them)
On a higher level point of view - I would question the need to connect to individual instances in an AutoScaling Group. The dynamic nature of AutoScaling would encourage you to fully automate your deployment and admin processes. To quote an AWS customer : "If you need to ssh to your instance, change your deployment process"
--Seb
The describe-auto-scaling-groups command from the AWS Command Line Interface looks like what you're looking for.
Edit: Once you have the instance IDs, you can use the describe-instances command to fetch additional details, including the public DNS names and IP addresses.
You can use the describe-auto-scaling-instances cli command, and query for your autoscale group name.
Example:
aws autoscaling describe-auto-scaling-instances --region us-east-1
--query 'AutoScalingInstances[?AutoScalingGroupName==`YOUR_ASG`]' --output text
Hope that helps
You can also use below command to fetch private ip address without any jq/awk/sed/cut
$ aws autoscaling describe-auto-scaling-instances --region us-east-1 --output text \
--query "AutoScalingInstances[?AutoScalingGroupName=='ASG-GROUP-NAME'].InstanceId" \
| xargs -n1 aws ec2 describe-instances --instance-ids $ID --region us-east-1 \
--query "Reservations[].Instances[].PrivateIpAddress" --output text
courtesy this
I actually ended up writing a script in Python because I feel more comfortable in Python then Bash,
#!/usr/bin/env python
"""
ec2-autoscale-instance.py
Read Autoscale DNS from AWS
Sample config file,
{
"access_key": "key",
"secret_key": "key",
"group_name": "groupName"
}
"""
from __future__ import print_function
import argparse
import boto.ec2.autoscale
try:
import simplejson as json
except ImportError:
import json
CONFIG_ACCESS_KEY = 'access_key'
CONFIG_SECRET_KEY = 'secret_key'
CONFIG_GROUP_NAME = 'group_name'
def main():
arg_parser = argparse.ArgumentParser(description=
'Read Autoscale DNS names from AWS')
arg_parser.add_argument('-c', dest='config_file',
help='JSON configuration file containing ' +
'access_key, secret_key, and group_name')
args = arg_parser.parse_args()
config = json.loads(open(args.config_file).read())
access_key = config[CONFIG_ACCESS_KEY]
secret_key = config[CONFIG_SECRET_KEY]
group_name = config[CONFIG_GROUP_NAME]
ec2_conn = boto.connect_ec2(access_key, secret_key)
as_conn = boto.connect_autoscale(access_key, secret_key)
try:
group = as_conn.get_all_groups([group_name])[0]
instances_ids = [i.instance_id for i in group.instances]
reservations = ec2_conn.get_all_reservations(instances_ids)
instances = [i for r in reservations for i in r.instances]
dns_names = [i.public_dns_name for i in instances]
print('\n'.join(dns_names))
finally:
ec2_conn.close()
as_conn.close()
if __name__ == '__main__':
main()
Gist
The answer at https://stackoverflow.com/a/12592543/20774 was helpful in developing this script.
Use the below snippet for sorting out ASGs with specific tags and listing out its instance details.
#!/usr/bin/python
import boto3
ec2 = boto3.resource('ec2', region_name='us-west-2')
def get_instances():
client = boto3.client('autoscaling', region_name='us-west-2')
paginator = client.get_paginator('describe_auto_scaling_groups')
groups = paginator.paginate(PaginationConfig={'PageSize': 100})
#print groups
filtered_asgs = groups.search('AutoScalingGroups[] | [?contains(Tags[?Key==`{}`].Value, `{}`)]'.format('Application', 'CCP'))
for asg in filtered_asgs:
print asg['AutoScalingGroupName']
instance_ids = [i for i in asg['Instances']]
running_instances = ec2.instances.filter(Filters=[{}])
for instance in running_instances:
print(instance.private_ip_address)
if __name__ == '__main__':
get_instances()
for ruby using aws-sdk gem v2
First create ec2 object as this:
ec2 = Aws::EC2::Resource.new(region: 'region',
credentials: Aws::Credentials.new('IAM_KEY', 'IAM_SECRET')
)
instances = []
ec2.instances.each do |i|
p "instance id---", i.id
instances << i.id
end
This will fetch all instance ids in particular region and can use more filters like ip_address.