Finding IAM SA Keys Older than 89 Days in Google Cloud - google-cloud-platform

I'm trying to write a script that will hunt out all IAM service keys that have existed longer than 89 days to meet our security requirements, but I'm typing myself into knots.
My best attempt so far:
gcloud iam service-accounts keys list --quiet --managed-by=user --iam-account $SERVICE_ACCOUNT_EMAIL --filter='createTime<=P89D' --format='csv[no-heading](KEY_ID)'
But this appears to catch all of the keys. I'm struggling to get my head around Google's filter configurations. Any pointers gladly accepted.

The underlying REST method is projects.serviceAccounts.keys.list and the result is a ServiceAccountKey which contains valid[Before|After]Time which are strings in the protobuf Timestamp.
So, I think this needs to either be a string comparison of dates (!) or comparing durations (but I'm unfamiliar with the duration format and how to compare).
You can convert the validAfterTime to a duration, i.e. --filter=validAfterTime.duration() (see duration) and then compare (!) but as Durations
Or construct a date that's within your scope and compare as-is. The following is hacky, please proceed with caution:
PROJECT=...
ACCOUNT=...
PAST=$(date -d "-90 days" +%Y-%m-%dT%H:%M:%SZ)
EMAIL="${ACCOUNT}#${PROJECT}.iam.gserviceaccount.com"
gcloud iam service-accounts keys list \
--iam-account=${EMAIL} \
--project=${PROJECT} \
--filter="validAfterTime<${PAST}"
I think there's a better way to do this!

I've ended up using the above method with ISO dates generated in a script and it seems to be working now. It feels like the kind of thing that should be nicely handled with the filters, but getting it working is taking more time than a bash date

Related

How can I check for GCP projects not in a VPC Service Control Perimeter using bash?

I am looking for a way to use a bash script with gcloud to:
Generate a list of all current projects in the org
Check each project to see if it is in a VPC Service control perimeter and list which perimeter name.
Identify projects that are not in a VPC Service control perimeter.
I've had no luck finding a way to script this. I'd like to be able to easily generate this list and identify projects that are not in a vpcsc. Thanks!
I don't use service perimeters and so it's challenging to write|test a solution but here are some pointers.
1. Projects
ServicePerimeterConfig resources are of the form project/{project_number}.
So, when you enumerate the projects, you'll want to use the projectNumber:
gcloud projects list \
--format="value(projectId,projectNumber)"
Consider putting these into an associative array keyed on projectNumber so that you can return the more useful projectId.
2. Service Perimeters
gcloud access-context-manager perimeters list \
--format=...
The documentation is unclear. --format is a global gcloud flag and should support value, json and yaml.
servicePerimeters is a little gnarly (deep) but you probably want a second associative array keyed on projectNumber (again) with the name or title as the value.
You should be able to use scope("project") in the format string to extract the project number.
It's possible that you can map the servicePerimeters using gcloud --format (and transforms) only but it may be easier to pipe --format=json into something like jq and munge there.
Can one Project be in multiple Perimeters?
Can a Perimeter include a no-longer-exists Project?
servicePerimeter includes status and spec lists of projects
3. In|Not-In
Array #1 contains all the projects. Those in Array #2 (which may be a duplicative test but) gives you projects in a service perimeter.
So, you could iterate over #1 and if it's in Array #2 put it in the "in" list otherwise put it in the "out" list.

GCP, is there a way to find which Asset-type can be labelled and which are not?

I need to find out which resources (Asset-Types) in entire GCP organization can be labelled.
In short, i do not want resources which doesn't have a column Label in the schema. Is there a way to find columns of every asset-type ? or any other way to extract only resources that have column/attribute Label?
gcloud asset search-all-resources --scope=organizations/Org-ID
--filter=-labels:* --format='csv(name, assetType, labels)' --sort-by=name > notLabels.csv
i use this command to get the resources but it returns also the resources that can't be labelled.
You can find the list of services that support labels in GCP in this documentation.
And you can filter it with the following format below as an example:
gcloud asset search-all-resources --filter labels.env:*
The above command lists the services that has env as key and anything that has value on it.
gcloud asset search-all-resources --filter=-labels.*
The second sample command above lists the resources with no labels value by adding - before the label parameter.
You can find more information on using filter searches using labels here.

how to get a list of dynamo db tables in aws that have a creationDate less than a certain value

I wish to use an aws cli commad that will return a list of dynamodb tables in my aws account where the CreationDateTime is less than a certain value. CreationDateTime is a property that is shown with the describe-table command, but the list-tables command only returns the names of the table. Is there any way I could possibly use a query to filter the names returned by list-tables in accordance with their corresponding CreationDateTime?
As noted, the answer is simply no, the AWS CLI can not do what you want in one query.
You need to resort to shell scripting to chain the necessary aws commands, or,
if you are willing to give up the hard requirement that it must be the AWS CLI, an alternative solution (and probably faster than forking off aws over and over) is to use a 5 line python script to get the result you want. You can still run this from the command-line, but it won't be the AWS CLI specifically.
Just perform the time arithmetic that suits your specific selection criteria.
#!/usr/bin/env python3
import boto3
from datetime import datetime, timedelta, timezone
ddb = boto3.resource('dynamodb')
for t in sorted(filter(lambda table: datetime.now(timezone.utc) - table.creation_date_time < timedelta(days=90), ddb.tables.all()), key=lambda table: table.creation_date_time):
print(f"{t.name}, {str(t.creation_date_time)}") # tables created in the last 90 days
No - As you noted, ListTables only lists the names of tables, and there is no request that provides additional information for each table, let alone filter on such information. You'll need to use ListTables and then DescribeTable on each of those tables. You can run all of those DescribeTable requests in parallel to make the whole thing much faster (although there is a practical snag - to do it in parallel, you'll need to have opened a bunch of connections to the server).

Move data from AWS Elasticsearch to S3

I have an application pumping logs to an AWS OpenSearch (earlier Elasticsearch) cluster. I want to move old logs to S3 to save cost and still be able to read the logs (occasionally).
One approach I can think of is writing a cron job that reads the old indexes, writes them (in text format) to the s3 and deletes the indexes. This also requires keeping day wise indexes. Is there a more efficient/better way?
You can use the manual snapshots approach to backup your indexes to s3: https://docs.aws.amazon.com/opensearch-service/latest/developerguide/managedomains-snapshots.html
Another option as suggested toward the end of the first link is to use a tool named Curator within lambda that will handle the index rotation:
https://docs.aws.amazon.com/opensearch-service/latest/developerguide/curator.html
Depending on your use case UltraWarm could be the best approach, in case you want those logs to be searchable later on without the need of manual restores, that will be required in case you go with the first two options I have listed:
https://aws.amazon.com/blogs/aws/general-availability-of-ultrawarm-for-amazon-elasticsearch-service/
There is one tool elasticdump
# Export ES data to S3 (using s3urls)
elasticdump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input=http://production.es.com:9200/my_index \
--output "s3://${bucket_name}/${file_name}.json"

List private AMIs sorted by time with aws cli

Is there a way to get describe-images to return list of AMIs sorted by time?
Right now it seems to sort them randomly (whereas the AWS Console shows me sorted by time each time I log in -- probably because I clicked the time column to sort it by that once and this preference is being persisted).
I've searched the doc for sorting (i.e., "sort", "order", ..) and cannot seem to find it: https://docs.aws.amazon.com/cli/latest/reference/ec2/describe-images.html
Thanks
The sorting feature is generic to the CLI, not specific to the describe-instances command.
You can accomplish this using jmespath querying.
https://docs.aws.amazon.com/cli/latest/userguide/controlling-output.html
http://jmespath.org/specification.html#func-sort-by