we have a multiple account setup in aws. Apps A,B,C are deployed into their own accounts. I want to export cloudwatch metrics (only metrics not logs) into monitoring-account (say) so that I can monitor and keep a tab on it
I have tried googling but couldnt find anything. please help
AWS recently announced a feature to enable cross account access of CloudWatch Metrics. Step by step instructions can be see at the following url:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Cross-Account-Cross-Region.html
Centralized logging is an important requirement for various purposes such as logging, compliance, etc. It is also a recommendation form AWS to aggregate your logs in a separate account. But it requires a bit of work and understand how it all works. Here is an official AWS blog giving you step by step approach.
AWS Centralized Logging blog
using cloudwatch agent config, https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Agent-Configuration-File-Details.html
credentials param can be supplied with the cross account role arn.
Thanks,
Related
I wanted to know if there was a way to track alerts or audit anything that happens with the AWS account like who changed what and why. I did find this https://docs.aws.amazon.com/opensearch-service/latest/developerguide/audit-logs.html where they use a comand line for enabling audit logs on an existing domain: aws opensearch update-domain-config --domain-name my-domain --log-publishing-options "AUDIT_LOGS={CloudWatchLogsLogGroupArn=arn:aws:logs:us-east-1:123456789012:log-group:my-log-group,Enabled=true}" but this is in regard to Amazon OpenSearch Service which I believe is only free for 12 months if you haven't used already. AWS Audit Manager. I am aware there are services that can do this but require a fee and I wanted to know if there were any free options
From the AWS documentation:
With AWS CloudTrail, you can monitor your AWS deployments in the cloud by getting a history of AWS API calls for your account, including API calls made by using the AWS Management Console, the AWS SDKs, the command line tools, and higher-level AWS services. You can also identify which users and accounts called AWS APIs for services that support CloudTrail, the source IP address from which the calls were made, and when the calls occurred. You can integrate CloudTrail into applications using the API, automate trail creation for your organization, check the status of your trails, and control how administrators turn CloudTrail logging on and off.
AWS Config provides a detailed view of the resources associated with your AWS account, including how they are configured, how they are related to one another, and how the configurations and their relationships have changed over time.
Basically, AWS CloudTrail keeps a log of API calls (requests to AWS to do/change stuff), while AWS Config tracks how individual configurations have changed over time (for a limited range of resources, such as Security Group rule being changed).
Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.
I'm running my Python web app on an EC2, and I want it to report some custom (app-level) metrics to CloudWatch.
Sample metrics are (uplink) request duration and similar.
From what I understand, I have to either use boto3 or the AWS CLI in order to do that. However:
My app doesn't use boto3 for functionality, so it seems like an overkill to use it just for reporting metrics
I have to be authenticated - unlike with Lambda, just the fact I'm running inside an EC2 does not mean I'm automatically authenticated.
What's the best practice here? My app doesn't have to run on EC2 (can be run it on GCP, Azure, or a custom server), so I really don't want to import boto3 into the code.
You can attach role and have access to AWS cervices without credentials.
It's all depends on your metric. Probably you don't need to use CloudWatch Metrics, but X-Ray in some cases.
If your app doesn't use both3 it's not means what you need to use that, but it's easiest way to call AWS API
If you are using GCP or Azure, please, use their monitoring services.
If you need to collect some custom metrics from anywhere, please, use some metrics service or implement your own API
You can look into CloudWatch Embedded Metrics Format (EMF).
You would need to install and configure CloudWatch Agent on your EC2 instance and then you can use python EMF library to publish metrics.
With this approach:
Your application is not calling CloudWatch APIs directly. CloudWatch agent does the publishing.
You get custom metrics and EMF log entries in CloudWatch Logs, which can then be used with CloudWatch Logs Insights and Contributor Insights.
But you still need to provide a way to CloudWatch agent to authenticate against CloudWatch APIs. On EC2 instances this is done via the role your instance assumes.
I am in process to use prometheus cloudwatch exporter which can be used for monitoring various AWS accounts. I seen lot of documentation about configuring single IAM user (which are not programmetic) for cross account access but unable to find steps for configuring programmetic user to access cloudwatch across various account.
it will be great if someone can provide me pointers for same
You cannot really have a single IAM role being used cross accounts. What you really need is to create an IAM role that can be assumed by a different user account.
Below link would be helpful in defining a role that can be assumed by a different account:
https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_cross-account-with-roles.html?shortFooter=true#tutorial_cross-account-with-roles-2
To utilise the cloudwatch exporter, you need to allow the exporter to assume the role in a similar way. You can create these roles quite easily using coding too. Comment with further details on what you need, will help out
I want to add dashboards(including the metrics) in aws account A to my aws account B in AWS CloudWatch, is it possible to do that? In order to do that, what kind of permission should I have to add dashboards from account A?
Thank you.
CloudWatch now supports cross account cross region dashboards. See the documentation - https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Cross-Account-Cross-Region.html#enable-cross-account-cross-Region
EDIT: This is now a supported feature in CloudWatch Console - see the official documentation
This is not something supported by CloudWatch today, but you could use CloudWatch GetMetricWidgetImage API to basically snapshot the data, put the graph image in a shared s3 bucket and display that instead.
See an example here