After we run an AWS lambda, a single cloudwatch log group is populated. Is there a way we can populate two (different) cloudwatch log groups from a single AWS lambda? I searched about it but couldn't find an answer. Let me know if it is possible.
It is not possible to specify two log groups from single lambda.
If you need logs in two cloudwatch groups, you would need to be innovative and create subscription filter in cloudwatch group which stream logs to another lambda where you get logs as a payload and then from the second lambda you can save logs to another cloudwatch group.
More info about cloudwatch subscriptions: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html
Related
I dont have much code to show.
I want to process all cloudwatch logs that are generated in last 1 day using lambda.
I want to execute lambda 6 am in the morning to extract some information from cloudwatch logs that are generated on previous day and put it in a table.
Instead of your 6 AM idea, you could also use a CloudWatch Logs subscription filter and trigger a Lambda function to process and store the log entries as described in a step-by-step example here Example 2: Subscription filters with AWS Lambda.
Or even easier and without duplicating the data to a database: Analyzing log data with CloudWatch Logs Insights
I assume that you have a lot of Lambda functions and each has a CloudWatch Log Group.
You can try the CloudWatch Log Group subscription filter, with this feature you can stream your logs into any support destinations such as Lambda.
Before that, you should prepare a Lambda function that has functionalities to help you to put your extracted data into DynamoDB table.
References:
https://docs.aws.amazon.com/lambda/latest/dg/with-ddb-example.html
https://www.geeksforgeeks.org/aws-dynamodb-insert-data-using-aws-lambda/
I am using Elasitsearch to get logs from cloudwatch log group by subscribing a lambda to the log group. So whenever there is a log event pushed to the log group, my lambda will be triggered and it will save the log to Elasticsearch. Then I can search the log via Kibana dashboard.
I'd like to put the metrics data to Elasticsearch as well but I couldn't find a way to subscribe to metrics data.
You can use AWS Module in MetricBeat from the Elastic Beat's family. Note that pulling metrics from cloudwatch will result in chargeable API calls. So you should carefully consider the scraping frequency.
Thanks
I want to send logs data from my EC2 instance (Ubuntu) to an AWS eventbridge where I can then send them to multiple endpoints. e.g. if someone performs a root user operation on the server this is written to/var/log/auth.log, I would then like this change in the log to be sent to eventbridge where it can then be routed to other locations, how can achieve this?
cheers
N.B.
I have tried using the cloudwatch agent but I can't figure how to get the logs to eventbridge once they're in a log group, so if there is a way I can this that would also work.
I can't figure how to get the logs to eventbridge once they're in a log group, so if there is a way I can this that would also work.
Once your CloudWatch Agent writes relevant logs to CloudWatch logs, you can setup a subscription filter on your log group.
The filter would stream logs of interest (e.g. those that contain ssh) into a lambda function. How to set it up is shown in:
Example 2: Subscription Filters with AWS Lambda
The lambda, using events api, e.g. in boto3, could process the log stream, filter out messages, construct events and publish them to the event bridge.
AWS CloudWatch has Log Groups and Log streams. A log group seems reasonable to me: Each product (e.g. each Lambda function, each Sagemaker endpoint) has its own log group.
But then there are log streams. When does AWS CloudWatch create new log streams? Can I search all log streams of a log group?
From the AWS Cloudwatch documentation you can see that a log stream is created each time the logs come from a different event source. In case of Lambda, it's one stream per Lambda container where each container might process multiple events.
A log stream is a sequence of log events that share the same source. Each separate source of logs into CloudWatch Logs makes up a separate log stream.
Yes, you can search all log streams of a log group using the CloudWatch Logs API. The FilterLogEvents action allows you to search through a log group.
The AWS documentation indicates that multiple log event records are provided to Lambda when streaming logs from CloudWatch.
logEvents
The actual log data, represented as an array of log event
records. The "id" property is a unique identifier for every log event.
How does CloudWatch group these logs?
Time? Count? Randomly, from my perspective?
Currently you get one Lambda invocation for every PutLogEvents batch that CloudWatch Logs had received against that log group. However you should probably not rely on that because AWS could always change it (for example batch more, etc).
You can observe this behavior by running the CWL -> Lambda example in the AWS docs.
Some aws services allow you to configure the log intervals such as elastic load balancing. There's a choice between five and sixty minute log intervals. You may not see a specific increment or parameter in the docs because they are configurable based on each service.