is there a way of saving logs generated by a crawler in a specific, newly created cloudwatch log group?
I want to use the finish crawling log as a trigger to a lambda function.
Many thanks in advance!
You can use the AWS CloudWatch Logs API to upload logs. Use CreateLogGroup and CreateLogStream to create your log stream, and then use PutLogEvents to upload your log.
There are other options that might be more suitable for triggering a Lambda function though, depending on your exact use case, such as uploading the collected log to S3 and having the upload trigger the function, or even starting the Lambda function directly.
Related
I dont have much code to show.
I want to process all cloudwatch logs that are generated in last 1 day using lambda.
I want to execute lambda 6 am in the morning to extract some information from cloudwatch logs that are generated on previous day and put it in a table.
Instead of your 6 AM idea, you could also use a CloudWatch Logs subscription filter and trigger a Lambda function to process and store the log entries as described in a step-by-step example here Example 2: Subscription filters with AWS Lambda.
Or even easier and without duplicating the data to a database: Analyzing log data with CloudWatch Logs Insights
I assume that you have a lot of Lambda functions and each has a CloudWatch Log Group.
You can try the CloudWatch Log Group subscription filter, with this feature you can stream your logs into any support destinations such as Lambda.
Before that, you should prepare a Lambda function that has functionalities to help you to put your extracted data into DynamoDB table.
References:
https://docs.aws.amazon.com/lambda/latest/dg/with-ddb-example.html
https://www.geeksforgeeks.org/aws-dynamodb-insert-data-using-aws-lambda/
I have a CloudWatch Log Group, this log group continuously receives logging information from my AWS services.
I want to extract some of the logging information from this log-group and want to store that data into S3 in some format (CSV, PARQUET).
I will then use Athena to query this logging data.
I want some sort of automatic mechanism to send these logs continuously to S3.
Can anyone suggest solution for this?
It looks like Athena is able to communicate directly with cloudwatch as shown here. Not sure how performant this is and how costly this turns out.
The other option is to configure Cloudwatch to send data to Firehose via Subscriptions which then dumps it to S3.
I want to send logs data from my EC2 instance (Ubuntu) to an AWS eventbridge where I can then send them to multiple endpoints. e.g. if someone performs a root user operation on the server this is written to/var/log/auth.log, I would then like this change in the log to be sent to eventbridge where it can then be routed to other locations, how can achieve this?
cheers
N.B.
I have tried using the cloudwatch agent but I can't figure how to get the logs to eventbridge once they're in a log group, so if there is a way I can this that would also work.
I can't figure how to get the logs to eventbridge once they're in a log group, so if there is a way I can this that would also work.
Once your CloudWatch Agent writes relevant logs to CloudWatch logs, you can setup a subscription filter on your log group.
The filter would stream logs of interest (e.g. those that contain ssh) into a lambda function. How to set it up is shown in:
Example 2: Subscription Filters with AWS Lambda
The lambda, using events api, e.g. in boto3, could process the log stream, filter out messages, construct events and publish them to the event bridge.
I want to create CloudWatch Rule that would be triggered upon creation of Log Event. For that reason as an event pattern I selected CloudWatch Logs service but when I try to generate some Cloud Watch logs the rule is not getting triggered. I can not find any example of using aws.logs as a source for an event and hence my question if I'm doing something wrong.
This is because the only events for logs available are AWS API Call via CloudTrail. CloudWatch Logs does not generate CloudWatch events on receiving new log entries.
For the Logs API call events to work, you need to setup CloudTrial trial.
However, if you want to trigger your lambda function based on log entries, I can recommend using subscription filters for lambda:
You can use subscriptions to get access to a real-time feed of log events from CloudWatch Logs and have it delivered to other services such as a Amazon Kinesis stream, Amazon Kinesis Data Firehose stream, or AWS Lambda for custom processing, analysis, or loading to other systems.
There is a group of AWS CloudWatch Logs, inside which there are several threads. As far as I understand it, each thread is a log coming from a separate server or container.
CloudWatch Log streams
I put the whole group of logs in Kinesis Firehose to deliver them to S3 Bucket. But inside Kinesis Firehose, all the logs are merged into one. How can I get these logs to the S3 storage so that each thread has its own directory?
I found a solution:
1) I modified every log in Kinesis Firehose using the Lambda function. I added an identifier to the end of the log line. And then it looks like this:
Modified logs
2) I created a Lambda function with a trigger that works every time logs are written to s3 bucket. And in this function, I distribute logs to the folders I need based on the information I added to the logs earlier. I will not give the code of this lambda function. I've described the general approach and I think those who need it can figure it out.