I am new to AWS, I want to move my aws real time cloudwatch stream logs to Redshift Table.I am thinking like first I want to move my cloudwatch stream logs to lambda funtion from there to Redshift. If you have any idea how to get those cloudwatch logs to lambda function, Please let me know. I dont any idea on it. I went through lot of docs but I didnot find proper lambda function to get stream logs from cloudwatch.
You should add subscription to your CloudWatch logs to a Kinesis/ Firehose.
Check -https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html
Related
I dont have much code to show.
I want to process all cloudwatch logs that are generated in last 1 day using lambda.
I want to execute lambda 6 am in the morning to extract some information from cloudwatch logs that are generated on previous day and put it in a table.
Instead of your 6 AM idea, you could also use a CloudWatch Logs subscription filter and trigger a Lambda function to process and store the log entries as described in a step-by-step example here Example 2: Subscription filters with AWS Lambda.
Or even easier and without duplicating the data to a database: Analyzing log data with CloudWatch Logs Insights
I assume that you have a lot of Lambda functions and each has a CloudWatch Log Group.
You can try the CloudWatch Log Group subscription filter, with this feature you can stream your logs into any support destinations such as Lambda.
Before that, you should prepare a Lambda function that has functionalities to help you to put your extracted data into DynamoDB table.
References:
https://docs.aws.amazon.com/lambda/latest/dg/with-ddb-example.html
https://www.geeksforgeeks.org/aws-dynamodb-insert-data-using-aws-lambda/
I have a serverless architecture with a few AWS lambdas up and running sending logs to Cloudwatch right now.
Question: Is there any option to avoid send logs to Cloudwatch and redirect them to another tool?
Example: Catch all logs from stdout, avoid send it to Cloudwatch (of course, I don't need to pay for Cloudwatch storage) and send all these logs to another external tool such as New Relic, Splunk, etc?
Thank you very much for your help!
You can do the tricky via removing Cloudwatch logs permissions from the role of your lambda.
There is a group of AWS CloudWatch Logs, inside which there are several threads. As far as I understand it, each thread is a log coming from a separate server or container.
CloudWatch Log streams
I put the whole group of logs in Kinesis Firehose to deliver them to S3 Bucket. But inside Kinesis Firehose, all the logs are merged into one. How can I get these logs to the S3 storage so that each thread has its own directory?
I found a solution:
1) I modified every log in Kinesis Firehose using the Lambda function. I added an identifier to the end of the log line. And then it looks like this:
Modified logs
2) I created a Lambda function with a trigger that works every time logs are written to s3 bucket. And in this function, I distribute logs to the folders I need based on the information I added to the logs earlier. I will not give the code of this lambda function. I've described the general approach and I think those who need it can figure it out.
I just created a new AWS kinesis stream and have it set up to receive data I am pulling in a lambda function. Theres is some data in my cloud watch logs that came in before I had my stream set up. Is there a way to get manually send this to kinesis?
There are ways as shown in this answer to export CloudWatch logs as JSON/txt/etc. afterwhich you could design some sort of script to iterate through those logs and put them into your Kinesis stream
is there a way of saving logs generated by a crawler in a specific, newly created cloudwatch log group?
I want to use the finish crawling log as a trigger to a lambda function.
Many thanks in advance!
You can use the AWS CloudWatch Logs API to upload logs. Use CreateLogGroup and CreateLogStream to create your log stream, and then use PutLogEvents to upload your log.
There are other options that might be more suitable for triggering a Lambda function though, depending on your exact use case, such as uploading the collected log to S3 and having the upload trigger the function, or even starting the Lambda function directly.