rdsadmin user logs in CloudWatch from Aurora Server Less DB Cluster - amazon-web-services

I created Aurora Server Less DB Cluster (mysql5.10) on AWS.
then I enabled only slow query logs by setting keys in parameter cluster groups as -
slow_query_log : 1
long_query_time: 0.5
log_output: file
but In cluodWatch when I was looking for logs, I found logs-
My mysql user logs (which was expected)
But there are so many rdsadmin logs. Even these logs do not match the criteria (long_query_time: 0.5).
Please help me to find out, is there a way to disable logs of rdsadmin from cloudwatch??

I could not find a clear answer on AWS site explaining which parameters are supported for RDS Serverless v1. After some quick test, I cant make the long_query_time parameter work with Serverless. Maybe the cluster need to be restarted, or we need to recreate a parameter group ? https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/aurora-serverless.how-it-works.html#aurora-serverless.parameter-groups
My solution is to enable slow_query log (it works), and stream the log to CloudWatch Logs.
Then I query the log with CloudWatch Logs Insights with the following query (assuming you have selected the slow_query log stream) :
parse #message "Query_time: *  Lock_time: * Rows_sent: * Rows_examined: *\n*" as Query_time,Lock_time,Rows_sent,Rows_examined,q
| filter Query_time > 0.250
It will display all query slower than 250ms.

Related

Workaround for 2 subscription-filter limit in AWS Cloudwatch Logs

I have several lambda functions deployed on AWS that I want to monitor directly for errors to update a postgresql table with.
I have created a lambda to parse streamed log data and update the db. I want to set up subscription filters between this lambda and my other function logs.
There are 6 log streams I want to monitor and the AWS Console limits the subscription filters to 2 per log group.
Is there a workaround or a better way to implement this kind of monitoring?
Thanks

Pushing data to AWS Elasticsearch from CloudWatch logs without a schema

Our setup is this, AWS Services produce and publish logs to the CloudWatch Service. From there we use the standard Lambda function to publish the logs to the AWS ElasticSearch
The lambda function pushes the logs to ES using the file format cloudwatch-logs-<date> This creates a new index every day
We have an issue with mapping of the data. So for example when a service (eg. aurora db) publish its first set of logs and the field CPU value is 0 the ES set that as a long. When that same service publishes a second set of logs and the CPU is set to 10.5 the ES rejects that set of data with the error mapper cannot change type [long] to [float]
We have allot of services publishing logs with allot of data sets. Is the best way to resolve this for lambda to push the logs with format of cloudwatch-logs so only one index is created and then manual fix the mapping issue for that index ? or is there a better way to resolve this ?

Cloudwatch logs streaming to ElasticSearch AWS

So, I created the cloudwatch index and streaming the cloud watch logs to elastic search and I am seeing data, however, I am only seeing current date data. I dont see old logs in elastic search which are in same log group in cloudwatch. I changed the date filter in elastic search, but dont see any change. Any idea why?
The index name created is, cwl-2018.03.20
That's the expected behavior. The streaming of logs from CloudWatch to Elasticsearch relies of a feature called subscription filters which only forwards new data to the destination.

Using DynamoDB to replace logfiles

We are hosting our services in AWS beanstalk managed instances. That is forcing us to move away from files based logging to use database based logging.
Is DynamoDB a good choice for replacing file based logging. If so, what should be the primary key. I thought of using timestamp but multiple messages may be logged by the same service within the same timeStamp so that might not be reliable.
Any advice would be appreciated.
Don't use DynamoDB to store logs. You'll be paying for throughput and space needlessly.
Amazon CloudWatch has built-in logging capabilities.
http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/WhatIsCloudWatchLogs.html
Another alternative is a dedicated logging service such as Loggly which is cloud-based and can receive logs in many common formats, plus they have an API to send custom logs. In the web-based console, you can search and filter through the logs.
As an alternative, why don't you use cloudwatch? I ended up writing a whole app to consolidate logs across ec2 instances in a beanstalk app, then last year AWS opened up cloudwatch as a service, so I junked my stuff. You tell cloudwatch where your logs are on the instance, give it a log group and stream name, and all your logs are consolidated in one spot, in cloudwatch. You can also run alarms off them using the standard AWS setup. It's pretty slick, and easy - don't have to write a front end to do lookups, it's already there.
Don't know what you're using for logging - we are a node.js shop, used winston for logging, and there is a nice NPM module that works with Winston to log automatically, called winston-cloudwatch.

How to fetch logs (AWS VPC LOGS) from aws which are seen on cloudwatch?

How do I fetch logs (AWS VPC LOGS) from aws which are seen on cloudwatch? I am confused between which API to use. The cloud watch api is about fetching the metrics and not about getting the log events.
If someone could help me getting a Java example to fetch logs into a file. I want to append the logs to a file. I have my own logging infrastructure for which I am using logstash-statsD-graphite.
You need to use the AWSLogs client, in the package com.amazonaws.services.logs : http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/logs/AWSLogs.html
You have the GetLogEventsRequest object to perform the request, and everything you need to paginate. You'll get a list of OutputLogEvent with timestamps and messages (and as far as I know, each message should be a VPC flow record).
The full API doc is here: http://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/Welcome.html
Hope this will get you started