Is there a logging service in AWS for debug information? - amazon-web-services

I'm trying out AWS. I create a app that is running in an EC2 instance. I want to send debug/diagnostic logs to stdout or syslog and have some way to easily collect and let me read them.
Currently I use Stackdriver logging, I install a google-fluentd plugin in the EC2 instance and it picks up the syslog and send to Stackdriver. I'm wondering whether there is a similar offering in AWS so that I don't need to create a GCP project just for reading logs?
Thanks!

AWS allows you dump all your logs to cloud watch where you can store them click here to be redirected to the corresponding aws documentation. The documentation teaches you how to set up the ec2 machine in order to dump the logs to aws

You can install the AWS Cloudwatch agent in your EC2 Instance. The agent then allows you to ship custom log files to AWS Cloudwatch. From AWS cloudwatch you could analyze them. You could also ship system and application logs through the agent. Here is a blog post explaining how it could be done on a Windows machine not hosted in AWS, its pretty much the same approach for a EC2 instance.

You can use AWS Cloud watch Logs to monitor, store, and access your log files from Amazon Elastic Compute Cloud (Amazon EC2) instances, AWS CloudTrail, RouteĀ 53, and other sources. You can then retrieve the associated log data from CloudWatch Logs.

Related

How to audit changes to the AWS account

I wanted to know if there was a way to track alerts or audit anything that happens with the AWS account like who changed what and why. I did find this https://docs.aws.amazon.com/opensearch-service/latest/developerguide/audit-logs.html where they use a comand line for enabling audit logs on an existing domain: aws opensearch update-domain-config --domain-name my-domain --log-publishing-options "AUDIT_LOGS={CloudWatchLogsLogGroupArn=arn:aws:logs:us-east-1:123456789012:log-group:my-log-group,Enabled=true}" but this is in regard to Amazon OpenSearch Service which I believe is only free for 12 months if you haven't used already. AWS Audit Manager. I am aware there are services that can do this but require a fee and I wanted to know if there were any free options
From the AWS documentation:
With AWS CloudTrail, you can monitor your AWS deployments in the cloud by getting a history of AWS API calls for your account, including API calls made by using the AWS Management Console, the AWS SDKs, the command line tools, and higher-level AWS services. You can also identify which users and accounts called AWS APIs for services that support CloudTrail, the source IP address from which the calls were made, and when the calls occurred. You can integrate CloudTrail into applications using the API, automate trail creation for your organization, check the status of your trails, and control how administrators turn CloudTrail logging on and off.
AWS Config provides a detailed view of the resources associated with your AWS account, including how they are configured, how they are related to one another, and how the configurations and their relationships have changed over time.
Basically, AWS CloudTrail keeps a log of API calls (requests to AWS to do/change stuff), while AWS Config tracks how individual configurations have changed over time (for a limited range of resources, such as Security Group rule being changed).

AWS CloudWatch custom metrics best practice

I'm running my Python web app on an EC2, and I want it to report some custom (app-level) metrics to CloudWatch.
Sample metrics are (uplink) request duration and similar.
From what I understand, I have to either use boto3 or the AWS CLI in order to do that. However:
My app doesn't use boto3 for functionality, so it seems like an overkill to use it just for reporting metrics
I have to be authenticated - unlike with Lambda, just the fact I'm running inside an EC2 does not mean I'm automatically authenticated.
What's the best practice here? My app doesn't have to run on EC2 (can be run it on GCP, Azure, or a custom server), so I really don't want to import boto3 into the code.
You can attach role and have access to AWS cervices without credentials.
It's all depends on your metric. Probably you don't need to use CloudWatch Metrics, but X-Ray in some cases.
If your app doesn't use both3 it's not means what you need to use that, but it's easiest way to call AWS API
If you are using GCP or Azure, please, use their monitoring services.
If you need to collect some custom metrics from anywhere, please, use some metrics service or implement your own API
You can look into CloudWatch Embedded Metrics Format (EMF).
You would need to install and configure CloudWatch Agent on your EC2 instance and then you can use python EMF library to publish metrics.
With this approach:
Your application is not calling CloudWatch APIs directly. CloudWatch agent does the publishing.
You get custom metrics and EMF log entries in CloudWatch Logs, which can then be used with CloudWatch Logs Insights and Contributor Insights.
But you still need to provide a way to CloudWatch agent to authenticate against CloudWatch APIs. On EC2 instances this is done via the role your instance assumes.

Elastic Beanstalk deploy hooks: getting user's username

Is there a way to get the username of the AWS user who initiated EB deployment from within a deploy hook?
I believe that's possible with AWS CLI command and apply filters to CloudTrail logs. But it has a few limitations.
Like CloudTrail is not real-time, it updates your logs approximately after 15 minutes which ultimately will a very slow deployment.

How can I set retention policy for logs exported from Elastic beanstalk into a bucket?

I am using elastic beanstalk to serve up our app. I have enabled the "Enable log file rotation to Amazon S3" option and I see the logs a writing to a bucket. It looks like the logs are going to the default bucket for the Elastic BeanStalk instance.
How can I tell the app to write logs to a different bucket?
I want to make sure we set a retention policy (say 10 days) for the logs in that bucket so that the bucket stays groomed.
Thank you in advance for your help. Much appreciated.
It is not possible to configure the Elastic Beanstalk app to publish logs to a different bucket. It can only be set off or on.
To delete logs after 10 days, you can add a lifecycle rule to your bucket that will delete the logs after 10 days: http://docs.aws.amazon.com/AmazonS3/latest/UG/lifecycle-configuration-bucket-no-versioning.html
However, it may also possible to configure your Elastic Beanstalk application to publish logs to CloudWatch logs:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.managing.cw.html
The docs show a console screenshot, but I don't see anywhere in the console to set this. The docs do reference settings you can set via configuration files, CLI, or SDK

Use aws cloudtrail to collect application logs

Is it possible to use cloud trail to recieve custom logs like application logs, access logs, security logs?
And cloud trail keeps the logs for how long?
You might be thinking of CloudWatch Logs, which does capture, provide search, and groom custom logs from EC2 instances. The retention grooming rules are configurable.
No. CloudTrail is for AWS APIs activity only. It logs the activity for the last 7 days of API activity for supported services. The list only includes API activity for create, modify, and delete API calls. You can optionally save the logs in S3 buckets for historic API activity.
You could configure VPC flow logs, CloudTrail logs and AWS Config logs with CloudWatch. You can setup a S3 bucket with lifecycle policies enabled to retain logs forever. Refer this.