AWS Elastic Beanstalk Application Console Logs Publish to Cloudwatch - amazon-web-services

I have deployed a .net 6 application in AWS Elastic Beanstalk (Windows server). Application has been configured to write application level logs to the console. At the moment, these logs are not published to a CloudWatch by default.
Is there a way that, these console logs can be publised to a Cloudwatch log group? For ex, When using Lambdas, they automatically send console logs to a Cloudwatch stream by default. Something similar would be ideal.
Thanks

Follow this
To stream instance logs to CloudWatch Logs
Open the Elastic Beanstalk console, and in the Regions list, select your AWS Region.
In the navigation pane, choose Environments, and then choose the name of your environment from the list.
In the navigation pane, choose Configuration.
In the Software configuration category, choose Edit.
Under Instance log streaming to CloudWatch Logs:
Enable Log streaming.
Set Retention to the number of days to save the logs.
Select the Lifecycle setting that determines whether the logs are saved after the environment is terminated.
Choose Apply.
After you enable log streaming, you can return to the Software configuration category or page and find the Log Groups link. Click this link to see your logs in the CloudWatch console.
Note :- before enabling, you should have proper permissions for cloudwatch agent.
Update ( based on comments )
.net on Linux supports application, for windows a user need to implement custom solution
/var/log/eb-engine.log,/var/log/eb-hooks.log,/var/log/web.stdout.log,/var/log/nginx/access.log,/var/log/nginx/error.log
This is an alternate way, https://aws.plainenglish.io/how-to-setup-aws-elasticbeanstalk-to-stream-your-custom-application-logs-to-cloudwatch-d5c877eaa242

Related

How to write Windows Computer name (of Windows EC2) as a separate field on Cloudwatch Log Group using Cloudwatch agent config?

Windows Logs: 1058,windows-computer-name.prod.domain.com
I see the above line in Cloudwatch Log Group
This is generated by Cloudwatch agent running on a Windows EC2
Question:
Is there a way (to change the Cloudwatch agent config file) to write the Windows computer name to the log group as a separate field on Cloudwatch log group?
(This would help me to query the separate windows computer name field to check if logs from a certain list of computer names are getting forwarded to the log group or not)
You can push custom metrics to CloudWatch Agent which will be accessible for further processing. You have to use StatsD agent for this purpose. How to push custom metrics to CW is shown below with StatsD example:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Agent-custom-metrics-statsd.html

Stream stdout logs from Go EC2 Environment on AWS

I have a Go application running in an EC2 instance. Log streaming is enabled and I am able to view the default ed-activity.log, nginx/access.log and nginx/error.log in the CloudWatch console.
However, the application is logging to stdout but I cannot see these logs within CloudWatch. I know they're because stored as I can download the full logging bundle and see them.
The question is how can I get these logs into CloudWatch? I have a .NET application and it's stdout logs appear in CloudWatch fine.

Log print statements from script running on ec2 server?

I have a python script that runs from an ec2 server. What is the easiest way for me to see print statements from that script? I tried viewing the system log but I don't see anything there and I can't find anything in cloudwatch. Thanks!
Standard output from arbitrary applications running on EC2 don't appear in CloudWatch Logs.
You can install the CloudWatch Logs Agent, configure it to collect logs from given locations, and then configure your app to log to one of those locations.
It is possible to send log of application running on EC2 to Cloudwatch directly for that you need to do following step.
Create IAM Role with relevant permission and attach to Linux instance.
Install the CloudWatch agent in the instances.
Prepare the configuration file in the instance.
Start the CloudWatch agent service in the instance.
Monitor the logs using CloudWatch web console.
For your reference:-
http://medium.com/tensult/to-send-linux-logs-to-aws-cloudwatch-17b3ea5f4863

GCP, RabbitMQ click-to-deploy service, how to disable Stack Driver metrics exporter

I've created a RabbitMQ kubernetes cluster using Google One Click to deploy. I've checked "Enable Stackdriver Metrics Exporter" and created the cluster. My problem is that Google is charging for every custom metric created.
I need to disable Stackdriver Metrics Exporter.
¿Anyone had the same issue and disabled this Exporter? If so ¿How can I disable it without destroying the cluster?
If this kubernetes cluster without another application, only RabbitMQ is running on it, you can disable “Stackdriver Kubernetes Engine Monitoring” function of kubernetes cluster.
In the Cloud Console, go to the Kubernetes Engine > Kubernetes clusters page:
Click your cluster.
Click Edit for the cluster you want to change.
Set the “Stackdriver Kubernetes Engine Monitoring” drop-down value to Disabled.
Click Save.
The Logs ingestion page in the Logs Viewer tracks the volume of logs in your project. The Logs Viewer also gives you tools to disable all logs ingestion or exclude (discard) log entries you're not interested in, so that you can minimize any charges for logs over your monthly allotment.
Go to logs exports, and follow this topic for manage "Logs Exports".

Getting Cloudwatch EC2 server health monitoring into ElasticSearch

I have an AWS account, and have several EC2 servers and an ElasticSearch domain set up to take the syslogs from these servers. However, in Cloudwatch and when investigating a specific server instance in the EC2 control panel, I see specific metrics and graphs for things like CPU, memory load, storage use, etc. Is there some way I can pipe this information into my ElasticSearch as well?
Set up Logstash and use this plugin https://github.com/EagerELK/logstash-input-cloudwatch
Or go the other way and use AWS Logs agent to put your syslogs into Cloudwatch and stop using ElasticSearch