AWS MWAA remote logging not visible on Airflow UI - amazon-web-services

AWS MWAA is configured for remote logging to a given S3 path.
While running tasks(preferably long and time-consuming) in any DAG, we can not see logs either on Airflow UI or the S3 path.
Whenever a task ends(success or failure), we are able to see logs at both places.
Is there any way to see logs while running tasks in case of remote logging?

Related

AWS Elastic Beanstalk logs? Access more detailed logs?

I currently deploying a couple of apps with Elastic Beanstalk and have some open questions. One thing that bugs me about EB is the logs. I can run eb logs or request the logs from the GUI. But the result is kind of confusing to me since I can't find a way to access the normal stdout for a running process. I'm running a Django app and it seems like the logs automatically show every log that is explicitly set to a Warning priority.
In Django, there are a lot of errors that seem to slip through the log system (e. g. failed migrations, failed custom commands, etc.)
Is there any way to access more detailed logs, or access the stdout of my main process? It would also be ok, if they would stream to my terminal, or if I had to ssh on the machine.
I suggest using the cli to enable the cloudwatch logs with eb logs --cloudwatch-logs enable --cloudwatch-log-source all. This will allow you to see the streaming output of web.stdout.log along with all of your other logs individually.

Stream stdout logs from Go EC2 Environment on AWS

I have a Go application running in an EC2 instance. Log streaming is enabled and I am able to view the default ed-activity.log, nginx/access.log and nginx/error.log in the CloudWatch console.
However, the application is logging to stdout but I cannot see these logs within CloudWatch. I know they're because stored as I can download the full logging bundle and see them.
The question is how can I get these logs into CloudWatch? I have a .NET application and it's stdout logs appear in CloudWatch fine.

Log print statements from script running on ec2 server?

I have a python script that runs from an ec2 server. What is the easiest way for me to see print statements from that script? I tried viewing the system log but I don't see anything there and I can't find anything in cloudwatch. Thanks!
Standard output from arbitrary applications running on EC2 don't appear in CloudWatch Logs.
You can install the CloudWatch Logs Agent, configure it to collect logs from given locations, and then configure your app to log to one of those locations.
It is possible to send log of application running on EC2 to Cloudwatch directly for that you need to do following step.
Create IAM Role with relevant permission and attach to Linux instance.
Install the CloudWatch agent in the instances.
Prepare the configuration file in the instance.
Start the CloudWatch agent service in the instance.
Monitor the logs using CloudWatch web console.
For your reference:-
http://medium.com/tensult/to-send-linux-logs-to-aws-cloudwatch-17b3ea5f4863

AWS ECS container logs design pattern

I have a classic scala app, it produces three different logs in the location
/var/log/myapp/log1/mylog.log
/var/log/myapp/log2/another.log
/var/log/myapp/log3/anotherone.log
I containerized the app and working fine, I can get those logs by docker volume mount.
Now the app/container will be deployed in AWS ECS with auto scaling group. in this case multiple container may run on one single ecs host.
I would like to use cloud watch to monitor my application logs.
One solution could be put aws log agent inside my application container.
Is there any better way to get those application logs from container to cloudwatch log.
help is very much appreciated.
When using docker, the recommended approach is to not log to files, but to send logs to stdout and stderr. Doing so prevents the logs from being written to the container's filesystem, and (depending on the logging driver in use), allows you to view the logs using the docker logs / docker container logs subcommand.
Many applications have a configuration option to log to stdout/stderr, but if that's not an option, you can create a symlink to redirect output; for example, the official NGINX image on Docker Hub uses this approach.
Docker supports logging drivers, which allow you to send logging to (among others) AWS cloud watch. After you modified your image to make it log to stdout/stderr, your can configure the AWS logging driver.
More information about logging in Docker can be found in the "logging" section in the documentation
You don't need log agent if you can change the code.
You can directly publish Custom Metric Data into ColudWatch like this page said: https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/examples-cloudwatch-publish-custom-metrics.html

How to log on Amazon Elastic Beanstalk with a spring boot application

I created 2 applications with spring boot. I deployed them on Amazon Elastic Beanstalk. One is deployed in a Java environment, the other one in Tomcat.
Tomcat has its catalina.out log, where I can find the logs written by my spring application with log4j. The Java application has a log web-1.log, but it is rolled every hour, and I can only find the last 5 logs.
Is there a better way to log, or to store the old logs (maybe on S3), or to change the retention policy?
You can apply log rotation to S3. You also have the elk stack option but requires effort.
If you want a more aws solution you can utilize cloudwatch. For example you set up your logger with a custom appender that sends logs back to cloudwatch.
By using cloudwatch you can have a more friendly way to check your logs.