AWS Cloud watch log is broken - amazon-web-services

I have a nestJS application running on ECS (AWS), my cloud watch log is broken, each json line is a log line.
Image below:
Anybody suggest an idea

Related

Sending application logs to Datadog using FluentBit

I am trying to send the application logs deployed on ECS Fargate to Datadog using Fluent Bit and following the steps mentioned https://docs.datadoghq.com/integrations/ecs_fargate/?tab=fluentbitandfirelens#
However "ECS Deployment on service X" is the only log I'm seeing in Datadog. I believe it is the ECS Metadata log which I have set to true.
I have configured to send the Fluent Bit logs to CloudWatch and the last line I see there is "stream processor started" , no errors.
So I'm guessing everything is setup correctly. Am I missing something?

Trigger cloud run from the event : GCS object create, using Eventarc API

I am trying to build a python cloud run service that would be triggered whenever a file is uploaded in a google cloud storage bucket. Although, when I see the logs, the service is not triggered while I have created an Eventarc trigger for the same, already. I cannot find any entries in the cloud run service logs, but the trigger tab shows an Eventarc trigger associated with it.[![Cloud Run Trigger Image][1]][1]
[![Cloud Run Logs][2]][2]
Any ideas or links that can help me here?
[1]: https://i.stack.imgur.com/ijjh2.png
[2]: https://i.stack.imgur.com/QhFhk.png
In your logs, the line
booting worker with pid: 4
indicates, that your cloud run instance did indeed got triggered, but might have failed to boot, because there is no further log output.
To debug, deploy a demo cloud run function that just logs the incoming message. Thus, it will be easier to see whether it has been triggered (and with what payload).
There is an easy Tutorial from Google along these lines.

Stream stdout logs from Go EC2 Environment on AWS

I have a Go application running in an EC2 instance. Log streaming is enabled and I am able to view the default ed-activity.log, nginx/access.log and nginx/error.log in the CloudWatch console.
However, the application is logging to stdout but I cannot see these logs within CloudWatch. I know they're because stored as I can download the full logging bundle and see them.
The question is how can I get these logs into CloudWatch? I have a .NET application and it's stdout logs appear in CloudWatch fine.

Configure CloudWatch log to a running instance programmatically

I just tried to configure the cloudwatch to a running ec2-instance(windows) , manually using the steps given in aws-cloudwatch-documentation and ended successfully. the cloud watch log groups are created and the logs are being logged correctly.
Now my query is, Is there any way to do the cloud watch configuration through code(programattically using JAVA) or script or powershell, ???
If yes, kindly share some samples

Exit code non-zero and unable to see output logs

How do I view stdout/stderr output logs for cloud ML? I've tried using gcloud beta logging read and also gcloud beta ml jobs stream-logs and nothing... all I see are the INFO level logs generated by the system i.e. "Tearing down TensorFlow".
Also in the case where I have an error that shows the docker container exited with non zero code. It links me to a GUI page that shows the same stuff as gcloud beta ml jobs stream-logs. Nothing that shows me the actual output from the console of what my job produced...
Help please??
It may be the case that the Cloud ML service account does not have permissions to write to your project's StackDriver Logs, or the Logging API is not enabled on your project.
First check whether the Stackdriver Logging API is enabled for the project by going to the API manager: https://console.cloud.google.com/apis/api/logging.googleapis.com/overview?project=[YOUR-PROJECT-ID]
Then Cloud ML service account should be automatically added as an Editor to the project, and therefore allows it to write to the project logs, but if you have changed your project permissions it may have lost it. If so, check that you've manually given the Cloud ML service account LogWriter permissions.
If you are unsure of the service account used by Cloud ML, this page has instructions on how to find it: https://cloud.google.com/ml/docs/how-tos/using-external-buckets