I use a lot StackDriver sinks to BigQuery. It helps on keeping the Data and have them in a convenient queryable form.
I am searching for the equivalent on AWS using CloudWatch but it seems only S3 is integrated. Any workarounds or should I code it?
This can be done with a workaround. It needs only one code component reading from S3 and posting to Stackdriver, then GCP sinks can pickup.
Create a CloudWatch Sink to S3 bucket.
The S3 bucket will be integrated to send SNS messages on object Creation.
A process should listen to those SNS notifications and copy the contents of the bucket into StackDriver.
Since the guarantees of SNS and Cloudwatch to S3 are there, a Lambda function on the SNS notification messages can make the solution more seamless.
Related
I am looking for a way to trigger my gitlab ci pipeline whenever there is a object (with specific name) is added in S3 bucket.
I am new with AWS lambda, can someone please help
You can use s3 event notifications and deliver these events to other AWS services, including AWS Lambda or SQS queues.
Leveraging these event notifications, you could send the event directly to a lambda function that subscribes to putobject events and parses the event content structure to determine if an object of the specific name was created and use the create pipeline API to trigger pipelines on GitLab.
Architecture overview:
I have a node.js backend that sends out images to a secondary api for transformations and then those images appear in s3 bucket. The problem is that the secondary api doesn't inform my api when the file is created in the bucket.
Is there some sort of long polling in s3 available because spamming get requests doesn't feel right (also will get expensive).
I'm considering adding a trigger on new files in s3, that will invoke a lambda that will put a message into some sort of pub/sub message broker and then I could just subscribe to it but this seems a bit too complicated?
From the S3 notification docs you can be notified via:
Amazon Simple Notification Service (Amazon SNS) topic
Amazon Simple Queue Service (Amazon SQS) queue
AWS Lambda
The relative benefits or each one are up to you but don't poll S3 for changes. Use one of these to be notified of the changes. You can decide to get notices for just new objects or deleted object.
I created a custom app that automatically uploads logs to s3.
Is there a way to push those logs to cloudwatch from s3 for analysis and alerting?
I'm aware that I can use a cloudwatch agent to push directly to cloudwatch from the app but there are complications involved in that option.
Thank you!
You could probably use Cloudwatch Events to listen to S3 changes. Not sure about if you can get the data from the S3 file, or just a trigger saying that a new log has been added.
You could also use S3 event notifications (https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html) connected either to a lambda or SQS, and from there create the logs to Cloudwatch. (similar to what was suggested by #marcin)
A better solution, but one that is a bit beyond the scope of the question, would be to send your logs through Kinesis Firehose and from there add the Cloudwatch and S3 logs.
I'm not aware of any out-of-the-box mechanism for that provided by AWS. But I think it could be relatively easy to develop.
Namely, you can create S3 notification for a PUT of a new log file from your app to S3. The event would trigger a lambda function. The function would get the file and using AWS SDK, e.g. boto3's put_log_events, it would send the log events to CloudWatch logs.
I want to collect failure logs for AWS datapipeline actions. The standard failure handling provided by AWS when datapipeline actions (steps) fail is to send a notification to a SNS topic. I want to store these failures in a log store, instead. (Ideally, in S3).
Is there any way to configure my datapipeline to log these failure notifications instead of pushing a notification to SNS? I want to send datapipeline failure logs to S3.
According to the documentation, it appears that SNS is the only supported destination for failure notifications.
I am hopeful that there is a way, because it is a very common pattern in AWS to be able to push logs to S3 buckets.
I wonder, would I be able to find failure logs in the pipelineLogUri S3 bucket that is configured for each datapipeline step? (If I can filter these granular logs for the fatal errors, that would be acceptable to me)
I decided to subscribe a lambda function to the SNS notifications and push them to S3 from there. Not ideal (because it adds additional complexity to our system), but it works.
I created s3 bucket with event enabled for the bucket. When the new object is uploaded to the bucket sns will trigger an event to lambda to extract content from the file.
Is there any way to customize the sns event in generic format.
There is no out-of-box way to customize the notification format (without involving a middleman like Lambda). I would suggest using AWS Lambda as the direct message destination instead of letting the S3 notification flows through SNS. This AWS documentation will be useful for you (https://docs.aws.amazon.com/AmazonS3/latest/dev/notification-walkthrough-2.html).