Trigger a Lambda function from multiple S3 file uploads - amazon-web-services

I am new to some AWS services.
Our source team uploads 2 files in a S3 bucket at some interval. They run one pipeline and upload a file in S3. Then they run another process and upload another file in S3. Both their processes run in parallel so files are not uploaded in a specific order.
we need to trigger our Lambda function when both files are uploaded in S3.
We tried to trigger lambda from S3 events and SNS but it trigger lambda twice because of 2 S3 events.
What is the best approach to handle this? Any suggestion would be appreciated.

Related

S3 Upload Trigger ( Multiple Files )

I have a foo lambda that executes some code by reading some files.
I only want to run the lambda after I upload the 10 required files, which is the tricky part.
10 files are uploaded in S3 bucket via bitbucket pipeline
??? (need to wait for all new CSVs to be uploaded)
Execute foo lambda
If I use S3 upload trigger it will not work because it will call the lambda 10 times for each file upload...
The 10 files already exist in the S3 repo, I just replace them.
Any ideas how to run only the foo lambda ONCE after the 10 files are uploaded?
The AWS Lambda function will be triggered for every object created in the Amazon S3 bucket.
There is no capability to ask for the Lambda function to run only after 10 files are uploaded.
You will need to add custom code to the Lambda function to determine whether it is 'ready' to trigger your work (and the definition of 'ready' is up to you!).

Loading files from AWS S3 bucket to a Snowflake tables

I want to copy files from S3 bucket to the Snowflake. To do this I'm using Lambda function. In the S3 bucket I have a folders and in every folders there are many CSV files. These CSV files can small and huge. I have created a Lambda function that is loading these files to the Snowflake. The problem is that Lambda function can work only 15 minutes. It's not enough to load all the files to the Snowflake. Can you help me with this problem? I have one solution for this - execute lambda only with one file not with all files
As you said, the maximum execution time for a Lambda function is 15 minutes, and is not a good idea load all the file in the memory, because you will have high costs with execution time and high usage of memory.
But, if you really want to use Lambdas and you are dealing with files over 1GB, perhaps you should consider AWS Athena or optimizing your AWS Lambda function to read the file using a stream instead of loading the whole file into memory.
Other option may be to create a SQS message when the file lands on s3 and have an EC2 instance poll the queue and process as necessary. For more information check here: Running Cost-effective queue workers with Amazon SQS and Amazon EC2 Spot Instances.
The best option will be automate the Snowpipe with AWS Lambda, for this check the Snowpipe docs Automating Snowpipe with AWS Lambda.

How to Process Multiple Related Files uploaded to s3 as one group using aws Lambda

I have multiple related files being uploaded to S3 bucket as group, which I want to process using aws Lambda. For examples externaly, inventory.txt, orders.txt, order_details.txt are received in one folder in s3 bucket. These are part of one batch. Someone else will send the same files in another folder in the same bucket.
I want to process these files(cleanse,combine, etc) at the same time (so 3 files at the same time) as a batch.
I have dabbled with Lambda, on S3 create object event level but it gets triggered for each file being uploaded. I want the lambda to trigger for the 3 files (and for the additional 3 files in another directory if applicable).
After the upload process completes, make it create a trigger file(dummy file). For example, your process uploads orders.txt, inventory.txt in s3://my-bucket/today_date/
Make it create a s3://my-bucket/today_date/today_date.complete after copy is complete.
Add S3 event trigger to your lambda function to execute when it a .complete file is uploaded to S3 and the process the rest of the files using lambda and delete the .complete file. And the process repeats for next day.

How to automate AWS Elastic Transcoder Jobs for s3 buckets?

What I want: To add watermarks to all video files that are uploaded to the S3 bucket (mov, mp4, etc.). Then overwrite the file with it's same name with the newly transcoded file that has the watermark on it.
So, I was able to manually do this by creating a pipeline and job with elastic transcoder, but this is manual. I want this done the moment a file is uploaded to the server, overwrite the file with the new file and boom.
One, this should be a feature already but not sure why it isnt.
And two, How can I have this automatically done? Any advise? I know its possible just not sure exactly where to start here
You need S3 bucket, a lambda along with your transcoder pipeline.
Elastic transcoder is backbone of your process.
To automate transcoding, create lambda function which gets triggered by an S3 event .
More detailed explanation is here .

AWS Lambda and Multipart Upload to/from S3

Using Lambda to move files from an S3 to our Redshift.
The data is placed in the S3 using an UNLOAD command directly from the data provider's Redshift. It comes in 10 different parts that, due to running in parallel, sometimes complete at different times.
I want the Lambda trigger to wait until all the data is completely uploaded before firing the trigger to import the data to my Redshift.
There is an event option in Lambda called "Complete Multipart Upload." Does the UNLOAD function count as a multipart upload within Lambda? Or would the simple "POST" event not fire until all the parts are completely uploaded by the provider?
There is no explicit documentation confirming that Redshift's UNLOAD command counts as a Multipart upload, or any confirming that the trigger will not fire until the data provider's entire upload is complete.
For Amazon S3, a multi-part upload is a single file, uploaded to S3 in multiple parts. When all parts have been uploaded, the client calls CompleteMultipartUpload. Only after the client calls CompleteMultipartUpload will the file appear in S3.
And only after the file is complete will the Lambda function be triggered. You will not get a Lambda trigger for each part.
If your UNLOAD operation is generating multiple objects/files in S3, then it is NOT an S3 "multi-part upload".