My ClaudiaJS API is working fine but I cannot store images files on an S3 bucket.
The problem is because the Lambda Function has no permission to Write on S3.
Question: How can I tell ClaudiaJS to include an S3 Write permission into the Lambda Function Permissions?
Related
I have an s3 bucket and I want to upload a file from a lambda that is invoked through an API call but I don't know what is the correct config for blocking public access configuration setting of the s3 bucket. Now, none of the 4 options are checked and it works but I'm not sure about it considering the security aspects.
What is the correct configuration of policy bucket, IAM role and blocking public access for upload a file to s3 using an http call?
Thanks!
The appropriate configuration is:
Lambda function IAM role: permit s3:PutObject to s3://mybucket/myprefix/*
Lambda function: use AWS SDK to invoke PutObject to S3
The S3 bucket policy and S3 block public access settings are largely orthogonal to the Lambda requirement here. In the typical case, you should use standard best practices: do not have an S3 bucket policy at all (unless you specifically need it) and enable Block Public Access at the account level (unless you specifically need buckets to be public).
There is a S3 bucket owned by a different AWS account which has a list of files. I need to copy the files to my S3 bucket. I would like to perform 2 things in order to do this:
Add an S3 bucket event in the other account which will trigger a lambda to copy files in my aws account.
My lambda should be provided permission (possibly through an assumed role) in order to copy the files.
What are the steps that I must perform in order to achieve 1 and 2?
The base requirement of copying files is straight-forward:
Create an event on the source S3 bucket that triggers a Lambda function
The Lambda function copies the object to the other bucket
The complicating factor is the need for cross-account copying.
Two scenarios are possible:
Option 1 ("Pull"): Bucket in Account-A triggers Lambda in Account-B. This can be done with Resource-Based Policies for AWS Lambda (Lambda Function Policies) - AWS Lambda. You'll need to configure the trigger via the command-line, not the management console. Then, a Bucket policy on the bucket in Account-A needs to allow GetObject access by the IAM Role used by the Lambda function in Account-B.
Option 2 ("Push"): Bucket in Account-A triggers Lambda in Account-A (same account). The Bucket policy on the bucket in Account-B needs to allow PutObject access by the IAM Role used by the Lambda function in Account-A. Make sure it saves the object with an ACL of bucket-owner-full-control so that Account-B 'owns' the copied object.
If possible, I would recommend the Push option because everything is in one account (aside from the Bucket Policy).
There is an easier way of doing it without lambda, AWS allows to set the replication of a S3 bucket( including cross region and different account), when you setup the replication all new objects will get copied to the replicated bucket, for existing objects using aws CLI just do copy object again with same bucket so that it gets replicated to target bucket, Once all existing the objects are copied you can turn off replication if you don't wise for future objects to get replicated, Here AWS does the heavy lifting for you :) https://docs.aws.amazon.com/AmazonS3/latest/dev/crr.html
There is few ways to achieve this.
You could use SNS notification and cross account IAM to trigger the lambda. Read this: cross-account-s3-data-copy-using-lambda-function explains pretty well what you are trying to achieve.
Another approach is to deploy lambda and all the resources required in the account that holds the files. You would need to create S3 notification that triggers lambda which copies the files to your account or have cloudwatch schedule (bit like cronjob) that triggers the lambda.
In this case lambda and the trigger would have to exists in the account that holds the files.
In both scenarios minimal IAM permissions that lambda would have to have is to be able to read and write to and from s3 buckets. To use STS in order to assume role. You also need to add Cloudwatch permissions to be able to generate lambda logs.
Rest of the required IAM permissions will depend of the approach you are going to take.
I am using Athena using lambda, and just noticed that result of query execution is success but there is nothing in S3 bucket. S3 bucket is public and has permission to put objects, so I can't find any problem with S3 bucket. I wonder if there is solution for this.
Providing S3 write permission to lambda execution role may help.
I am trying to deploy a Lambda function to AWS from S3.
My organization currently does not provide the ability for me to upload files to the root of an S3 bucket, but only to a folder (ie: s3://application-code-bucket/Application1/).
Is there any way to deploy the Lambda function code through S3, from a directory other than the bucket root? I checked the documentation for Lambda's CreateFunction AWS command and could not find anything obvious.
You need to zip your lambda package and upload to S3 in any folder.
You can then provide an https S3 url of the file to upload to lambda
function.
The S3 bucket needs to be in the same region as that of the lambda
function.
Make sure you zip from the folder, i.e when the package is unzipped,
the files should be extracted in the same directory as the unzip
command, and should not create a new directory for the contents.
I have this old script of mine that I used to automate lambda deployments.
It needs to be refactored a bit, but still usable.
It gets as input the lambda name and the zip file path located locally on your PC.
It uploads it to S3 and publishes to the AWS Lambda.
You need to set AWS credentials with IAM roles that allows:
S3 upload permission
AWS Lambda update permission
You need to modify the bucket name and the path you want your zip to be uploaded to. (lines 36-37).
That's it.
Is it possible to configure S3 bucket to run a Lambda function created in a different account? Basically what I'm trying to accomplish is that when new items are added to S3 bucket I want to run a lambda function in another account
You can do this by providing the full Lambda Function ARN to your S3 bucket. For example inside your bucket settings in the AWS Console:
This article will help you configure the correct IAM for cross account invocation. Also take a look at the AWS Lambda Permissions Model. Note that as far as I know the bucket and the Lambda function have to be in the same region!