I'm trying to provide my Lambda function with the S3FullAccessPolicy policy. Note the target bucket is not configured within the template.yaml - it already exists. Considering the syntax examples from this documentation I have three options:
1.AWS managed policy named:
Policies:
- S3FullAccessPolicy
2.AWS SAM policy template (SQSPollerPolicy) defined:
Policies:
- S3FullAccessPolicy:
BucketName: abc-bucket-name
3.Or an inline policy document:
Policies:
- Statement:
...
In trying #1 I get an error that says it seems to suggest I need to provide an arn. If this is the case where would I provide it? The error:
1 validation error detected: Value 'S3FullAccessPolicy' at 'policyArn' failed to satisfy constraint:
Member must have length greater than or equal to 20
For #2 I provide the bucket name but it says that the policy is 'invalid'. I've tried adding quotes and replacing the name with an arn - but no luck.
And #3 - I can find the code for the policy here but that's in yaml so I wonder if that's even what I'm supposed to be using.
What am I missing here? I'm open to using any one of these options but right now I'm 0/3.
The full Lambda function:
testFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: lambda/testFunction/
Handler: app.lambda_handler
Runtime: python3.8
Timeout: 900
Policies:
- S3FullAccessPolicy
I used below template without any issues.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./hello_world/
Handler: app.lambda_handler
Runtime: python3.8
Tracing: Active
Policies:
- S3FullAccessPolicy:
BucketName: existingbucketname # bucket name without arn
Ran it using below command and it deployed successfully.
sam deploy --stack-name sample-stack --s3-bucket bucket-to-deploy --capabilities CAPABILITY_IAM
Related
After deploying a Lambda through SAM, I was getting a 403 inside the Lambda when attempting to download from S3. I checked the Lambda's Role in IAM Management Console, and I saw that the Role only had AWSLambdaBasicExecutionRole. However, it should also have the policies from the SAM template for S3 read/write.
Here is a snippet from my SAM template (with some things renamed):
MyFunction:
Type: AWS::Serverless::Function
Properties:
PackageType: Image
Metadata:
DockerTag: my-tag
DockerContext: ./src/stuff
Dockerfile: Dockerfile
Policies:
- Version: '2012-10-17'
- S3ReadPolicy:
BucketName: !Ref MyBucket
- S3WritePolicy:
BucketName: !Ref MyBucket
Shouldn't the S3ReadPolicy and S3WritePolicy be a part of the Lambda's Role?
Is there something I'm missing?
I know I could manually add the policies needed, but obviously I want as much as possible to be happening automatically via SAM.
Other details: I'm not sure if it matters, but for the sake of additional context, the Lambda is part of a Step Function state machine. I'm using the boto3 library for making the request to download from S3. I get a {'Code': '403', 'Message': 'Forbidden'} error from boto3.
(answering my own question). The issue was the Policies section needed to be under Properties.
MyFunction:
Type: AWS::Serverless::Function
Properties:
PackageType: Image
Policies:
- Version: '2012-10-17'
- S3ReadPolicy:
BucketName: !Ref MyBucket
- S3WritePolicy:
BucketName: !Ref MyBucket
Metadata:
DockerTag: my-tag
DockerContext: ./src/stuff
Dockerfile: Dockerfile
I'm new to SAM templates. I have the following snippet of my SAM Template where I used to pass the name of bucket name as a parameter from outside of this SAM YAML file :-
SAM Template:-
MyLambda:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./functions/test/dist/
Handler: index.lambdaHandler
Runtime: nodejs12.x
Events:
S3PutObjectEvent:
Type: S3
Properties:
Bucket: !Ref S3BucketName
Events: s3:ObjectCreated:*
Parameter.YAML:-
DeploymentEnvironment:
default:
S3BucketName: my-awesome-s3-bucket
Now, I do not create any S3 Bucket using SAM Template or Infrastructure as a code (IaC). Bucket creation is done by Lambda code itself hence there is no S3 Object Type declaration in my SAM Template.
When I execute this command, sam validate to validate the SAM Template then I get this error:-
/template.yaml' was invalid SAM Template.
Error: [InvalidResourceException('MyLambda', 'Event with id [S3PutObjectEvent] is invalid. S3 events must reference an S3 bucket in the same template.')] ('MyLambda', 'Event with id [S3PutObjectEvent] is invalid. S3 events must reference an S3 bucket in the same template.')
I really need your help in achieving this as I tried hard in getting it solved. I read various forums, not sure if we can pass the bucket name from outside of the SAM template or not.
Is there any way workaround? This is really critical issue for me. Appreciate your help on this. thanks
Bucket creation is done by Lambda code itself
I'd recommend against this pattern, as your Lambda even source won't get created if the Bucket doesn't already exist.
Try creating the bucket in your SAM template, and pass the bucket name to your function as an environment variable.
Optionally you can set different environment names on your bucket name (addressing comment) using Parameters.
Parameters:
Env:
Type: String
AllowedValues:
- dev
- qa
- prod
Default: dev
Resources:
MyBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub 'My-unique-bucket-name-${Env}'
MyLambda:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./functions/test/dist/
Handler: index.lambdaHandler
Runtime: nodejs12.x
Environment:
Variables:
BUCKET_NAME: !Ref MyBucket # passed to Lambda as environment variable
Events:
S3PutObjectEvent:
Type: S3
Properties:
Bucket: !Ref MyBucket
Events: s3:ObjectCreated:*
And get the bucket name in your function
const bucket = process.env.BUCKET_NAME
I want to create a deployment script for some lambda functions using AWS SAM. Two of those functions will be deployed into one account(account A) but will be triggered by an s3 bucket object creation event in a second account(account B). From what I know the only way to do this is by using adding a resource based policy to my lambda. But I don't know how to do that in AWS SAM. My current yaml file looks like this.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
deploy-test-s3-triggered-lambda
Parameters:
AppBucketName:
Type: String
Description: "REQUIRED: Unique S3 bucket name to use for the app."
Resources:
S3TriggeredLambda:
Type: AWS::Serverless::Function
Properties:
Role: arn:aws:iam::************:role/lambda-s3-role
Handler: src/handlers/s3-triggered-lambda.invokeAPI
CodeUri: src/handlers/s3-triggered-lambda.js.zip
Runtime: nodejs10.x
MemorySize: 128
Timeout: 60
Policies:
S3ReadPolicy:
BucketName: !Ref AppBucketName
Events:
S3NewObjectEvent:
Type: S3
Properties:
Bucket: !Ref AppBucket
Events: s3:ObjectCreated:*
AppBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Ref AppBucketName
What do I need to add to this yaml file in order to tie a resource based policy that allows for cross account access to my lambda function?
This can be done achieved with the help of AWS::Lambda::Permission using aws_cdk.aws_lambda.CfnPermission.
For example, to allow your lambda to be called from a role in another account, add the following to your CDK:
from aws_cdk import aws_lambda
aws_lambda.CfnPermission(
scope,
"CrossAccountInvocationPermission",
action="lambda:InvokeFunction",
function_name="FunctionName",
principal="arn:aws:iam::111111111111:role/rolename",
)
If your bucket and your Lambda function exist in separate accounts I don't know if it's possible to modify both of them from SAM / a single CloudFormation template.
Don't think cross account s3 event is possible with SAM, may need to go back to CFN.
I've followed the AWS SAM documentation for hands-on learning experience and have reached this particular section: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-property-function-s3.html
Above page in the documentation explains how to map the S3 events to any resource and I've done something similar on my local machine. My local template below.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
CreateThumbnail:
Type: AWS::Serverless::Function
Properties:
CodeUri: hello_world/
Handler: app.lambda_handler
Runtime: python3.6
Timeout: 60
Policies: AWSLambdaExecute
Events:
CreateThumbnailEvent:
Type: S3
Properties:
Bucket: !Ref testBucket
Events: s3:ObjectCreated:*
testBucket:
Type: AWS::S3::Bucket
Now, when I build and deploy above SAM template, while a new S3 bucket is getting created, the 'object creation' event is not mapped to the Lambda function created. PSB screenshot.
I haven't found any articles or blogs on this error (probably, I'm doing a silly mistake here)
Please help.
Thanks.
I realized that for some reason, the AWS Lambda console is not showing the trigger event but the mapping is successful. I've also validated by uploading sample image to the bucket which triggered the lambda.
The event mapping can be seen in the properties section of the S3 bucket created.
S3 test bucket created -> Properties -> Events
You can add a AWS::Lambda::Permission to your Resources as described in this AWS SAM Github issue
LambdaInvokePermission:
Type: 'AWS::Lambda::Permission'
Properties:
FunctionName: !GetAtt MyFunction.Arn
Action: 'lambda:InvokeFunction'
Principal: 's3.amazonaws.com'
SourceAccount: !Sub ${AWS::AccountId}
SourceArn: !GetAtt MyBucket.Arn
First create bucket where you plan to save packaged code
aws s3 mb s3://youtbucketname
After sam build
go for
sam package --template-file template.yaml --s3-bucket <yourbucketname>
I have the following AWS SAM file (showing extract) for a lambda function. The problem is that I'm trying to specify multiple policies and this does not work, I get an error
Resources:
Get:
Type: AWS::Serverless::Function
Properties:
FunctionName: fnStores
Handler: handler.get
Runtime: nodejs6.10
Policies:
-AmazonDynamoDBReadOnlyAccess
-AmazonS3ReadOnlyAccess
This is the error I get
"ARN -AmazonDynamoDBReadOnlyAccess -AmazonS3ReadOnlyAccess is not valid.
On a side note, is it possible to create a custom policy that combines the above two and then use that? If so please provide an example.
The YAML list isn't valid. Need a space between - and the Policy names
Try
Resources:
Get:
Type: AWS::Serverless::Function
Properties:
FunctionName: fnStores
Handler: handler.get
Runtime: nodejs6.10
Policies:
- AmazonDynamoDBReadOnlyAccess
- AmazonS3ReadOnlyAccess