AWS cloud9 use lambda function - amazon-web-services

The AWS cloud9 UI changed a lot and I cannot find how to create a lambda function from it. What I have done is to create a lambda function and use upload lambda to the folder but it doesn't show anything. How to do it?

Related

Can we import our locally created lambda function to AWS console?

I have been working on AWS Lambda. For some reason, I had to create a Lambda function locally. I just want to know, can we import local lambda function to aws lambda console. If yes then please elaborate how can i achieve this?
It is pretty easy:
Write your lambda function with a lambda_handler
Create a requirements.txt
Install all the requirements in the same folder
Package it(Zip file).
Go to AWS Lambda --> Create function --> Fill all details --> Create function.
Under code section: upload from --> .zip file ---> (select your file) --> upload--> save
Modify the handler according to your function name.
Yes you can. Assume you create the Lambda function using the Java run-time API (com.amazonaws.services.lambda.runtime.RequestHandler) and you create a FAT JAR that contains all of the required dependencies. You can deploy the Lambda function (the JAR) using the AWS Management Console, as described in this AWS tutorial.
Creating an AWS Lambda function that detects images with Personal Protective Equipment

Add AWS Lambda function trigger to existing function uding Terraform

Currently I have a lambda function that is created in a different Terrraform module. I need to create a Cloudwatch Logs trigger for that lambda function from a serperate repository. So far, I don't see any Terraform resources (that I know of) to do this. I have also looked into using Boto3 in local-exec through terraform. This doesn't look possible either. Are there any ways that I am missing, that can complete this using Terraform, AWS_CLI, or python.
Thanks
You need to define a aws_cloudwatch_log_subscription_filter with the Lambda function's ARN as the destination value. You could pass the Lambda function's ARN into the CloudWatch module, or you could have it lookup the function by name to get the ARN. You'll probably also need to create an aws_lambda_permission resource to give CloudWatch permission to invoke the Lambda function.

AWS Multiple Lambda Functions in one zip

I'm using CloudFormation to create lambda resources and I have several python scripts as well as one js script in a lambda-resources folder. Would it be okay to pass the same file location for every lambda function and just specify unique handlers? When my lambda function is created it looks like it only creates one lambda function.
Yes, this is definitely one way to accomplish what you're looking to do.
You'll need to create a zipped version of your lambda-resources folder and upload it via the Lambda service or even to S3, then reference it as the file location for each Lambda function.

How do I add a Lambda Function with an S3 Trigger in CloudFormation?

I've been working with CloudFormation YAML for awhile and have found it to be comprehensive - until now. I'm struggling in trying to use SAM/CloudFormation to create a Lambda function that is triggered whenever an object is added to an existing S3 bucket.
All of the examples I've seen thus far seem to require that you create the bucket in the same CloudFormation script as you create the Lambda function. This doesn't work for me, because we have a design goal to be able to use CloudFormation redeploy our entire stack to different regions or AWS accounts and quickly stand up our application. S3 bucket names must be globally unique, so if I create the bucket in CloudFormation, the script will break when I try to deploy it to a different region/account. I could probably get around this by creating buckets with the account name/region in the name, but that's just not desirable from a bucket sprawl perspective.
So, does anyone have a solution for creating a Lambda function in CloudFormation that is triggered by objects being written to an existing S3 bucket?
Thanks!
This is impossible, according to the SAM team. This is something which the underlying CloudFormation service can't do.
There is a possible workaround, if you implement a Custom resource which would trigger a separate Lambda function to modify the existing bucket and link it to the Lambda function that you want to deploy.
As "implement a Custom Resource" isn't very specific: Here is an AWS github repo with scaffold code to help write it, and then you declare something like the following in your template (where LambdaToBucket) is the custom function you wrote. I've found that you need to configure two things in that function: one is a bucket notification configuration on the bucket (saying tell Lambda about changes), the other is a Lambda Permission on the function (saying allow invocations from S3).
Resources:
JoinLambdaToBucket:
Type: Custom::JoinLambdaToExistingBucket
Properties:
ServiceToken: !GetAtt LambdaToBucket.Arn

How can I create a Scheduled Event for a lambda function using the AWS CLI?

The AWS CLI does not have an option to schedule a lambda function. This is possible via the AWS console right now.
Any ideas on how I can do this?
aws lambda create-event-source-mapping # does not support scheduling events
It is not possible to use the API to create a schedule event sources with AWS Lambda at this time. That means it is not possible to use the AWS CLI to create the schedule. It is also not possible to use CloudFormation to schedule a AWS Lambda function.
Unfortunately using the GUI is the only option until AWS release an API.
We use Lambda to create print-ready file: http://blog.peecho.com/blog/using-aws-lambda-functions-to-create-print-ready-files