Ok I am new to AWS. I am building a dashboard. It will also be an UI to trigger lambda functions. I see that there's native dashboarding software called quicksight. Does it allow command to trigger lambda functions?
I am thinking of falling back to flask or django. Does anyone has a better suggestion? Thank you so much in advance.
Related
I'm recently working on creating a bot with Amazon Lex. Reading the aws tutorials, I saw that they were outdated. This wasn't a problem till I needed to hook a aws lambda function to my intent fulfillment. I discovered that I need to hook the lambda function to a bot alias (witch I created, but have not discovered how to hook the function) to be able to test the bot. I'm stuck on this problem. I would appreciate if someone that has already built a bot could explain me how to hook the lambda function to the bot intent.
I am going to assume that you're working with version 2 of AWS Lex. A lot of people are experiencing the issue of associating a Lambda function with a Lex bot.
AWS have chosen a pretty weird way to configure this.
Please take a look at this answer that I had previously written on how to link a Lambda function to your Lex bot: https://stackoverflow.com/a/73621837/8880593
{
action: "lambda:InvokeFunction",
principal: new iam.AnyPrincipal(),
}
Add this permission to your lambda
If it works, you can limit your principal later on
I have a serverless infrastructure that has a front-end web app. In this front-end, users can select specific times of day for taskX to occur.
I know (and have) set up events to occur on a recurring schedule with a manually-created (with serverless framework) cron-based trigger. It's my understanding I could use a cron to trigger at specific times as explained here: How to trigger a Lambda function at specific time in AWS? and here: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-run-lambda-schedule.html ...
...however, I don't know how I would programmatically create (and also remove) these events using the AWS SDK. (also noting I might have thousands of said events - Perhaps eventbridge isn't the right tool?)
Suggestions on how to do this?
As I understood your question correctly, you need to set up a one-time schedule. Recently(10-Nov-2022) AWS launched a new service called EventBridge Scheduler and it supports a One-time schedule. I already added an answer here, please have a look. If not comment below, I might think I can help you.
For example say I build a workflow that uses 10 lambda functions that trigger each other and are triggered by a dynamodb table and an S3 bucket.
Is there any AWS tool that tracks how these triggers are tying together so I can easily visualize the whole workflow I’ve created?
Bang on, few months ago, I too was in a similar situation for my distributed architecture running on AWS.
So far, I have found the following options as possibilities. I'm still figuring out which is more suitable. But, hope this information helps you.
1. AWS-Native option :: Engineer your Lambda code to trigger Cloudwatch custom-metrics for any important events from within the code. Later, you may use Cloudwatch dashboard to visualize them.
2. Non-AWS options :: There are several of them, but all of them require you to engineer your code with their respective libraries / packages to transmit the needed information. Some of them support ASYNC invocations, so it shouldn't keep your master lambdas in the waiting state for log tracing.
IOPipe
Epsagon
3. Mix of AWS & Non-AWS :: This is a more traditional approach to our problem. You log events to Cloudwatch Logs (like how Lambda does it out of the box), "ingest" these logs into popular log management and analysis SaaS tooling to make sense between these logs via "pattern-matching" and other proprietary techniques.
Splunk Cloud
Datadog
All the best! Keep me posted how it is going.
cheers,
ram
If you use CloudFormation you can visualize the resource relations with CloudFormation Designer. However, if you don't have the resources in a CloudFormation stack, you can create one from all the existing resources.
In AWS it was possible to run cloudwatch to trigger callback lambda functions on events.
Is it possible in GCE to automatically tag servers with the user who created it based on the activity logs? Google Cloud functions seem to only be able run a non-public callback based on GCS events.
How would I do this?
As a matter of fact, there are four types of triggers for Google Cloud Functions. But none of them is useful in this case.
There is a way to automatically do so, though.
You can create an application setting up Stackdriver Logging using a Client Library, as for example Python, in App Engine.
Then you can schedule a task using a cron job which triggers the application. You can use the client library to review the logs and search for compute.instance.insert (CE creation), the "actor" or "user" and...
finally add a label to the existing resource.
I think it might be very basic question ( might be not ). I want to use Apex for my lambda functions since I need just lambdas not API gateways so i don't want use bulky Serverless.
But just creating lambdas won't help I need to schedule some event and triggers.
Can someone out there help me with that
Apex uses Terraform to build AWS infastructure. They support building cloud watch event rules which are used to trigger Lambda on cron.
Here are the documents from Terraform. https://www.terraform.io/docs/providers/aws/r/cloudwatch_event_rule.html
This is the link to the S3 Bucket notification which allow you to setup triggers to Lambda.
https://www.terraform.io/docs/providers/aws/r/cloudwatch_event_rule.html