I know you can trigger sagemaker pipelines with all kind of events. Can you also trigger sagemaker pipelines when another pipeline finishes it's execution?
Yes, use an Amazon EventBridge event, such as
{
"source": [
"aws.sagemaker"
],
"detail-type": [
"SageMaker Model Building Pipeline Execution Status Change"
],
"detail": {
"currentPipelineExecutionStatus": [
"Succeeded"
]
}
}
Then call the next pipeline as the target of the EventBridge event
You can use any EventbridgeEvent to trigger a pipeline step. Since Eventbridge supports Sagemaker Pipeline Status Change as an event, you should be able to trigger a pipeline by another one.
Yes, use an Amazon EventBridge event, such as
{
"source": ["aws.sagemaker"],
"detail-type": ["SageMaker Model Building Pipeline Execution Status Change"],
"detail": {
"currentPipelineExecutionStatus": ["Succeeded"],
"previousPipelineExecutionStatus": ["Executing"],
"pipelineArn": ["your-pipelineArn"]
}
}
Related
I have a setup wherein I need to trigger a lambda function when my glue crawler has run and data is ready in redshift. Is there a way to create such a trigger?
Edit:
I added an Event bridge rule for crawler state change, that works and triggers the lambda function but it triggers when any of my crawlers are running. I want to isolate it to trigger only after a specific crawler is run. I tested with the code below but it doesn't seem to pick my crawler name. Is there any other way to specify the crawler name in the rule or am I making a syntactical error?
{
"source": ["aws.glue"],
"detail-type": ["Glue Crawler State Change"],
"eventName": "crawler_name",
"detail": {
"state": ["Succeeded"]
}
}
Solution: Add an EvenBridge rule with the following Event Pattern
{
"source": ["aws.glue"],
"detail-type": ["Glue Crawler State Change"],
"detail": {
"crawlerName": ["newton_pfi_new_raw_to_source"],
"state": ["Succeeded"]
}
}
I want to Run cloudwatch rule once the previous step function completes.
This needs to be done multiple times(you can say reusable).
Example- once I trigger a step function rule and its execution gets complete, the next cloudwatch rule shoud get triggered and so on.
Can this be done like- once step function completes, a message should be published to SQS and then using the sqs, a cloudwatch event can get triggered?
We can create a cloudwatch rule with Event Source on Step Function with status SUCCEEDED
{
"source": [
"aws.states"
],
"detail-type": [
"Step Functions Execution Status Change"
],
"detail": {
"status": [
"SUCCEEDED"
],
"stateMachineArn": [
"arn:aws:states:us-east-1:555666611111:stateMachine:my-state-machine-qa"
]
}
}
and add next Stepfunction as Target.
I want to be able to setup an AWS CloudWatch event rule that will trigger to an SNS topic whenever one of my Step Functions completes (either success or failure). I do not want this to run for all Step Functions, but there will be an indeterminate number of them based on a common name prefix. Ideally, I'd like to be able to do something like this, but it appears that wildcards are not allowed in Event Patterns. Are there any creative ways to work around this?
{
"source": [
"aws.states"
],
"detail-type": [
"Step Functions Execution Status Change"
],
"detail": {
"status": [
"FAILED",
"SUCCEEDED"
],
"stateMachineArn": [
"arn:aws:states:us-west-1:123456789012:stateMachine:Prefix-*"
]
}
}
Wildcards are not supported in Cloudwatch event rule according to AWS official forum.
You will have to add all the arn's in the state machine ARN list. To do it easily you may write a script that does the following:
Get all the state machine names with specific prefix.
Update the Cloudwatch Event Rule to include all the state machine arn's with specific prefix.
My solution is below:
{
"source": ["aws.states"],
"detail-type": ["Step Functions Execution Status Change"],
"detail": {
"status": ["SUCCEEDED", "FAILED"],
"stateMachineArn": [ { "prefix": "arn:aws:states::us-west-1:123456789012:stateMachine:prefix-" } ]
}
}
I have a requirement where I need to trigger my lambda function when all of the glue crawlers have run & my data is ready in redshift to be queried.
I have setup the following AWS cloudwatch rule but it triggers the lambda if any of the crawlers have succeeded.
{
"detail-type": [
"Glue Crawler State Change"
],
"source": [
"aws.glue"
],
"detail": {
"crawlerName": [
"crw-raw-recon-the-hive-ces-cashflow",
"crw-raw-recon-the-hive-ces-position",
"crw-raw-recon-the-hive-ces-trade",
"crw-raw-recon-the-hive-ces-movement",
"crw-raw-recon-the-hive-ces-inventory"
],
"state": [
"Succeeded"
]
}
}
Now my question is there a way I could enforce the lambda to be triggered only when all of them have succeeded?
Also, I am not sure if redshift generates any similar events when it receives data.
I am trying to add some notifications to my Pipeline in AWS.
I have a build stage where I use AWS CodeBuild and I want to receive an email whenever the build fails.
I have followed the tutorial that amazon offers and it works fine to notify me about the failure, but I can't seem to find how to add the logs in the email.
I have created the following CloudWatch Event Rule, which monitors the execution of the entire Pipeline.
{
"source": [
"aws.codepipeline"
],
"detail-type": [
"CodePipeline Pipeline Execution State Change"
],
"detail": {
"state": [
"FAILED",
"SUCCEEDED",
"CANCELED"
],
"pipeline": [
"Pipeline_Trial"
]
}
}
Can anyone help me figure how to add the logs to this rule ?
The event from CodePipeline does not contain the CodeBuild logs so you can't pass this through to your email without something in the middle.
A solution could be to have your CloudWatch event target a Lambda function which looks up the logs via the CodeBuild / CloudWatch logs API. It can then generate the email including the logs and send the notification via SNS.