AWS EventBridge Pattern for ECR Scan Issues - amazon-web-services

I cannot quite figure it out, what is the Event Pattern required to trigger an EventBridge rule for when an ECR Scan comes back having found vulnerabilities at ANY level. Can anyone share an Event Pattern that would allow this?

You can find sample ECR Scanning events at https://docs.aws.amazon.com/AmazonECR/latest/userguide/ecr-eventbridge.html#ecr-eventbridge-bus
{
"version": "0",
"id": "85fc3613-e913-7fc4-a80c-a3753e4aa9ae",
"detail-type": "ECR Image Scan",
"source": "aws.ecr",
"account": "123456789012",
"time": "2019-10-29T02:36:48Z",
"region": "us-east-1",
"resources": [
"arn:aws:ecr:us-east-1:123456789012:repository/my-repo"
],
"detail": {
"scan-status": "COMPLETE",
"repository-name": "my-repo",
"finding-severity-counts": {
"CRITICAL": 10,
"MEDIUM": 9
},
"image-digest": "sha256:7f5b2640fe6fb4f46592dfd3410c4a79dac4f89e4782432e0378abcd1234",
"image-tags": []
}
}
You can create a rule to match on values within finding-severity-counts. You may find this helpful https://docs.aws.amazon.com/eventbridge/latest/userguide/content-filtering-with-event-patterns.html#filtering-exists-matching

Related

EventBridge pattern invalid when I add a path prefix. "Event pattern is not valid. Reason: "name" must be an object or an array at..."

I am trying to create and EventBridge event that triggers when objects are created in a path prefix of my bucket. When I write the event pattern without the path prefix, it works. When I add the path prefix, I get a failure. I am using official documentation for syntax and this other SO question seems to confirm what I'm doing but the solution doesn't work.
I am using EventBridge to create the rule > Step 2 Build event pattern > Event pattern.
Error message:
Event pattern is not valid. Reason: "name" must be an object or an array at [Source: (String)"{"source":["aws.s3"],"detail-type":["Object Created"],"detail":{"bucket":{"name":"test-test-20230118"},"object":{"key":[{"prefix":"raw"}]}}}"; line: 1, column: 83]
Unsuccessful pattern:
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["test-test-20230118"]
},
"object": {
"key": [{
"prefix": "raw"
}]
}
}
}
Successful pattern without prefix:
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["test-test-20230118"]
}
}
}
Your pattern will work if you modify the sample event to match the name and prefix your filtering on. Ive not seen that error so not sure whats going on but i think its related to the sample event your testing your pattern against. Start again with the sample event (I copied the sample event from event type -> AWS Events, sample events -> Object Created and pasted it into "enter my own") and update resources, bucket->name and detail->object->key so your pattern will match it.
I assume "raw" is a directory in your "test-test-20230118" bucket. If that is the case, use a slash such as "raw/" as prefix.
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["test-test-20230118"]
},
"object": {
"key": [{
"prefix": "raw/"
}]
}
}
}

EventBridge rule is not getting triggered via custom event pattern of parameter store

I have created one event bridge with below custom event pattern of updating value of SSM parameter store. when I update the value rule is not getting triggered. Anything i am missing ?
{
"version": ["0"],
"id": ["80e9b391-6a93-413c-839a-453b528083af"],
"source": ["aws.ssm"],
"detail-type": ["Parameter Store Change"],
"account": ["647587844964"],
"time": ["2017-05-22T16:44:48Z"],
"region": ["us-east-1"],
"resources": ["arn:aws:ssm:us-east-1:647587844964:parameter/bfom_date"],
"detail": {
"operation": ["Update"],
"name": ["bfom_date"],
"type": ["String"],
"description": ["Sample Parameter"]
}
}

AWS EventBridge: How to send only 1 notification when multiple objects deleted

I use AWS EventBridge with the following settings to activate Lambda functions. If there are three files under s3://testBucket/test/, and when these files deleted (I delete all files at the same time), EventBridge will send a notification to activate Lambda three times.
In this situation, I want to send only one notification to avoid duplicate execution of Lambda. Does anyone know how to set EventBridge to do so?
{
"source": [
"aws.s3"
],
"detail-type": [
"Object Deleted"
],
"detail": {
"bucket": {
"name": [
"testBucket"
]
},
"object": {
"key": [{
"prefix": "test/"
}]
}
}
}
It is not possible.
An event will be generated for each object deleted.

EventBridge Rule for findings from SecurityHub

I am trying to create a EventBridge Rule for a "event" pattern as below :
My Json Structure :
{
"Findings": [
{
"SchemaVersion": "2018-10-08",
"Id": "arn:aws:securityhub:us-west-2:220307202362:subscription/aws-foundational-security-best-practices/v/1.0.0/EC2.9/finding/eeecfc8d-cb70-4686-8615-52d488f87959",
"ProductArn": "arn:aws:securityhub:us-west-2::product/aws/securityhub",
"ProductName": "Security Hub",
"CompanyName": "AWS",
"Region": "us-west-2",
"GeneratorId": "aws-foundational-security-best-practices/v/1.0.0/EC2.9",
"AwsAccountId": "220311111111",
"Types": [
"Software and Configuration Checks/Industry and Regulatory Standards/AWS-Foundational-Security-Best-Practices"
],
"FirstObservedAt": "2021-09-27T20:01:59.019Z",
"LastObservedAt": "2021-10-12T16:35:29.556Z",
"CreatedAt": "2021-09-27T20:01:59.019Z",
"UpdatedAt": "2021-10-12T16:35:29.556Z",
"Severity": {
"Product": 0,
"Label": "INFORMATIONAL",
"Normalized": 0,
"Original": "INFORMATIONAL"
},
"Title": "EC2.9 EC2 instances should not have a public IPv4 address"
}
]
}
My Json structure does not looks like Event pattern shown in above picture on right hand side so i thought of modifying the event pattern something like as per my json posted above.As soon as i Edit the event pattern the option on the left hand side changes to "custom pattern" as below :
When i try to test my above json it gives me error as below :
What I am missing here ? How I can configure my event Hub findings such that it is able to identify my above json and it go go to my target (Kinesis firehose) ?
In test event pattern, you need write full event including items like version, id,...
This tutorial shows simple example (for EC2 though).
And for Security Hub Findings, event test pattern will be like shown in this doc .
Update:
Here is the screenshot what I tried using your JSON. Note that Event pattern is only "source". And for Test event pattern headers except findings, I took codes from "Use sample event provided by AWS" of dropdown of custom event.
Event pattern JSON is:
{
"version": "0",
"id": "8e5622f9-d81c-4d81-612a-9319e7ee2506",
"detail-type": "Security Hub Findings - Imported",
"source": "aws.securityhub",
"account": "123456789012",
"time": "2019-04-11T21:52:17Z",
"region": "us-west-2",
"resources": ["arn:aws:securityhub:us-west-2::product/aws/macie/arn:aws:macie:us-west-2:123456789012:integtest/trigger/6294d71b927c41cbab915159a8f326a3/alert/f2893b211841"],
"detail": {
"Findings": [{
"SchemaVersion": "2018-10-08",
"Id": "arn:aws:securityhub:us-west-2:111122223333:subscription/aws-foundational-security-best-practices/v/1.0.0/EC2.9/finding/eeecfc8d-cb70-4686-8615-52d488f87959",
"ProductArn": "arn:aws:securityhub:us-west-2::product/aws/securityhub",
"ProductName": "Security Hub",
"CompanyName": "AWS",
"Region": "us-west-2",
"GeneratorId": "aws-foundational-security-best-practices/v/1.0.0/EC2.9",
"AwsAccountId": "220311111111",
"Types": [
"Software and Configuration Checks/Industry and Regulatory Standards/AWS-Foundational-Security-Best-Practices"
],
"FirstObservedAt": "2021-09-27T20:01:59.019Z",
"LastObservedAt": "2021-10-12T16:35:29.556Z",
"CreatedAt": "2021-09-27T20:01:59.019Z",
"UpdatedAt": "2021-10-12T16:35:29.556Z",
"Severity": {
"Product": 0,
"Label": "INFORMATIONAL",
"Normalized": 0,
"Original": "INFORMATIONAL"
},
"Title": "EC2.9 EC2 instances should not have a public IPv4 address"
}]
}
}
You can now use Test Event.
It's confusing that Event pattern and Test Event pattern is too far. The attributes like source is treated in EventBridge automatically.
For detecting specific attribute, "Event type->Security Hub Findings-Imported" might be useful.

CloudWatch Event Rule and SNS for updates on ECS service

I want to receive an email every time I update my ECS service (and once the update finishes or the desired state was reached)
I thought about CloudWatch Events Rules setting an SNS topic as target (which a confirmed email address). However, it doesn't work.
This is my custom Event pattern:
{
"detail-type": [
"ECS Update"
],
"resources": [
"arn:aws:ecs:us-east-1:aws-account:service/myService"
],
"source": [
"aws.ecs"
],
"detail": {
"clusterArn": [
"arn:aws:ecs:us-east-1:aws-account:cluster/myCluster"
],
"eventName": [
"SERVICE_STEADY_STATE"
],
"eventType": [
"INFO"
]
}
}
I also tried:
TASKSET_STEADY_STATE
CAPACITY_PROVIDER_STEADY_STATE
SERVICE_DESIRED_COUNT_UPDATED
I'm updating the service through the cli
aws ecs update-service --cluster myCluster --service myService --task-definition myTaskDef --force-new-deployment --desired-count 2
The status of the event rule is enabled and the target is the SNS topic. The input is matched event.
I don't have any clue. Am I using the wrong event name?
You can also set email notification on Task instead of service, also there is an issue regarding ECS notification.
I was not able to make it base on ECS status change, I controlled notification at lambda level. you can set this rule and its working for me.
{
"source": [
"aws.ecs"
],
"detail-type": [
"ECS Service Action"
]
}
you can expect a bit delay as I already experienced this and also reported in GitHub Issue.
Here is the JSON event that you will receive for above rule.
{
"version": "0",
"id": "c3c27e7b-abcd-efgh-c84e-highgclkl",
"detail-type": "ECS Service Action",
"source": "aws.ecs",
"account": "1234567890",
"time": "2020-06-27T00:00:00.00Z",
"region": "us-west-2",
"resources": [
"arn:aws:ecs:us-west-2:1234567890:service/test"
],
"detail": {
"eventType": "INFO",
"eventName": "SERVICE_STEADY_STATE",
"clusterArn": "arn:aws:ecs:us-west-2:123456789:cluster/mycluster",
"createdAt": "2020-06-27T00:00:00.00Z"
}
}
ecs_cwe_events
or the other option is so you can try task-based changes.
{
"source": [
"aws.ecs"
],
"detail-type": [
"ECS Task State Change"
],
"detail": {
"lastStatus": [
"STOPPED",
"RUNNING"
],
"clusterArn": [
"arn:aws:ecs:us-west-2:123456789:cluster/my_cluster",
]
}
}