Is it possible to capture the startTime and endTime of execution of lambda functions along with parameters that were passed to it ?
I couldn't find any state-change event configurations that could be configured to send events when lambda function starts/terminates?
A crappy alternative is to record parameters & start time in database when the lambda is being invoked and have the lambda update the endgame as final step before it's completion. This appears prone to failures scenarios like function erroring out before updating DB.
Are there other alternatives to capture this information
aws x-ray may be a good solution here. It is easy to integrate and use. You may enable it aws console.
Go to your lambda function/ configuration tab
Scroll down & in AWS X-Ray box choose active tracing.
Without any configuration in the code, it is going to record start_time and end_time of the function with additional meta data. You may integrate it as a library to your lambda function and send additional subsegments such as request parameters. Please check here for documentation
Here is a sample payload;
{
"trace_id" : "1-5759e988-bd862e3fe1be46a994272793",
"id" : "defdfd9912dc5a56",
"start_time" : 1461096053.37518,
"end_time" : 1461096053.4042,
"name" : "www.example.com",
"http" : {
"request" : {
"url" : "https://www.example.com/health",
"method" : "GET",
"user_agent" : "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/601.7.7",
"client_ip" : "11.0.3.111"
},
"response" : {
"status" : 200,
"content_length" : 86
}
},
"subsegments" : [
{
"id" : "53995c3f42cd8ad8",
"name" : "api.example.com",
"start_time" : 1461096053.37769,
"end_time" : 1461096053.40379,
"namespace" : "remote",
"http" : {
"request" : {
"url" : "https://api.example.com/health",
"method" : "POST",
"traced" : true
},
"response" : {
"status" : 200,
"content_length" : 861
}
}
}
]
}
Related
I'm trying to publish an event to AWS Event Bridge via an API Gateway while transforming the event body using API gateway mapping templates written in Velocity Template Language (VLT) following this guide.
The event body looks like this
{
"ordersDelivered": [
{
"orderId": "a0874e2c-4ad3-4fda-8145-18cc51616ecd",
"address": {
"line2": "10 Broad Road",
"city": "Altrincham",
"zipCode": "WA15 7PC",
"state": "Cheshire",
"country": "United Kingdom"
}
}
]
}
and the VLT template like
#set($context.requestOverride.header.X-Amz-Target = "AWSEvents.PutEvents")
#set($context.requestOverride.header.Content-Type = "application/x-amz-json-1.1")
#set($inputRoot = $input.path('$'))
{
"Entries": [
#foreach($elem in $inputRoot.ordersDelivered)
{
"Resources" : ["$context.authorizer.clientId"],
"Detail" : "$util.escapeJavaScript($elem)",
"DetailType" : "OrderDelivered",
"EventBusName" : "hk-playground-more-sole",
"Source" : "delivery"
}#if($foreach.hasNext),#end
#end
]
}
However on making a test call to the REST endpoint method via the API Gateway 'Test' option in the AWS console, I get a malformed request error from the EventBridge integration as shown below:
Endpoint request body after transformations:
{
"Entries": [
{
"Resources" : [""],
"Detail" : "{orderId=a0874e2c-4ad3-4fda-8145-18cc51616ecd, address={line2=10 Broad Road, city=Altrincham, zipCode=WA15 7PC, state=Cheshire, country=United Kingdom}}",
"DetailType" : "OrderDelivered",
"EventBusName" : "hk-playground-more-sole",
"Source" : "delivery"
} ]
}
Sending request to https://events.{aws-region}.amazonaws.com/?Action=PutEvents
Received response. Status: 200, Integration latency: 32 ms
Endpoint response headers: {x-amzn-RequestId=6cd086bf-5147-4418-9498-b467ed2b6b58, Content-Type=application/x-amz-json-1.1, Content-Length=104, Date=Thu, 15 Sep 2022 10:17:44 GMT}
Endpoint response body before transformations: {"Entries":[{"ErrorCode":"MalformedDetail","ErrorMessage":"Detail is malformed."}],"FailedEntryCount":1}
Method response body after transformations: {"Entries":[{"ErrorCode":"MalformedDetail","ErrorMessage":"Detail is malformed."}],"FailedEntryCount":1}
The logs above suggest that $elem object is not being converted to JSON, so instead of $util.escapeJavaScript($elem) I tried using $util.toJson($elem) but that assigns an empty string to the Detail element and I get a 400 error. I have also tried to change the VLT template to directly read the ordersDelivered using JSONPath expression string
#set($inputRoot = $input.path('$.ordersDelivered'))
{
"Entries": [
#foreach($elem in $inputRoot)
{
"Resources" : ["$context.authorizer.clientId"],
"Detail" : "$util.escapeJavaScript($elem)",
"DetailType" : "OrderDelivered",
"EventBusName" : "hk-playground-more-sole",
"Source" : "delivery"
}#if($foreach.hasNext),#end
#end
]
}
but I still get the same MalformedDetail error as above on testing this. Am I missing the correct way of converting JSON in the Detail element?
I am trying to send a binary file and string parameters to AWS API Gateway.
this is the mapping template that is on API Gateway POST:
{
"imageFile" : $input.params('imageFile'),
"purdueUsername" : $input.params('purdueUsername'),
"description" : $input.params('description'),
"price" : $input.params('price'),
"longitude" : $input.params('longitude'),
"latitude" : $input.params('latitude'),
"category" : $input.params('category'),
}
Making a post request results in this:
When I try this
{
"imageFile" : "$input.params('imageFile')",
"purdueUsername" : "$input.params('purdueUsername')",
"description" : "$input.params('description')",
"price" : "$input.params('price')",
"longitude" : "$input.params('longitude')",
"latitude" : "$input.params('latitude')",
"category" : "$input.params('category')",
}
I am getting empty parameters. The api is not receiving the parameters I am sending through POST request.
How should I change the mapping template?
Note: When I only try to have imageFile in the mapping template and only send binary file without extra parameters it works completely fine.
{
"imageFile" : "$input.body"
}
However, I want to be able to send other parameters beside the binary file.
this is how I solved the problem. I am sending the binary file in the body of the POST request and the other parameters as a header.
this is the mapping template I put on the AWS API Gateway
{
"purdueUsername" : "$input.params('purdueUsername')",
"description" : "$input.params('description')",
"price" : "$input.params('price')",
"longitude" : "$input.params('longitude')",
"latitude" : "$input.params('latitude')",
"category" : "$input.params('category')",
"isbnNumber" : "$input.params('isbnNumber')",
"imageFile" : "$input.body"
}
I deployed Elasticsearch and Kibana 7.10.1. And I am streaming cloudwatch metrics data (raw json) to Elasticsearch.
The metric raw data format looks like:
{
"metric_stream_name" : "metric-stream-elk",
"account_id" : "264100014405",
"region" : "ap-southeast-2",
"namespace" : "AWS/DynamoDB",
"metric_name" : "ReturnedRecordsCount",
"dimensions" : {
"Operation" : "GetRecords",
"StreamLabel" : "2021-06-18T01:12:31.851",
"TableName" : "dev-dms-iac-events"
},
"timestamp" : 1624924620000,
"value" : {
"count" : 121,
"sum" : 0,
"max" : 0,
"min" : 0
},
"unit" : "Count"
}
I can see that these raw data are saved in Elasitcsearch with a custom index name aws-metrics-YYYY-MM-DD. Now how can I let Kibana read metrics from this index?
I don't want to use metricbeat because it queries metrics from AWS. My event flow is streaming AWS metrics to Elasticsearch. How can I achieve that?
I have request JSON body
[
{"name" : "Ram"},
{"name" : "Sam"}
]
and this is the input for wiremock request
I need to match the request even if the JSON has same content but values may not be in same order. Example,
[
{"name" : "Sam"},
{"name" : "Ram"}
]
The method which I use is .withRequestBody. I tried withequalToJson` but does not work. What is that matcher which checks only JSON contents and not order ?
This can be resolved using JsonPath that is part of the bodyPatterns equality matching functionality.
{
"request" : {
"urlPathPattern" : "/jpath/.*",
"method" : "GET",
"bodyPatterns" : [ {
"matchesJsonPath" : "$[?(#.name == 'Sam')]"
} ]
},
"response" : {
"status" : 200,
"body" : "Works"
}
}
Using a JsonPath online evaluator it is easy to test JsonPath expressions. For more details on what is possible have a look here.
Using Amazon Data Pipeline, I'm trying to use a SqlActivity to execute some SQL on a non-Redshift data store (SnowflakeDB, for the curious). It seems like it should be possible to do that with a SqlActivity that uses a JdbcDatabase. My first warning was when the wysiwyg editor on Amazon didn't even let me try to create a JdbcDatabase, but I plowed on anyway and just wrote and uploaded a Json definition by hand, myself (here's the relevant bit):
{
"id" : "ExportToSnowflake",
"name" : "ExportToSnowflake",
"type" : "SqlActivity",
"schedule" : { "ref" : "DefaultSchedule" },
"database" : { "ref" : "SnowflakeDatabase" },
"dependsOn" : { "ref" : "ImportTickets" },
"script" : "COPY INTO ZENDESK_TICKETS_INCREMENTAL_PLAYGROUND FROM #zendesk_incremental_stage"
},
{
"id" : "SnowflakeDatabase",
"name" : "SnowflakeDatabase",
"type" : "JdbcDatabase",
"jdbcDriverClass" : "com.snowflake.client.jdbc.SnowflakeDriver",
"username" : "redacted",
"connectionString" : "jdbc:snowflake://redacted.snowflakecomputing.com:8080/?account=redacted&db=redacted&schema=PUBLIC&ssl=on",
"*password" : "redacted"
}
When I upload this into the designer, it refuses to activate, giving me this error message:
ERROR: 'database' values must be of type 'RedshiftDatabase'. Found values of type 'JdbcDatabase'
The rest of the pipeline definition works fine without any errors. I've confirmed that it activates and runs to success if I simply leave this step out.
I am unable to find a single mention on the entire Internet of someone actually using a JdbcDatabase from Data Pipeline. Does it just plain not work? Why is it even mentioned in the documentation if there's no way to actually use it? Or am I missing something? I'd love to know if this is a futile exercise before I blow more of the client's money trying to figure out what's going on.
In your JdbcDatabase you need to have the following property:
jdbcDriverJarUri: "[S3 path to the driver jar file]"