Scala: Get URL Parameters - amazon-web-services

I'm writing my first Scala lambda, and I have run into a bit of an issue, which i think should be straightforward, but I'm having trouble finding the answer.
So I have the following code that will allow me to accept json and use it in the lambda.
--eventTest.scala
val stream : InputStream = getClass.getResourceAsStream("/test_data/body.json")
--request handler
def handleRequest(input: InputStream): Unit = {
val name = scalaMapper.readValue(input, classOf[NameInfo])
val result = s"Hello there, ${name.firstName} ${name.lastName}."
println(result)
}
This works just fine, but I'm having problems figuring out how to be able to get URL parameters. Does it automatically use the same input Stream? There is seems to be very little documentation on this in Scala.
Thanks

A Lambda function's event is a JSON object. The Lambda runtime will introspect the handler function and attempt to extract or convert that object based on the function signature. I believe the easiest representation is a java.util.Map[String,String] (iirc, Lambda doesn't have a Scala runtime, so you'll have to use Java classes and convert them).
An example event from the API Gateway proxy integration: https://github.com/awsdocs/aws-lambda-developer-guide/blob/master/sample-apps/nodejs-apig/event.json
For more information about the Java runtime: https://docs.aws.amazon.com/lambda/latest/dg/java-handler.html

Related

Invoking binary in aws lambda with rust

So I have the following rust aws lambda function:
use std::io::Read;
use std::process::{Command, Stdio};
use lambda_http::{run, service_fn, Body, Error, Request, RequestExt, Response};
use lambda_http::aws_lambda_events::serde_json::json;
/// This is the main body for the function.
/// Write your code inside it.
/// There are some code example in the following URLs:
/// - https://github.com/awslabs/aws-lambda-rust-runtime/tree/main/examples
async fn function_handler(_event: Request) -> Result<Response<Body>, Error> {
// Extract some useful information from the request
let program = Command::new("./myProggram")
.stdout(Stdio::piped())
.output()
.expect("failed to execute process");
let data = String::from_utf8(program.stdout).unwrap();
let parsed = data.split("\n").filter(|x| !x.is_empty()).collect::<Vec<&str>>();
// Return something that implements IntoResponse.
// It will be serialized to the right response event automatically by the runtime
let resp = Response::builder()
.status(200)
.header("content-type", "application/json")
.body(json!(parsed).to_string().into())
.map_err(Box::new)?;
Ok(resp)
}
#[tokio::main]
async fn main() -> Result<(), Error> {
tracing_subscriber::fmt()
.with_max_level(tracing::Level::INFO)
// disable printing the name of the module in every log line.
.with_target(false)
// disabling time is handy because CloudWatch will add the ingestion time.
.without_time()
.init();
run(service_fn(function_handler)).await
}
The idea here is that I want to return the response from the binary in JSON format.
I'm compiling the function with cargo lambda which is producing bootstrap file, then I'm zipping it manually by including the bootstrap binary and the myProgram binary.
When I test my function in the lambda panel by sending event to it I get the response with the right headers etc. but the response body is empty.
I'm deploying my function thru the aws panel, on Custom runtime on Amazon Linux 2 by uploading the zip file.
When I test locally with cargo lambda watch and cargo lambda invoke the response body is filled with the myProgram stdout parsed to json.
Any ideas or thoughts on what goes wrong in the actual cloud are much appreciated!
My problem was with the dynamically linked libraries in the binary. It is actually python binary and it was missing specific version of GLIBC.
The easiest solution in my case was to compile myProgram on Amazon Linux 2

while writing a Chaincode (Hyperledger-fabric) in Golang I am confused while fallowing documentation

I am confused ,when I am following fabric-samples (asset-transfer-basic --->https://github.com/hyperledger/fabric-samples/blob/main/asset-transfer-basic/chaincode-go/chaincode/smartcontract.go) for writing smart contract in Golang , the methods takes ctx contractapi.transcationcontextinterface as there function parameter , but when I am trying to refer other chaincode's on internet every one else are taking stub shim.chaincodesubinterface as there function parameter , if I use stub as my function parameter then how can I make use of clientidenty methods (cid package) , and in asset-transfer-basic code Init/Invoke are not mentioned also in main function when creating a new chaincode ( assetChaincode, err := contractapi.NewChaincode(&SmartContract{}) ) SmartContract{} does not implement contractinterface . I am try to do a project on ERC20 token for applying for jobs so please help
The examples you have found that include an Init and Invoke function, and take ChaincodeStubInterface as a parameter are using the older and lower-level fabric-chaincode-go API directly. The asset-transfer-basic sample is using the newer and higher-level fabric-contract-api-go API. This is built on top of the lower-level API and allows you access to all the same information as the lower-level API (note that the transaction context passed into transaction functions using the higher-level API has a GetStub method to obtain the corresponding low-level ChaincodeStubInterface).
I would recommend using the Contract API as demonstrated by the asset-transfer samples. The two approaches are functionally equivalent but the Contract API provides a cleaner abstraction requiring less boiler-plate code.

Why is the JSON output of my task being escaped by AWS Step Functions?

I have a Lambda function that runs in a step function.
The Lambda function returns a JSON string as output.
When I debug the function locally, I see that the JSON is valid but when I run the step function and get to the next step after my function, I can see that all my " have turned to \" and there is a " at the beginning and end of my JSON.
So a JSON object that looks like the following when I debug my function:
{"test":60,"test2":"30000","test3":"result1"}
Ends up looking like the following as the input of the step after my lambda:
"{\"test\":60,\"test2\":\"30000\",\"test3\":\"result1\"}"
Why does my valid JSON object end up being escaped?
How can I prevent this from happening?
The Lambda function returns a JSON string as output.
That is exactly why your JSON is being escaped - you're returning your object as a JSON string e.g. using JSON.stringify not as a JSON object.
The easiest way to fix this would be to just return the object & not convert the output to a JSON string. That way, it won't be escaped & will be returned, as you expect, as an object.
However, if it must stay as a JSON string for whatever reason, you can use the States.StringToJson(...) intrinsic function to unescape the escaped JSON string using the ResultSelector property of your task.
So for example, if your output is:
{
"Payload": "{\"test\":60,\"test2\":\"30000\",\"test3\":\"result1\"}",
...
}
To be able to unescape the output before passing it to the next task, set the ResultSelector of your task to:
"ResultSelector": {
"Payload.$":"States.StringToJson($.Payload)"
}
Or if using the Workflow Studio, click on the task, check the Output > Transform result with ResultSelector - optional checkbox and fill in the text box with the above ResultSelector object:
Either way, the final result of your task definition should look like this:
{
...
"States": {
"Lambda Invoke": {
"Type": "Task",
...
"ResultSelector": {
"Payload.$":"States.StringToJson($.Payload)"
}
}
}
}
The output will then be as you expect:
{
"Payload": {
"test": 60,
"test2": "30000",
"test3": "result1"
}
}
While the answer from #Ermiya Eskandary is entirely correct, you also have a few more options that you can use to your advantage with or without using ResultSelector (if its a stringified json, then you pretty much have to use ResultSelector however as that answer mentioned) property. ResultPath and OutputPath.
If you do not need the incoming event for anything else after this Lambda, then have your lambda return an Json like object (ie: if in python, return a dict)
In your State Machine Definition, then include two properties in your Lambda Task
OutputPath:"$.SomeKey",
ResultPath:"$.SomeKey"
the SomeKey has to be the same for both.
What these two lines together in the task properties is say (ResultPath) "Put the output of this lambda in the event under the key 'SomeKey'" and then (OutputPath) "only send this key on to the next Task"
If you still need the data from the Input, you can use ResultPath: alone, which will put the output of the Lambda under the key assigned and append it to the InputEvent as well.
See This documentation for more info
Newbie to step functions here. I noticed there are two different ways to call a lambda from step functions:
The AWS-SDK way using Resource: arn:aws:states:::aws-sdk:lambda:invoke
The "optimised" way using Resource: arn:aws:states:::lambda:invoke
I found that the optimized way does a much better job with the JSON coming back from the python Lambda whereas the AWS SDK way was an escaped mess.

How to modify the HTTP::Response after it has been written to

I'm trying to write some tooling for Crystal (specifically Kemal) where I can see if the response content type is text/html and modify the response body thats has already been written to the HTTP::Response before it is sent to the client by injecting an HTML element into the existing html response body.
I've noticed that HTTP::Server::Response is write-only, but things like Gzip::Writer are able to modify the body.
How can I modify the HTTP::Server::Response html body before it is sent to the client?
It's written in Crystal, so let's just take a look at the source on how others do this.
Taking the CompressHandler as an example, the basic idea is to replace the response's IO with something that allows the desired control:
context.response.output = Gzip::Writer.new(context.response.output, sync_close: true)
# ...
call_next(context)
So how can we make use of that to modify the response that's being written?
A naive (and slow) example would be to just keep hold of the original output and provide a IO::Memory instead:
client = context.response.output
io = IO::Memory.new
context.response.output = io
call_next(context)
body = io.to_s
new_body = inject_html(body)
client.print new_body
Of course that would only work when this handler comes before any handler that turns the response into non-plaintext (like the above CompressHandler).
A smarter solution would provide a custom IO implementation that just wraps the original IO, watching what's written to it and inject whatever it wants to inject at the right point. Examples of such wrapping IOs can be found at IO::Delimited, IO::Sized and IO::MultieWriter among others, the pattern is really common to prevent unnecessary allocations.

Create Azure EventHub Message in Azure Functions

I'm trying to do some Proof of Concept with EventHub and Azure Functions. I have a Generic Webhook Function in C# that I want to pass a message to my EventHub.
I get stuck on the parameter name given on the "Integrate" tab. If I declare that name among the parameters I have to give it a type. I can't figure out what kind of type though... I've tried:
String (other functions uses that together with the direction, not applicable with webhooks).
EventData
IEnumerable
I can't get it to work. If I don't do this I get the error message:
"Missing binding argument named 'outputEventHubMessage'"
If I give it the wrong type I get the message:
"Error indexing method 'Functions.GenericWebhookCSharp1'. Microsoft.Azure.WebJobs.Host: Can't bind to parameter."
I'm probably a bit lost in documentation or just a bit tired, but I'd appreciate any help here!
/Joakim
Likely you're just missing the out keyword on your parameter. Below is a working WebHook function that declares an out string message parameter that is mapped to the EventHub output, and writes an EventHub message via message = "Test Message".
Because async functions can't return out parameters, I made this function synchronous (returning an object rather than Task<object>). If you want to remain async, rather than use an out parameter, you can instead bind to an IAsyncCollector<string> parameter. You can then enqueue one or more messages by calling the AddAsync method on the collector.
More details on the EventHub binding and the types it supports can be found here. Note that the other bindings follow the same general patterns.
#r "Newtonsoft.Json"
using System;
using System.Net;
using Newtonsoft.Json;
public static object Run(HttpRequestMessage req, out string message, TraceWriter log)
{
string jsonContent = req.Content.ReadAsStringAsync().Result;
dynamic data = JsonConvert.DeserializeObject(jsonContent);
log.Info($"Webhook was triggered! Name = {data.first}");
message = "Test Message";
var res = req.CreateResponse(HttpStatusCode.OK, new {
greeting = $"Hello {data.first} {data.last}!"
});
return res;
}