Need xunit test case for signalR trigger with Kafka - unit-testing

I have 2 methods in my azure functions
SignalR to Kafka - which writes data to Kafka using httptrigger
Kafka to signalR-which reads data from Kafka using kafkatrigger
I want to write xunit test cases for both the methods

Related

How to load test an asynchronous pipeline?

I have the following pipeline in AWS
API-Gateway -> Lambda -> Kafka(Amazon MSK) -> Consumers
I need to load test the entire pipeline and each component in the pipeline (to identify bottlenecks)
I don't have any prior experience in load testing. So, didn't know where to start. Some blogs mentioned JMeter can be used to do the load testing. But later got to know that the pipeline is asynchronous and it can't be done using JMeter.
How can I load test the pipeline? Is there any standard way to do it?
Any help is greatly appreciated. Thanks!
You can use any load testing tool which is capable of sending message to the API gateway and then reading it from Kafka.
When it comes to JMeter it's capable of both:
API Gateway: Building a WebService Test Plan
Kafka consumer: Apache Kafka - How to Load Test with JMeter
If you want to measure the cumulative duration of the request from API till it gets "consumed" - Transaction Controller

Unit and Integration Test for Azure Function with ServiceBusTrigger

I have an Azure Function which is triggered by an Azure Service Bus Queue.
The function is below.
How this Run method can be unit tested?
And how an integration test can be done by starting with AddContact trigger, checking the logic in the method and the data being sent to a blob using the output binding?
public static class AddContactFunction
{
[FunctionName("AddContactFunction")]
public static void Run([ServiceBusTrigger("AddContact", Connection = "AddContactFunctionConnectionString")]string myQueueItem, ILogger log)
{
log.LogInformation($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}
}
I had the exact same doubts.
Adding Unit Tests is not too complicated, at the end of the day, its a function, so all we got to do is to call the Azure Function with the correct string, for parameter string myQueueItem.
Adding Integration tests needs some additional ground work. In the Github project, the author uses the TestFunctionHost class from Azure/azure-functions-host project.
I tried following this strategy, but the amount of code needed to setup all these is uncomfortably high for my liking. Not a lot of it is well documented, and some of the stuff needs developers to use Azure App Services myGet feed.
I wanted a simpler approach, and thankfully I found one.
Azure Functions is built on top of the Azure WebJobs SDK package, and leverages its JobHost class to run. So in our integration tests, all we need to do, is to setup this Host, and tell it where to look for the Azure Functions to load and run.
IHost host = new HostBuilder()
.ConfigureWebJobs()
.ConfigureDefaultTestHost<CLASS_CONTAINING_THE_AZURE_FUNCTIONS>(webjobsBuilder => {
webjobsBuilder.AddAzureStorage();
webjobsBuilder.AddServiceBus();
})
.ConfigureServices(services => {
services.AddSingleton<INameResolver>(resolver);
})
.Build();
using (host) {
await host.StartAsync();
// ..
}
...
Once this is done, we can send messages to ServiceBus and our Azure Functions will get triggered. Once can even set breakpoints in the Functions getting tested and debug issues!
I have blogged about the whole process here and I have also created a github repository at this link, to showcase test driven development with Azure Functions.
How this Run method can be unit tested?
The method is a static public method. You can unit test it by invoking the static method AddContactFunction.Run(/* parameters /*); You will not need a Service Bus namespace or a message for that matter as your function expects to receive a string from the SDK. Which you can provide and verify the logic works as expected.
And how an integration test can be done by starting with AddContact trigger, checking the logic in the method and the data being sent to a blob using the output binding?
This would be a much more sophisticated scenario. This would require to run Functions runtime and generate a real Service Bus message to trigger the functions as well as validate that the blob was written. There's no integration/end-to-end testing framework that is shipped with Functions and you'd need to come up with something custom. Azure Functions Core Tools could be helpful to achieve that.

Mocking Kafka APIs for Unit Testing

I want to mock the Confluent Kafka APIs for Consumer and Producer in GO for Unit Testing, Is there any way (process/steps/library) to mock them successfully?
producer_test.go in kafka module has a useful test producer:
p, err := NewProducer(&ConfigMap{
"socket.timeout.ms": 10,
"message.timeout.ms": 10})
https://github.com/confluentinc/confluent-kafka-go/blob/master/kafka/producer_test.go

Mock SQS Config

I've created a mule application with the following configuration to SQS. This lives in my config.xml file.
<sqs:config name="amazonSQSConfiguration" accessKey="${aws.sqs.accessKey}" secretKey="${aws.sqs.secretKey}" url="${aws.sqs.baseUrl}" region="${aws.account.region}" protocol="${aws.sqs.protocol}" doc:name="Amazon SQS: Configuration">
<reconnect count="5" frequency="1000"/>
</sqs:config>
This is problematic for me because when this SQS configuration loads up, it tries to connect to Amazon SQS queue but can't because access to the queue is not accessible from my machine.
For munit, unit purposes, I'm looking for a way to stop this from trying to connect on load?
Is there a way I can mock this sqs:config?
Please note this is different from mocking the connector in my flow? In this case I need to mock the config.
Or happy for any other suggestions.
thanks

Schedule a Webservice call using spring integration

I want to schedule (with date time) a Webservice call using Spring integration. I am planning to use the below configuration to invoke the REST Webservice. I am new to Webservice and SI. Could any of you help me to come up with a scheduler to do the same?
<int-http:outbound-gateway request-channel="sampleRequestChannel"
reply-channel="sampleReplyChannel"
url="http://<server details>"
http-method="POST" expected-response-type="java.lang.String" />
To read the data from DB there is JDBC adapters. One of them is:
<int-jdbc:inbound-channel-adapter>
<int:poller/>
</int-jdbc:inbound-channel-adapter>
with which you can poll some table in DB periodically for the fresh value of date and time and send it as a payload to the Spring Integration flow.
Another is <int-jdbc:outbound-gateway> which is based on the upstream flow and can be triggered by user event.