Camunda incident integration with Pagerduty - camunda

Is there a way that we can integrate Camunda BPMN workflows with Pagerduty. I would want all the incidents to be integrated with Pagerduty. Any references?

You can hook into various events via a generic listener mechanism, e.g. https://docs.camunda.org/manual/latest/user-guide/spring-boot-integration/the-spring-event-bridge/
From here you can trigger the Pagerduty REST API
https://developer.pagerduty.com/api-reference/
If you have more complex requirements or want to get fancy you could also consider a custom incident handler https://docs.camunda.org/manual/latest/user-guide/process-engine/incidents/

Related

How to externalize API Manager Analytics

Is there a way to externalize API Manager Analytics without using Choreo Cloud?
In a situation where we don't have ELK stack, can we use custom log files, CSV or a Database to store Analytics information so we can run custom reports against them?
Can we write a custom handler to write Analytics information into a different external source?
WSO2 API Manager, by default uses logs files to log API Analytics information. It logs all the successful, fault and throttled event into the log file. With on-premises Analytics option, we can use ELK stack and with ELK stack approach, it uses Filebeats and Logstash to read those logs, and filter Analytics related information.
There is no out-of-the-box option to plug in a custom log file, CSV or a Database as the destination for Analytics information. However, we can write a custom handler/ event publisher (as demonstrated in [1]) to read those log files and collect Analytics related information and put them into a different format such as CSV, Database. Instead of publishing already available analytics events data, it is also possible to publish custom analytics data with the existing event schema as demonstrated here [2].
But this requires a lot of effort and sounds like implementing a new Analytics options from the scratch as this custom event publisher is responsible only for publishing the events and we will need some way of visualizing them.
As of now, there is only 2 options for API Manager Analytics and they are either Choreo Cloud (cloud) or ELK stack (on-premise).
[1] https://apim.docs.wso2.com/en/latest/api-analytics/samples/publishing-analytics-events-to-external-systems/
[2] https://apim.docs.wso2.com/en/latest/api-analytics/samples/publishing-custom-analytics-data/

When to use Firestore vs Pub/Sub

Can you elaborate on the differences between Pub/Sub and Firestore and provide some scenarios or use cases on which one to choose?
I'm not sure which one to use for building an app for a food delivery service that services real-time updates reflected as soon as they are added or changed to the database, ensuring that customers and drivers are aware of when food is ready for pickup and when food is in transit to their end destination such as UberEats.
The difference is quite simple:
Firestore (RealtimeDB) is for backend to frontend (customers/users) communication and realtime updates
Pubsub is a backend to backend message bus for async processing.
In your use case, you won't use PubSub to send notification to your users! Use realtimeDB to perform these updates.
Pub/Sub is like a notification system wherein you receive updates when something is added, changed or removed.
Firestore, on the other hand, is a NoSQL database for mobile (Android, iOS) and other web apps that can be directly access via native SDK. It can support many data types, from simple strings to complex objects. It also supports whatever data structure that works best for your app.
It is best to use Firestore for your app as it provides realtime updates.
You can check for the detailed documentation of Pub/Sub and Firestore.
For Firestore, you can either use either mobile/web client library or server client library.
Here's the link for Firestore, containing its benefits and key features.

Is there any equivalent feature to BPMN UserTask available in AWS Step functions?

We have our old 'Camunda based Spring boot application' which we currently deployed it into kubernetes which is running in an AWS EC2 Instance. This application acts as a backend for an angular based UI application.
Now we need to develop a new application similar to the above, which needs to interact with UI.
Our process flow will contain some UserTasks (BPMN) which will wait until manual interaction performed by human user via angular UI.
We are evaluating the possibility of using AWS stepfunctions instead of Camunda, if possible.
I googled but unable to find a concrete answer.
Is AWS stepfunctions have any feature similar to BPMN/Camunda's UserTask ?
Short answer: No.
====================================
Long answer:
After a whole day of study, I decided to continue with CamundaBPM because of below reasons.
AWS step-functions don't have an equivalent feature of UserTask in BPMN.
Step functions supports minimal human intervention via sending emails/messages by using AWS SQS(simple queue service) and AWS SNS(simple notification service).
Refer this link for full example. This manual interaction also based on 'task token'. So this interaction is limited to basic conversational style.
Step-function is NOT coming with in-built database & data management support, the developer has to take care of designing database schema, creating tables, their relationship etc.
On the other hand, Camunda is taking care of creating tables, their relationship, saving & fetching data.
No GUI modeler is available in step-functions, instead you need to draw workflow in a JSON-like language. This will be very difficult if your workflow becomes complex.
Drawing workflow in Camunda is just drag-and-drop using it's modeler.

Is there any way that i can read from BigQuery using Dialogflow Chatbot

I want to implement a function where I want to display the data in the dialogflow chatbot that is retrieved from BigQuery using a select statement.Is this possible.Kindly help
Based on what was mentioned in the comments sections, seemingly Dialogflow Fulfillment is exactly what you need here. When user types an expression, Dialogflow matches the intent and send webhook request based on the adjusted fulfillment function, the webhook service then performs the intended action like calling API services or affording some other business logic processes.
Dialogflow integration with Bigquery also requires applying fulfillment code to compose appropriate GCP Cloud function that will handle communication with Bigquery API service. Said this, you can use built-in Inline editor and write your fulfillment function, however it doesn't accept any other then Node.js programming language.
Saying more about implementation, I think you can follow codelabs tutorial which provides in detail the general workflow, assuming that you can inject your own code replacing the domestic addToBigQuery() function in Index.js from the example. For this purpose you can visit nodejs-bigquery Github repository containing a lot of useful code samples, in particular the generic query() function that might match your initial aim.

Using EventHub RBAC feature to publish events

I am writing a publisher service that wants to publish events to a client event hub. Is it correct that by using RBAC, I can achieve without having to worry about SAS keys? If so, is there any sample code I can look at.
I have been looking at the github samples at https://github.com/Azure/azure-event-hubs/tree/master/samples/DotNet/Rbac/EventHubsSenderReceiverRbac/ but they seem to be using individual user accounts for RBAC.