Confusion about Camunda tasks vs. activities - camunda

In the latest Camunda doc pages, I noticed some confusing information in a section that talks about listeners for tasks vs. listeners for activities: https://docs.camunda.org/manual/latest/user-guide/process-applications/process-application-event-listeners/ .
For instance, this section has the following text (my bolding of two words):
The invoice process has a task named “archive invoice”. The application “invoice.war” further provides a Java Class implementing the ExecutionListener interface and is configured to be invoked whenever the END event is fired on the “archive invoice” activity.
I know that giving names to abstract terms is fraught with difficulties, but it sure seems like this text is not being clear on what's a "task" vs. an "activity".

This question was already answered by Thorben and me in the Camunda forum.
See my answer here:
An activity is the global class on which tasks, subprocess, call
activities etc. belongs to. So a Task is also an activity, but an
activity is not necessarily a task. See the reference for more
detailed explanation.
And Thorbens addition:
In addition to Chris' explanation, the term task is overloaded in the
Camunda and BPMN context. It refers to a task in the BPMN 2.0 context
(aka a design time unit of work that is/cannot broken down further =>
service task, human task, send task, etc.) as well as a task in the
tasklist (aka a runtime unit of work that needs to be completed by a
human).
Hope it helps.

Related

Camunda A/B Testing

Anybody has experience with A/B testing in Camunda. For Example, if I have two external tasks, which ties to a front screen say built in React. How do we make Camunda go to one external task 80% of the time vs 20% of the time to other one.
Thank you!
Untested: Assuming you are using spring/boot you can use an expression in the "topic" field of the external task property in your bpmn.
That expression then could dynamically define the topic to be used.
Now randomTopicGenerator could be a bean which rollDice() method returns topic name topic-name-v1 and topic-name-v2 by using an 80/20 random function.
You than start two external-task-workers, one for each topic.
When you make the topic-name structure and the ratio configurable, you can later disable the old topic.

Is Process Mining used just to infer business process models?

I have been searching about mining event logs (Process Mining). I wonder if there are other uses besides infering the process model (eg. improving the process). Until now I haven't found any other practical application. Can someone recommend me authors, publications about it (if there is other application), or recommend any keywords that I can search for to find it. Thank you!
Please take a look at this twitter thread: https://twitter.com/JorgeMunozGama/status/1236967153825275904
Many interesting applications from soccer analyzing to wind turbines monitoring.
I would suggest having a look at this wonderful book: https://www.springer.com/gp/book/9783662498507
It gives a detailed understanding of process mining and its applications.
Three alternative uses of process mining other than creating business models are:
Discovering patient pathways (a patient moving between different healthcare providers or different departments in a hospital). This information may also be relevant for parties that are not healthcare providers themselves, for example the insurer. This can also help with fraud detection. For example, if the process map shows that procedure X (for example an x-ray) is usually followed by procedure Y (for example a knee operation), and the insurer finds that in certain cases procedure Y is done but X is missing, that may be indicative of a type of fraud. In this example, process mining can easily identify all the cases that had an x-ray but never showed op for a knee operation.
Discovering networks (who refers work to who and at what intensity). In this case you do not use the product for the 'activity' column in the event log, but instead label the name of the provider as the 'activity' column in the event log. This is also known as a 'work hand over' map. It is slightly different than a regular process map because it does not visualize activities anymore, but instead visualizes the flow between key players.
Process mining allows for exact quantifications of throughput times and bottlenecks, which regular BPMN models can not do.
Process mining can be used to obtain process models, even if there are no event logs to be mined.
See the BPMN Sketch Miner for how to do so.

High level PHP library for Amazon SWF deciders to check state of activity tasks

I'm writing PHP for fairly simple workflow for Amazon SWF. I've found myself starting to write a library to check if certain actions have been started or completed. Essentially looping over the event list to check how things have progressed, and then starting an appropriate activity if its needed. This can be a bit faffy at times as the activity type and input information isn't in every event, it seems to be in the ActivityTaskScheduled event. This sort of thing I've discovered along the way, and I'm concerned that I could be missing subtle things about event lists.
It makes me suspect that someone must have already written some sort of generic library for finding the current state of various activities. Maybe even some sort of more declarative way of coding up the flowcharts that are associated with SWF. Does anything like this exist for PHP?
(Googling hasn't come up with anything)
I'm not aware of anything out there that does what you want, but you are doing it right. What you're talking about is coding up the decider, which necessarily has to look at the entire execution state (basically loop through the event list) and decide what to do next.
Here's an example written in python
( Using Amazon SWF To communicate between servers )
that looks for events of type 'ActivityTaskCompleted' to then decide what to do next, and then, yes, looks at the previous 'ActivityTaskScheduled' entry to figure out what the attributes for the previous task were.
If you write a php framework that specifies the workflow in a declarative way then a generic decider that implements it, please consider sharing it :)
I've since found https://github.com/cbalan/aws-swf-fluent-php which looks promising, but not really used it, so can't speak to the whether it works or not.
I've forked it and started a bit of very light refactoring to allow some testing, available at https://github.com/michalc/aws-swf-fluent-php

Workflow frameworks for Django [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I've been looking for a framework to simplify the development of reasonably complex workflows in Django applications. I'd like to be able to use the framework to automate the state transitions, permissioning, and perhaps some extras like audit logging and notifications.
I've seen some older information on the same topic, but not too much in the last 2-3 years. The major choices I've heard of are GoFlow (not updated since 2/2009) and django-workflow (seems more active).
Has anyone used these packages? Are they mature and/or compatible with modern (1.3) Django? Are there other options out there worth considering that might be better or better supported?
Let me give a few notes here as i'm the author of django-fsm and django-viewflow, two projects that could be called "workflow libraries".
Workflow word itself is a bit overrated. Different kind of libraries and software could call themselves "workflow" but have varying functionality.
The commonality is that a workflow connects the steps of some process into a whole.
General classification
As I see, workflow implementation approaches can be classified as follows:
Single/Multiple users - Whether workflow library automates single user tasks or has permission checking/task assignment options.
Sequential/Parallel - Sequential workflow is just a state machine pattern implementation and allows to have single active state at a moment. Parallel workflows allow to have several active tasks at once, and probably have some sort of parallel sync/join functionality.
Explicit/Implicit - Whether workflow is represented as a separate external entity, or is weaved into some other class, that main responsibility is different.
Static/Dynamic - Static workflows are implemented in python code once and then executed, dynamic workflows typically could be configuring by changing contents of workflow database tables. Static workflows are usually better integrated with the rest of the django infrastructure
like views, forms and templates, and support better customization by usual python constructions like class inheritance. Dynamic workflows assume that you have generic interface that can adapt to any workflow runtime changes.
Of these, the first two could be considered gradual differences, but the other two are fundamental.
Specific packages
Here is brief description what we have nowadays in django, djangopackages and awesome-django project list under workflow section:
django.contrib.WizardView - implicit, single user, sequential, static the simplest workflow implementation we could have. It stores intermediate state in the hidden form post data.
django-flows - explicit, single user, sequential, static workflow, that keeps flow state in external storage, to allow user to close or open page on another tab and continue working.
django-fsm - implicit, multi-user, sequential, static workflow - the most compact and lightweight state machine library. State change events represented just as python methods calls of model class. Has rudimentary support for flow inheritance and overrides. Provides slots for associate permission with state transitions. Allows to use optimistic locking to prevent concurrent state updates.
django-states - explicit, multi-user, sequential, static workflow with separate class for state machine and state transitions. Transitions made by passing string name of transition to make_transition method. Provides way for associate permission with state transitions. Has a simple REST generic endpoint for changing model states using AJAX calls. State
machine inheritance support is not mentioned in the documentation, but class state definition makes it possible with none or few core library modifications.
django_xworkflows - explicit, sequential, static workflow with no support for user permission checking, separated class for state machine. Uses tuples for state and transition definitions, makes workflow inheritance support hard.
django-workflows - explicit, multi-user, sequential, dynamic workflow storing the state in library provided django models. Has a way to attach permission to workflow transition, and basically thats all.
None of these django state machine libraries have support for parallel workflows, which limits their scope of application a lot. But there are two that do:
django-viewflow - explicit, multi-user, parallel, static workflow, with support for parallel tasks execution, complex split and join semantic. Provides helpers to integrate with django functional and class based views, and different background task execution queries, and various pessimistic and optimistic lock strategies to prevent concurrent updates.
GoFlow, mentioned in question, tends to be the explicit, multi-user, parallel, dynamic workflow, but it has been forsaken by author for a years.
I see the way to implement dynamic workflow construction functionality on top of django-viewflow. As soon as it is completed, if will close the last and the most sophisticated case for workflow implementation in the django world.
Hope, if anyone was able to read hitherto, now understands the workflow term better, and can do the conscious choice for workflow library for their project.
Are there other options out there worth considering that might be better or better supported?
Yes.
Python.
You don't need a workflow product to automate the state transitions, permissioning, and perhaps some extras like audit logging and notifications.
There's a reason why there aren't many projects doing this.
The State design pattern is pretty easy to implement.
The Authorization rules ("permissioning") are already a first-class
part of Django.
Logging is already a first-class part of Python (and has been
added to Django). Using this for audit logging is either an audit
table or another logger (or both).
The message framework ("notifications") is already part of Django.
What more do you need? You already have it all.
Using class definitions for the State design pattern, and decorators for authorization and logging works out so well that you don't need anything above and beyond what you already have.
Read this related question: Implementing a "rules engine" in Python
It's funny because I would have agreed with S.Lott about just using Python as is for a rule engine. I have a COMPLETELY different perspective now having done it.
If you want a full rule engine, it needs a quite a few moving parts. We built a full Python/Django rules engine and you would be surprised what needs to be built in to get a great rule engine up and running. I will explain further, but first the website is http://nebrios.com.
A rule engine should atleast have:
Acess Control Lists - Do you want everyone seeing everything?
Key/Value pair API - KVP's store the state, and all the rules react to changed states.
Debug mode - Being able to see every changed state, what changed it and why. Paramount.
Interaction through web forms and email - Being able to quickly script a web form is a huge plus, along with parsing incoming emails consistently.
Process ID's - These track a "thread" of business value. Otherwise processes would be continually overlapping.
Sooo much more!
So try out Nebri, or the others I list below to see if they meet your needs.
Here's the debug mode
An auto generated form
A sample workflow rule:
class task_sender(NebriOS):
# send a task to the person it got assigned to
listens_to = ['created_date']
def check(self):
return (self.created_date is not None) and (self.creator_status != "complete") and (self.assigned is not None)
def action(self):
send_email (self.assigned,"""
The ""{{task_title}}"" task was just sent your way!
Once you finish, send this email back to log the following in the system:
i_am_finished := true
It will get assigned back to the task creator to look over.
Thank you!! - The Nebbs
""", subject="""{{task_title}}""")
So, no, it's not simple to build a rules based, event based workflow engine in Python alone. We have been at it over a year! I would recommend using tools like
http://nebrios.com
http://pyke.sourceforge.net (It's Python also!)
http://decisions.com
http://clipsrules.sourceforge.net
A package written by an associate of mine, django-fsm, seems to work--it's both fairly lightweight and sufficiently featureful to be useful.
I can add one more library which supports on the fly changes on workflow components unlike its equivalents.
Look at django-river
It is now with a pretty admin called River Admin
ActivFlow: a generic, light-weight and extensible workflow engine for agile development and automation of complex Business Process operations.
You can have an entire workflow modeled in no time!
Step 1: Workflow App Registration
WORKFLOW_APPS = ['leave_request']
Step 2: Activity Configuration
from activflow.core.models import AbstractActivity, AbstractInitialActivity
from activflow.leave_request.validators import validate_initial_cap
class RequestInitiation(AbstractInitialActivity):
"""Leave request details"""
employee_name = CharField(
"Employee", max_length=200, validators=[validate_initial_cap])
from = DateField("From Date")
to = DateField("To Date")
reason = TextField("Purpose of Leave", blank=True)
def clean(self):
"""Custom validation logic should go here"""
pass
class ManagementApproval(AbstractActivity):
"""Management approval"""
approval_status = CharField(verbose_name="Status", max_length=3, choices=(
('APP', 'Approved'), ('REJ', 'Rejected')))
remarks = TextField("Remarks")
def clean(self):
"""Custom validation logic should go here"""
pass
Step 3: Flow Definition
FLOW = {
'initiate_request': {
'name': 'Leave Request Initiation',
'model': RequestInitiation,
'role': 'Submitter',
'transitions': {
'management_approval': validate_request,
}
},
'management_approval': {
'name': 'Management Approval',
'model': ManagementApproval,
'role': 'Approver',
'transitions': None
}
}
Step 4: Business Rules
def validate_request(self):
return self.reason == 'Emergency'
I migrate the django-goflow from django 1.X -python 2.X to fit for django 2.X - python 3.x, the project is at django2-goflow

Java EE -- using the same stateful object for several users

Even though I've been in Java SE for quite some time now, I started EE & web w/ Java only about a month ago, so pardon if the question seems a bit noobish...
So here's the situation: I'm trying to write a JS based multi-player game with real-time interaction (let's say chess in this example, though it really doesn't matter what particular game it is, could be tennis or w/ever). The clients would interact with the server through JS calls, sending the move etc. Now, while I could just receive the move from one client & pass it straight on to the other player, not maintaining the game state on the server would mean putting a huge sign out saying "user JS scripts welcome" (and that's out of experience -- "hacked" a crapload of that kind myself). This brings me to my problem -- how do I share a stateful object between several sessions? One idea that came to mind was a singleton storing a Hashmap of stateful beans & then each session could retrieve the bean by it's hash, but I've no idea how right that is (and it seems rather complex for a fairly common thing like that). Tieing it to application scope seems overkill as well...
P.S. I do understand that the object would need concurrency managing etc, I just can't seem to put my finger on how to get it shared...
EDIT: I'm sorry I didn't mention it before -- using Glassfish, EE6.
You have a business process scenario which is defined according to Seam framework documentation as follows
The business process spans multiple interactions with multiple users, so this state is shared between multiple users, but in a well-defined manner. The current task determines the current business process instance, and the lifecycle of the business process is defined externally using a process definition language, so there are no special annotations for business process demarcation.
Here you can see a Seam business process management Tutorial
Notice Seam uses JBoss BPM behind the scenes to handle its business process context. If you just want to use plain JBoss BPM capabilities, you can see here how to integrate with JBoss
See also JBoss BPM User guide
Solved. Shared it via ServletContext, which I initially thought wouldn't work 'cause FacesServlet is a separate one, thought it has smthn like a different container either.