We want to collect data during the day and create an User Task once a day. How can that be done with camunda? Is there a possibility to use process variables or do we need to access our own database and mark the corresponding items as processed (as soon as we create the daily user task)?
Do we need to create these user tasks programmatically? (We are using embedded Spring Boot Camunda instance)
One very good option would be to use a Timer Start Event per the documentation here: https://docs.camunda.org/manual/7.10/reference/bpmn20/events/timer-events/#timer-start-event.
It seems that you may want to use that in conjunction with a Timer Intermediate Catching Event (https://docs.camunda.org/manual/7.10/reference/bpmn20/events/timer-events/#timer-intermediate-catching-event) in something like the following manner:
Start a process instance at a specific time in the morning with the Timer Start Event. Perhaps 6:30AM in your local time zone?
Execute certain steps to gather data, perhaps through external service invocations, etc.
At a specific time (in the afternoon?), create the User Task and present the data. The User Task could follow the Timer Intermediate Catching Event noted above.
I hope this helps!
Related
Does anyone know a way to generate a record in a table without the system being used by the user?
I need to generate something similar to a notification or reminder, with various data obtained from other tables, something similar to a report
Thank you
To run periodic tasks, you will need some sort of task scheduler like celery or huey. With that in place, you can just create and save instances of whatever model you have in mind from the task scripts and the task scheduler will repeat it periodically.
I have a requirement to capture session level details like session start time, end time, src success row, failedrows etc.. in a audit table. As all those details are available in prebuilt session variables i need to store these in a table. As of now what i am doing is, taking an assignment task in a workflow and assigning all these prebuilt session variables values for a particular session to wrkflow variables and passing these workflow variables to mapping variables using another non reusable session (the mapping which loads the table) using pre variable assignment option.It is working fine for workflow which is having one session. But if i have to implement this for a workflow having more no of sessions this process will be tedious as i have to create assignment task for each of these sessions and need to create non resuable session which calls a mapping to load into audit table.
So i am wondering is there any alternative solution to get this job done? I am thinking of a solution in which if we can able to captures audit details of all session in a file and pass this file as a input to a mapping to load this data at once into table. Is this possible? any solution?
Check this out: ETL Operational framework
It covers and end-to-end solution that should fit your needs and be quite easy to extend if you have multiple sessions - all you'd need to do is apply similar Post session commands before running the final session that loads the stats to database.
I have a scenario where I have to pull data from web service using REST web service consumer transformation. For example the endpoint url is http://example/2015/Q1. Here I have to parameterise 2015/Q1 as $$DATES. But I cannot change parameter values manually. I have to design my mapping in a way that it should dynamically keep increasing the dates without doing it manually in all the runs including past to future. Please suggest me a way for the same.
You can have a parent workflow which will dynamically create a script with "pmcmd startworkflow" calls for all the quarters you need. Parent workflow will call the script to invoke the child workflow n number of times. You also need to have a table or file with all the quarters and a flag which will say if that quarter is processed or not. In the child workflow(actual one that you already have) you need to update the flag and mark it as processed. Each run of the child workflow will pick up the first unprocessed quarter and process it.
Hope that helps.
I would like to use the list method in the Reports API to periodically fetch Activities of all users of some applications (e.g. 'admin' and 'login') and keep a local copy of all that data (using watch and push notifications is not an option in my particular scenario).
The idea is defining small time windows (e.g. 60 seconds) and, at the end of each time window plus some small delay, using the 'list' method and setting the startTime and endTime accordingly, fetching all events logged during the already finished time window.
This way I would be able to have an almost-real-time list of events locally stored. However, I'm not sure about what minimal delay should be used to ensure that the list method will be able to fetch all events. I'm assuming some delay is required here. Am I right? If so, is there any minimum delay that guarantees all events will be fetched?
In theory you wouldn't need a delay, but probably 10 secs would be fine if you want to be sure. Another important thing would be the api quota, in this case the project would be limited to 5 queries per second. Here is the documentation on that https://developers.google.com/admin-sdk/reports/v1/limits
I have a Django application.
One of my models looks like this:
class MyModel(models.Model):
def house_cleaning(self):
// cleaning up data of the model instance
Every time when I update an instance of MyModel, I'd need to clean up the data N days later. So I'd like to schedule a job to call
this_instance.house_cleaning()
N days from now.
Is there any job queue that would allow me to:
Integrate well with Django - allow me to call a method of individual model instances
Only run jobs that are scheduled to run today
Ideally handle failures gracefully
Thanks
django-chronograph might be good for your use case. If you write your cleanup jobs as django commands, then you schedule them to run at some time. It runs using unix cron behind the scene.
Is there any reason why a cron job wouldn't work? Or something like django-cron that behaves the same way? It's pretty easy to write stand-alone Django scripts. If you want to trigger house cleaning on some change to you model after a certain number of days, why not create a date flag in the model which is set to N days in the future when the job needs to be scheduled? You could run a script on a daily basis which pulls all records where the date is <= today, calls the instance's house_cleaning() method and then clears the date field. If an exception is raised during the process, it's easy enough to log it or dispatch an email.