I am following this tutorial https://medium.com/#naduni_pamudika/how-to-create-a-simple-bpmn-process-adc94f0b2f86
After completing all steps I am able to see BPMN project but after starting it I am not able to see the project in Claimable Task List or in Completed Task List. Both the lists are empty.
How can I fix this issue?
A task will be considered 'claimable' only if the Candidate User or Candidate Group of that particular task is specified.
In this case, you have logged in as admin, so setting the Candidate User for the HelloWorld task as 'admin' should set the task as claimable. Any other users that you have defined in the BPS Server should also count as Candidate Users.
You can also have a Claimable task assigned to an entire group of users, by specifying the User Role. I have defined user roles like 'employee' and 'manager', so setting the Candidate Group as 'employee' will set the task as claimable for all users with the 'employee' role. It should look like this
Related
I am developing a scenario on Camunda bpm Modeler 5 and bpm run 7.17.
Scenario:
using a service task to invoke an external rest-api using http-connector and that external rest api sends response as follows with multiple employees info in json format.
[
{"regId":"0XFY1FX00W","fname":"abc","lname":"def","email":"abc.def#gmail.com"},
{"regId":"0XFY1F000X","fname":"ghi","lname":"jklm","email":"ghi.jklm#ymail.com"},
{"regId":"0XFY1F000Y","fname":"nop","lname":"qrs","email":"nop.qrs#xmail.com"},
{"regId":"0XFY1F000Z","fname":"tuv","lname":"wxyz","email":"tuv.wxyz#zmail.com"}
]
I am trying to create 4 user tasks with those 4 employees and those user tasks should be populated with those employee first name and last names.
I have used parallel multi-instance and I succeeded with that 4 user tasks creation but all the user tasks contain last employee details i.e.
"fname":"tuv","lname":"wxyz".
Please let me know how to achieve that i.e. the user tasks should contain 4 different employee details?
Note: I have tested with sequence multi-task and it is working as expected and My requirement is to use user task and not any subprocess.
So I created a python script similar to [BQ tutorial on SQ][1]. The service account has been set using os.environ. When executing with BigQuery Admin and other similar permissions(Data user, Data transfer agent, Data view etc) the schedule query creation fails with
status = StatusCode.PERMISSION_DENIED
details = "The caller does not have permission"
The least permission level it is accepting is 'Project Owner'. As this is a service account, I was hoping a lower permission level can be applied eg Bigquery Admin, as all I need with the Service account is to remotely create schedule queries. Even the how to guide says it should work. Can anyone provide some input if there is any other combination of permissions which will allow this to work please.
[1]: https://cloud.google.com/bigquery/docs/scheduling-queries#set_up_scheduled_queries
I have an AWS Amplify application that has a structure with multi-organizations:
Organization A -> Content of Organization A
Organization B -> Content of Organization B
Let's say we have the user Alice, Alice belongs to both organizations, however, she has different roles in each one, on organization A Alice is an administrator and has more privileges (i.e: can delete content or modify other's content), while on Organization B she is a regular user.
For this reason I cannot simply set regular groups on Amplify (Cognito), because some users, like Alice, can belong to different groups on different organizations.
One solution that I thought was having a group for each combination of organization and role.
i.e: OrganizationA__ADMIN, OrganizationB__USER, etc
So I could restrict the access on the schema using a group auth directive on the Content model:
{allow: group, groupsField: "group", operations: [update]},
The content would have a group field with a value: OrganizationA__ADMIN
Then I could add the user to the group using the Admin Queries API
However, it doesn't seem to be possible to add a user to a group dynamically, I'd have to manually create each group every time a new organization is created, which pretty much kills my idea.
Any other idea on how I can achieve the result I'm aiming for?
I know that I can add the restriction on code, but this is less safe, and I'd rather to have this constraint on the database layer.
Look into generating additional claims in you pre-token-generation handler
Basically you can create an attribute that includes organization role mapping
e.g.
{
// ...
"custom:orgmapping": "OrgA:User,OrgB:Admin"
}
then transform them in your pre-token-generation handler into "pseudo" groups that don't actually exist in the pool.
Some of the scheduled queries in Google Cloud Platform suddenly don't run anymore, with the message "Access Denied: ... User does not have bigquery.tables.get permission for table..."
First, is it possible to see under which user the scheduled query is running?
Second, is it possible to change the user?
Thanks, Silvan
I always use service accounts for command line execution...
if you can use bq cli, look at --service_account and --service_account_credential_file
If you still want to use the schedule query, there is some documentation on the service account on https://cloud.google.com/bigquery/docs/scheduling-queries (per above)
This can also be done (for a normal non-service account user) via the console as per the instructions at: https://cloud.google.com/bigquery/docs/scheduling-queries#update_scheduled_query_credentials
"To refresh the existing credentials on a scheduled query:
Find and view the status of a scheduled query.
Click the MORE button and select Update credentials."
Although this thread is 2 years old, it is still relevant. So I will guide you on how to troubleshoot this issue below:
Cause:
This issue happens when the user that was running the query does not meet the required permissions. This could have been caused by a permissions removal or update of the scheduled query's user.
Step 1 - Checking which user is running the query:
Head to GCP - BigQuery - Scheduled Queries
Once on the scheduled queries screen, click on the display name of the query that need to be checked and head to configuration. There you will find the user that currently runs the query.
Step 2 - Understanding the permissions that are needed for running the query:
As specified on Google Cloud's website you need 3 permissions:
bigquery.transfers.update, and, on the dataset: bigquery.datasets.get and bigquery.datasets.update
Step 3 - Check running user's permissions:
From the GCP menu head to IAM & Admin - IAM
IAM
There you will find the permissions assigned to different users. Verify the permissions possessed by the user running the query.
Now we can solve this issue in 2 different ways:
Step 4 - Edit current user's roles or update the scheduler's credentials with an email that has the required permissions:
Option 1: Edit current user's roles: On the IAM screen you can click on "Edit
principal" next to a user to add, remove or update roles (remember to
add a role that complies with the permissions required mentioned in
Step 2).
Option 2: Update credentials (as #coderintherye suggested in another
answer): Head to GCP - BigQuery - Scheduled Queries and select
the query you want to troubleshoot - Head to MORE (on the
top-right corner of the screen) - Update credentials - Finally,
choose a mail. WARNING: That mail will now be the user that
runs the query, so make sure that it has the permissions needed
as mentioned in step 2.
To change a scheduled query from a user to a service account, you need to:
make sure that the service account is from the same project as the project where you are running your scheduled query.
You as a user and the service account, should have the appropriate permissions:
https://cloud.google.com/bigquery/docs/scheduling-queries#required_permissions
You can run a command from the CLI or python code to make the change from user to service account:
CLI:
bq update \
--transfer_config \
--update_credentials \
--service_account_name=abcdef-test-sa#abcdef-test.iam.gserviceaccount.com \
projects/862514312345/locations/us/transferConfigs/5dd12f12-0000-122f-bc38-089e0820fe38
Python:
from google.cloud import bigquery_datatransfer
from google.protobuf import field_mask_pb2
transfer_client = bigquery_datatransfer.DataTransferServiceClient()
service_account_name = "email address of your service account"
transfer_config_name = "projects/SOME_NUMBER/locations/EUROPE_OR_US/transferConfigs/A_LONG_ALPHANUMERIC_ID"
transfer_config = bigquery_datatransfer.TransferConfig(name=transfer_config_name)
transfer_config = transfer_client.update_transfer_config(
{
"transfer_config": transfer_config,
"update_mask": field_mask_pb2.FieldMask(paths=["service_account_name"]),
"service_account_name": service_account_name,
}
)
print("Updated config: '{}'".format(transfer_config.name))
See also here for code examples:
https://cloud.google.com/bigquery/docs/scheduling-queries#update_scheduled_query_credentials
bq update --transfer_config --update_credentials --service_account_name=<service_accounnt> <resource_name>
service account = service account id that you wish to use as a credential.
resource_name = resource name of the Scheduled query that you can see in the configuration section of the Scheduled query detail page.
Using the RallyDev Web Services API v2.0 I would like to request the iterations for a users default project.
I can do this now by first calling:
https://rally1.rallydev.com/slm/webservice/v2.0/iteration:current?pretty=true
Parsing out Iteration->Project->Ref, and then calling calling:
https://rally1.rallydev.com/slm/webservice/v2.0/project/[ProjectID]/Iterations?pretty=true
or
https://rally1.rallydev.com/slm/webservice/v2.0/iteration?query=(Project.Oid=[ProjectID])&pretty=true
Wondering if there is a better way?
I saw UserProfile had DefaultProject and DefaultWorkspace, but I couldn't figure out how to use them as fetching just returned 'null'.
Your queries on Iteration are spot on for looking up Iterations for a particular Project. Note that for UserProfile - the Default Workspace/Project settings are not required fields. They are empty unless the User has explicitly set these in his or her profile settings. Only the user him/herself can set these - a (Workspace/Subscription) Administrator cannot set them on behalf of the User. So if you're getting empty values back for these, it is likely because the User of concern does not have the Default Workspace/Project set.