Is there a way to call custom code when an item is deleted within Sitecore? Or can I somehow attach code to a publish action?
You can subscribe to item:deleted event. Check this article: Using events.
Attaching to item:deleted will work only if you have a single server solution. If you have separated the content delivery and the content edit servers then it will be a bit more involved.
Related
GCP Python client method to Create Monitoring Notification channel is not idempotent.
Duplicate Channels under same Project are getting created.
Is there a way to avoid duplication?
create_notification_channel
According the documentation it's not possible to handle duplicates directly but you can solve your issue with the following actions :
List the existing notification channels in GCP via the list_notification_channels
Loop on the actual notifications channels (maybe configured in your code), if the element exist in GCP (check from the list retrieved previously), update it via update_notification_channel function, otherwise create it with create_notification_channel function.
So I'm trying to build a content pipeline with Kentico Cloud. One of the requirements is that pressing the big green Publish is not the final step of the process. The published content then has to be collected, its representation transformed and forwarded elsewhere. Subscribing to publish / unpublish webhook events then processing the related content looked like the way to go, but apparently these are sometimes firing before the content is available via the Delivery API.
What are my options? I really don't want to do polling - the nested structure of the content combined with the inability to filter by parent items makes it far from trivial.
As it turns out, the answer is right there in the API docs:
https://developer.kenticocloud.com/reference#list-content-types
X-KC-Wait-For-Loading-New-Content
If the requested content has changed since the last request, the header determines whether to wait while fetching content. This can be useful when retrieving changed content in reaction to a webhook call. By default, when the header is not set, the API serves old content (if cached by the CDN) while it's fetching the new content to minimize wait time. To always fetch new content, set the header value to true.
Suppose I have a task of updating an user via a third party API call. Is it okay to put the actual user data inside the message (if it fits)? Or should I only provide an ID in the message so the worker can retrieve the updated record from my local database?
You need to check what level of compliance is required for your infrastructure, to see what kind of data you want to put in the queue.
If there aren't any compliance restrictions, you are free to put any kind of data in your own infrastructure on AWS.
Simple question, but I suspect it doesn't have a simple or easy answer. Still, worth asking.
We're creating an implementation for push notifications using AWS with our Web Server running on EC2, sending messages to a queue on SQS, which is dealt with using Lambda, which is sent finally to SNS to be delivered to the iOS/Android apps.
The question I have is this: is there a way to query SNS endpoints based on the custom user data that you can provide on creation? The only way I see to do this so far is to list all the endpoints in a given platform application, and then search through that list for the user data I'm looking for... however, a more direct approach would be far better.
Why I want to do this is simple: if I could attach a User Identifier to these Device Endpoints, and query based on that, I could avoid completely having to save the ARN to our DynamoDB database. It would save a lot of implementation time and complexity.
Let me know what you guys think, even if what you think is that this idea is impractical and stupid, or if searching through all of them is the best way to go about this!
Cheers!
There isn't the ability to have a "where" clause in ListTopics. I see two possibilities:
Create a new SNS topic per user that has some identifiable id in it. So, for example, the ARN would be something like "arn:aws:sns:us-east-1:123456789:know-prefix-user-id". The obvious downside is that you have the potential for a boat load of SNS topics.
Use a service designed for this type of usage like PubNub. Disclaimer - I don't work for PubNub or own stock but have successfully used it in multiple projects. You'll be able to target one or many users this way.
According the the [AWS documentation][1] if you try and create a new Platform Endpoint with the same User Data you should get a response with an exception including the ARN associated with the existing PlatformEndpoint.
It's definitely not ideal, but it would be a round about way of querying the User Data Endpoint attributes via exception.
//Query CustomUserData by exception
CreatePlatformEndpointRequest cpeReq = new CreatePlatformEndpointRequest().withPlatformApplicationArn(applicationArn).withToken("dummyToken").withCustomUserData("username");
CreatePlatformEndpointResult cpeRes = client.createPlatformEndpoint(cpeReq);
You should get an exception with the ARN if an endpoint with the same withCustomUserData exists.
Then you just use that ARN and away you go.
I use camunda 7.2.0 and i'm not very experienced with it. I'm trying to write data about users, who had done something with process instance to database (i'm using rest services) to get some kind of reports later. The problem is that i don't know how to trigger my rest(that sends information to datebase about current user and assignee) when user assignes task to somebody else or claims task to himself. I see that camunda engine sends request like
link: engine/engine/default/task/5f965ab7-e74b-11e4-a710-0050568b5c8a/assignee
post: {"userId":"Tom"}
As partial solution I can think about creating a global variable "currentUser" and on form load check if user is different from current, and if he is - run the rest and change variable. But this solution don't looks correct to me. So is there any better way to do it? Thanks in advance
You could use a task listener which updates your data when the assignee of a task is changed. If you want this behavior for every task you could define a global task listener.