I have used one of Microsoft's templates to create a unique sharepoint list. One of the columns is labeled priority. The issue is that when a priority changes, I have to go in and manually update the other priority numbers. This is a work-related SharePoint site and it would be useful if the priority column could be automatically updated if a priority changes. I also am using Power Automate to send out notifications if a priority changes, so it can be tracked. Any assistance would be greatly appreciated.
As you are using Power Automate to send notification, all you need to do is to add a new action Update item to update the priority number based on your requirements. As I am not sure about the pattern of the priority number, only a general example of updating list item via Power Automate here.
How do I update a list item after adding it with Power Automate
Let me know if you need more help.
Related
I am having an AWS personalize solution with 3 successful solution versions trained with trainingMode = "FULL" using both AWS console and AWS SDK python. However when I tried to create a new solution version with trainingMode="UPDATE" as shown in the code below:
import boto3
personalize = boto3.client('personalize')
personalize.create_solution_version(solutionArn = solution_arn, trainingMode = "UPDATE")
I've got back the following exception:
Exception has occurred: InvalidInputException
An error occurred (InvalidInputException) when calling the CreateSolutionVersion operation: There should be updates to at least one dataset after last active solution version with training mode set to FULL.
Anyone experiencing this issue, is there anything I miss to be able to train with an UPDATE mode?
The purpose of trainingMode="UPDATE" is to process new items added to the items dataset (via PutItems or a bulk upload) as well as impression data for new interactions added to the interactions since the last FULL/UPDATE training. The new items and impressions are used to update the exploration feature for solutions created with the aws-user-personalization recipe. That is, to bring in new/cold items for exploration and to adjust probabilities for existing cold items in further exploration. Note that UPDATE only brings in new items and impression data and does not retrain the model.
Therefore, if there are no dataset updates since the last FULL/UPDATE, there is no value in creating a new solution version with UPDATE.
Finally, keep in mind that Personalize automatically updates solution versions created with the aws-user-personalization recipe every two hours at no cost. This essentially does an UPDATE for you.
With User-Personalization, Amazon Personalize automatically updates the latest model (solution version) every two hours behind the scenes to include new data without creating a new solution version. With each update, Amazon Personalize updates the solution version with the latest item information and adjusts the exploration according to implicit feedback from users. This allows Amazon Personalize to gauge item quality based on new interactions for already explored items and continually update item exploration.
If you create a solution version with UPDATE, you will be charged for the server hours to perform the update. Practically speaking, the only time you would need to manually create a solution version with UPDATE is when you do not want to wait for the next automatic update.
I am building recommendation system for classified ads website , ads are added and deleted daily.
What I thought of is to use PutItems to add new ads and make field called status = 0 , if user deleted the ad , I will use the same PutItem API with the same ITEM_ID to update the stored Item, and use filter to select only ads with status = 0 when generation recommendation.
Is that correct ? will the PutItems API update the existing ad ? and is there anyway to delete the Item ?
Currently there is no way to remove items that were already added to Datasets.
Your workaround looks good, however from my experience with working with Personalize, the filter might decrease your recommendations quality.
To understand why, this is the more or less algorithm, that Personalize uses for filtering recommendations:
Get recommended items for user
Filter recommendations using filter expression
Return first N recommended items left after filtering
Because the filtering is done after getting recommendations, it means, that Personalize will simply fill recommendations list with items, that were somewhere down on the recommended list.
And there is a problem with that approach - items lower on the list, have lower "Score" value, which indicates accuracy of recommendations. That's why you will end up with in general worse recommendations, but it will depend how many ads that have status = 0 were recommended, before filtering out them.
To check your recommendations scores, simply get recommendations in Personalize web UI. It will return list of recs with scores.
Better approach
If your ads are updated daily, then you can definitely workaround it by following those steps:
Create a Lambda function, that is triggered every 24 hours
Lambda will fetch all of the ads and put them into S3 bucket as CSV file. It should exclude ads that are no longer available (status = 0)
Call CreateDatasetImportJob API using any AWS SDK of your choice and provide the data which is stored on S3 bucket
Personalize will start import job. When it finishes, all of the items are replaced with the newest dump
However it has some downsides.
If you are not using the User-Personalization (aws-user-personalization) Recipe, then after each import of Items, you need to update your Solution by creating new Solution Version. Otherwise it won't include changes made by items dataset import job.
Creating a new Solution Version is quite slow and expensive, that's why I would recommend to use User-Personalization Recipe, if you want to use this approach and since HRNN Recipes are marked as legacy, it's a good idea to migrate anyways.
If you are using User-Personalization Recipe, then according to AWS documentation:
Amazon Personalize automatically updates your latest solution version every two hours to include new data. Your campaign automatically uses the updated solution version. For more information see Automatic Updates.
So pretty much all of the work is done on Personalize side and you don't have to worry about Solution retraining after each Items import job.
And the last problem...
Since for User-Personalization Recipe documentation claims, that your solution will be updated within two hours, then you might end up with recommending items, that are not available, for some short period of time. If you are updating items daily, it might be a significant problem.
To fix that case, I would recommend simply using Filter approach, that you mentioned. Thanks to this, you have benefits of both approaches
and your recommendations are always valid.
in my org we would like to create a template for product backlogs items, that will include alos predefined list of tasks that will be assigned to each product backlog item created with this template.
For example:
Task for adding configuration and settings
Task for documentation
Task for E2E test, etc.
I could not identify an option to add tasks for a template, or to create the conditions in the Product backlog item and add the tasks as well.
Thanks !!!
According to your description, seems you want to create the work items in batch (PBI and the attached tasks) when applying a PBI work item template.
It's not supported if you mean to crate a work item template with predefined linked tasks (or other work item type), we can only set the value of the fields in the template, cannot link or create other child/related work items within the template. Please refer to Use templates to add and update work items for details.
The only way I can think of is that writing a script to call the REST API (Work Items - Create) to create the PBI and other tasks, then link the tasks to that PBI.
I want to know list archival methords specific to SharePoint Online/O365. I dont think record center will be applicable to SharePoint Online. I see workflow as the only solution to move the items to an archive list, but the problem I see here is I will not be able to update the metadata fields like CreatedBy , etc to the archive list as expected.
Does micorsoft suggent any ways of list item archival?
Any 3rd party tool for the same?
Microsoft suggest a list can hold up to 50 million records, what is the real significance, how does it affect the performance
If archived to a different list how does the workflow associated will get affect or can be handled ?
If you move list items between lists using the "Content and
Structure" tool in Site Settings, the system fields such as created
will be preserved, as the items are moved and not copied.
In-Place record management of list items can be implemented by:
Manual Declaration, Content Type Policies, List Workflows or
Reusable Workflows.
Record Center is used for documents and not list items, but is
available in SharePoint Online.
Lots of pricy third-party solutions.
Haven't seen the 50 million figure, here is a lengthy response to dealing with large lists in SharePoint Online.
4.Reusable Workflows will "handle" being used in multiple lists if configured properly.
Personally, I recommend using content type policies that declare
items in-place as records.
I have a product catalog with a few hundred categories in it and I am dynamically creating an SqlDependency for each category in the catalog. The SqlCommands that these dependencies will be based on, will differ only on the categoryID. The problem that I have is that I want all these dependencies to perform different actions depending on the SqlDependency that fired them. How can I do that? Do I have to create a different OnChange event for each SqlDependency? Is there a way all these dependencies to fire the same OnChange event and this event to know which dependency fired it or receive a parameter which will be passed during the dependency creation?
This problem arised trying to create a Sql Dependency mechanism that will work with AppFabric Cache.
Thank you in advance.
See if you can look into the cache tables that asp.net is creating for you and the triggers that are being created on the original tables. Once you see what is going on, you can create the tables and triggers yourself and can implement the caching through your own asp.net code. It really is not that hard. Then, not when a table is updated(when you use SQLDependency), but relevant rows in that table are updated, you can refresh the relevant cache or write your own code to perform the whatever unique actions you want. Better off doing it yourself when you learn how to!