iCloud records only seen when directly searched - icloud

If I go to the iCloud dashboard and search (Filter:) for a record by exact recordName, the record is found. However, if I search using any range or inexact methods (i.e. search by date range etc.), the record is not found. Also, the record is not returned inside my Objective C program when I query all records of that type. It appears to be random records that have this characteristic. I recently inserted 40 or more records of this type (via software) and this behavior is only seen in 6 of them. I have tried using the dashboard to update the affected records, in hopes that this might trigger "update" events on the backend. Nothing that I have tried fixes this!!
I should add that my data is in the Development environment, not Production.

I may have found a work around for this problem. In the iCloud dashboard, I deleted the index QUERYABLE on the recordName field (the only index on this field) and then added it back. I then waited about 5 minutes or so and this seemed to fix the problem. However, I believe that this is, perhaps, a bug in the backend iCloud server.

Related

Older Power-BI pbix using a SharePoint-Online list that has had new fields added

I have a Power-BI dashboard that was created some time ago. It’s data source is a SharePoint-Online list. Since the Power-BI dashboard was created, several new fields have been added to the SharePoint list. Now I am being asked to add a new page to the dashboard that reports on those new fields. However, I have not found a way to get the existing Power-BI list/dataset to show the new fields.
Refreshing the data does refresh the values, but refresh does not add the new fields.
I’ve spend the last 4 hours looking on the internet for a solution. The only thing I have been able to do so far is to attach the list again with a different name- the new fields DO show up when I do this. (I can’t just replace the older Power-BI list/dataset because there have been several calculated columns and measures added.)
I can work with this and create the report, but is this the only way? It doesn’t seem like it should be.
Any help would be appreciated! Thank you!
(I'm using Power BI April 2021 and Sharepoint Online)
So, it looks like there's no good answer to this issue. I found that adding another instance of the reference Sharepoint list, that included the new columns, did work (however inelegant). That seems to be the best direct answer for times when the older pbix file must continue to be used.
What I ended up doing, though, was to create a new separate pbix file which included the latest version of the Sharepoint List. This was the best solution for my organization since it will allow us to be more focused on the specific manufacturing processes involved.
Thanks to #Jon and #Alejandro for their efforts to help!
If you have access to PowerAutomate you could refresh the dataset creating a flow so that given a certain time (say, once or twice a day) the dataset gets refreshed with the new created items.
Otherwise if you are working with the service version of Power BI you can program a refresh of the dataset directly from the workspace going to the settings of the dataset. You would have to have a gateway set for that which could be in personal mode or not.
Also if you want to update the data in the service version you could do it manually too in the workspace.

Is there a way to force Sitecore to sync MongoDB data with it's SQL database?

I am setting up Sitecore xDB and am trying to test exactly what info gets through the system for authenticated and non-authenticated users. I would like to be able to make a change and see the results quickly in Sitecore. I found the setting to lower session lifetime to 1 minute rather than 20. I have not found a way to just force Sitecore to sync with Mongo on demand or at least within 1-5 minutes rather than, what also appears to be about 20 minutes at the moment. Does it exist or is "rebuilding" the database explained here the only existing process?
See this blog post by Martina Welander for this and more good info about xDB sessions: https://mhwelander.net/2016/08/24/whats-in-a-session-what-exactly-happens-during-a-session-and-how-does-the-xdb-know-who-you-are/
You just need a utility page that calls System.Web.HttpContext.Current.Session.Abandon(). You may also want to redirect the user to a page that doesn't exist.
Update to address comment
My understanding is that once an xDB session has expired, processing should take place quickly. In the Sitecore.Analytics.Processing.Services.config file, the BackgroundService agent is set to run on an interval of 15 seconds by default.
You may just be seeing cached reporting data. Try clearing the cache using the /sitecore/admin/cache.aspx page. You could also decrease the defaultCacheExpiration setting for the reporting cacheProvider in the Sitecore.Analytics.Reporting.config file. The default is 10 minutes.

Sitecore Item Buckets When User Doesn't Have Read Access to All Items

If a user is searching a sitecore bucket, and they do not have read access to all of the items within the bucket, then issues occur when they search.
Example:
Say there are 1000 items returned in a bucket search and the user only has read access to 100 of them. What ends up happening is it still says "Your search has returned 1000 items". Worse yet, all of the items that they do not have read access to still show up in the list but are blank.
This creates the illusion that no results were returned. When, in reality, the user doesn't have access to any of the items given a certain page index/size.
Has anyone run in to this issue before? I'm guessing Sitecore is doing post-search processing on the items which is causing the items to be found in the search, but not displayed.
My Thoughts:
One possible solution I was thinking about is tapping in to the correct Sitecore pipeline and adjusting the Lucene HitCollector to verify user security. If I can find the correct pipeline, this would resolve the issue, but I'm wondering if there is a better way?
I would much rather write code more specific to Sitecore buckets. Such as code that adds a lucene query term (search a custom role field or something similar) and then have it auto-added to the search based on the current user's role.

Magento products not visible via REST api addition

I'm currently running into the problem that I am using a webservice system to load products into magento.
I'm using the REST api in conjunction with Oauth to create products and assign a category. It works and when I go to the admin I can see the products as well as see they are properly assigned to the correct category. When I open the category management in the management console i can see i have (example: 106) items assigned in the category.
However, the problem is: It does not show in the site.. even with refreshing anything that is cache or index.
When I open up the management console and open 1 article and save it without changing any other property and then Save it. I can suddenly see the item in the front end webshop...
I'm lost to why this occurs.. also for 19k product updates it is becoming a bit of an annoying bit of work to update this amount of products since any bulk update method does not do the same as editing just 1 product at a time.
Any help is much appreciated.
In the end I have discovered the answer myself. Thought it might be nice to list it here as well.
In the 'rights' tab i added all the accessrights for the user using the api. This allowed me to read products etc. Very stupid mistake but somehow I overlooked this at first.
IF you'd expect security errors.. you wont get any. just empty lists and null responses.

Ember choking upon encountering large data sets

Looking for a solution to an issue caused by large data sets forcing Ember to lock up the browser while it tries to process the data.
For pagination, I'm using tchak's handy pagination mixin to paginate approximately 13,000+ objects being loaded from a backend API.
The Ember Data objects contain an ID, one text attribute and several number attributes.
The problem is it takes close to a minute before the browser finishes processing the data, rendering the browser unusable in the meantime. Firefox even goes as far as to issue a warning that a script is using up all browser resources and suggests that script be terminated.
I've written my own pagination mixin that requests objects by range, i.e. items 10-25, and it works generally well except for one serious limitation: sorting. To sort the data, I need to make additional requests to the backend and reload the objects even if some of them have already been loaded.
I would love to be able to load all of the content upfront to simplify the process of sorting without doing additional requests to the backend API. I'm looking for guidance on how to tackle this issue but I'm open to an entirely alternative approach.
If nothing else, is it possible to reduce the resource footprint Ember places on the browser as it tries to load all 13k objects into the ArrayController?
I'm using Ember 1.0.0-pre2 with the latest Ember Data (currently at Revision 10).
On the backend is Rails 3.2.8.
Update I sidestepped the issue by loading data into an ArrayController property other than content. This brought the load times down from over a minute to only a few seconds. I then slice the requested number of items and load those into content. This works well for any number of items, at the cost of not being able to easily sort the data.
I suggest you take a look at Ember Table. The demo shows a table with 500 000 records and works very fast. Digging around the source code might help.
Can't you query a view from your db that handles the sorting? Pass in the sort conditions in the query string ?sortBy=name&sortAsc=true