Sitecore returns incorrect items in multi server farm - sitecore

I have a Sitecore website deployed in multi server environment. When I make some changes to Sitecore items sometimes they are shown correctly, but sometimes it shows old data.
I understand that sitecore caches items, but it sometimes showing wrong data and sometime its fine. If its caching it should always be same data at least.
For example:
Sitecore.Globalization.Translate.TextByDomain("MyDictionary", "Category");
Sometimes it returns correct data sometimes it shows wrong data i.e. the one before I changed to item.
I am using Sitecore 8.0

Items get cached on the individual servers in memory, and these are not cleared unless you activate event queues. Further content might be cached in the output cache, which needs to be cleared after you publish.
Here is a guide on how to activate event queues and here is also a good description
Here is how to make your sites clear output cache after publish

Thanks Jens for your help. The links really helped me with my understanding of Sitecore farm.
But the issue turned out to be rather silly. For some reason on one content delivery server Application Pool account didn't had permission on the virtual directory.

Related

Sitecore Database Cleanup Fails

Sitecore 6.6
I'm speaking with Sitecore Support about this as well, but thought I'd reach out to the community too.
We have a custom agent that syncs media on the file system with the media library. It's a new agent and we made the mistake of not monitoring the database size. It should be importing about 8 gigs of data, but the database ballooned to 713 GB in a pretty short amount of time. Turns out the "Blobs" table in both "master" and "web" databases is holding pretty much all of this space.
I attempted to use the "Clean Up Databases" tool from the Control Panel. I only selected one of the databases. This ran for 6 hours before it bombed due to consuming all the available locks on the SQL Server:
Exception: System.Data.SqlClient.SqlException
Message: The instance of the SQL Server Database Engine cannot obtain a LOCK
resource at this time. Rerun your statement when there are fewer active users.
Ask the database administrator to check the lock and memory configuration for
this instance, or to check for long-running transactions.
It then rolled everything back. Note: I increased the SQL and DataProvider timeouts to infinity.
Anyone else deal with something like this? It would be good if I could 'clean up' the databases in smaller chunks to avoid overwhelming the SQL Server.
Thanks!
Thanks for the responses, guys.
I also spoke with support and they were able to provide a SQL script that will clean the Blobs table:
DECLARE #UsableBlobs table(
ID uniqueidentifier
);
INSERT INTO
#UsableBlobs
select convert(uniqueidentifier,[Value]) as EmpID from [Fields]
where [Value] != ''
and (FieldId='{40E50ED9-BA07-4702-992E-A912738D32DC}' or FieldId='{DBBE7D99-1388-4357-BB34-AD71EDF18ED3}')
delete top (1000) from [Blobs]
where [BlobId] not in (select * from #UsableBlobs)
The only change I made to the script was to add the "top (1000)" so that it deleted in smaller chunks. I eventually upped that number to 200,000 and it would run for about an hour at a time.
Regarding cause, we're not quite sure yet. We believe our custom agent was running too frequently, causing the inserts to stack on top of each other.
Also note that there was a Sitecore update that apparently addressed a problem with the Blobs table getting out of control. The update was 6.6, Update 3.
I faced such a problem previously, and we had contacted Sitecore Support.
They gave us a Sitecore Support DLL, and suggessted a Web.Config change for Dataprovider -- from main type="Sitecore.Data.$(database).$(database)DataProvider, Sitecore.Kernel" to the new one.
The reason I am posting on this question of yours is that because the most of the time taken for us was in Cleaning up Blobs and and they gave us this DLL to increase Cleanup Blobs speed. So I think it might help you too.
Hence, I would like to suggest if you could please request Sitecore Support in this case, I am sure you might get the best solution to solve your case.
Hope this helps you!
Regards,
Varun Shringarpure
If you have a staging environment I would recommend taking a copy of database and try to shrink the database. Part of the database size might also be related to the transaction log.
if you have a DBA please have him (her) involved.

Timeline items on Glass file structure

I assume that the cards on the timeline are stored locally on Glass - I was wondering if details about where and how they are stored is available. I see the images are stored in a particular cache folder, but not sure about how they are organized. Or is this something that Google doesn't want played with at this point?
I was trying to reorganize timeline items (via mirror) into bundles by content-type and forgot about the restrictions via oauth, so I assume a native or root application would be required.
The local timeline items are organized by time received.
I think that the timeline items are being hosted somewhere else. GLASS then syncs up(I think using GoogleCloudMessaging) with the hosted timeline and the items are then fetched and stored locally.
I'm not sure if this answers your question, however, I want to bring this thread back up. I was about to ask the same question regarding the hosting of the timeline items of both scenarios.

Sitecore performance enhancements

We need our Sitecore web application to process 60-80 web requests per second. We are using Sitecore 7.0. We have tried a 1 Webserver + 1 Database server deployment, but it only processes 20-25 requests per second. Web server queues up all the other requests in the Memory. As we increase the load, memory fills up.(We did all Sitecore performance enhancements recommended). We need 4X performance to reach the goal :).
Will it be possible to achieve this goal by upgrading the existing server, or do we have to add more web servers in production environment.
Note: We are using Lucene indexing as well.
Here are some things you can consider without changing overall architecture of your deployment
CDN to offload media and static asset requests
This leaves your content delivery server available to handle important content queries and display logic.
Example www.cloudflare.com
Configure and use Sitecore's built-in caching
This is from the guide:
Investigation and configuration of the Sitecore Caches is broken down
into multiple tasks. This way each task is more focused and
simplified. The focus is on configuration and tuning of the Sitecore
Database Caches (prefetch, data, and item caches.)
For configuration
of the output rendering caching properties, the customer should be
made aware of both the Sitecore Cache Configuration Reference and the
Sitecore Presentation Component Reference as to how properly enable
and the properties to expire these caches.
Check out the Sitecore Tuning Guide
Find Slow Queries or Controls
It sounds like your application follows Sitecore best practices, but I leave this note in for anyone that might find this answer. Use Sitecore's built-in Debug mode to identify the slowest running controls and sublayouts. Additionally, if you have Analytics set up there is a "Slow Pages" report that might give you some information on where your application is slowing down.
Those things being said, if you're prepared to provision additional servers and set up a load-balanced environment then read on.
Separate Content Delivery and Content Management
To me the first logical step before load-balancing content delivery servers is to separate the content management from the equation. This is pretty easy and the Scaling Guide walks you through getting the HistoryEngine set up to keep those Lucene indexes up to date.
Set up Load Balancer with 2 or more Content Delivery servers
Once you've done the first step this can be as easy as cloning your content delivery server and adding it to your load balancer "pool". There are a couple of things to consider here like: Does your web application allow users to log in? So you'll need to worry about sticky sessions or machine keys. Does your web application use file media instead of blob media? I haven't had to deal with this, but I understand that's another consideration.
Scale your SQL solution
I've seen applications with up to four load balanced content delivery servers and the SQL Server did not have a problem - I think this will be unique to each case depending on a lot of factors: horsepower and tuning of SQL Server, content model of your application, complexity of your queries, caching configuration on content delivery servers, etc. Again, the Scaling Guide covers SQL Mirroring and Failover, so that is going to be your first stop on getting that going.
Finally, I would say contact Sitecore. These guys have probably seen more of what's gone right and what's gone wrong with installations and could get you on the right path. Good luck!
This answer written from a Sitecore developer perspective:
Bottom line: You need to figure out exactly where your performance bottleneck is. That is going to take some digging, but will be very worthwhile. You should definitely be able to serve 60-80 requests/s without any trouble... but of course that makes a lot of assumptions about the nature of your site and the requests.
For my site, I found Sitecore's caching implementation to be sub-par... I created some very simple and aggressive application-specific caches in my app and this made all the difference in the world. For instance, we have 900+ "Partner" items where our sites' advertisements live... and simply putting all these objects in an array in the Application object sped up page requests significantly. Finding an object in a Hashtable indexed by its Item.Name or ID is going to be a lot faster than Sitecore.Context.Database.GetItem("/itempath") or a SelectItems() call (at least, that's my experience). If your architecture and data set will allow this strategy, we've had good experience with it.
Another thing to watch out for is XSLT renderings. Personally, I avoid them completely in favor of ASP.NET UserControls. The XSLT rendering is just slow. As much as 10x slower than a native UserControl rendering the same HTML. So if you have a few of these... replace with some custom code and you'll see a world of difference.

How do I stop Sitecore caching my custom data provider?

I have a custom data provider pulling in data via a webservice.
I have pretty much followed this example here: http://www.techphoria414.com/Blog/Black-Art-of-Sitecore-Data-Providers.aspx
Sitecore seems to be caching my data provider though. For example, I load my application for the first time, and the data get pulled from the provider as I would expect. However, if i collapse the tree and reopen it again, I would expect by provider to be called again, but it is not.
I thought it might be stored in the prefecth cache or in a lucene index, but clearing those didn't cause a call back to my provider again.
How do I tell it to always get data from my data provider?
Turns out the issue was a bug in my own code. If you want to disable caching on your provider though, you can do:
this.CacheOptions.DisableAll = true;

Use cookies without sending them back to the server

I need a way to stash some data that is global to the browser. If I open a new window with a URL from my app, e.g. via a bookmark, I need to access some data that was created in another window and never sent to the server.
As far as I can tell the only thing that is global to the browser and not just a window, (like window.name), is a cookie. The problem I'm running into is if I set a cookie the cookie is then sent with every request to the server, but I don't ever want this data on the wire. Is there any way to set a cookie and just use it purely as a bucket for storing some data and never send that data to the server?
The HTML 5 storage API looks like exactly what you want here, but unfortunately it's only supported by a handful of browsers right now.
Is there any way to set a cookie and just use it purely as a bucket for storing some data and never send that data to the server?
No.
You'll need to look into a plugin that provides dedicated offline storage facility, or use the HTML5 storage API and tell everyone to upgrade their browsers
If you decide to go the plugin route, as far as I am aware you have 3 options:
Google Gears
Flash - it has an offline storage facility - you could write a small flash app to store things using this facility, then interop with it from javascript.
Silverlight also has offline storage - as with flash you could write a small app to do the storage, then interop with it from javascript.
I'd probably look into using flash first, as everyone already has it.
Development would likely be a lot easier if you were to use silverlight. It's not as widely installed, but it is spreading pretty rapidly. Last I heard* something like 30% of browsers had it installed which is pretty impressive.
Google gears would unfortunately be a distant third. People are going to be installing flash and silverlight for other reasons, but nobody has gears.
*This is an entirely unsubstantiated quote, but does seem to fit with what I've seen on various people's computers, etc.
Can you mandate that your users install Google Gears? It's a javascript API that lets you store local info- also lets you persist between sessions, which may be useful for your app.
Why not just read a field in the parent window using window.opener ? Or if you've three windows running - parent and two children which I think you might be implying then read/write to a hidden field in the parent from the children.
Sounds like your app is running 100% local, if that is the case the browser isn't the way to go anyway. Cookies can be easily deleted. If your app isn't local the webserver should be the one supplying information. Cookies are never the correct way to store sensitive information or information that should persist over longer amounts of time.